Make lua-resty-ffi run on Envoy

Barton Fink

Background

Both nginx and envoy are high-performance proxy server software.

They are written in C/C++ so they run fast. But the C/C++ ecosystem is notoriously weak. Can we do hybrid programming with Go, Java, Python, Rust or NodeJS? so we can use Those rich ecosystems of popular mainstream programming languages?

The envoy officially supports Lua filter, so could we port the lua-resty-ffi to the envoy? Then everything is ok.

Solution

Envoy has a filter mechanism that allows us full control over the request and response flow. And filters allow asynchronous processing, and Lua filters support asynchronous processing based on coroutines.

Envoy is a multi-threaded model where each worker thread accepts requests individually.

  • The calling direction of lua-resty-ffi has nothing to do with the platform, it uses pthread mutex lock queue and pthread condition notification.
  • The callback direction of lua-resty-ffi has to be adapted to the envoy callback mechanism, which uses eventfd (via libevent) and a locked queue.

Unlike nginx worker processes, each source block defined in the filter configuration has its own lua vm instance. So lua-resty-ffi may have different multiple instances in each lua vm instance.

Call

We just need to change the post function in resty_ffi.lua.

In the evnoy lua filter, we get a wrapper handle from the entry function, such as envoy_on_request, It stands for the filter in lua realm. But the wrapper is userdata type, and the address of the corresponding C++ object is aligned, so we need to implement a method Returns the C++ address so we can call the C++ method in the C API.

1
local r = wrapper and wrapper:this() or get_request()

We also need a yield method (because luajit ffi cannot yield):

1
2
3
if wrapper then
    return wrapper:ffiYield()
end

ffiYield() sets the status to FFICall, which in turn suspends this request (Http::FilterHeadersStatus::StopIteration) to wait for ffi to resume:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
int StreamHandleWrapper::ffiYield(lua_State* state) {
  state_ = State::FFICall;
  return lua_yield(state, 0);
}

...

Http::FilterHeadersStatus StreamHandleWrapper::start(int function_ref) {
  // We are on the top of the stack.
  coroutine_.start(function_ref, 1, yield_callback_);
  Http::FilterHeadersStatus status = (state_ == State::WaitForBody || state_ == State::FFICall ||
                                      state_ == State::HttpCall || state_ == State::Responded)
                                         ? Http::FilterHeadersStatus::StopIteration
                                         : Http::FilterHeadersStatus::Continue;

  if (status == Http::FilterHeadersStatus::Continue) {
    headers_continued_ = true;
  }

  return status;
}

Callback

The most notable change is the callback path, we need to use Dispatcher::post, then The callback will be invoked in a worker thread and call callbacks_.continueIteration().

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
void ngx_http_lua_ffi_respond(void* tsk, int rc, char* rsp, int rsp_len) {
  ngx_thread_task_t* task = static_cast<ngx_thread_task_t*>(tsk);
  ffi_task_ctx_t* ctx = static_cast<ffi_task_ctx_t*>(task->ctx);
  pthread_mutex_lock(&ctx->mtx);
  if (!ctx->is_abort) {
    ctx->rc = rc;
    ctx->rsp = rsp;
    ctx->rsp_len = rsp_len;
    auto wrapper = static_cast<StreamHandleWrapper*>(ctx->wrapper);
    wrapper->dispatcher().post([ctx, wrapper] {
      if (!ctx->is_abort) {
        wrapper->onFFICallback(ctx);
      }
    });
  } else {
    ngx_http_lua_ffi_task_free(ctx);
    if (rsp) {
      free(rsp);
    }
  }
  pthread_mutex_unlock(&ctx->mtx);
}

void StreamHandleWrapper::onFFICallback(ffi_task_ctx_t* ffi_ctx) {
  ASSERT(state_ == State::FFICall);
  ...

  try {
    resumeCoroutine(3, yield_callback_);
    markDead();
  } catch (const Filters::Common::Lua::LuaException& e) {
    filter_.scriptError(e);
  }

  if (state_ == State::Running) {
    headers_continued_ = true;
    callbacks_.continueIteration();
  }
}

Demo

envoy.yaml

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
  http_filters:
  - name: lua_filter_with_custom_name_0
    typed_config:
      "@type": type.googleapis.com/envoy.extensions.filters.http.lua.v3.Lua
      default_source_code:
        inline_string: |
          function envoy_on_request(request_handle)
            require("resty_ffi")
            local demo = ngx.load_ffi("ffi_go_echo")
            local ok, res = demo:foobar("foobar", request_handle)
            assert(ok)
            assert(res == "foobar")
            local demo = ngx.load_ffi("resty_ffi_python", "ffi.echo?,init", {is_global = true})
            local ok, res = demo:echo("hello", request_handle)
            assert(ok)
            assert(res == "hello")
          end          

Test:

1
2
3
4
5
6
7
8
# run demo
cd /opt/envoy/examples/lua
PYTHONPATH=/opt/lua-resty-ffi/examples/python/ \
LD_LIBRARY_PATH=/opt/lua-resty-ffi/examples/python \
LUA_PATH='/opt/lua-resty-ffi/?.lua;;' \
envoy -c envoy.yaml --concurrency 1

curl -v localhost:8000/httpbin/get

Output:

* Connected to localhost (127.0.0.1) port 8000 (#0)
> GET /httpbin/get HTTP/1.1
> Host: localhost:8000
> User-Agent: curl/7.68.0
> Accept: */*
>
* Mark bundle as not supporting multiuse
< HTTP/1.1 200 OK
< date: Sun, 09 Jul 2023 06:05:36 GMT
< content-type: application/json
< content-length: 248
< access-control-allow-origin: *
< access-control-allow-credentials: true
< x-backend-header-rtt: 0.004774
< alt-svc: h3=":443"; ma=3600, h3-29=":443"; ma=3600
< server: envoy
< via: 1.1 nghttpx
< x-frame-options: SAMEORIGIN
< x-xss-protection: 1; mode=block
< x-content-type-options: nosniff
< x-envoy-upstream-service-time: 146
<
{
  "args": {},
  "headers": {
    "Accept": "*/*",
    "Host": "localhost:8000",
    "User-Agent": "curl/7.68.0",
    "X-Envoy-Expected-Rq-Timeout-Ms": "15000"
  },
  "origin": "192.168.0.1",
  "url": "http://localhost:8000/httpbin/get"
}
* Connection #0 to host localhost left intact

Conclusion

With lua-resty-ffi, you could use your favorite mainstream programming language, e.g. Go, Java, Python, Rust, or Node.js, to do development in Nginx and Envoy, so that you could enjoy their rich ecosystem directly.

Welcome to discuss and like my github repo:

https://github.com/kingluo/lua-resty-ffi