Background
Both nginx and envoy are high-performance proxy server software.
They are written in C/C++ so they run fast. But the C/C++ ecosystem is notoriously weak. Can we do hybrid programming with Go, Java, Python, Rust or NodeJS? so we can use Those rich ecosystems of popular mainstream programming languages?
The envoy officially supports Lua filter, so could we port the lua-resty-ffi to the envoy? Then everything is ok.
Solution
Envoy has a filter mechanism that allows us full control over the request and response flow. And filters allow asynchronous processing, and Lua filters support asynchronous processing based on coroutines.
Envoy is a multi-threaded model where each worker thread accepts requests individually.
- The calling direction of lua-resty-ffi has nothing to do with the platform, it uses pthread mutex lock queue and pthread condition notification.
- The callback direction of lua-resty-ffi has to be adapted to the envoy callback mechanism, which uses eventfd (via libevent) and a locked queue.
Unlike nginx worker processes, each source block defined in the filter configuration has its own lua vm instance. So lua-resty-ffi may have different multiple instances in each lua vm instance.
Call
We just need to change the post
function in resty_ffi.lua
.
In the evnoy lua filter, we get a wrapper handle from the entry function, such as envoy_on_request
,
It stands for the filter in lua realm.
But the wrapper is userdata type, and the address of the corresponding C++ object is aligned, so we need to implement a method
Returns the C++ address so we can call the C++ method in the C API.
|
|
We also need a yield method (because luajit ffi cannot yield):
|
|
ffiYield()
sets the status to FFICall
, which in turn suspends this request (Http::FilterHeadersStatus::StopIteration
) to wait for ffi to resume:
|
|
Callback
The most notable change is the callback path, we need to use Dispatcher::post
, then
The callback will be invoked in a worker thread and call callbacks_.continueIteration()
.
|
|
Demo
envoy.yaml
|
|
Test:
|
|
Output:
* Connected to localhost (127.0.0.1) port 8000 (#0)
> GET /httpbin/get HTTP/1.1
> Host: localhost:8000
> User-Agent: curl/7.68.0
> Accept: */*
>
* Mark bundle as not supporting multiuse
< HTTP/1.1 200 OK
< date: Sun, 09 Jul 2023 06:05:36 GMT
< content-type: application/json
< content-length: 248
< access-control-allow-origin: *
< access-control-allow-credentials: true
< x-backend-header-rtt: 0.004774
< alt-svc: h3=":443"; ma=3600, h3-29=":443"; ma=3600
< server: envoy
< via: 1.1 nghttpx
< x-frame-options: SAMEORIGIN
< x-xss-protection: 1; mode=block
< x-content-type-options: nosniff
< x-envoy-upstream-service-time: 146
<
{
"args": {},
"headers": {
"Accept": "*/*",
"Host": "localhost:8000",
"User-Agent": "curl/7.68.0",
"X-Envoy-Expected-Rq-Timeout-Ms": "15000"
},
"origin": "192.168.0.1",
"url": "http://localhost:8000/httpbin/get"
}
* Connection #0 to host localhost left intact
Conclusion
With lua-resty-ffi, you could use your favorite mainstream programming language, e.g. Go, Java, Python, Rust, or Node.js, to do development in Nginx and Envoy, so that you could enjoy their rich ecosystem directly.
Welcome to discuss and like my github repo: