Serve multiple requests from a single proxy request

classic Classic list List threaded Threaded
3 messages Options
Reply | Threaded
Open this post in threaded view
|

Serve multiple requests from a single proxy request

wld75
Hello,
I'm wondering if nginx is able to serve multiple requests from a single
proxy request before it completes.

I am using the following configuration:

    proxy_cache_lock on;
    proxy_cache_lock_timeout 5s;
    proxy_cache ram;
    proxy_pass myUpstream;

My upstream uses chunked transfer encoding and serve the request in 10 sec.
Now if I try to send 2 requests to nginx, the first one starts responding
immediately but the second will starts 5 sec later (lock timeout) and then
perform a second request to my upstream.

Is there a way to configure nginx to immediately respond to multiple
requests with a single request to my upstream?

Thank you in advance,
Traquila

Posted at Nginx Forum: https://forum.nginx.org/read.php?2,281058,281058#msg-281058

_______________________________________________
nginx mailing list
[hidden email]
http://mailman.nginx.org/mailman/listinfo/nginx
Reply | Threaded
Open this post in threaded view
|

Re: Serve multiple requests from a single proxy request

Roman Arutyunyan
Hi,

On Fri, Aug 31, 2018 at 05:02:21AM -0400, traquila wrote:

> Hello,
> I'm wondering if nginx is able to serve multiple requests from a single
> proxy request before it completes.
>
> I am using the following configuration:
>
>     proxy_cache_lock on;
>     proxy_cache_lock_timeout 5s;
>     proxy_cache ram;
>     proxy_pass myUpstream;
>
> My upstream uses chunked transfer encoding and serve the request in 10 sec.
> Now if I try to send 2 requests to nginx, the first one starts responding
> immediately but the second will starts 5 sec later (lock timeout) and then
> perform a second request to my upstream.

This is all normal considering you have proxy_cache_lock enabled.  Your second
request waits until the first request completes and fills up the cache entry.
This cache entry is supposed to be used to serve the second request.  But it
expires after 5 seconds and makes another request to the upstream.

> Is there a way to configure nginx to immediately respond to multiple
> requests with a single request to my upstream?

There is no proxy/cache option to enable that behavior.  However, if your
response is big enough, you can try using the slice module:

http://nginx.org/en/docs/http/ngx_http_slice_module.html

With this module you can slice the response in pieces and cache each one
separately.  While the same logic as above is true for the slices too,
the fact that they are usually much smaller than the entire response makes
it look like the response is proxied to multiple client simultaneously.

--
Roman Arutyunyan
_______________________________________________
nginx mailing list
[hidden email]
http://mailman.nginx.org/mailman/listinfo/nginx
Reply | Threaded
Open this post in threaded view
|

Re: Serve multiple requests from a single proxy request

wld75
Thank you for your answer.
This means nginx is not compatible with CMAF and low latency streaming.

I tried the slice module and read its code but it does not cover my needs.
I guess I have to develop a new proxy module.

thanks
Traquila


Roman Arutyunyan Wrote:
-------------------------------------------------------

> Hi,
>
> On Fri, Aug 31, 2018 at 05:02:21AM -0400, traquila wrote:
> > Hello,
> > I'm wondering if nginx is able to serve multiple requests from a
> single
> > proxy request before it completes.
> >
> > I am using the following configuration:
> >
> >     proxy_cache_lock on;
> >     proxy_cache_lock_timeout 5s;
> >     proxy_cache ram;
> >     proxy_pass myUpstream;
> >
> > My upstream uses chunked transfer encoding and serve the request in
> 10 sec.
> > Now if I try to send 2 requests to nginx, the first one starts
> responding
> > immediately but the second will starts 5 sec later (lock timeout)
> and then
> > perform a second request to my upstream.
>
> This is all normal considering you have proxy_cache_lock enabled.
> Your second
> request waits until the first request completes and fills up the cache
> entry.
> This cache entry is supposed to be used to serve the second request.
> But it
> expires after 5 seconds and makes another request to the upstream.
>
> > Is there a way to configure nginx to immediately respond to multiple
> > requests with a single request to my upstream?
>
> There is no proxy/cache option to enable that behavior.  However, if
> your
> response is big enough, you can try using the slice module:
>
> http://nginx.org/en/docs/http/ngx_http_slice_module.html
>
> With this module you can slice the response in pieces and cache each
> one
> separately.  While the same logic as above is true for the slices too,
> the fact that they are usually much smaller than the entire response
> makes
> it look like the response is proxied to multiple client
> simultaneously.
>
> --
> Roman Arutyunyan
> _______________________________________________
> nginx mailing list
> [hidden email]
> http://mailman.nginx.org/mailman/listinfo/nginx

Posted at Nginx Forum: https://forum.nginx.org/read.php?2,281058,281062#msg-281062

_______________________________________________
nginx mailing list
[hidden email]
http://mailman.nginx.org/mailman/listinfo/nginx