nginx caching proxy

classic Classic list List threaded Threaded
5 messages Options
Reply | Threaded
Open this post in threaded view
|

nginx caching proxy

wld75
Hello,

I did't find the answer in documentation, but am I right, assuming from my
observation, that when the proxy_cache is enabled for a location, and the
client requests the file that isn't in the cache yet, nginx starts
transmitting this file only after it's fully received from the upstream ?
Because I'm seeing the lags equal to the request_time from the upstream.

If I'm right, is there a way to enable the transmitting without waiting for
the end of the file ?

Posted at Nginx Forum: https://forum.nginx.org/read.php?2,281621,281621#msg-281621

_______________________________________________
nginx mailing list
[hidden email]
http://mailman.nginx.org/mailman/listinfo/nginx
Reply | Threaded
Open this post in threaded view
|

Re: nginx caching proxy

Roman Arutyunyan
Hello,

On Wed, Oct 17, 2018 at 05:13:15AM -0400, drookie wrote:
> Hello,
>
> I did't find the answer in documentation, but am I right, assuming from my
> observation, that when the proxy_cache is enabled for a location, and the
> client requests the file that isn't in the cache yet, nginx starts
> transmitting this file only after it's fully received from the upstream ?
> Because I'm seeing the lags equal to the request_time from the upstream.

The short answer is no, nginx cache does not introduce any delays here.
Maybe your client waits until it receives the full file before letting you know.

When a client requests a file missing in the cache, then the file is requested
from the upstream and it is sent to the client SIMULTANEOUSLY with saving it
in the cache.  However if another client requests a file currently being
received and cached in the first client's context, and proxy_cache_lock is
enabled, then this second client will wait for the file to be fully cached by
nginx and only receives it from the cache after that.

[..]

--
Roman Arutyunyan
_______________________________________________
nginx mailing list
[hidden email]
http://mailman.nginx.org/mailman/listinfo/nginx
Reply | Threaded
Open this post in threaded view
|

Re: nginx caching proxy

wld75
And how about result cache file ?
Will it be only 1 object in cache when 2 client GET 1 file SIMULTANEOUSLY or
2 different objects in Nginx proxy_cache_path ?

Posted at Nginx Forum: https://forum.nginx.org/read.php?2,281621,281686#msg-281686

_______________________________________________
nginx mailing list
[hidden email]
http://mailman.nginx.org/mailman/listinfo/nginx
Reply | Threaded
Open this post in threaded view
|

Re: nginx caching proxy

Roman Arutyunyan
On Thu, Oct 25, 2018 at 07:21:35AM -0400, vizl wrote:
> And how about result cache file ?
> Will it be only 1 object in cache when 2 client GET 1 file SIMULTANEOUSLY or
> 2 different objects in Nginx proxy_cache_path ?

Only one response can be cached for a single key.  Once a new response is
cached, previous one is discarded.

--
Roman Arutyunyan
_______________________________________________
nginx mailing list
[hidden email]
http://mailman.nginx.org/mailman/listinfo/nginx
Reply | Threaded
Open this post in threaded view
|

Re: nginx caching proxy

Gryzli Bugbear
Except the case when your site is returning "Vary: Accept-Encoding" for
example and the 2 clients are using different Accept-Encoding headers 
(that's also true for whatever the Vary header is)  :)

On 10/25/18 4:20 PM, Roman Arutyunyan wrote:
> On Thu, Oct 25, 2018 at 07:21:35AM -0400, vizl wrote:
>> And how about result cache file ?
>> Will it be only 1 object in cache when 2 client GET 1 file SIMULTANEOUSLY or
>> 2 different objects in Nginx proxy_cache_path ?
> Only one response can be cached for a single key.  Once a new response is
> cached, previous one is discarded.
>
--
-- Gryzli

https://gryzli.info

_______________________________________________
nginx mailing list
[hidden email]
http://mailman.nginx.org/mailman/listinfo/nginx