This makes sure that the user is getting the freshest possible content. This causes the browser to re-validate with the source on every request for the content. One use of this technique is to provide conservative cache interactions to the browser (for example, Cache-Control: no-cache). Surrogate-Control accepts many of the same values as Cache-Control, plus some other more esoteric ones (read the tech note for all the options). These headers provide a specific cache policy for proxy caches in the processing path. Surrogate headers are a relatively new addition to the cache management vocabulary (described in this W3C tech note). You can use cache control headers to set policies that determine how long your data is cached.įastly looks for caching information in each of these headers as described in our documentation on cache freshness. Understand how cache control headers work See our Google Cloud Storage instructions if you're changing the default TTL for a GCS bucket. You can specifically set a fallback TTL (sometimes called a default TTL). TTL is set based on the cache related headers information returned from your origin server. The amount of time information can be retained in cache memory is considered its time to live or TTL. You can check your hit ratio by viewing the Observability page for your service. In general, you want your cache hit ratio as high as possible, usually in excess of 90%. A high cache hit ratio means you've kept request traffic from hitting your origin unnecessarily. The number of requests delivered by a cache server, divided by the number of cacheable requests (hits + misses), is called the cache hit ratio. We also provide a variety of plugins to help you directly integrate Fastly with your content management system. For instructions, see our documentation on integrating third-party services and configuring web server software. You can optimize caching with Fastly by customizing your application platform settings. Integrate Fastly with your application platform To ensure optimum origin performance during times of increased demand or during scheduled downtime for your servers, consider the following best practices for your service's caching configurations.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |