Frequently Asked Questions (FAQ)
This may happen if and when we process an image for the first time.
The very first time we see an asset, we send back the original asset to your users with a very short cache time. We also start a process to optimize the asset in our server. Once it is optimized, we cache the asset for a longer time on our CDN. From this point on, your visitors will enjoy the optimized assets which would very likely be smaller than your original assets.
We reuse the
Cache-Control headers that your server sends and cache assets in our CDN for that period.
However, there is one exception. If we find that your server is not setting any
Cache-Control header at all (it is more common than one may expect), we cache the assets on our CDN for a conservative period of 4 hours.
You can cache-bust assets in a way similar to how you would cache-bust resources on your user’s browser. For example: you can cache-bust resources on Dexecure by using a different URL, changing the query parameters etc.
If you find yourself doing this often, you can set a shorter
Cache-Control header to instruct Dexecure to cache for a shorter time. Read Question 2 for more information on how Dexecure uses
In the very unlikely event of Dexecure server being down, a fail-safe mechanism will kick in automatically to ensure uninterrupted service to your users.
In case an optimized asset is not accessible from the Dexecure POPs, the original asset from your server will be automatically served to the user’s browser. This is how we implement it using Service Workers. We also have automated DNS failover back to your original servers.
Dexecure provides a multicloud architecture using different commodity CDNs like Amazon Cloudfront and Cloudflare. These are seamless managed from our side and there is no configuration required from your side. Rest assured that whatever CDN we chose, we guarantee that all your assets would be served via a fast HTTP/2 connection.
You can see the response headers of any request that you are interested in. There will be a CDN hit response header in addition to several “Dex-” headers which indicate that the asset has been optimized by Dexecure.
Sure, just make sure that the script installation call from the HTML file points to the right location. Also since the script uses Service Worker, a script installed in a subdirectory can only control pages within that subdirectory. To get around this limitation, you should set a particular response header for the Service-Worker. Read more about the
Service-Worker-Allowed header here.
Our optimizations can be disabled using one of the following 3 ways:
- To disable Dexecure locally for debugging, the easiest way is to disable Service workers on your browser temporarily. For example, on Google Chrome, this involves checking the
Bypass for network option in the Application panel of the Chrome devTools. On Firefox, open your page in Private Browsing mode since Firefox disables Service Workers in private browsing mode.
- Reload the page using
Cmd/Ctrl + Shift + R (This bypasses the service worker too).
- Change the
optimisationsEnabled key to false in the Dexecure JS file that you were provided.
There are 3 different ways to go about doing this.
- - Set
optimisationsEnabledoption to false in the Dexecure Service worker script.
- - Replace your Dexecure Service Worker Script file with an empty file. Deleting the file from your server would not work.
- - Replace the Dexecure installation code in your HTML with the code in this gist. This will actually unregister any active service workers in your domain.
This is relevant to users using the JS file with service workers (Step 1 in integration).
Service workers are installed per domain. When the user visits a page with the JS script, a service worker is installed for the user for that particular subdirectory. Then, all pages in that subdirectory that the user visits subsequently would be automatically optimized. However, if you would like to optimize only a few pages with Dexecure, contact us at [email protected] and we can come up with a script that helps you optimize only certain pages of the website.
Nonetheless, for production environment, we suggest adding the optimizing code to all HTML files that are directly accessible to the user (thus allowing the Service Worker to kick in).