通过S3从Amazon CloudFront提供gzip压缩的CSS和JavaScript

我一直在寻找让我的网站加载速度更快的方法,我想探索的一种方法是更好地利用CloudFront.

因为CloudFront最初不是作为自定义CDN设计的,而且它不支持gzipping,所以到目前为止,我一直使用它来托管我的所有图像,这些图像在我的站点代码中由其CloudFront CNAME引用,并使用Far-Futures标头进行优化。

另一方面,CSS和JavaScript文件托管在我自己的服务器上,因为到目前为止,我的印象是它们不能从CloudFront以gzip方式提供。GZIP的收益(约75%)超过了使用CDN的收益(约50%):Amazon S3(以及CloudFront)不支持通过使用浏览器发送的HTTP Accept-Encoding标头以标准方式提供GZIP内容,以表明其对GZIP压缩的支持,因此它们不能动态地提供gzip和服务组件。

因此,直到现在,我的印象是,人们必须在两种选择中做出选择:

  1. 将所有资产转移到Amazon CloudFront,忘记gzipping.

  2. 保持组件自托管,并配置我们的服务器以检测传入的请求,并在适当的时候执行动态gzipping,这就是我到目前为止所选择的做法。

工作区来解决这个问题,但本质上这些不起作用.[链接]。

现在,Amazon CloudFront似乎支持自定义来源,并且如果您使用的是自定义来源,那么现在可以使用标准的HTTP Accept-Encoding方法来提供gzip内容。[链接]。

到目前为止,我还没能在我的服务器上实现这个新功能。我在上面链接的博客文章(这是我发现的唯一一篇详细介绍更改的博客文章)似乎暗示,如果您选择自定义Origin(我不想使用),则只能启用gzipping(工具栏工作区,我不想使用):我发现在我的CloudFront服务器上托管相应的文件并从那里链接到它们更简单。尽管仔细阅读了文档,我还是不知道:

  • 新功能是否意味着文件应通过自定义来源托管在我自己的域服务器上,如果是,什么代码设置将实现这一点?

  • 如何配置CSS和JavaScript头,以确保它们是从CloudFront压缩的。

82090 次浏览

UPDATE: Amazon now supports gzip compression, so this is no longer needed. Amazon Announcement

Original answer:

The answer is to gzip the CSS and JavaScript files. Yes, you read that right.

gzip -9 production.min.css

This will produce production.min.css.gz. Remove the .gz, upload to S3 (or whatever origin server you're using) and explicitly set the Content-Encoding header for the file to gzip.

It's not on-the-fly gzipping, but you could very easily wrap it up into your build/deployment scripts. The advantages are:

  1. It requires no CPU for Apache to gzip the content when the file is requested.
  2. The files are gzipped at the highest compression level (assuming gzip -9).
  3. You're serving the file from a CDN.

Assuming that your CSS/JavaScript files are (a) minified and (b) large enough to justify the CPU required to decompress on the user's machine, you can get significant performance gains here.

Just remember: If you make a change to a file that is cached in CloudFront, make sure you invalidate the cache after making this type of change.

My answer is a take off on this: http://blog.kenweiner.com/2009/08/serving-gzipped-javascript-files-from.html

Building off skyler's answer you can upload a gzip and non-gzip version of the css and js. Be careful naming and test in Safari. Because safari won't handle .css.gz or .js.gz files.

site.js and site.js.jgz and site.css and site.gz.css (you'll need to set the content-encoding header to the correct MIME type to get these to serve right)

Then in your page put.

<script type="text/javascript">var sr_gzipEnabled = false;</script>
<script type="text/javascript" src="http://d2ft4b0ve1aur1.cloudfront.net/js-050/sr.gzipcheck.js.jgz"></script>


<noscript>
<link type="text/css" rel="stylesheet" href="http://d2ft4b0ve1aur1.cloudfront.net/css-050/sr-br-min.css">
</noscript>
<script type="text/javascript">
(function () {
var sr_css_file = 'http://d2ft4b0ve1aur1.cloudfront.net/css-050/sr-br-min.css';
if (sr_gzipEnabled) {
sr_css_file = 'http://d2ft4b0ve1aur1.cloudfront.net/css-050/sr-br-min.css.gz';
}


var head = document.getElementsByTagName("head")[0];
if (head) {
var scriptStyles = document.createElement("link");
scriptStyles.rel = "stylesheet";
scriptStyles.type = "text/css";
scriptStyles.href = sr_css_file;
head.appendChild(scriptStyles);
//alert('adding css to header:'+sr_css_file);
}
}());
</script>

gzipcheck.js.jgz is just sr_gzipEnabled = true; This tests to make sure the browser can handle the gzipped code and provide a backup if they can't.

Then do something similar in the footer assuming all of your js is in one file and can go in the footer.

<div id="sr_js"></div>
<script type="text/javascript">
(function () {
var sr_js_file = 'http://d2ft4b0ve1aur1.cloudfront.net/js-050/sr-br-min.js';
if (sr_gzipEnabled) {
sr_js_file = 'http://d2ft4b0ve1aur1.cloudfront.net/js-050/sr-br-min.js.jgz';
}
var sr_script_tag = document.getElementById("sr_js");
if (sr_script_tag) {
var scriptStyles = document.createElement("script");
scriptStyles.type = "text/javascript";
scriptStyles.src = sr_js_file;
sr_script_tag.appendChild(scriptStyles);
//alert('adding js to footer:'+sr_js_file);
}
}());
</script>

UPDATE: Amazon now supports gzip compression. Announcement, so this is no longer needed. Amazon Announcement

We've made a few optimisations for uSwitch.com recently to compress some of the static assets on our site. Although we setup a whole nginx proxy to do this, I've also put together a little Heroku app that proxies between CloudFront and S3 to compress content: http://dfl8.co

Given publicly accessible S3 objects can be accessed using a simple URL structure, http://dfl8.co just uses the same structure. I.e. the following URLs are equivalent:

http://pingles-example.s3.amazonaws.com/sample.css
http://pingles-example.dfl8.co/sample.css
http://d1a4f3qx63eykc.cloudfront.net/sample.css

Cloudfront supports gzipping.

Cloudfront connects to your server via HTTP 1.0. By default some webservers, including nginx, dosn't serve gzipped content to HTTP 1.0 connections, but you can tell it to do by adding:

gzip_http_version 1.0

to your nginx config. The equivalent config could be set for whichever web server you're using.

This does have a side effect of making keep-alive connections not work for HTTP 1.0 connections, but as the benefits of compression are huge, it's definitely worth the trade off.

Taken from http://www.cdnplanet.com/blog/gzip-nginx-cloudfront/

Edit

Serving content that is gzipped on the fly through Amazon cloud front is dangerous and probably shouldn't be done. Basically if your webserver is gzipping the content, it will not set a Content-Length and instead send the data as chunked.

If the connection between Cloudfront and your server is interrupted and prematurely severed, Cloudfront still caches the partial result and serves that as the cached version until it expires.

The accepted answer of gzipping it first on disk and then serving the gzipped version is a better idea as Nginx will be able to set the Content-Length header, and so Cloudfront will discard truncated versions.

Yesterday amazon announced new feature, you can now enable gzip on your distribution.

It works with s3 without added .gz files yourself, I tried the new feature today and it works great. (need to invalidate you're current objects though)

More info

You can configure CloudFront to automatically compress files of certain types and serve the compressed files.

See AWS Developer Guide