本地托管 jQuery 的好处与缺陷

我们目前正在从 google CDN 中提取 jQuery 和 jQueryUI (以及 jQueryUI CSS)库。我喜欢这个,因为我可以调用 google.load("jquery", "1");
并且将使用最新的 jQuery 1.x.x。

现在,由于安全原因,我将在本地调用这些库。

我很乐意把他们拉到本地,但我想知道还有什么其他的好处和陷阱需要注意?

46303 次浏览

The main benefit of having them on a CDN is that the files can be downloaded in parallel to files downloaded from your own website. This reduces latency on every page. So, the flip side of this is a pitfall of hosting locally - increased latency. The main reason for that is that browsers are limited in the number of connections that they can make at the same time to the same web domain. In IE6 this was defaulted to 2 concurrent connections to the same domain - shared between all open windows of IE!! In IE8+ it improved, defaulting to 6, which is inline with FF/Chrome, but still, if you have a lot of images and you are not using sprites, you will experience heavy latency.

Using a CDN, I would always set the library version explicitly rather than getting the latest one. This reduces the risk of new versions breaking your code. Not very likely with jQuery, but possible.

The other main benefit of using a CDN is reduced traffic on your site. If you pay per GB or you are on a virtual server with limited resources, you might find that overall site performance increases and hosting costs come down when you farm off some of your content to a public CDN.

Make sure you also read the other answer to this question by @Xaver. This is a very good trick

Google CDN:

  • caching, good for performance, more users likely to have it already, and it downloads in parallel
  • if ever, heaver forbid cdn goes down. you're screwed.
  • if a new version breaks your existing plugins or site, you'll know about it possibly too late

Locally:

  • development without being connected to the net is possible
  • can still get some performance benefits by gzipping, in addition to minifying

I always use the CDN (Content Delivery Network) from Google. But just in case it's offline:

<script src="http://ajax.googleapis.com/ajax/libs/jquery/1.4.2/jquery.min.js"></script>
<script>!window.jQuery && document.write('<script src="jquery-1.4.2.min.js"><\/script>')</script>

Grab Google CDN's jQuery and fallback to local if necessary

Edit: If you don't need to support IE6 and your site has partial https usage you can remove the http as well:

<script src="//ajax.googleapis.com/ajax/libs/jquery/1.4.2/jquery.min.js"></script>

Benefits: (Specifically for Google's CDN)

  1. Downloads in parallel with your files. Other answers address this further
  2. Google's Servers are likely to be able to physically deliver the content faster
  3. Common libraries and frameworks might already be on the user's machine, as the HTTP cache for a CDN is universal across all sites
  4. Your bandwidth wouldn't have to go towards serving large library files

Virtually every way you look at it, using Google's CDN is a good thing.

Performance will be improved (albeit fairly marginally, unless your site is really busy), and the amount of data your servers have to transmit will go down (although jQuery isn't exactly a massive thing to download), etc.

The only reason you wouldn't want to use it is if you don't trust Google. By using it, you are effectively giving Google an additional window of information into your site's traffic profile, including knowledge of URLs that you may otherwise not want to make public (eg secure areas of your site).

If you are paranoid about security then this may be enough to persuade you not to use them (after all, hosting it yourself isn't exactly going to slow your site down to a crawl), but in general most people would take the pragmatic view that Google knows enough about their site already that adding this won't make much difference.

I prefer to use my local version, because I don't have control about what they will provide. For example I don't want my users to get affected by google-analytics or anything similar, because this is a legal problem in my country.

Others have covered the benefits. Pitfalls:

  • If you only include content from your own server, that's one server that needs to be running—and not blocked by firewalls etc—to make your site work. Pull script from a third party and now that's two servers that need to be running and unblocked to make your site work.

  • Any site you pull <script> from can completely control the user's experience on your site. If Google were feeling evil they could put something in their copy of jQuery to log your keypresses, steal personal information from the page you're on to tie into their web tracking database, make you post “I love Google!” comments to every form, and so on.

Google probably aren't actually going to do that, but it's a factor that's out of your control, and certainly something to worry about with other script-hosting services. There have been incidents before where stats scripts have been compromised with malware loaders.

Before including any script from a third party—even on one single page of your site—you must 100% trust them with all user-accessible functionality visible on that hostname (including web-facing admin functions).

To me it really depends on how much control you desire to have. If you are like me and need to develop on local host when working and traveling. Having the jquery files local is better than having it hosted on google or else where.

Probably I'm in minority nowadays, but I'd say that you don't want to use CDN unless you really need to. Key factors to start using it are:

  • Cross geo users. If you host your website in the US but have visible amount of European users - CDN will improve the loading time.
  • Big amount of users and\or big content, so one main server is not enough any more. One can think of any porn-video website (or Netflix, if you want). Video stream is a heavy load, with CDN would be much much less load on the main server.

But... the point is that these points are not really applicable to 90% of websites in the world. I bet you're not Facebook with millions of online users around the globe, you're not Pornhub with hundreds of GB transferred every second.

If your website is targeted to users in your city/country and capacity of one server is enough for amount of users you have - why would you ever want a CDN? It's quicker for your users in your city and simpler for you to fetch everything from your main server locally.


It was more about CDNs in general, now let me be closer to the actual question about jQuery or any other library.

If you want your website to stay accessible and working without maintenance for more than a year, let's say - put it locally. Libraries nowadays are being updated in a crazy tempo which you probably don't want to follow. And old versions are being deleted eventually. Moreover, the whole library can die (probably not applicable to jQuery though).

From my recent experience - I updated TinyMCE on the website I maintain from 3.x.x (dated 2012) to 5.x.x (dated Spring 2019). This website was working for 7(seven!) years without any maintenance in this part of the logic. There was no "minifying" concept back then and CDNs were not as common as now. But even if they would be common - you never know what will happen in 3-5-10 years from now. Usually you want your website to stay alive even without you maintaining it, don't you? However if you pull jQuery from CDN today, then this link may (and, probably, will) break in 5 years.

Solution with CDN AND fallback to local version as @Xaver suggested can be a good compromise. But... maybe just get rid of CDN link? ;)