Main Area and Open Discussion > General Software Discussion
Pros & Cons of Remote Libraries?
(1/1)
barney:
Folk,
Didn't find anything here on the topic, but perhaps did not search as diligently as I might :-\.
I've been reading a book on HTML5, and it seems that there's a lot of use of various JS components. I'm interested in opinions/discussion on the use of remote libraries, e.g., jquery or modernizr, as opposed to copying them to a local Web site.
One obvious pro is that you'll most always be using the most current version(s) :up:, and an equally obvious con is that you are at the mercy of Internet traffic and remote server up time/availability :down:.
So what other advantages/disadvantages do you see? Which would you recommend for Web work?
justice:
My recommendation is to use google's library API. Chances are people have the javascript library cached already so won't have to download it again, speeding up your site. Also Google's network infrastructure is probably better than yours.
Also, link to a specific version of the library, not the latest, as changes to the library could break your code. Just test new versions then change the link.
barney:
Chances are people have the javascript library cached already so won't have to download it again
--- End quote ---
And how do I verify the cached version is the one I'm using on the page? Wouldn't it be safest to force download each time the page is reloaded? Or are browsers smart enough to discern the difference(s) 'tween cached and page versions :-\?
Also, link to a specific version of the library, not the latest
--- End quote ---
I'd think that would be a given, but prolly worth mentioning, at that ;D.
Google's network infrastructure is probably better than yours.
--- End quote ---
Yeah, but if the ubiquitous they are loading my page, they have access to a locally-stored file, whereas network traffic could delay the load of a remote one. In some cases, that could break a page - or am I being paranoid about that :tellme:?
Maybe a little background? This question arose during a happy hour conversation as a result of an event occurring at a young Webmaster's site. She swears she'll never use a remote reference again, and another agrees with her. Then a couple of scripters are adamant about remote usage, and two of us just cannot make up our minds - we can see both sides, always a detriment to convincing argument ;D. Hence, this post.
Stoic Joker:
Maybe a little background? This question arose during a happy hour conversation as a result of an event occurring at a young Webmaster's site. She swears she'll never use a remote reference again, and another agrees with her. Then a couple of scripters are adamant about remote usage, and two of us just cannot make up our minds - we can see both sides, always a detriment to convincing argument ;D. Hence, this post.-barney (October 26, 2010, 07:03 AM)
--- End quote ---
Two things come to mind:
1. Nothing is fool-proof, so no matter what you do there will be a potential for hiccups...Just try to aim for the smaller ones.
2. Sounds like a job for more robust code. Like doing some version checking so people with iPads aren't locked out of a Flash only website. The bells-N-whistles are nice as long as you don't make the site too dependent on them.
In a nutshell, how big is the library file?
Huge, let somebody else handle it (especially if it's free).
Tiny, sure can't hurt to toss it in.
Browser caching seems fairly (safe) reliable these days.
justice:
The browser and the server send request, last modified and expiry timestamps back and forward and thus determine when a cached file is invalid, with for example javascript files you can set this yourself, if you use my link google does all this for you (basically the link won't change so the link will be cached for a year I believe). Every new version they will have a new filename therefore there is minimal checking. If you want to read more about this kind of stuff try something like PageSpeed and run it on your website and google for caching web requests.
When requesting a page the http protocol limits how many files can be loaded from the same host at the same time, therefore say you are loading the maximum amount of files from your webhost (images, stylesheets, html etc can add up to 50 requests easily), however with the javascript library hosted on a different host (such as google, or your own such ass assets.yourdomain.com) you can download and process this in parallel.
Hope that's not too technical. The problems with using external resources is usually:
* you trust the external resource. (security)
* you realise they can disappear (sustainability)
* you realise they can make changes to the file (see trust)
* you realise you are directing traffic from your website to theirs (responsibility)
I think that is not a problem with google, but for example 'hotlinking' to images from another website (point to the image on an external website) would be a bad idea. They could easily replace an image with an offensive one. Also I've seen it used to attach an external site by overloading it with traffic, which obviously is not responsible (could be criminal).
But in the context of your first question, if you are looking at common javascript libraries, I still think use the Google one, they have optimised it for speed and won't disappear easily.
Navigation
[0] Message Index
Go to full version