The browser and the server send request, last modified and expiry timestamps back and forward and thus determine when a cached file is invalid, with for example javascript files you can set this yourself, if you use my link google does all this for you (basically the link won't change so the link will be cached for a year I believe). Every new version they will have a new filename therefore there is minimal checking. If you want to read more about this kind of stuff try something like
PageSpeed and run it on your website and google for caching web requests.
When requesting a page the http protocol limits how many files can be loaded from the same host at the same time, therefore say you are loading the maximum amount of files from your webhost (images, stylesheets, html etc can add up to 50 requests easily), however with the javascript library hosted on a different host (such as google, or your own such ass assets.yourdomain.com) you can download and process this in parallel.
Hope that's not too technical. The problems with using external resources is usually:
* you trust the external resource. (security)
* you realise they can disappear (sustainability)
* you realise they can make changes to the file (see trust)
* you realise you are directing traffic from your website to theirs (responsibility)
I think that is not a problem with google, but for example 'hotlinking' to images from another website (point to the image on an external website) would be a bad idea. They could easily replace an image with an offensive one. Also I've seen it used to attach an external site by overloading it with traffic, which obviously is not responsible (could be criminal).
But in the context of your first question, if you are looking at common javascript libraries, I still think use the Google one, they have optimised it for speed and won't disappear easily.