Good Pros and Cons of both technologies.
http://www.choice.com.au/viewArticle.aspx?id=104527&catId=100533&tid=100008&p=3&title=FireWire+vs+Hi-Speed+USB
Short version:
Firewire, uses a "Peer-to-Peer" architecture in which the peripherals are intelligent and can negotiate bus conflicts to determine which device can best control a data transfer
Hi-Speed USB 2.0 uses a "Master-Slave" architecture in which the computer handles all arbitration functions and dictates data flow to, from and between the attached peripherals (adding additional system overhead and resulting in slower data flow control)-Cloq
Some people in this forum are probably not hardware savvy, so we need to clarify generalizations carefully.
Firewire is DMA and peer-to-peer, so it has much better real-time deterministic service times. The DMA capability lets it write directly into DMA buffers without involving Windows and its drivers. This means there's less CPU overhead for large file transfers and real-time activities (like watching videos) will be much smoother.
However, when TI defined the peer-to-peer IEEE 1394 (Firewire) standard, they did it only for the transport layer, not the session layer (middleware component as some call it). As a result, if you plug a 1394 Sony DVD re-recorder into a Panasonic HDTV, it won't work. If you plan to use the peer-to-peer features, you need to buy all your equipment from a
single manufacture until the require middleware can be standardized probably around year 2012 or so. When there is "eventually" a middleware standard, there will probably be a logo for it that you'll need to look for.
In constrast, USB is not DMA; therefore, there's much more CPU overhead in moving data around between USB buffers by the CPU. The speeds of these moves will be non-deterministic, which will cause jitter in real-time traffic, especially with a heavily burdened CPU doing lots of background I/O.
Because USB is not typically peer-to-peer, the smarts (or middleware) required to make it interoperable is programmed in the daughter half of the USB driver. This daughter half can be made smart so that it will negotiate resources such as power, bandwidth, response time, etc when it gets connected. This is why it takes forever for you computer to boot up with lots of USB devices are plugged in--time is required for this negotiation between devices, which is a good thing BTW, because you get the best trade-offs. But try booting with your USB devices UNplugged and you'll see it's much faster.
Embedded system engineers love the negotiation feature of USB because it avoids a lot of end-user service calls from users trying to overload their USB bandwidth or create device conflicts. For the non-geek, USB is plug-and-play.
There is a peer-to-peer version of USB called "USB on-the-go". Some of your peripheral devices (especially cameras and photo printers) may have this feature. Check on the box that your photo printer came in. It's based in part on Universal Plug-n-Play (UPnP), which Microsoft developed. If your USB device supports UPnP, then it probably supports USB on-the-go as well. If you can plug your camera directly into your photo printer, then you must have it.
I like USB better for non-computer geeks because it's smarter. But for professional use, 1394 is faster and deterministic, which is important for real-time data streaming. In the commercial environment, you should pay the price and buy a 1394 camera for streaming data transfers.