That's what computing originally was. Then we got personal computers.
I like that way of mainframing things!
Renegade: Video transcoding was only one example. I can see many other uses: image manipulation, 3D rendering tasks, complex OCR tasks for a lot of documents, and so on. Basically, any task where
(upload time + download time + cloud processing time) < local processing time
Bandwidth might be an issue but shouldn't be exaggerated, even for the case of video transcoding. A fairly large number of people have had 100Mbit connections at home for some time. Very far from all of course. But many enough for this kind of service to take off I'd think.
Cloud transcoding might pose some legal (copyright) problems. But I'm not too sure about that. Couldn't a generic cloud processing service claim to be a mere infrastructure? Recording digital TV broadcast with a TV card is legal where I live and so is transcoding those recordings on my PC for personal use. And a system could perhaps be built so that the local computer only uploads obfuscated calculation tasks to the cloud. Then the cloud processing service can sincerely say that they can't know or control what the processing is for.
Anyway, if legal risk explain why no company tries it then a P2P version of the same idea could still be possible.
We already have P2P file sharing and P2P proxies (TOR). I've also seen attempts at P2P cloud storage (though I can't mention a specific example). So why not P2P processing?
We actually already have a large P2P processing system at work in [email protected]
. But there users only donate their CPU cycles for science.
We also already have many examples of limited
cloud processing for end users already. Webapps that let us upload a file for malware check, sites that convert pdfs to other formats, and so on. But those are limited in doing only very specific tasks that the user can't modify much.