ATTENTION: You are viewing a page formatted for mobile devices; to view the full web page, click HERE.

Main Area and Open Discussion > General Software Discussion

An Analysis of CamSpace as a Webcam Mouse and Gaming Controller

(1/3) > >>

VideoInPicture:
Well, it seems that CamSpace, http://www.camspace.com, is in Beta3 mode and they sent out invites to the people that signed up (I never got mine, though. Anyone got theirs?).  Here's three videos of someone trying to use CamSpace to emulate a mouse: http://www.youtube.com/watch?v=0AYnNzr_uO4, http://www.youtube.com/watch?v=_Y2hT-ejlrA&feature=related, http://www.youtube.com/watch?v=qEuIKe4HUEI&feature=related and here is a video of someone using it on Google Earth type application: http://www.youtube.com/watch?v=9xoSS5QiyX4. Note that these videos appear to be from people who have actually gotten copies of the program and are not promotional videos.

Some of you may remember the WebCam Signature program I submitted for the DC Programming Contest that contained an experimental webcam mouse mode (http://webcamsignature.wikidot.com). From coding that program and watching these videos, I feel that I can provide some useful information for people that haven't gotten an invite yet.

From reading CamSpace's wiki and watching CamSpace demo videos, I can conclusively say that CamSpace primarily tracks colors (like WebCam Signature) to determine where to move your controls.  One piece of supporting proof is their colored wheel printouts at http://www.camspace.com/featured/.

The advantage of this method is that it is fairly simple to program and does not require any complex mathematics. You tell the CamSpace program what colors you want to track by holding up your colored object in certain positions that the program specifies. This is what allows you to use “any object” as your control to your computer. More on the “any object” part later in this posting.

The program tracks the areas of the blobs of colors to determine the z-depth.  However, the problem with this method is that it is difficult to impossible to distinguish if you are actually pulling your controller back/forwards or whether you are just tilting forwards  are an angle so that your controller has a smaller projected frontal area. Also, I’m not sure how useful the z-depth tracking will be in a real world setting because in order for it to work properly, the whole of your controlling object must be visible to your camera at all times or you will get jerky tracking. This arises from the fact that a camera has a limited field of view and as you move an object in the z-direction, you may inadvertently move out of the field of view. I will have to get the beta to test before I can judge how well the z-depth tracking works in their implementation.

They track the rotation of the blobs of colors to determine the angles. They do not appear to track the exact shape of the object because you can use objects of various shapes. Instead, they appear to track the longest axis of the colored blobs to determine the angle of orientation of your colored objects. In some videos, you can see a shorter axis that is perpendicular to the longest axis of the colored blobs. This is used to find the center of the blob. This center point appears to be what is used to calculate the position of your controller in the X-Y plane.

The program is most likely written in Microsoft Visual C++ from the screenshots I've seen, so don't hold your breath for a Linux or Mac version any time soon.  I’m deducing this from the program icons seen on http://www.camspace.com/wiki/index.php/Emulation_Authoring. If you are a Microsoft Visual Studio user, you will recognize the familiar program icons in the top left of the windows. They most likely used Visual C++ to gain as much of a performance advantage as possible due to the inherently high cost of image processing in real time.

In the first three videos listed at the beginning of this post, you can see someone trying to use CamSpace to emulate a mouse. The first video shows that CamSpace can be made to work smoothly and allow you to move windows around by using a pinching motion of your fingers with colored plastic on them. However, there is a major hurdle that must be overcome before you can use a webcam as a general mouse. This major hurdle is that you must design a control scheme to signal to the computer when you want to select an object, move it, click, double-click, right-click, middle-click, scroll, and for those of you with extended mouse buttons, how to control the 4-th and 5th buttons. The implementation must allow you to move the mouse accurately while allowing you to execute the above commands. You will most likely have to use two hands to successfully emulate a standard mouse. One hand would be used for moving the mouse and the other would be used for signalling the other commands. It is difficult to do this with just one hand because as you move and shift your hand to create a different gesture, you will likely shift whatever it is on your hand that the program is tracking to position the mouse. This makes clicking on a button a frustrating affair let alone a text link.

The second video listed above shows some inherent problems of color based tracking methods. It shows that sometimes, the performance of this method can be quite slow as indicated by the less than 10 fps frame rate seen while trying to track three objects. I’m not sure whether this is due to the emulation used, lack of processing power, or a problem with the CamSpace program. Either way, if you are playing a game that already taxes your computer’s resources, you will have to upgrade to a better computer if you want decent tracking and a game frame rate. From coding WebCam Signature and playing around with other webcam tracking programs out on the web, I estimate that CamSpace will likely use around 20-30% of the processing time of a middle of the pack duo-core CPU, such as an AMD Athlon 64 X2. Ram usage should be fairly low, probably under 50 megs when running.

During the second video, you will notice that you can see the colored hand (being tracked) of the person but not the body of the person himself. At the end of the video, you can see the person come in front of the camera and his shirt (normally black) becomes white and is pixelated around the edges. This highlights the problem that the ability to track “any object” is in fact the ability to track “any object with a color that is not similar to colors seen in the background.” Similar colors will fool the program as well as shiny objects that are used for controllers. Changing light conditions or sunlight shining on your tracking object will also have adverse effects. Sunlight is bad in particular because it contains a good deal of near-infrared light that is not visible to the human eye but is picked up by many webcams. This creates noise in the colored blobs that affect the accuracy of tracking and may introduce jerky motions. Evidence of this can be seen by looking at the constantly changing pixelated edges of the colored blobs on the person’s hand in the second video.

From comments I have seen on CamSpace’s site and others, many people are calling CamSpace as a Wiimote killer and a gaming revolution. I believe these people are wrong. I do not believe you will be able to obtain tracking as accurate as you can with a Wiimote nor will it be anywhere as responsive. The Wiimote uses a specialized infrared camera coupled with dedicated hardware that allows it to outperform the vast majority of webcam/computer combinations out there. It is also much more resistant to lighting changes and is not affected by similar colors. The Wiimote just has to track two infrared LEDs that provide a much more predictable and reliable method of control than trying to emulate human vision. I also do not believe this will revolutionize gaming because there are already other virtual reality systems out there that allow you to track objects with greater accuracy than is possible with a color based system. Some of them are commercial systems with dedicated hardware, but some are open source like http://www.free-track.net/english/.

You are inherently unable to provide the accurate and fine tracking that a standard mouse can with a webcam system and current PC hardware because the vast majority of webcams have a maximum resolution of 640x480 (0.3 megapixels) whereas most monitors have a resolution that that is much greater, such as 1440x900 (1.3 megapixels) and up.  This means that the position of your mouse on the screen must be interpolated from a 0.3 megapixel image, not even considering the issue of processing power. Most likely, you will be using a resolution of 320x240 to enable your system to track at a high enough frame rate to prevent lag.

 Even if you used a camera with a resolution equal to that on your monitor, you will still not have a usable system because the processing requirements will be enormous. Think of trying to process a DVD movie frame by frame in real time and you will have an idea of the power needed to equal the resolution that can be provided by a standard mouse.

This is not to say the CamSpace is bad. In fact, it appears to be an excellent piece of software and it is about time that someone has made use of the webcam and make computer vision a useful thing. However, we must be aware of the inherent limitations of computer vision systems and not get carried away thinking that we are ready to throw away our mouse and Wiimote.

One thing that I hope CamSpace will do is release their software under an open source licence so that others can analyze the code and use it. While I was writing Webcam Signature, I did some extensive searching for example code that demonstrated computer vision in a useful manner. Something more useful than turning on a security camera. Well, the field is very sparse and most of the programs people are writing are proprietary and there is very LITTLE sharing going on. If computer vision is to advance to a useful state, we must share code so that new ideas can be readily implemented and old ones improved upon. This was the primary reason why I made Webcam Signature open source and I hope CamSpace will be open source as well.

Well, that is a long enough article for now. I’m sure there are some spelling mistakes and I’ll fix those in after a bit. I’ll write more if I have something to add.

Stay tuned for my second program that I’m submitting for the DC Programming Contest. It’s a good one.

vitalyb:
Good article. Some comments:

I've got my copy too and it is pretty responsive. A lot more than I expected. It has several issues:

1) You must be in a well-lit enviorment.
2) The overall GUI feel is a bit buggy.

However. With that said, the Soccer game (the one that is seen in the promotional video) plays great. You can actually bounce the virtual ball and it feels very responsive and AWESOME. They need to develop some better (physics based) games and it CAN be WII killer after all.

VideoInPicture:
Good article. Some comments:

I've got my copy too and it is pretty responsive. A lot more than I expected. It has several issues:

1) You must be in a well-lit enviorment.
2) The overall GUI feel is a bit buggy.

However. With that said, the Soccer game (the one that is seen in the promotional video) plays great. You can actually bounce the virtual ball and it feels very responsive and AWESOME. They need to develop some better (physics based) games and it CAN be WII killer after all.
-vitalyb (July 08, 2008, 08:32 AM)
--- End quote ---

I still waiting for my beta invite even though I signed up a long time ago. Have you tried using it as a mouse? Also, what about the similar color issue?

vitalyb:
The betas are being sent slowly on purpose to handle issues that the current beta users report, so you'd have to be patient.

I tried to use the mouse pointer with it. It works, but it is hardly comfortable.

Similar colors do pose a problem however, in proper lighting it is a very minor one. CamTrax works by locking on a NEW object that it sees in a specified location. And the lock is a lot more complex than described in the first post - It locks on the form, on the color AND it keeps learning as you play with it. If you intentionally take the object out of the camera view it will try to lock on a different object with the same color, however, it hardly ever happens normally.

mouser:
Thank you very much for posting your impressions of CamSpace -- really very interesting reading.  :up: :up: :up:

Navigation

[0] Message Index

[#] Next page

Go to full version