After some more experimentation, it appears that framerate issues come from the image being too dark - doesn't matter if the image is dark because you've set gamma, exposure, brightness or gain, if the image has a very low average brightness, the framerate will plummet, even with 'auto' everything off. I don't recall this happening when I was using just one camera, but that was on a different machine a long time ago.
Because the framerate varies with brightness, and because the brightness changes all the time, the sync issues mentioned in Part 1 are not predictable. Fortunately, the good old "z += (newz - z) / filteramt" 'filter' came to the rescue, and for as long as the cameras maintain a reasonable framerate, things go well.
To be honest, this whole thing would be a bzillion times easier if TrackIR cameras (with their framerate and optics geared towards this task) could show up as VFW devices, but my questions to Naturalpoint regarding that got bounced around internally, then went unanswered. Then again, I suppose they really might not want to talk to me considering I kinda made a competing product. Doot de doo.
1 comment:
Webcam stereo vision is difficult, but I've been doing it for years now and I've got a pretty good working system.
I had very much the same problems as you. Just getting two identical cameras to run at the same time was initially a big problem which took ages to resolve. Almost all the stereo correspondance algorithms which you can find on the web are also extremely poor. I initially used the Birchfield algorithm, but then dumped that and developed something of my own.
If done properly stereo vision is extremely useful for detecting people and doing navigation or visual obstacle avoidance. The fast version of my system runs at 12 frames per second, which would be fine for avoiding collisions.
Post a Comment