Robots With Vision

For the past few months, I've been trying to get a wireless camera that I could use for my NXT robots. I ended up getting the camera from Geeks.com that Jim posted about earlier (this was before the Mindsensors camera came out, so I didn't know about that one), plus a USB adapter from Happauge. Both of these items were recommended by RoboRealm, the free vision software I'm using for the camera. I got everything working recently and tried a few things with it, and it performs quite well. The software can do a lot... it's not made specifically for the NXT, but it has an interface to communicate to the Brick via Bluetooth. The RoboRealm folks put up a nice tutorial for using the software in NXT robots, here. Although the tutorial uses some VBScript programming, it's actually not necessary. You can use drag-and-drop "modules" to process the video, and then send variables with data to the robot, which can use an NXT-G program to react correctly. Below are two robots I've made to test the capabilities of the camera

This first robot I made is the usual track-the-ball robot. I wanted to see how well the software's color detection and tracking worked, and I was very satisfied with the result. Although the surroundings need to be controlled (to keep other objects of the same color out), the software does a great job tracking objects. One disadvantage that is very noticeable in this robot, however, is that the Bluetooth communication takes a little while, so the robot doesn't have very good "reaction time". Hence, I had to make it turn slowly so it wouldn't go past the ball before recieving the command to stop.

The below video first shows the robot following the ball from side to side, and then shows it following the ball forwards and backwards. I had some trouble getting it to do both at the same time, and didn't want to take the time to figure it out. :P

For the second robot, I made a "paper-thin" remote control. Basically, I got a piece of paper with four differently colored squares on it. When you put your hand over one of the squares, the robot will react to the command by either moving forward, backward, spinning left, or spinning right, depending on which square was covered. A camera makes this possible by watching the paper from overhead and detecting when a square is covered up. Here's a video of it in action:



-Jonathan

Comments

Laurens Valk said…
That is interesting. Also the idea of using paper as 'remote' is very original. Good work :)
That is great stuff, Jonathan! Now I want to go back and start playing with the wireless camera again.

Jim
Thanks... it's a lot of fun using a camera, because it opens up a lot of capabilities for robots.

-Jonathan
Anonymous said…
Ive been using roborealm for about a year now. Its GREAT!

Athough my work was lost to faulty hardrive, I may have to return to it again. (and back it up).

BOB on nxtlog was my work.

I noticed you have the same camera I have. Did you know it has a microphone too? Roborealm also has voice capabilities, if you have windows and train the voice recogintion stuff.

Anyway, nice paper remote!
Nicthegr
Yeah I saw that it had a microphone, but unfortunately my USB adapter only gives me video and no sound, so I can't use it on my robots (unless I get a different adapter, that is). That's too bad, especially considering that Roborealm can do voice recognition - that would be really neat.

Can you provide a link to your NXTLog stuff? I couldn't find it by searching.

-Jonathan
Anonymous said…
Sure!

http://mindstorms.lego.com/nxtlog/ProjectDisplay.aspx?id=4e8e70b4-4697-44e2-9e25-ade9e1824851

and its updated model

http://mindstorms.lego.com/nxtlog/ProjectDisplay.aspx?id=4985092f-641a-468e-9a25-6c2a4e958412

Any ideas for improvement? I already have new, faster base.

Nicthegr
Wow, that's impressive... awesome idea of self-recharging.

-Jonathan
Anonymous said…
Impressive!

"Although the tutorial uses some VBScript programming, it's actually not necessary. You can use drag-and-drop "modules" to process the video, and then send variables with data to the robot, which can use an NXT-G program to react correctly."

I can connect Roborealm to my NXT, and obtain sensor and battery readings. BUT the tutorial does not seem to show how to use the drag and drop modules. (it only shows VBScript programing...) Am I missing something?

Any help is appreciated.
The tutorial does talk about the modules, maybe you just didn't recognize them. The modules are the things like "RGBFilter" and "Blob Filter" which are talked about on pages 3 and 4 of the tutorial. In order to use a module, you double-click on it in the left side of the application. The modules you're using in the program are listed at the bottom of the application, and you can double click on them to configure them. You kinda have to figure out a lot of them by trial and error, but you can go on the Documentation section of RoboRealm for documentation of the modules.

Some of the modules I've used a lot so far are the RGB Filter (to find colors), Blob Filter (to differentiate between same-colored objects), and Center Of Gravity (to track an object I've found using stuff like the RGB and Blob filters).

-Jonathan
Anonymous said…
Hi all,

You might be interested in this video shot at Robodevelopment on adding a bluetooth camera to the NXT:
http://vishots.com/2008/02/10/braintech-demonstrates-vision-sdk-for-microsoft-robotics-studio/

[Full disclosure: I'm the guy in the video :-)]

Popular Posts