Computer Vision on Mobile Testbed

Arduino, LEGO, and Bluetooth

After swearing off robotics, I found myself intrigued by the possibility of using modern mobile handsets as an integrated robot controller. A modern cell phone has its own battery and power supply management, accelerometers, gyroscopes, compass, a camera capable of video, WiFi, and an embarrassing amount of processing power. Add a set of wheels and motors and you’ve got a complete robot.

From a software point of view, I decided to go with an Android handset. It has the most versatility and also supports a wider array of connecting to external hardware. It also helped that I’d already made an app or three for iOS and wanted to see how Android worked. If push came to shove, I could also target Unity3D to Android, adding a third platform to my Unity3D deployment checklist.

I knew that the name of the game was flexibility since this was going to be a research platform. Therefore I built the mechanics using LEGO. For electronics and motor control, I went with Arduino. I’m very familiar with these two systems, and the goal wasn’t to re-hash the electro-mechanical aspects.

Unfortunately this project got sidelined somewhat when I decided to better learn iOS development. There are only so many hours in a day!

Fiducials and Camera Calibration Grid

To date, I’ve got PID control working for the motors, with a Bluetooth interface to my laptop. I’ve also started working with OpenCV on my Mac to test various computer vision algorithms. I’m pretty happy with the fiducial tracking library ArUco. My landlord has graciously agreed to let me stick fiducials all over the house. (It helps to rent from a nerd!)

On the handset side, I’ve secured an older HTC One X+, downloaded the Android SDK, and built a Hello World. I also used a trial copy of Unity for Android to download a test application to the phone. The next step is to write a simple Bluetooth remote control app using Java on Android. Then I’ll start integrating OpenCV for Android. Luckily, there is a package for that already. Finally, I’ll use ArUco to allow the robot to get 3D position via the camera. The sky’s the limit from there!

Last edited on Feb 06, 2013
This document (c) 2013 by Ed Paradis.

Contact info and homepage