UMass Amherst hardware hackers Blake Foster, Rui Wang, and Erik Learned-Miller built this articulated realtime tracking rig using a GPU, Arduino, and FPV Pan-Tilt camera. [Thanks, Rui!]
The human eye is amazingly adept at tracking moving objects. The process is so natural to humans that it happens without any conscious effort. While this remarkable ability depends in part on the human brain’s immense processing power, the fast response of the extraocular muscles and the eyeball’s light weight are also vital. Even a small point and shoot camera mounted on a servo is typically too heavy and slow to move with the agility of the human eye. How, then, can we give a computer the ability to track movement quickly and responsively?
Thanks to recent progress in camera miniaturization, small, easily manipulable cameras are now readily available. In this project, we use a first person view (FPV) camera intended for use on model airplanes. The camera is mounted on servo motors which can aim the camera with two degrees of freedom. The entire assembly weighs only 32 grams, only slightly more than a typical human eyeball. Coupled with a GPU-based tracking algorithm, the FPV camera allows the computer to robustly track a wide array of patterns and objects with excellent speed and stability.
ADVERTISEMENT