Over the past two years, makers have created innovative and surprising projects with their Google AIY kits. here are a few that you may want to try yourself!
An animatronic toy is a great place to start hacking. In Paul Trebilcox-Ruiz’s case, he used a Star Wars Porg as the framework for an AIY Voice-powered translation tool, designed to help youngsters learn new languages. Press the Porg’s belly to put it into listening mode; it will hear your phrase, then repeat it back to you in the predetermined language while flapping its wings.
This elaborate setup from Mike Rigsby has the perfect payoff — it listens for your voice (via an Amazon Echo), then checks to see if you’re smiling using the AIY Vision Kit. If you are, a robot catapult points itself in the direction the Echo determined your voice to be coming from and flings some candy to you.
Hand Command Recognizer
By setting up a specific search region and utilizing a training set of just 1,500 images, Dmitri Villevald constructed a program with his Vision Kit that can recognize direction-oriented hand gestures. “It can be used to control your mobile robot, replace your TV remote control, or for many other applications,” he describes in his notes — and it all happens without need for the cloud. His instructions for building a companion display box, with light-up arrows that respond to your gestures, are included in the how-to.
Voice Controlled ROBOT Arm
Press the AIY Voice button, speak an orientation and value (in degrees), and the affordable MeArm robot gripper leaps to do your bidding. The interface and code control all four servos on the robot for directional control. “More complex procedures, like ‘pick up something in the lower left, move it to the upper right and release it there’ can easily be coded and combined with a individual keyword to activate them,” project creator Dr. H writes in his how-to.
This project from Larry Lindsey and Christiana Caro uses the Vision Kit to identify boats that pass by their office, and save the corresponding video files for their art project, Yacht TV. Their documentation includes interesting notes about fine-tuning the program to increase the number of correctly identified boat sightings. It’s open source and configurable — they say that just by editing a text file, you could build a Dog TV, Plane TV, you name it.
Similar to Yacht TV, Kev Hester set up his Vision Kit to identify hummingbirds that might be visiting the feeder outside his kitchen window. When the system detects one, it snaps a photo of it and tweets it. It runs on 100 lines of Python code, and he’s even included the STL for the printable window bracket he designed for it.