In today’s world, designers often create environments that rely on their users being able to see. Accessibility for the visually impaired is either an afterthought or not considered at all. For the 285 million visually impaired people around the world, tasks like finding one’s keys or walking along busy sidewalks become arduous or even impossible. Blindsight, however, seeks to change this reality. A new assistive technology designed by high school students and invited to showcase at the China-US Young Makers Competition in Beijing, Blindsight promises to dramatically improve the quality of life for the blind.

Blind people today rely on sighted guides, seeing-eye dogs, and canes even a century after the introduction of these solutions. Modern apps like TapTapSee and Seeing AI face the problem of requiring users to awkwardly point mobile phones around them, limiting autonomy and increasing discomfort. Critically, these apps do not enable the visually impaired to actively engage with their environments, effectively relegating them to spectator status in our society.

Enter Blindsight: a wearable device that greatly increases the independence of the blind. Blindsight is an armband and smartphone app combination, designed to help the visually impaired by integrating machine learning with haptic feedback. After waking Blindsight from sleep with a wake word or button press, users can summon the smart assistant to help them by reading signs or identifying products in a grocery store. Blindsight can also find a misplaced object, using the integrated camera to locate it and then controlling 8 vibration motors to direct the user’s hand towards the target. Through facial recognition, Blindsight can even remember and identify faces to help the user recognize people he or she has met before. After a full day of use, Blindsight can be recharged on any standard Qi wireless charging pad.

Bringing Blindsight from concept to working prototype has been a challenging yet exciting journey. As a team of high school students, we encountered several problems with the actual engineering of the device, including purchasing a faulty camera, scaling our 3D prints incorrectly, and struggling to sew together an armband. On the software side, processing images on our server was taking tens of seconds, negatively affecting the user experience.

However, after bringing on new members with additional expertise, our team was able to successfully overcome these hurdles and produce a functional prototype. We refactored our server code to keep our machine learning models continuously loaded in VRAM, dramatically cutting down on waiting time after a command. We also discovered that the material used in men’s dress socks was perfect for a comfortable yet secure armband. Finally, after searching through detailed product catalogs and contacting vendors, we found a compatible wide-angle camera that met our size constraints. We were ready to present our first Blindsight design.

With our functional prototype, our team was proud to win the Semifinalist round of the Google-sponsored 2018 China-US Young Makers Competition. As one of ten teams representing the United States, we will attend a showcase and final competition in Beijing later this summer. In the meantime, our team is working diligently to improve our prototype and incorporate new features, including a dynamic Braille ‘display’, currency detector, and more. Be sure to visit blindsight.app to follow our progress and leave your feedback. Our team is committed to improving the lives of the visually impaired, one cutting-edge innovation at a time.


Team Blindsight consists of high school students Christy Koh, Michael Zhu, Krishna Veeragandham, Megan Leng, Sean Tseng; incoming Boston University freshman Devin Mui; incoming UC Berkeley freshmen Jaiveer Singh and Aaron Huang; incoming University of Washington freshman Jaimie Jin; and incoming UC San Diego freshman Jesse Liang. Article written with the assistance of Kyle Shi.