By Albert Rondan, Computer Graphics Engineer at Lynx Laboratories
What happens when a few of the engineers at Lynx Laboratories decide to enter a statewide hackathan and make a real-time, interactive music visualizer with our device? Well, we pumped up the Daft Punk, invited some friends over, and ended up with a pretty trippy video:
This video was made with no post processing and visual effects software. The only editing involved are the cuts from live action film to the colorful, morphing visuals. The visuals are achieved using a kinect-like camera and some special 3D processing software. The program takes in raw depth and color data from the camera and renders it in OpenGL. It doesn’t end there; the program modifies the visual data according to properties of the audio data in real-time. These modifications are based on a set of “filters” implemented to do something specific to the visual data (like change its color according to the intensity of the song, or change the position of each point in space by some function). Each of the filters are applied at different points in the song, similar to the way a particle system works. In this case, the time interval corresponds directly to the current state of the song that’s playing, so each 3D “particle’s” position can be recalculated as a function of time. There are so many interesting effects that we can recreate with different filters in this program; we haven’t had time to think of them all just yet. This is just the tip of the iceberg in terms of the kind of revolutionary 3D technology that we have produced.
Our company has completed the first version of our vision — a 3D structural capture camera called the Lynx A that captures the shape and motion of what it sees, quickly and easily. It’s really as easy as taking a 2D picture or video with a camera; you can point-and-shoot to capture 3D models and motion data of whatever you choose. Polygonal structures form right before your eyes, as you move the camera around. This 3D data can then be used in visual effects, architectural rendering, video games. and much more. It’s a high-powered, handheld device that integrates right into the workflow of people that need 3D data, and it’s affordable! There’s nothing “hacky” about it; the camera has been built from the ground up with a robust platform and in-house algorithms to provide an all-in-one user experience.
We (Lynx Laboratories) are a technology startup out of the University of Texas at Austin. The UT Perception Lab, started by graduate student Chris Slaughter and Professor Sriram Vishwanath, became a company with Chris, Sriram, and Engineers Jeff Mahler and Dustin Hopper. Development on the core software platform continued, and two more engineers joined; me and Nick Shelton. Since then we have relocated to the UT startup camp, where we’ve met other entrepreneurs and great mentors like Bob Metcalfe, Josh Baer, and Carol Thompson.
We aren’t done yet. The Lynx A is a stepping stone to the pinnacle of 3D capture, and we need the help of the people that would use this kind of device to make it better than ever. Whether it’s a particular feature or workflow enhancement that would be useful, we want all the feedback we can get. That’s the great thing about Kickstarter; backers have the opportunity to purchase a camera and be at the epicenter of a community that’s going to shape our core technology.
If you’re involved in a particularly revolutionary or awesome project and would like to write about it for 3D Thursday, or you have a related product that you’d like us to review or write about, please contact Eric Weinhoffer at [email protected] Thanks for reading!