If you’ve been living on Earth this year, you’ve probably heard someone mention self-driving cars. If you were tuned in, you might have heard the $70,000 price tag for just one of the early lidar units used in Google’s driverless cars. The devices are a bit outside the typical price point for a family auto let alone a hobbyist’s project.
Lidar is at once an acronym for Light Detection and Ranging and a portmanteau of “light radar.” Its name is fairly descriptive: The unit bounces laser light off of objects, and senses the return of that light to measure the distance of said object. This is key to object detection in a self-driving car.
But forget cars for a moment. What about golf carts?
Alex Rodrigues, Michael Skupien, and Brandon Moak built a self-driving golf cart as part of a mechatronics engineering program at the University of Waterloo in Ontario, Canada. The sensor that made it possible was a new-ish, $8,000 lidar unit from Velodyne called the VLP-16, affectionately known as the Puck. After a $35,000 grant, Rodrigues, Skupien, and Moak dropped out of Waterloo to move to California and attend YCombinator. They founded Varden Labs, which just finished a round of fundraising, and has done pilot demos for automated shuttles, including bringing VIPs to a Sacramento Kings game.
That’s not to say lidar makes automated driving simple, but if you can take some of the challenges out, reasoned Rodrigues, you could make it a lot easier to solve. “We intentionally chose to put constraints on the problem,” he says. “We said building a fully self-driving car that can work in all conditions, in all places, at all times, is really hard. But making something that can be useful for transporting people doesn’t have to be that hard. So we specifically are targeting shuttle service on private campuses.” That means they could slow it down, place it in a controlled environment like a university or retirement community, and keep it on private property to cut down on the regulations they have to deal with.
A Changing Landscape
Varden Labs is just one of a number of groups tackling self-driving golf carts as people movers, and there’s still more working in other segments of lidar computer vision and control. And not all are targeting big, expensive vehicles; the technology is rapidly becoming more accessible to makers — who are in turn making it cheaper and more accessible for the masses.
So forget self-driving golf carts. Reduce the problem even more. What about lawn mowers?
“We are trying to make it better than the average mower,” says Alexander Grau, one of five members of the Ardumower project, an Arduino-powered DIY automated cutter.
“If you want average quality or average features, then you just can buy a normal mower, commercial mower. If you want something special, then you have to build yourself a mower.”
Basically a Roomba for your lawn, right?
If only it were that simple. Roombas and other robotic vacuums (some of which use lidar) have a couple of advantages: They (mostly) don’t operate in direct sunlight; they never have to go up- or downhill; and their environment doesn’t change much. Take it outside, put a blade on it, and you have to contend with many other confounding factors.
“If you have flat ground, just a room for example, indoor it works nice,” says Grau. “But outdoor it’s too difficult, because the ground is not flat. So sometimes you get data, signal from the ground, and the processing software … thinks it’s an obstacle, but it’s just the ground.”
Grau would know. He’s a computer scientist who is currently tackling all these issues, and more, in pursuit of the Ardumower. You can buy an automated lawn mower, points out Grau, but it relies on a cable strung around the perimeter to stay in bounds. Lidar seemed a better option.
In principle, lidar isn’t very difficult. But its basic element only gives you one data point: You know that something is a certain distance away, but not how wide or tall it is, if it’s moving, how it’s oriented, or really anything else. A person appears the same as a wall appears the same as a chair, if the chair is even tall enough to be struck by the laser. To make lidar useful, you need to take many measurements over a period of time. And there’s more than one way to do that.
Capturing Routes
A multi-laser lidar unit like those from Velodyne can quickly capture a three-dimensional model of a locale when set in a static position. The more advanced the unit, the more lasers it contains, which in turn collect a higher density of data, which allows for fairly detailed 3D images of their location.
Once you start moving a lidar unit, however, the data it is capturing begins to overlap, making recordings impossible to parse. Sophisticated systems overcome this by adding in positional awareness like GPS. Some of the mapping vehicles from big tech companies couple GPS with lidar units to track the exact point where each moment of lidar data is captured, “painting” their route with lasers. Combined with the location data, this can be used to create a nearly infinite 3D point cloud, valuable for various applications such as providing detailed street information to self-driving cars.
Most simply, affix a unit on a spinning apparatus and let it go round, giving you a 360° picture every time it spins. Grau bought a $115 Lidar-Lite single-point sensor from PulsedLight, mounted it on a 3D-printed platform, and attached a DC motor. Using the Robotics Operating System (ROS)’s open-source hector_slam code (Simultaneous Location and Mapping), Grau was able to get visual maps of the lidar data, and have the mower interpret them.
But ultimately, the rig was giving the Ardumower too much data, at least too much to process with an Arduino, especially when Grau experimented with 3D, in an attempt to address the signal interference due to topography. A more expensive unit, with still greater processing needs, could accomplish this. (For comparison, Rodrigues’ Velodyne-powered golf cart requires the computing power of a desktop.)
More Detailed Data
Grau’s project is similar to an apparatus called Sweep, a small, rotating lidar scanning unit that also uses a Lidar-Lite. Sweep, developed by two-person company Scanse, just wrapped up a successful $273,000 Kickstarter.
Scanse co-founders Kent Williams and Tyson Messori were biding their time as robotics consultants when they decided they should start manufacturing a lidar unit. “What we found is that, you’re basically not going to be able to accomplish any really capable autonomy outside unless you’re using lidar,” says Williams. “The existing cost of the sensors made that pretty much impossible.” They spent two years designing Sweep before launching the Kickstarter, and have now begun production thanks to both the Kickstarter and $250,000 worth of venture funding from Rothenberg Ventures’ River accelerator.
The tuna can-sized device, which looks like a little head with lopsided eyes, is on preorder for $255 (some 30 times cheaper than Velodyne’s least expensive unit), also runs on ROS, can take data points up to 40 meters away, and fits nicely on a drone or robot. The Lidar-Lite unit employs a series of micropulses of laser light; the receiver recognizes these pulses, meaning it’s easier to pull the signal out of ambient noise, requiring less processing power and giving a clearer picture. The sensor sits on a brushless motor, and Williams and Messori also developed a “spherical scanning kit” that mounts the unit on a servo, oriented at 90° to the brushless motor, so that it can record data in three dimensions.
Grau, too, was able to design a version of his sensor that offered a fuller, 3D view, by adding another motor. In both cases, the results are limited by the frame rate of the unit; if you’re spinning it at 10Hz (the max speed for Scanse’s device) and recording 500 samples per second (the max for PulsedLight’s Lidar-Lite), you’ve just cut that by whatever your rotations per second are on the Y axis: You only get to see the full picture once every time it nods up and down, or goes round vertically. This means that, while the device is useful for real-time scanning of a plane of an environment in 2D, to get a 3D representation Sweep needs to sit still for around a minute.
Set on a table, Sweep rotates quietly, and a set of white dots appears on a black field on a ROS visualizer on Williams’ laptop. It looks like an architectural drawing of the room, writ in pixel art, and you can see people, visualized as a line of dots, as they move. The 3D version is more like a pin art toy, showing a globe of dots with recognizable features — trees, tables, people.
Data like this, as recorded by a Velodyne unit, is naturally much more detailed. A “point cloud” appears in real time, colored lines rolling out in succession as the device progresses across space, with physical features appearing like waves in those lines.
Velodyne manages this, with both the Puck and its larger units, by adding more lasers. The Puck has 16, arranged to angle upward and downward by up to 15°, obviating the need for the unit to spin on two axes. Combine this with a spin rate of up to 20Hz and lasers that fire every 55.296 microseconds, and you get 300,000 points per second.
A Maturing Technology
There are other ways to measure environments with lidar. In the early ‘80s in Paris, France, Omar-Pierre Soubra founded a company called Mensi based on a laser triangulation. The goal was to offer a detailed 3D model of the inside of a nuclear power plant before it was decommissioned. To accomplish this, Mensi designed a one-meter-long metal tube with a rotating mirror on one end and a receiver on the other. A laser sent through the mirror would bounce off the interior of a space, and you can use trigonometry to locate the data point. As the tech advanced, incorporating a time of flight measurement based on sending identifiable packets of light rather than a constant stream, it was able to achieve extremely high resolution, capturing a million points per second and recording terabytes of information. Trimble, a multinational location services company, purchased Mensi in 2003.
Velodyne is a company with maker roots — founder David Hall is a BattleBots veteran, and developed their lidar, now used by Google and many other companies working on autonomous vehicles, for his entry in the DARPA Grand Challenge. But the units from Trimble and Velodyne aren’t designed with makers in mind: “We would not want to give the impression that our product is in any type of plug and play,” a Velodyne representative told me. Trimble, points out Soubra, bought Mensi because the technology is useful for some of Trimble’s core industrial businesses, like mining, construction, civil engineering, etc.
The uses for an inexpensive, single-laser unit are still myriad, though. Put it on a boat or wheelchair for assisted guidance. Place it in a stationary position in a room, and it can act as a security device, or as a smart light switch, or to tell how busy a room is, by recognizing movement. “All sorts of 3D detection stuff that you really would have a lot of trouble doing with a camera, is suddenly very, very simple with a lidar,” says Rodrigues. Grau intends to combine his scanner with other sensors — perhaps a radio automatic direction finder — for a more robust mower.
Soubra says lidar is also useful as a scanner in conjunction with 3D printing: “It’s one of those sectors that started to follow what I call the aftermath of 3D printing,” he says. “One of the issues with 3D printing is that, you play with it, and quickly get tired of downloading objects that others have created … sometimes you want to have a copy of something that you have. And you want to replicate it into a 3D printing model, maybe scale it down, or scale it bigger, or whatever it is. So that’s when 3D scanning becomes part of the toolchain of doing stuff yourself.”
Lidar is still progressing, despite challenges like speed, and resolution. “Compared to most other sensors in this price range, like sonar and infrared range finders, this provides way more data,” says Messori. Along with these improvements will come additional uses, with lidar tailored to them. “This first device is really intended to be a developer product,” says Messori. “We definitely intend on improving it for specific applications, or tuning it for specific applications.” That could be adding accelerometers or gyroscopes to make it work better on drones, upping its durability, or adding connectivity to pair with phones, and Messori and Williams anticipate, eventually, building a robot around their device.
Lidar is just hitting the DIY market. The cost continues to drop, and better lasers and processors will make resolution go up, enabling ever more applications.
“The technology is starting to specialize into different fields,” says Soubra. “Now it is well known enough that it can be tweaked for specific applications. That’s how you know when a technology is mature.”
ADVERTISEMENT