On a dusty lot at Moffett Field, CA that’s been made to simulate a lunar landscape, NASA showed off what they hope will be the next step in our exploration of the solar system: telerobotics. That’s a fancy name for remote control, but it’s a lot fancier than most remote control vehicles.

We’re already quite familiar with very long-distance telerobitcs: the Mars rovers do this. The problem with the Mars rovers is that one command from Earth can take seven minutes or longer to get to the rover, and the result can’t be known for another seven minutes. What NASA is testing here is much shorter range. In fact, today the range was about 250 miles – the distance between the test site and the International Space Station, which was directly controlling the K10 robot rolling around on our moonscape.

NASA is calling this “Surface Telerobotics,” and the K10s have been designed with the idea that they’ll be landing probes, operated by astronauts in orbits above the surface of the Moon, Mars, or other sites where the communication lag will be minimal, allowing real-time control. Really, just like the M.A.L.P. from Stargate SG-1.

Today’s test was a simulation of a possible future lunar mission. The concept was that astronauts in the Orion spacecraft could park themselves at the L2 point 40k miles above the far side of the moon, and then direct the robot to deploy a radio telescope on the surface on the “dark side” where radio signals from Earth are blocked out.

What the rover sees.

What the rover sees.

Of course, just as exciting is learning about the hardware and software going into the systems which, in line with the “better, faster, cheaper” ethic of lace, is all off-the-shelf stuff. The controller software is written in Java, and run by Flight Engineer Luca Parmitano of the ESA on a Lenovo laptop running, of all things, Windows XP SP3. The point to that hardware and OS is that they are very stable, very reliable builds; known quantities. The ISS communicates with the rover over the Ku microwave band (12-18GHz) at speeds up to 3Gb/s, sending instructions and receiving telemetry from the various cameras and sensors on board.

The rover itself has multiple systems for locating and positioning itself, including GPS, radar, and stereo cameras (again, all off-the-shelf parts), so that the controller can move it around on a distant landscape safely. The rover even runs off rechargeable laptop batteries. Here’s a gallery of some of the hardware they’re using:

But what’s the ultimate point of all this driving around on vacant lots? Terry Fong, the Director of the Intelligent Robotics Group at NASA Ames puts it clearly: “This work really tests the notion that robots can project human presence to other planetary surfaces. ultimately, this will allow us to discover and explore dangerous and remote places, whether they’re at the bottom of the ocean or at the far reaches of our solar system.”

Personally, while I wish there was a lot more money going into the space program overall, this kind of project is inspiring. With the funding limitations NASA has to deal with, they’re getting more creative and working with what’s available. This is the maker spirit in action!