Meet Carol Reiley

Biohacking Science

CAROL

Have you ever made a robot that didn’t work right, at least at first? In hobby robotics this is no big deal, and it still isn’t the end of the world with industrial robots, but with surgical robotics, being one millimeter off can mean life or death — or at least complications. Welcome to Carol Reiley’s world. She’s the co-maker and co-author of the Air Guitar Hero and Blood Pressure Monitor projects featured in this issue of MAKE, and she recently finished a Ph.D. in surgical robotics from Johns Hopkins, where she (among other things) developed precise force-feedback arms for remotely tying surgical knots.

Now Reiley works at Intuitive Surgical, makers of the da Vinci Surgical System. These robots don’t work autonomously; instead, human doctors use them to execute precise surgical procedures or to augment their ability to see inside a patient’s body. The improved precision and reduced trauma of robotic surgery translates to less patient pain and faster recovery times.

Yet even as she spends her days helping to develop multi-million-dollar surgical robots, Reiley remains active as a maker who enjoys looking for simple, low-cost ways to solve big healthcare problems. MAKE spoke with her to learn how she found her niche and how she wants to change surgery in the decades ahead.

pool bot
Pool Bot: Triton RUV (remote underwater vehicle) from Prof. Chris Kitts’ robotics class at Santa Clara University, where Carol Reiley began working with robots.

What led you to surgical robotics?

My dad is an engineer, and he taught me and my brother how to program computers when I was in eighth grade. I’ve always had an interest in mixing computing and engineering.

I never really wanted to be a doctor, but all through high school I volunteered at a hospital. I saw a lot there and realized that I wanted to have an impact. As a doctor you save one life at a time, but as an engineer I could build something that could change the way surgery is performed and potentially save millions of lives.

How did you begin working with robots?

As an undergrad at Santa Clara University, I was very fortunate to work with a professor who had a laboratory for land, sea, and space robots. It was very hands-on, and underwater robotics just seemed so cool! We built a low-cost, underwater, remotely controlled vehicle that operated in a little pool. I got a scuba license, we went out into the field — that’s how I started building robots.

It’s quite a leap to go from underwater robots to the operating room!

My professors have had such an amazing influence on my life. As a grad student I met a professor at Hopkins who was doing a lot of work with surgical robotics. I realized there were so many problems that needed to be solved, like how do you give robots a sense of touch, so a doctor using a surgical robot can feel hardness or texture. I ended up doing my master’s thesis on haptic technology.

Is there a proper balance between robotic technology and human expertise?

I’ve never been interested in robot autonomy; I’m much more interested in human-machine interaction. I don’t want robots to be able to operate on you autonomously, but I do want computers to understand what’s going on, so they can assist doctors in ways that even a human assistant could not — say, with an extra arm, or by providing enhanced vision. Right now I’m doing an internship at Intuitive Surgical, the company that makes the da Vinci surgical robot (youtu.be/rP25mga2x8M), which can already scale motions, magnify views, and provide 3D vision. Robots can help a doctor become a Super Surgeon.

How do doctors control surgical robots?

That’s what I like to think about — how humans interact with the machines. There’s still a lot of work to be done in this area. Whatever a robot “feels,” we want to convey that feedback to a doctor in a realistic way. That can mean a slave robot — a physical device that gives direct feedback to a doctor in a physical environment. Or it can be a

virtual environment, where conditions are simulated on a computer to replicate and manipulate a physical environment.

In either case, the goal is to provide a more realistic experience. That might mean an immersive sense of touch, with attributes such as weight, pressure, and feel. Or, in a virtual environment, to provide additional information like numerical data. I want things to feel more natural, and to filter out extraneous information.

It’s hard to imagine a doctor performing surgery without some tactile feedback.

In open-hand surgery, a surgeon can feel the hardness of tissue, the pressure of surrounding pieces of anatomy, any physical changes that happen over the course of a procedure, and so on. Surgical robots work in a very difficult environment. Unlike industrial robots, which function more or less in free space, surgical robots operate inside very small incisions in the body. The constrained space means there are a lot of physical pressures on the surgical tools, which makes it very hard to determine which of those pressures should be translated into haptic feedback. Plus the sensors on the robotic arms have to be sterilized in an autoclave, which is very hard on them. Haptics are viable in research, but in the real world, it’s much more complicated.

So where are the frontiers in surgical robotics?

Surgical robots don’t have intelligence right now, but that’s the kind of thing I worked on in grad school. I see adding that intelligence as the next level; that’s how we can really revolutionize the operating room. There are three frontiers: to have robots working in the operating room that function like true assistants, so they start to act and anticipate like an experienced surgical nurse throughout the course of a procedure. Another frontier is augmentation, with capabilities like augmented motion to guide the surgeon who is performing a task, or augmented vision, to give surgeons capabilities they really wouldn’t have otherwise. The third frontier is to automate tasks that are tedious, repetitive, or boring — like closing up a surgical incision.

That all sounds great, but it also sounds expensive.

It is! Today, a surgical robot costs around $2 million. But that’s also why I’ve launched a project called TinkerBelle Laboratories. TinkerBelle is a group of engineers I’ve gotten to know, and together we try to develop simple solutions to attack big, global problems. In healthcare, for example, many pregnant women in the developing world have hypertension, which can cause a lot of health problems. But in those kinds of environments, there are relatively few people who know how to take a proper blood pressure reading. So we created an automated blood pressure device that’s very easy to use, doesn’t require any training, and could be produced at very low cost.

I tend to think on two different levels at the same time. I love being able to work with the most sophisticated robotic machines in the world, but I also love working on problems that need low-cost solutions. I realize those two impulses pull in opposite directions, but they’re both really interesting to me because they require you to design and build things in very different ways.

What will the next generation of Make: look like? We’re inviting you to shape the future by investing in Make:. By becoming an investor, you help decide what’s next. The future of Make: is in your hands. Learn More.

Tagged
Discuss this article with the rest of the community on our Discord server!

ADVERTISEMENT

Escape to an island of imagination + innovation as Maker Faire Bay Area returns for its 16th iteration!

Prices Increase in....

Days
Hours
Minutes
Seconds
FEEDBACK