Noah Feehan meets with Alexis Lloyd (left) and Jane Friedhoff around the New York Times' Listening Table

Noah Feehan meets with Alexis Lloyd (left) and Jane Friedhoff around the New York Times‘ Listening Table

Noah Feehan is in an office workroom that’s strewn with screwdrivers, a digital oscilloscope, and fume extractors. On a desk sits a half-finished circuit printer, filled with cartridges of nano silver particle ink and ascorbic acid. It prints circuits on paper. Feehan and his co-workers have been floating in and out of the room to work on it.

“I like to organize by project,” he says, gesturing to boxes filled with wires and scrap materials. Of the half-finished circuit printer: “I’d love to print an eight-and-a-half by eleven RF-harvesting antenna. Might be a good opportunity to experiment with algorithmic design.”

Feehan and his co-workers don’t work at a hardware store, electronics workshop, or tech startup. They’re standing on the 28th floor of 620 Eighth Avenue in Manhattan — the office of The New York Times.

Feehan — whose official, LinkedIn-approved, resume-topping title is “Maker” — isn’t the only tinkerer that works at the 163-year-old newspaper. Seven other makers populate the Times’ R&D Lab, which launched in 2006.

Their mission: To forecast game-changing technology trends that will unfold in the next three to five years. They then build prototypes to envision how these ideas will impact media’s future — and how these ideas upend our notion of the communicated word. How will content be delivered? What sort of devices will bridge information and audience? How will platforms change? The idea is not so much to create products based on these questions, but to discover what creative director Alexis Lloyd describes as “tangible artifacts of potential futures that have relevance to the Times.”

The lab is full of builders, coders, fixers, and the various things they’ve cooked up. “We all come from very different backgrounds, from video art to statistics,” says Lloyd. “We all have a background that sits at the intersection of art or design, technology and critical theory.”

Their latest invention, which they finished last September, is a four-foot-wide table, dotted with 14 capacitive strips, that sits in the middle of the lab, surrounded by stools. This is the “Listening Table”: part transcriptionist, part smart furniture, and part, well, table.

This puppy transcribes, in real time, what people say in meetings, using Android speech recognition. In the middle of the table is a microphone that captures every idea, pitch, suggestion, disagreement, digression, jabber, and joke. Around the edge are eight single-pixel thermal cameras that figure out who’s talking or gesticulating.

But the table’s a lot more than just a note-taker. On a flat-screen TV a couple feet away, the words appear on the screen nearly as they’re being spoken. Each word is a varying shade: Lighter, grayer words are deemed less relevant (“the,” “a,” and other articles), while key topics are solid black.

And if you touch one of those capacitive strips on the table, the system recognizes the 30 seconds prior to the tap and the 30 seconds after to be key moments in the meeting, making it easy to tease out important chunks in the transcript. The Listening Table is not just recording what’s being said — it’s recording why it’s being said, and what’s important about it.

NYT Table diagram

The table was actually designed and built by François Chambard, a seasoned maker the Times contracted for the job. (He’s also known for creating keyboard stands for Wilco.) Chambard describes the proposed timeline as “aggressive”: Two months. On one hand, the project seems like a cinch — “It’s a table,” he says — but the challenge was making sure the seven hidden layers meshed perfectly together, to eliminate any looseness or gaps.

The surface is white Corian, like you see on countertops, and the base is custom bent-laminated and veneered plywood, with the central microphone sitting atop it, underneath a perforated cage. Inside the base is an Arduino Mega with a custom board, a Mac Mini that runs the server and an Android tablet that communicates with it, and some simple cabling.

The table is surrounded by video monitors that display the R&D Lab’s other projects. The table is just the latest effort in what the lab calls “semantic listening,” a trend that they’ve been working on for the past several years.
In the footsteps of the Quantified Self movement, the team started wondering what quantifiable values were slipping under the radar. That’s when they realized values that can be qualified, like meaning and context, needed to be examined, too.

Sure, your footsteps and budget can be tallied up — but what about those thoughts and feelings that were running through your head? How could the lab build a means to interact with that data?

So they made the table, not only to monitor how much data passes through an environment like an office or a meeting, but why that information is important. The table addresses how people, consumers, or publishers can answer those questions in a physical, tangible, touchable way. (And you can still scribble notes or plop your coffee on it too.)

Surrounding the table is an arsenal of the Lab’s other projects, all designed to explore the way media will be delivered and consumed in the short- to medium-term.

One nearby monitor shows news articles annotated in real time with New York Times tags, not just from articles, but on the level of words and phrases. These words or key points are identified and then could be used as bullet points on a mobile app, or an interactive map can be built based on locations mentioned in the story. A particular location could even be contextualized on a wearable device. (The database of important words is manually compiled, continually, by librarians and word taxonomists.)

Back in 2011, the Lab brewed up a different table-focused technology that used Microsoft’s Surface interface, a touchable tablet and interactive bulletin board. On the Lab’s prototype, users can flick, twist, and drag photos (which open news stories) across the table and organize them into stacks. Tables, by nature, promote sharing, as colleagues can gather around a table over coffee. If a user places his or her cell phone on the surface, this table can automatically conjure articles that his or her friends have shared.

Feehan views signals on an oscilloscope at the New York Times' R&D Lab.

Feehan views signals on an oscilloscope at the New York Times’ R&D Lab.

This stuff isn’t actually used in the Times newsroom. There aren’t any board meetings held around the Listening Table. But that’s not the point. The point is to help the Times think about how emerging technologies will affect the industry. And makers help the centuries-old newspaper company do this.

“Having something tangible to stimulate ideas and conversation from is really useful — and for us as designers and makers, in our research process, there’s a lot of information and knowledge that we get from reading about topics from thinking about them from having larger discussions about them,” Lloyd says. “But there’s a whole different set of knowledge that you get from having to build a thing, and having to figure out what that button does.” (The table aside, most stuff gets built in-house in the lab.)

As the lab forecasts tech trends of the not-too-distant future, the team also considers the threats posed by information-slurping furniture.

“These are technologies that could pretty clearly be used for surveillance or nefarious purposes,” says Lloyd. “One of our research goals is to develop a set of design principles: How can we allow for transparency, so I can have a sense of control over my participation with the system?” Knowing what kind of data is being collected, when it’s being collected, and the ability to opt out, are key — thus, the moving lights on the table, which indicate whether it’s listening.

“One of the most valuable things we do is to be able to have a deep understanding of an emerging technology and a tangible interface or artifact that can stimulate a conversation within the organization about what we might see happening three to five years out,” says Lloyd.

Matt Boggie is the lab’s executive director. (He’s also been tinkering with that circuit printer from before.) “We think about how you read an article, or how you watch a video, or how we report on news — and how that finds its way into other experiences.” That’s what the lab is all about, and why stuff like the Listening Table is a big deal.

But at the end of the day, the lab is really a roomful of scrappy, smart people who love to work with their hands.

“I’ve been doing a lot of strategy work for next year, which has been a lot of writing and a lot of PowerPoint,” Boggie says. “And I come in here and start to screw things together or test some circuits.”

“I’ll get to a point around three in the afternoon when I’m like, ‘I need to do something with my hands’.”


Editor’s note: This article was updated to clarify the purpose of the flagging system