Over the last few years, we’ve seen a huge growth in the number and variety of both microcontroller boards and single-board computers. Just as in the early 1980s, when the arrival of cheap home computers led to an explosion of variety and choice, the growth in the number of microcontroller boards in today’s market has meant that manufacturers have experimented with both features and form factors.
However, we live in a different time, and the trends driving the growth in microcontrollers have led us down a different path than they did last time around. In the 1980s, we looked at the new home computers and saw not just glowing screens, but boxes that could be manipulated in any number of ways. Todayʼs computers, smartphones, and tablets instead are seen as a way to communicate. Nowadays a microcontroller — or even a “real” computer — without an internet connection is just a brick.
Today’s microcontroller board market began with development boards. Essentially, these were breakout boards for new chips that manufacturers wanted to bring to market. They allowed professional engineers to experiment before they placed orders of thousands, or perhaps millions, of chips to put inside their products.
From a hobbyist perspective, these development boards were built for professionals and were generally too expensive to be useful. For the most part, the now venerable PIC microcontroller was the backbone of the maker movement’s electronic builds and came by the chip, rather than by the board.
The modern era, defined by microcontrollers becoming conveniently packaged on boards, began with the Arduino. The “little blue board that could” has changed the way that we do electronics, not just for hobbyists, but for professionals as well. Those expensive — and badly documented — developer boards for the professional market have given way to cheaper microcontroller boards that are far more easily accessible. That’s been good for everyone, including the professionals, and we have makers to thank for that.
It’s safe to say that the growing popularity of internet-connected smart devices, the so-called Internet of Things (IoT), has changed the face of the microcontroller board market.
The current generation of boards now come with radios, sometimes lots of radios. Before the IoT, microcontrollers, like computers in the past, were seen as a way to automate, or control. Now, they too have become communication tools.
It’s just that, for the most part, they’re talking to each other, rather than to us.
Throwing in the Kitchen Sink
The microcontroller board market is in transition. Just as the way we’re using computers is changing, the way we build hardware is changing with it. Because of that, manufacturers aren’t entirely sure how people are going to use their product. The response from many has been panic, and they “throw another radio on it.”
The arrival of what I call “kitchen sink” boards, which try to be all things to all people, has been one of the main trends over the last year or two. This is especially evident on Kickstarter, where people are desperately seeking to differentiate their board from all the others.
Microcontrollers are ultimately used to control things, and that means there isn’t a single-use case. But that doesn’t mean it’s a good idea to have one board — with all the power and all the necessary radios — to do all the jobs a microcontroller might be asked to do. A typical kitchen sink board comes with multiple radios, and more CPU and RAM than most embedded devices will ever need to do their jobs. And this hardware is expensive. “One board to rule them all” will never be the right board to use. As with the UNIX command line, people should try and focus on building small, simple hardware tools, not giant monoliths.
One casualty that arose from the end of early home computers is now evident: the decreasing number and variety of form factors in which those computers came. We’re now seeing the same thing for microcontrollers and, to some degree, single-board computers.
People should focus on building small, simple hardware tools, not giant monoliths.
The “classic” Arduino layout, including the irritating, irregular offset between pins 7 and 8, has become a standard, almost by default. In addition to clones and imitators, the huge community surrounding the board has brought with it shields and other hardware designed for its configuration. This means boards that might not resemble an Arduino computationally still resemble it physically.
Other board makers have started to see their design become standardized now too. For instance, Adafruit’s range of Feather boards has a standard layout, one that imitators and competitors are starting to duplicate.
There’s also a movement occurring at the smaller end of the market in which manufacturers have started to produce integrated modules on a single board. Often destined to be mounted on other circuit boards, the castellated module is now a default way to get today’s tiny surface-mount parts into the hands of a wider community that often doesn’t have the tools, or the skills, to make use of them directly. This became especially obvious with the arrival of the ESP8266, which led to an ESP-12-like form becoming the default. Competitors like the RTL8710 now come in very similar configurations. Some are even pin compatible.
Similarly, the Raspberry Pi’s layout has been imitated, with several newer boards duplicating it exactly. One of those, Asus’ Tinker, is rapidly carving out a niche as an inexpensive media center. And the popularity of the Raspberry Pi Zero, along with the recent arrival of a wireless variant that has made the board far more useful, may start to drive imitators. But we aren’t witnessing a full form factor standardization for single board computers — at least not yet. Like the Arduino’s pin headers, the Raspberry Pi’s header block has become a standard by default, and for the SBC market, perhaps that’s enough.
Computing That Is Cheap Enough to Throw Away
General-use microcontroller boards with onboard Wi-Fi can now be found for less than two dollars, while a single-board computer can be picked up for only a few dollars more. Even for those of us that have grown up with Moore’s Law, that seems almost inconceivable. And yet, we’re getting to a place where computing is not just cheap, it’s essentially free.
That changes how people are using microcontrollers. The ESP8266 has been a runaway success, and in many ways is the opposite of the “kitchen sink” boards that manufacturers — unsure of their markets — are pushing as the solution to the IoT.
“Good enough” is sometimes all that’s needed.
The ESP8266 is also successful due to the community that has rapidly grown around it. This community coalesced not because of the features the board offered — there have been other small form factor wireless boards — but because of one feature the other boards didn’t offer, the price point. As a result, the ESP8266 has become the “third community” of the maker electronics world alongside the Arduino and the Raspberry Pi. Although some of that success can be attributed to the ESP8266/Arduino compatibility, the chip’s community-built Lua development environment is actually far more widely used, which suggests the price point really is the thing that drove community adoption. It appears that “good enough” is sometimes all that’s needed.
The Arrival of FPGA
Field-programmable gate arrays (FPGAs) are a very different kind of beast than a microcontroller. With microcontrollers, what you have control over is the software, the code living on the chip. With an FPGA, you start with a blank slate. You design the circuit. There is no processor to run software on until you design it.
It might sound crazy, but what this gives you is flexibility, and the age of the maker FPGA has arrived without much real fanfare. There is now an open-source toolchain for Lattice’s iCE40 FPGA, and FPGA boards specifically targeting the maker market — like Alorium’s XLR8 — are starting to appear. These boards provide hardware-level flexibility, allowing you to adapt hardware rather than replace it as your project evolves — something maker projects have a tendency to do over time.
It’s also been interesting to see the appearance of FPGA-like chips inside “real” products. For instance, Apple’s new AirPods are actually built around a Cypress PSoC chip.
Packaging Machine Learning
One of the most intriguing features of the Arduino 101 board when it was released was the 128-node neural network hidden inside the Intel Curie driving the board. For months after its release it was almost impossible to get any information about, or access to, the network, with Intel promising that documentation and library support were “coming soon.” That’s changed with the arrival of the CurieNeurons Library from General Vision. A free version gives limited access; the “Pro” library offers full support at a cost of $19 per user (which is almost two-thirds the cost of the board itself), and that’s going to be too rich for most makers.
Which is what happened to the rest of Intel’s offerings to the maker market. Pitched towards the high end makers in need of high performance the Galileo, Joule, and Edison boards were recently withdrawn from the market with little warning. In a market where low end boards are routinely stretched to do things most people thought they couldn’t do, the expensive, and badly documented, boards were always going to be a hard sell.
What Do People Really Want in a Board?
Most people, and most makers, want to solve a problem. While, for some, the specifications of the board really matter, those people are by far the minority. What some manufacturers fail to understand, sometimes repeatedly, is that most people don’t need more performance than what they require, and would rather pay less for the right tool than extra for something excessive. In the end, most people aren’t interested in the kitchen sink, except when they need to do the washing.