The Arduino Nano 33 BLE Sense is a Nano-sized board with Bluetooth connectivity and jam packed with sensors. These include a 9-axis inertial measurement unit (IMU) that adds a compass to the usual gyroscope and accelerometer, a microphone, a light and color sensor, as well as temperature, humidity, and air pressure sensors. This board should not be confused with the Arduino Nano 33 BLE. The two boards are identical except for the additional sensors on the Arduino Nano 33 BLE Sense.
The Arduino Nano 33 BLE Sense is run by the Bluetooth module from uBlox. This module features an ARM Cortex M4F based microcontroller from Nordic Semiconductor operating at 64 MHz. The “M4F” means this micro has a floating point unit in addition to digital signal processing instructions. The microcontroller runs Mbed OS, an operating system for ARM based embedded systems, and your Arduino sketches run on top of that. The Scheduler library supports cooperative time-sharing among multiple threads on Mbed OS.
The form factor of the Arduino Nano 33 BLE Sense matches that of other Nano-sized boards. Its narrow shape and holes for header pins make it breadboard-friendly. It also has castellations along the edge and a smooth bottom for surface-mounting to a PCB. Like the other Arduino Nano 33 boards, this board runs at 3.3V and is not 5V tolerant. Except for the voltage, it is pin-compatible with the Arduino Nano Every. Like other recent Nano-sized boards, the pins are only labeled on the bottom. So you’ll have to keep a pinout diagram handy for prototyping.
Machine Learning Capabilities
Arduino has billed this board as a platform for TinyML, tiny machine-learning. The idea is that you can use a more powerful computer to create a machine learning model that will take input from the board’s many sensors. Once that model is trained, you can deploy it to the board and the board will have enough processing power to run the model and make decisions from the sensor input. Arduino has created a walkthrough to demonstrate the potential of the board for TinyML. I followed this walkthrough to familiarize myself with the board.
If you’re going to program this board in Arduino IDE, you’ll first have to get several libraries installed. To talk to the board you’ll have to go to the boards manager and install support for Arduino Mbed OS Nano Boards. Once installed, you should select Arduino Nano 33 BLE as your board type. This is because the BLE and BLE Sense boards are identical except for the sensors, most of which communicate over I2C. To access those sensors you next have to install a library for each one. Confusingly, the getting started guide references the PDM library for accessing the microphone, but you won’t find this library in the library manager. Instead, this library is silently installed along with the board definition files. I found the example code for each sensor library to be excellent, just what I would expect from an official Arduino library.
Once support for the board and sensors is installed, you can install the machine learning examples by installing a library called Arduino_TensorFlowLite. I ran each example to get a sense of what the developers wanted to show off about the board. It wasn’t obvious what each example sketch was supposed to do because the developers didn’t put much documentation in the code and links to more information from the walkthrough were broken. One example briefly described in the walkthrough was a speech recognizer that could detect “yes” or “no” and turn the RGB LED on the board green for “yes,” red for “no,” or blue for undetermined. It worked when speaking clearly and loudly towards the board. Given the time it takes the board to make a decision on the microphone input, it’s reasonable to conclude that two words is the practical limit to the board’s vocabulary.
I searched the web to find documentation for the remaining example sketches. I discovered that running the gesture recognition example requires patching one of the sensor libraries by hand. The “person detection” example requires connecting an Arducam camera, which I did not have. Another minor annoyance was that the board would disconnect from USB each time a sketch was downloaded to the board, and then the board would reconnect on a different COM port. This meant that the serial monitor, for example, couldn’t be used until the board’s port was reselected.
The ultimate test of the board’s TinyML abilities is gathering training data from the Arduino, making an ML model from it, and then deploying that model to the board. For this process, the walkthrough gives an overview of how this would be done: First the training data is acquired via the serial monitor, then it is processed into a model via a number of Python scripts that appear in a Google Colab notebook, then you download a header file that contains the trained model and integrate it with your sketch. Getting this to work took several tries as the Python scripts produced various errors, but eventually I had a working model that could distinguish between a punch gesture and a bicep flex gesture when holding the board in my hand. So although the edges are a little rough in the TinyML demonstration, it does work.