Machine learning possible on microcontrollers

ARM’s Zach Shelby introduced the use of microcontrollers for machine learning and artificial intelligence at the ECF19 event in Helsinki on last Friday. The talk showed that that artificial intelligence and machine learning can be applied to small embedded devices in addition to the cloud-based model. In particular, artificial intelligence is well suited to the devices of the Internet of Things. The use of machine learning in IoT is also sensible from an energy efficiency point of view if unnecessary power-consuming communication can be avoided (for example local keyword detection before sending voice data to cloud more more detailed analysis).

According to Shelby , we are now moving to a third wave of IoT that comes with comprehensive equipment security and voice control. In this model, machine learning techniques are one new application that can be added to previous work done on IoT.

In order to successfully use machine learning in small embedded devices, the problem to be solved is that it has reasonably little incoming information and a very limited number of possible outcomes. ARM Cortex M4 processor equipped with a DSP unit is powerful enough for simple hand writing decoding or detecting few spoken words with machine learning model. In examples the used machine learning models needed less than 100 kilobytes of memory.

zackdscf6473

The presentation can be now viewed on YouTube:

Important tools and projects mentioned on the presentation:

TinyML

TensorFlow Lite

uTensor (ARM MicroTensor)

TensorFlow Lite Micro

Articles on presentation:

https://www.uusiteknologia.fi/2019/05/20/ecf19-koneoppiminen-mahtuu-mikro-ohjaimeen/

http://www.etn.fi/index.php/72-ecf/9495-koneoppiminen-mullistaa-sulautetun-tekniikan

 

406 Comments

  1. Tomi Engdahl says:

    Using an Edge Impulse tinyML model and an OLED library from Adafruit Industries, Atul Yadav developed a Nano 33 BLE Sense system that detects and displays when a myna bird sings.

    Visual Indication for bird chirping (Mynah song detection)
    https://m.youtube.com/watch?v=9ibp7H7qPnk

    Reply
  2. Tomi Engdahl says:

    ML in Crop Quality and Environmental Tracking: A look at how many #environmental and #agricultural problems can be addressed within the same framework of audio analysis and environmental sensing through #machinelearning. https://bit.ly/3Ab89bf

    A Continuously Sprouting Project: ML in Crop Quality and Environmental Tracking
    https://www.sparkfun.com/news/3915?utm_content=171277991&utm_medium=social&utm_source=facebook&hss_channel=fbp-153488801415
    A look at how many environmental and agricultural problems can be addressed with the same framework of audio analysis and environmental sensing through machine learning!

    Reply
  3. Tomi Engdahl says:

    A FANtastic predictive maintenance project by Edge Impulse ambassador Andri Yadi! This system uses a tinyML model on an Arduino Nano 33 BLE Sense to classify fan operation and identify faults.

    https://m.youtube.com/watch?v=GB6ArqH_eLo&feature=youtu.be

    Reply
  4. Tomi Engdahl says:

    AIfES is a standalone AI framework that allows on-device training without a PC and works on almost any hardware. Even the 8-bit Arduino Uno!

    AIfES is an AI/ML framework written in C for even the smallest microcontrollers
    https://blog.arduino.cc/2021/07/06/aifes-is-an-ai-ml-framework-written-in-c-for-even-the-smallest-microcontrollers/

    Reply
  5. Tomi Engdahl says:

    Recognising Bird Sounds With A Microcontroller
    https://hackaday.com/2021/07/06/recognising-bird-sounds-with-a-microcontroller/

    Machine learning is an incredible tool for conservation research, especially for scenarios like long term observation, and sifting through massive amounts of data. While the average Hackaday reader might not be able to take part in data gathering in an isolated wilderness somewhere, we are all surrounded by bird life. Using an Arduino Nano 33 BLE Sense and an online machine learning tool, [Errol Joshua] demonstrates how to set up an automated bird call classifier.

    The Arduino Nano 33 BLE Sense is a fully featured little dev board that features the very capable NRF52840 microcontroller with Bluetooth Low Energy, and a variety of onboard sensors, including a microphone. Training a machine learning model might seem daunting to many people, but online services like Edge Impulse makes the process very beginner-friendly.

    a massive online library of bird calls from all over the world is available on Xeno-Canto.

    https://hackaday.io/project/180538-bird-sound-classifier-on-the-edge

    Reply
  6. Tomi Engdahl says:

    This device uses machine learning to detect the ripening stages of various fruits and vegetables by spectral color.

    Vegetables and Fruits Ripeness Detection by Color w/ TF © CC BY
    https://create.arduino.cc/projecthub/kutluhan-aktar/vegetables-and-fruits-ripeness-detection-by-color-w-tf-041f92

    Collate spectral color data of varying fruits and vegetables and interpret this data set with a neural network to predict ripening stages.

    Reply
  7. Tomi Engdahl says:

    RECOGNISING BIRD SOUNDS WITH A MICROCONTROLLER
    https://hackaday.com/2021/07/06/recognising-bird-sounds-with-a-microcontroller/

    Machine learning is an incredible tool for conservation research, especially for scenarios like long term observation, and sifting through massive amounts of data. While the average Hackaday reader might not be able to take part in data gathering in an isolated wilderness somewhere, we are all surrounded by bird life. Using an Arduino Nano 33 BLE Sense and an online machine learning tool, a team made up of [Errol Joshua], [Ajith KJ], [Mahesh Nayak], and [Supriya Nickam] demonstrate how to set up an automated bird call classifier.

    https://hackaday.io/project/180538-bird-sound-classifier-on-the-edge

    Reply
  8. Tomi Engdahl says:

    ‘Droop, There It Is!’ is a smart irrigation system that uses ML to visually diagnose drought stress
    https://blog.arduino.cc/2021/07/13/droop-there-it-is-is-a-smart-irrigation-system-that-uses-ml-to-visually-diagnose-drought-stress/

    Reply
  9. Tomi Engdahl says:

    Aksha is a Nano 33 BLE Sense-equipped pencil that uses tinyML to recognize different Hindi letters drawn in the air: https://medium.com/@naveenmanwani/aksha-an-arduino-based-ml-pencil-powered-by-tensorflow-lite-micro-e7ba854f42f3

    Reply
  10. Tomi Engdahl says:

    Using a tinyML model on the Nano 33 BLE Sense, this device automatically detects when someone is snoring and begins to vibrate as an alert.

    The Snoring Guardian listens while you sleep and vibrates when you start to snore
    https://blog.arduino.cc/2021/07/20/the-snoring-guardian-listens-while-you-sleep-and-vibrates-when-you-start-to-snore/

    Snoring is an annoying problem that affects nearly half of all adults and can cause others to lose sleep. Additionally, the ailment can be a symptom of a more serious underlying condition, so being able to know exactly when it occurs could be lifesaving. To help solve this issue, Naveen built the Snoring Guardian — a device that can automatically detect when someone is snoring and begin to vibrate as an alert.

    Reply
  11. Tomi Engdahl says:

    Can tinyML enable you to “hear” the difference between pouring hot and cold water? Marcelo Rovai and Marco Zennaro used the Nano 33 BLE Sense and Edge Impulse to find out.

    “Listening Temperature” with TinyML © GPL3+
    https://create.arduino.cc/projecthub/mjrobot/listening-temperature-with-tinyml-7e1325

    Can we “hear” a difference between pouring hot and cold water? Amazing proof-of-concept by a quick real deployment using Edge Impulse Studio

    Reply
  12. Tomi Engdahl says:

    A̶I̶ A-Thigh. This wearable system counts your squats using a TensorFlow Lite model on the Nano 33 BLE Sense.

    Squats Counter Using TensorFlow Lite and Tiny Motion Trainer © MIT
    https://create.arduino.cc/projecthub/Manasmw333/squats-counter-using-tensorflow-lite-and-tiny-motion-trainer-4d9dcf

    Squats counter that can detect squats by measuring the accelerometer readings and prints it on the Serial monitor.

    Reply
  13. Tomi Engdahl says:

    ICYMI! During last week’s Arm #AITechTalk, Massimo Banzi showed off our machine learning-capable boards, some new tools and frameworks to support your project, and a few example tinyML applications to help get you started.

    https://m.youtube.com/watch?v=mj2J445pdNc&feature=youtu.be

    Reply
  14. Tomi Engdahl says:

    Is it possible for a neural network to “hear” the difference between pouring hot and cold water? Marcelo Rovai and Marco Zennaro decided to use a sound classification model on the Nano 33 BLE Sense to find out.

    “Listening Temperature” with TinyML © GPL3+
    https://create.arduino.cc/projecthub/mjrobot/listening-temperature-with-tinyml-7e1325

    Can we “hear” a difference between pouring hot and cold water? Amazing proof-of-concept by a quick real deployment using Edge Impulse Studio

    Reply
  15. Tomi Engdahl says:

    This tutorial shows how to train an artificial neural network using the AIfES ML framework on a PC and run it on your Arduino board afterwards.

    How to Use AIfES on a PC for Training © GPL3+
    https://create.arduino.cc/projecthub/aifes_team/how-to-use-aifes-on-a-pc-for-training-9ad5f8

    This tutorial shows how to train an artificial neural network with AIfES on a PC and run it on your Arduino board afterwards.

    Reply
  16. Tomi Engdahl says:

    BABL is a tinyML-powered baby monitor that uses a Nano 33 BLE Sense with Edge Impulse’s new EON Tuner to distinguish cries from other household sounds.

    BABL x EON: A Baby Monitor Tuned by Edge Impulse! © MIT
    https://create.arduino.cc/projecthub/ishotjr/babl-x-eon-a-baby-monitor-tuned-by-edge-impulse-b8aa30

    Edge Impulse automatically improves model accuracy with one weird trick!

    In the original BABL project, we achieved 86.3% accuracy without a ton of effort, but what if we could improve that number – and do so without a ton of manual tweaking and training? That’s the concept behind Edge Impulse’s new EON Tuner, which automatically generates the optimal model based on your target and dataset type. Here’s a look at how EON Tuner can dramatically improve accuracy in just a few clicks!

    Reply
  17. Tomi Engdahl says:

    New to embedded machine learning? This tutorial walks you through the process of training a custom model with Edge Impulse and running it on the Portenta + Vision Shield.

    https://docs.arduino.cc/tutorials/portenta-vision-shield/vs-openmv-ml

    Reply
  18. Tomi Engdahl says:

    During the tinyML Foundation EMEA Technical Forum, University of Bologna PhD student Federica Zonzini explored how anomaly detection models on devices like the Nano 33 BLE Sense could be used for vibration-based structural health monitoring.

    https://m.youtube.com/watch?v=RmUQzFTnGGY

    Reply
  19. Tomi Engdahl says:

    Our friends at Edge Impulse recently announced official support for the Arduino Tiny Machine Learning Kit! This makes it easy to acquire images and other sensor data from your Nano 33 BLE Sense and OV7675 camera module, then build, train and deploy your ML model.

    Announcing Support for the Arduino Tiny Machine Learning Kit
    https://www.edgeimpulse.com/blog/announcing-support-for-the-arduino-tiny-machine-learning-kit

    Reply
  20. Tomi Engdahl says:

    Keep tabs on the pH levels a hydroponic plant’s water supply with a Nano 33 BLE Sense and determine if it’s right using an Edge Impulse tinyML model.

    Monitor the pH levels of a hydroponic plant’s water supply with Arduino and tinyML
    https://blog.arduino.cc/2021/09/02/monitor-the-ph-levels-of-a-hydroponic-plants-water-supply-with-arduino-and-tinyml/

    Reply
  21. Tomi Engdahl says:

    AIfES – Inference Tutorial © GPL3+
    This tutorial shows the different ways how to perform an inference in AIfES®.
    https://create.arduino.cc/projecthub/aifes_team/aifes-inference-tutorial-f44d96

    Reply
  22. Tomi Engdahl says:

    Build And Code This AI Self-Driving Car For Less Than $110
    https://www.iflscience.com/editors-blog/build-and-code-this-ai-selfdriving-car-for-less-than-110/

    The car has a camera with built-in Bluetooth and Wi-Fi so you can connect it to other devices wirelessly. Wheelson also has an LCD display that allows you to see what the robot is viewing. It’s powered by four small electromotors and there’s a rechargeable Li-Po battery included.

    Reply
  23. Tomi Engdahl says:

    Tauno Erik used tinyML on the Nano 33 BLE Sense to create a keyword spotting system that lights up the corresponding LED.

    https://m.youtube.com/watch?v=GdRD4VHg54Q

    Reply
  24. Tomi Engdahl says:

    TinyML on ajankohtainen aihe Telenor Research -tutkimusyksikön tutkimusohjelmassa. Käyttökohteita on loputtomasti aina autoista maatalouteen ja joukkoliikenteestä matkailuun. Mutta mitä TinyML tarkoittaa ja miksi sitä pidetään vallankumouksellisena? Lue lisää artikkelista!

    Kuinka tehdä pienistä antureista älykkäitä? – Telenorin tutkimusyksikkö työstää seuraavaa vallankumouksellista IoT-innovaatiota
    TinyML on ajankohtainen aihe Telenor Research -tutkimusyksikön tutkimusohjelmassa monestakin syystä. Se on täydellinen pari NB-IoT:lle, joka on IoT:lle optimoitu tietoverkko. Lisäksi sillä on loputon määrä käyttökohteita aina autoista maatalouteen ja joukkoliikenteestä matkailuun. Mitä TinyML siis tarkoittaa ja miksi sitä pidetään vallankumouksellisena?
    https://www.dna.fi/yrityksille/blogi/-/blogs/kuinka-tehda-pienista-antureista-alykkaita-telenorin-tutkimusyksikko-tyostaa-seuraavaa-vallankumouksellista-iot-innovaatiota?utm_source=facebook&utm_medium=linkad&utm_content=ILTE-artikkeli-kuinka-tehda-pienista-antureista-alykkaita-telenorin-tutkimusyksikko-tyostaa-seuraavaa-vallankumouksellista-iot-innovaatiota&utm_campaign=H_MES_21-36-40_artikkelikampanja&fbclid=IwAR2GJX07CvOyxHsx9tez3vU-F24GL6dXVbkgUNdpM4sRHxeGzIDG3VlpWsE

    Reply
  25. Tomi Engdahl says:

    A tinyML application with huge potential for EVs! Quickly predict a lithium-ion battery’s lifecycle using the Nano 33 BLE Sense and Edge Impulse.

    Battery Life Cycle Predictor Powered by Edge Impulse © GPL3+
    https://create.arduino.cc/projecthub/manivannan/battery-life-cycle-predictor-powered-by-edge-impulse-820ab8

    A TinyML model to predict the Lithium Ion battery’s life cycle within shorter time using Edge Impulse.

    Reply
  26. Tomi Engdahl says:

    Overengineering A Smart Doorbell
    https://hackaday.com/2021/09/12/overengineering-a-smart-doorbell/

    Fresh from the mediaeval splendour of the Belgian city of Gent, we bring you more from the Newline hacker conference organised by Hackerspace Gent. [Victor Sonck] works at the top of his house, and thus needed a doorbell notifier. His solution was unexpected, and as he admits over engineered, using machine learning on an audio stream from a microphone to detect the doorbell’s sound.

    https://www.youtube.com/watch?v=GMLE4b0AKFE

    Reply
  27. Tomi Engdahl says:

    Jeremy Ellis wanted to see if the Arduino Portenta could recognize itself using tinyML.

    And the answer is…
    twitter.com/rocksetta/status/1438910453351022592

    Reply
  28. Tomi Engdahl says:

    Use this tinyML tool to identify poison ivy on your next hike!

    Poison ivy have you on edge? This is poison ivy at the Edge!
    https://www.hackster.io/justinelutz/poison-ivy-have-you-on-edge-this-is-poison-ivy-at-the-edge-6d9975

    Poison ivy causes painful rashes & is found in places with poor reception (forests). Use this edge ML tool to ID poison ivy on your hike!

    Reply
  29. Tomi Engdahl says:

    TinyML Aerial Forest Fire Detection © MIT
    Detecting forest fires with a long-range autonomous airplane equipped with satellite communication.
    https://create.arduino.cc/projecthub/team-sol/tinyml-aerial-forest-fire-detection-78ec6b

    Reply
  30. Tomi Engdahl says:

    Syntiant Partners with Edge Impulse for NDP101-Powered Arduino-Compatible TinyML Development Board
    https://www.hackster.io/news/syntiant-partners-with-edge-impulse-for-ndp101-powered-arduino-compatible-tinyml-development-board-8116a480c347

    Compact dev board can run speech applications at under 140 microwatts and sensor applications at under 100 microwatts, the company claims.

    Reply
  31. Tomi Engdahl says:

    This paper presents the successful first attempt at deploying federated transfer learning to a resource-constrained microcontroller, which in this case was a Nano 33 BLE Sense: arxiv.org/pdf/2110.01107.pdf

    Reply
  32. Tomi Engdahl says:

    In his latest Eloquent Arduino post, Simone Salerno has put together a basic tutorial to help you create a gesture recognition system using Arduino.

    https://eloquentarduino.github.io/2021/10/arduino-gesture-recognition-the-easy-way-with-machine-learning/

    Reply
  33. Tomi Engdahl says:

    Performing gesture recognition on Arduino boards can be very straighforward and is composed by 2 steps: capture data to train a Machine Learning model on your PC, then export that model to plain C++ back into your board to predict the gestures. It only requires a very small amount of work on your side with the Eloquent Arduino software.
    https://eloquentarduino.github.io/2021/10/arduino-gesture-recognition-the-easy-way-with-machine-learning/

    Reply
  34. Tomi Engdahl says:

    TensorFlow Lite for MCUs is AI on the Edge
    By Michael Parks for Mouser Electronics
    https://www.mouser.com/empowering-innovation/ai?utm_source=endeavor&utm_medium=display&utm_campaign=ed-personifai-eit-ai-#article2-ai

    TensorFlow, a Google-led effort, is a set of open-source software libraries that enable developers to easily integrate complex numerical computation algorithms and machine learning (ML) into their projects (Figure 1). According to Google, these libraries provide stable application programming interfaces for Python (Python 3.7+ across all platforms) and C. Also, they provide APIs without backward compatibility guarantees for C++, Go, Java and JavaScript. Additionally, an alpha release is available for Apple’s Swift language.

    TensorFlow offers so-called end-to-end machine learning support for the development and utilization of deep neural networks (DNN). DNNs are an implementation of ML that are particularly adept at pattern recognition and object detection and classification. TensorFlow libraries support both phases of the machine-learning process, which are training and inferencing. The first is the training of deep neural networks that requires significant computing horsepower typically found in server-grade hardware and graphical processing units (GPUs). More recently application-specific integrated circuits known as Tensor Processing Unit (TPUs) have been developed to support the training efforts. The second phase, inferencing, is utilizing the trained DNNs in the real-world to respond to new inputs and make recommendations based on the analysis of those inputs against the trained models. This is the phase that should be of keen interest to embedded product developers.

    The release of TensorFlow Lite for Microcontrollers (a subset of the TensorFlow libraries) is specifically geared for performing inferencing on memory-constrained devices typically found in most embedded systems applications. It does not allow you to train new networks. That still requires the higher-end hardware.

    Reply
  35. Tomi Engdahl says:

    A winner of the TensorFlow Lite for Microcontrollers Challenge, Naveen Kumar’s device can be embedded in a pillow to listen throughout the night and vibrate to alert the sleeper when snoring is heard.

    https://blog.arduino.cc/2021/10/20/the-snoring-guardian-listens-while-you-sleep-and-vibrates-when-you-start-to-snore/

    Reply
  36. Tomi Engdahl says:

    Little Device, Big Data
    https://www.hackster.io/news/little-device-big-data-d2660d2bc8da

    Resource-constrained devices can continually learn from large datasets, while preserving data privacy, with TinyFedTL.

    Reply
  37. Tomi Engdahl says:

    This system highlights the potential of tinyML for the performing arts.

    Mapping Dance syncs movement and stage lighting using tinyML
    https://blog.arduino.cc/2021/10/22/mapping-dance-syncs-movement-and-stage-lighting-using-tinyml/

    Being able to add dynamic lighting and images that can synchronize with a dancer is important to many performances, which rely on both music and visual effects to create the show. Eduardo Padrón aimed to do exactly that by monitoring a performer’s moves with an accelerometer and triggering the appropriate AV experience based on the recognized movement.

    Reply
  38. Tomi Engdahl says:

    Looking for a project to get started with machine learning on Arduino? Simone Salerno shows how to easily create a gesture recognition system using the Nano RP2040 Connect’s built-in accelerometer.

    https://eloquentarduino.github.io/2021/10/arduino-gesture-recognition-the-easy-way-with-machine-learning/

    Reply
  39. Tomi Engdahl says:

    RISC-V-Based VEGA Brings Continual Learning to TinyML with an Order of Magnitude Efficiency Gain
    Based on the PULP Platform, this RISC-V chip could offer hundreds of thousands of hours of operation on a single battery charge.
    https://www.hackster.io/news/risc-v-based-vega-brings-continual-learning-to-tinyml-with-an-order-of-magnitude-efficiency-gain-6189df9bb365

    Reply
  40. Tomi Engdahl says:

    TinyML Dog Bark Stopper © GPL3+
    An fun and simple project that uses TinyML to detect and respond to dog barks.
    https://create.arduino.cc/projecthub/NathanielF/tinyml-dog-bark-stopper-77e436

    Reply
  41. Tomi Engdahl says:

    Using Simone Salerno’s EloquentTinyML library, you only need a few lines of code to perform state-of-the-art person detection on the Portenta Vision Shield.

    https://eloquentarduino.github.io/2021/11/3-lines-of-code-person-detection-on-arduino-portenta-and-esp32/

    Person Detection on Arduino Portenta Vision Shield and ESP32 with Just 3 Lines of Code
    10 NOVEMBER 2021
    Have you ever wanted to perform person detection on Arduino or ESP32 boards? Found it difficult to custimize the sketch to suit your needs? Fear no more!

    Person detection on Arduino and ESP32 microcontrollers doesn’t have to be difficult: with the right library, you only need 3 lines of code to perform state-of-the-art person detection.

    EloquentTinyML is my library to run TensorFlow Neural Networks on Arduino boards without hassle.

    EloquentTinyML both supports ARM Cortex-M (Arduino Portenta, Arduino Nano variants, STM32) boards and ESP32 boards.

    I tested person detection on two different boards: Arduino Portenta + Vision Shield and M5Stack fisheye camera.

    As promised, you will only need 3 lines of code to run person detection at the bare minimum:

    Instantiate the detector
    Configure the detector
    Feed an image to the detector and get response back
    The following is a working sketch that counts 60 lines of code including comments, debug messages and security checks!

    Reply
  42. Tomi Engdahl says:

    OpenGL Machine Learning Runs On Low-End Hardware
    https://hackaday.com/2021/11/13/opengl-machine-learning-runs-on-low-end-hardware/

    If you’ve looked into GPU-accelerated machine learning projects, you’re certainly familiar with NVIDIA’s CUDA architecture. It also follows that you’ve checked the prices online, and know how expensive it can be to get a high-performance video card that supports this particular brand of parallel programming.

    But what if you could run machine learning tasks on a GPU using nothing more exotic than OpenGL? That’s what [lnstadrum] has been working on for some time now, as it would allow devices as meager as the original Raspberry Pi Zero to run tasks like image classification far faster than they could using their CPU alone. The trick is to break down your computational task into something that can be performed using OpenGL shaders, which are generally meant to push video game graphics.

    [lnstadrum] explains that OpenGL releases from the last decade or so actually include so-called compute shaders specifically for running arbitrary code. But unfortunately that’s not an option on boards like the Pi Zero, which only meets the OpenGL for Embedded Systems (GLES) 2.0 standard from 2007.

    Constructing the neural net in such a way that it would be compatible with these more constrained platforms was much more difficult, but the end result has far more interesting applications to show for it. During tests, both the Raspberry Pi Zero and several older Android smartphones were able to run a pre-trained image classification model at a respectable rate.

    Towards GPU-accelerated image classification on low-end hardware
    https://medium.com/analytics-vidhya/towards-gpu-accelerated-image-classification-on-low-end-hardware-ec592e125ad9

    Reply
  43. Tomi Engdahl says:

    Machine Learning Shushes Stressed Dogs
    https://hackaday.com/2021/11/15/machine-learning-shushes-stressed-dogs/

    He trained a TinyML neural net to detect when she barked and used and Arduino to play a sound byte to sooth her. The sound bytes in question are recordings of [Nathaniel]’s mom either praising or scolding [Clairette], and as you can see from the video below, they seem to work quite well. To train the network, [Nathaniel] worked with several datasets to avoid overfitting, including one he created himself using actual recordings of barks and ambient sounds within his own house. He used Eon Tuner, a tool by Edge Impulse, to help find the best model to use and perform the training. He uploaded the trained network to an Arduino Nano 33 BLE Sense running Mbed OS, and a second Arduino handled playing sound bytes via an Adafruit Music Maker Featherwing.

    TinyML Dog Bark Stopper
    An fun and simple project that uses TinyML to detect and respond to dog barks.
    https://www.hackster.io/NathanielF/tinyml-dog-bark-stopper-77e436

    Reply
  44. Tomi Engdahl says:

    Based on a Nano 33 BLE Sense, this prototype wristband doesn’t need a high-power radio and nearby computer to run a gesture-detecting CNN.

    TinyML Gives an Arduino Nano 33 BLE Sense Wristband High-Speed, Low-Power Gesture Recognition
    https://www.hackster.io/news/tinyml-gives-an-arduino-nano-33-ble-sense-wristband-high-speed-low-power-gesture-recognition-0e23b3ee1ebd

    Built using low-cost parts, this prototype wristband doesn’t need a high-power radio and nearby computer to run a gesture-detecting CNN.

    Reply

Leave a Comment

Your email address will not be published. Required fields are marked *

*

*