HomeRaspberry PiM5Stack UnitV2 is the easiest way learn to AI vision and Edge...

M5Stack UnitV2 is the easiest way learn to AI vision and Edge Computing

Probably the easiest way to get started with AI vision projects

It’s remarkable, how quickly technology moves. A few years ago, AI vision development kits would cost you an arm and leg and have a footprint of a mini-computer, but in 2021, these things can be enclosed in devices barely bigger than your thumb. M5Stack UnitV2 is just that – an AI vision and Edge Computing platform that you can put inside your pocket. If you ever wanted to try AI vision and play with Edge computing, it’s the easiest way to try things out.

M5Stack UnitV2

M5Stack UnitV2

You won’t believe how small it is until you see one in your hands. It’s an ARM-based dual-core Cortex A7 device clocked at 1.2Ghz wrapped in a shell so small that raised the doubts of my colleagues about its stand-alone AI capabilities. Rest assured, it can! A closer look at the spec sheet reveals a really powerful device, and the departure from ESP32 chips usually present in other M5Stack development boards:

AI coreSigmstar SSD202D
CPUDual Cortex-A7 1.2Ghz Processor
RAM128MB DDR3
Storage512MB NAND Flash + microSD
CameraGC2145 1080P Colored Sensor
WiFi150Mbps 2.4GHz 802.11 b/g/n
OtherMicrophone, USB-C, Grove connector, microSD card reader

M5Stack continues to innovate. M5Stack UnitV2 is slightly bigger than M5StickC PLUS (review) aptly called by me: the most featured ESP board per mm². This board can singlehandedly capture images, resolve them using AI models uploaded to the board and deliver JSON formatted results over WIFI or USB-C.

I expect these to sell out like fresh buns, so here are official stores across different pages where you can try to get yours:

AI vision and Edge Computing

M5Stack has a strong history of making programming as accessible as possible. Their UIFlow turns to tinker with ESP devices into wireless magic thanks to OTG programming and built-in batteries. Playing with M5Stack UnitV2 is no different. To get started, all you need is a web browser.

The web server created by the device (10.254.239.1 or unitv2.py) will load the camera interface with basic settings, a preview of the output and over a dozen operational modes to test with your camera straight away.

  • Audio FFT
  • Code Detector
  • Face Detector
  • Lane Line Tracker
  • Motion Tracker
  • Shape Matching
  • Camera Stream
  • Online Classifier
  • Color Tracker
  • Face Recognition
  • Target Tracker
  • Shape Detector
  • Object Recognition

At the time of the writing, there is a glitch in the firmware which disabled the WiFi on my unit, but I reached out to the M5Stack team and they are aware of the issue. This should be resolved soon. For the time being, I’m using the USB-C to access the interface.

M5Stack UnitV2 connects via USB-C and creates WiFi AP which you can connect to and configure to your liking. Then the data stream is available via serial port or

Web interface

If you load up the web interface, you will see access to all tools available on the device. The web interface can be used to troubleshoot and aid your development, as for the most part, you will be working with JSON formatted results. If you are new to JSON, I have a quick guide here to get you started.

M5Stack UnitV2 interface

The interface lets you try and preview all listed above mode and monitor the payloads submitted by M5Stack UnitV2 in real-time. It’s a great tool to visualise what M5Stack UnitV2 sees and refine the AI vision process in your code.

Extra options allow you to upload your training models and work with more than one. Be aware that depending on the model and calculations this can slow down the rate at which M5Stack UnitV2 delivers the results.

M5Stack UnitV2 will display a resized 480p stream in the web browser and draw up overlays indicating found shapes and objects by the AI training models. Included models are pretty fast and update at 15fps. It’s pretty impressive if you consider that the device is barely bigger than your thumb. It displays the capability of the device pretty well. For more specific tasks you are expected to create your own recognition models.

The video stream is available through the web browser and you can access it through:

http://unitv2.py/video_feed
or
http://10.254.239.1//video_feed

Jupiter

M5Stack UnitV2 - jupiter

Jupiter is a web-based notepad/IDE which you can use to quickly test or modify the behaviour of your code without going back and forth between M5Stack UnitV2 and the computer. It’s handy for smaller edits, but if you want to take full advantage of AI vision, use proper IDE for MicroPython & Python 3.8.

M5Stack has a basic guide for Jupiter too so you won’t feel lost if you never used it before. Feel free to try their basic examples to understand the programming flow and get a better grasp of things. If Jupiter isn’t sufficient, you can always SSH into M5Stack UnitV2 and interact with it like with any other Linux machine.

ssh m5stack@10.254.239.1 
//user: m5stack
//pwd: 12345678

//user: root
//pwd: 7d219bec161177ba75689e71edc1835422b87be17bf92c3ff527b35052bf7d1f

AI training

The development board is only as good as the tools given to work with. For ESP32 based boards, we have UIFlow to turn programming into OTG magic, but for M5Stack UnitV2 we have something cool too! A dedicated training engine that lets you train image recognition to achieve specific goals.

M5Stack UnitV2 - AI training

While the training process is relatively simple: upload and annotate the pictures, download a training model file. Digital elbow grease and time is required. Bigger the picture sample, the better the training model is. Online guides suggest batches of 30-50 images to use and annotate to get reasonably good training models.

M5Stack V2 training module is browser-based and creates the training models for you, but the task of collecting the right pictures and annotating the targets correctly lays with you. It’s a possible time sink, but if you want to get great results, expect to put the time in.

There are built-in tools that will speed up annotating process (COCO SSD) which use rectangles to mark up objects on your uploaded pictures, but you are in charge of assigning the labels to it. You will need at least 30 pictures, the process takes about 10 min and it’s recommended that pictures are of similar quality to what your testing scenario will look like.

M5Stack UnitV2 - interface

I won’t dive into much details in this article, as I’m planning an entire project that will show you how to use AI vision to order your M5Stack UnitV2 now and follow along when the project is out!

What can you do with it?

The use of AI vision is unlimited. M5Stack UnitV2 is too slow to turn your car into a driverless machine, but there are many projects that you can use this platform for. I have a couple of ideas that I would like to try down the line once I feel more comfortable with it:

M5Stack UnitV2
  1. AI Doorbell – with the ability to recognise faces, couriers and possibly unwanted guests
  2. RC car – it may not be fast enough for real cars, but you can try with RC cars
  3. Sorting systems – sorting items by shape, colour etc
  4. and more

If you would like to see how M5Stack UnitV2 handles traffic recognitions, I have uploaded the full video to my 2nd YouTube channel. You will be able to compare different training models and the speed it is able to process the data.

Final thoughts

I’m waiting for the WiFi fix, as I want to take the camera to my garden. I would like to train the AI to recognise the birds feeding of my feeder and log the feed event times for each species to find out what birds eat the most of the grain and nuts. For that, I will need WiFi to interact with the camera remotely. M5Stack UnitV2 (official, AliExpress, Banggood) is a great little device to get you started with AI vision. At $75 it’s reasonably priced and if the support as good as it has been with M5Paper (review), we should expect the fixes soon. In the package from M5Stack, I also got Core Ink (review pending) and Core2 (review pending) devices, which I’m going to covers soon. What would you use M5Stack UnitV2 for? Let me know in this Reddit thread.

🆓📈 – See the transparency note for details.

PayPal

Nothing says "Thank you" better than keeping my coffee jar topped up!

Patreon

Support me on Patreon and get an early access to tutorial files and videos.

image/svg+xml

Bitcoin (BTC)

Use this QR to keep me caffeinated with BTC: 1FwFqqh71mUTENcRe9q4s9AWFgoc8BA9ZU

M5Paper

Programable, ESP32 based awesome dev platform with 4.7 e-ink display by M5Stack

More HATs

client-image
client-image

Argon One M.2

Enclose Raspberry Pi 4 inside this great case with custom I/O, cooling and GPIO and M.2 SSD support

More cases on

client-image
client-image

Best Raspberry Pi Projects

How to use Raspberry PI as WOL (wake on lan) server

0
While you could wake up your PC from a mobile directly, having a dedicated server capable of doing so is the best solution. The reason is simple. You can hook up as many devices as you wish with a single endpoint. This is why Raspberry Pi is perfect for this.

How fast Raspberry Pi NAS is?

0
Let's see how fast Raspberry Pi NAS really is?

Argon18: Argon ONE SSD modification

0
Argon One case just got better - now you can boot it from USB without ruining the design thanks to Argon 18: Argon One SSD modification

Slow Internet Warning

0
From time to time my Internet grinds to a stop. Since Raspberry Pi 4 comes with a 1Gbps Ethernet, I decided to take advantage of it and create a reporting system in NodeRED that will monitor and report when the ISP is not keeping the contractual agreements. Works with Alexa, Google Home, Android and Windows 10.

HOW TO...

C/C++ and MicroPython SDK for Raspberry Pi Pico on Windows

0
A guide to SDK toolchain for Raspberry Pi Pico and C/C++ , Micropython on Windows.

A comprehensive guide to Grafana & InfluxDB

0
How to use Grafana and InfluxDB on Raspberry Pi for IoT sensors in home automation

How to boot Raspberry Pi 4 from USB

0
How to set up and boot Raspberry Pi 4 from USB drive - headless guide.

Raspberry Pi Zero, Raspberry Pi 3A+ WiFi without keyboard or cables

0
Getting Raspberry Pi boards connected in seconds

Everything you need to start using JSON

0
It's JSON not Jason, just saying!