Ricardo's Place Robotics, machine learning, or simply random thoughts!

Bye bye, LinkedIn

Today, I decided to close my LinkedIn account. I’d been mulling over this idea for a while, trying to weigh the PROS/CONS. The final result was that I couldn’t see much value on having that account, and I also realized I should be posting things on my own website instead :smiley:

I still have my twitter account (@rdeazambuja78), but I’m thinking about shutting it down as well.

Variable Power Supply, with Current Limitation, from Commercial Off-the-Shelf parts... powered by USB-C!

From time to time I have a project with some electronics that need testing. This weekend I was checking how to power my Maple Syrup Pi Camera with a solar panel. However, prototypes always have a chance of generating the magic smoke, so it’s nice to be able to limit the current to avoid that fate. In addition to that, I already have a fancy soldering iron that is power by USB-C, so why not a cordless power supply powered by USB-C too? Below you can see the result from my weekend tinkering :sweat_smile:.

Digital Variable Power Supply, with Current limitation, powered by a USB-C power bank.


Studying Flatbuffers to play with TFLite models

I may write another post in the near future about this, but for now it will be yet another very-short-post™ :wink:. I’m working with Tiny ML (or Edge AI or simply trying to run complex stuff on not-so-great hardware) and, currently, my focus is on Google Coral EdgeTPU. In general, I like Google, TensorFlow, etc, but a lot of the things they release are badly documented (or the documentation is just plain outdated) and others simply overcomplicated (ok, it may be useful when many people work on the same codebase…). Sometimes, I even think this is some sort of business strategy because a gigantic company like Google couldn’t do these things by mistake, but who knows. So, back to TFLite models, most of the users know they are Flatbuffers, but it’s so annoyingly hard to make simple things because you can’t find proper documentation (a Google search should ALWAYS return perfect results related to Google stuff, shouldn’t it????).


How to use VSCode remotely to edit files on your Raspberry Pi

This is yet another very-short-post™. I really like VSCode because I think it speeds up lots of things. However, when I developing stuff on the Raspberry Pi, I would keep moving files back & forth or I would just use vim. So, today I decided to google a little bit and I found a simple solution: sshfs


Using Google Coral Edge TPU USB accelerator to create a virtual (fake) webcam

Anynomized Webcam :)

After all that story about the lawyer cat, I decided to try to make something interesting to use during webinars, virtual meetings, etc. With the help of Google Coral Edge TPU USB Accelerator, it’s possible to run deep neural models, with very high framerate, without the need of a GPU (and without all the noise coming from the colling fans). Above, I’m using segmentation to transform myself into some sort of semi-invisible blob while showing the results from PoseNet.

If you want to try it, you will need a computer running Linux and a Google Coral Edge TPU USB Accelerator.


No more Disqus comments in this blog

My apologies. I seem to have totally ignored the existence of the Disqus comments in this blog. Yesterday, I randomly read some article showing all the tracking Disqus does on people and I didn’t like that. I was using it just because I thought it was an easy way to have comments that looked like a forum. So, until I find a new way to add comments to this blog without the risk of people being tracked, I will keep this blog without comments. If you have a suggestion, please, get in touch.

Creating a docker container from a Raspberry Pi Zero image... and the other way around

I’m a big fan of the Raspberry Pi Foundation and a user of their single-board computers as well. In the past two years, I worked developing tiny, under-250g, collision resilient quadcopters that had as the main computer a Raspberry Pi Zero W (RPI Zero W). The reasons why I chose the RPI Zero W were size/weight, power consumption, price and the huge community of users. I even considered to use the Banana Pi Zero because it had a faster CPU with more cores, but I gave up in favor of the RPI after talking to a friend that was struggling to set it up. Nowadays, I’m starting a new project on smart IoT sensors that, I hope, will help businesses in the tourism sector to recover faster by understanding the flow of the tourists while respecting people’s privacy. For that reason, I will need a hardware that is low power, small, resonably priced and with good support… the RPI Zero W was the first thing that came to my mind, but it is not powerful enough for some on the edge image processing I’m planning to do. One way to speed up things is to directly compile them for (on) the RPI Zero W. Currently, it’s possible to use cross-compilers, but I was having trouble to cross-compile the TensorFlow Lite runtime library and that’s why I’m writing this post.


Labeling images directly on Google Colab (colaboratory)

TL; DR: I like Google Colab (Colaboratory) and I use it quite a lot because that way I can work during the night without waking up my wife with my laptop’s crazily loud gpu fan noises. Not so long ago I wrote a post where I shared two notebooks that allowed the user to save images and sounds directly from your webcam / mic to use inside a colab notebook.

Now, I put everything together in a python module and I added a super cool way to label images directly from a colab notebook! I’m not 100% sure, but I couldn’t find anything like that after googling a lot.

Here is one example where I added some labels to an image captured from my webcam using a colab notebook:

colab_utils labeling example

For more details I suggest you to go straight to the colab_utils repo.

Creating Singularity containers from local Docker images

TL; DR: Singularity containers are like Docker containers that don’t force you to be root to run them. Ok, if you want a better explanation, I suggest this presentation or just try searching for it.


Interactively playing with MNIST

My interactive MNIST toy running on a Jupyter notebook

The very first example you use to introduce neural nets to students nowadays is always something based on MNIST handwritten numbers. Therefore, I decided to create an interactive notebook where you can directly draw your digits to test your brand new trained neural net.