Meet R3–14, My Personal Assistant

R3–14 is a robot personal assistant which answers questions, helps with daily tasks and brightens the mood.

Over the last two years, during my school holidays, I have been working on a robot assistant using the Raspberry Pi and personally, it has been a great learning experience as well as having an immense sense of satisfaction as I saw my idea come to life.

When the idea of making a small robot assistant popped into my mind, it was just a stab in the dark. I wanted to incorporate multiple features to learn more about electronics, 3D design & manufacture and improve my Python programming.

Before diving into the details, here is a breakdown of my project:

  • Google Assistant

  • SiriControl Integration

  • Web App

  • Face Tracking

  • 433MHz RF Control

  • 3D Printed PLA Body

  • RGB LED eyes

In the beginning, I created an aluminium prototype, after designing my initial idea in Tinkercad. And then the fun began…

At first, I implemented the Jasper voice platform but with it having limitations, I switched to the Google Assistant SDK, when it was released, which had far better voice recognition and responses.

Also, I integrated SiriControl with the Google Assistant so that for all physical actions and commands, both Siri and Google can be used to call them. SiriControl is a python framework which allows the retrieval of user-initiated commands from Siri.

And then the face recognition 👋 :

To add a human touch, I used RGB LED eyes which change colour according to the Google Assistant events so that the user can understand the current state such as speaking, listening, loading etc. This was done by my RGB LED library which allows smooth colour transitions. I created this library because at the time I couldn’t find one online. 😄

For the home automation aspect , I used 433MHz transmitters and receivers to record and replay back remote codes. Along with the web app, this process turns out to be much cheaper than buying smart bulbs and sockets, with the Raspberry Pi acting as the hub.

Overtime, as my idea became clearer, I wanted a simpler, friendlier design which lead to the following:

I also created a web app, using the lightweight web server Lighttpd, which is shown in the video, with a range of features including:

  • Automatic/manual mode for remote control and face tracking

  • Home automation with devices that can be toggled on and off

  • Live streaming of webcam feed for remotely controlling the robot

  • Speaking text with various (inaccurate but funny 😏) accents, using eSpeak

This has been an incredible journey for me, through which I have learned a variety of skills such as the importance of design and prototyping, electronic basics and 3D design and printing. On the way, I developed the RGB LED library and the SiriControl framework, which was great fun.

It all started when I got a Raspberry Pi for my birthday a few years ago. I wanted to create a personal assistant, which would have a fun, friendly design with a human touch.

Although it would not be a viable consumer product, I feel people become more emotionally attached with human-like features, such as the RGB LED eyes.

‘How can I help?’

On an ending note, always imagine the satisfaction you will feel after achieving your goal, and you will never have to search for motivation. 👍