Béa by MAIF


3D Printing

Rhinoceros 3D

Laser Cut


Anne-Sophie Charroin


Béa is a project made in partnership with the MAIF (a french insurance company). The pitch was to create a connected object linked to old people dependencies. After an on field study and user interviews. We decided to focus on a service working with a connected object using voice recognition. The idea was on one hand to allow the people close to the dependent person to organize themselves in order not to always be the same to take in charge the person and on the other hand allowing the dependent person to manage it’s own needs.

Initial Brief

The idea of this project was for the MAIF to create a new service around the silver economy and the IoT. The target was elder people and how to allow these peoples to stay autonomous at home the longest time warrantying their own safety.

On Field Study

The three firsts weeks of the project were dedicated to a field study. For this we met different kind of people: dependent people staying at home, autonomous people, auxiliary nurses etc. From this study we realized two personas that represented quite well our investigation.


We developped 2 concepts:

A connected object allowing a dependant person to ask help to his entourage thanks to voice interaction.
The idea from this concept was to promote good deed, socialization, and mutual assistance.

Our second thougt was to allow the person to anticipates the venue of professional assistant all day long in order to avoid frustration.
Here we wanted to enhance non-verbal communication, sensibility and mutual consideration.

Finally we chosed to work with the people close to the dependant person in order to preserve self esteem and non stigmatizing relationships. So we created Béa.


As a personnal challenge I wanted to create a working prototype. So using a raspberry pi and some node.js, I was able to use google speech to text API to send voice actions from our object to our companion app.
The logic behind the object is quite simple. The server gets a string from the speech recording and then analyze it. It triggers some specifics words such as "I need" or "I want", when "at 10am","next monday" and what "carrots", "medical appointment".
From that, the person with the app who wants to take care of one of the different just pick it in the list so other persons knows someone is dealing with it.

User Tests

In order to prove our concept we made a user test with Monique, an 90 years old women with a lot of professional help. We wanted our object to have simple interactions and had to admit that we missed some points in the developpement and had to rethink some of the usage such as: How to cancel a voice command ? How to avoid two times the same need if the person forgets that she already made it ?

Setup and explanation of the service to Monique


It was a very interesting project because working for elder people is not as easy as we could think, they don't have the same approach of technology and/or IoT as most of the youngest generation so we have to think about new interactions, not robot like services etc.
Our concept still needs work and is clearly not able to be distributed on the market but allowed us to go further than most of our usual project.