Lena Ebner Logo

Can I joke on you?

This application was build for a course during my semester abroad in Barcelona at the UPC. It combines emotion detection with a recommender system, that then tries to adapt to the user to make him smile more often. Therefore, we implemented a Next.js application, that does face recognition with MediaPipe Task Vision on the user's face. We use the FaceBlendShapes from MediaPipe and based on them we calculate the smile degree of the user. In order for it to be more accurate and adapted to the user, we first do a calibration phase, where we require the user to look once neutral and once with a big smile into the camera. The main part of the application is an infinitive loop, where jokes are told to the user and the smile degree is calculated. Optional the Text-to-Speech Javascript Web API is implemented to read aloud the jokes. For each joke the smile degree is calculated and send to a recommender system. The recommender takes a Q-learning based approach to further recommend jokes to the user from categories he likes. We did a HRI user study with "Can I joke on you?", testing the application with 21 participants and evaluating our approach.

Project: Can I joke on you?