In this post, we describe the development of a further demonstrator for an exhibition we are planning using a Raspberry Pi. The Digital Environment team are planning to be in support of NERC in attending the American Association for the Advancement of Science (AAAS) annual conference and exhibition in Seattle, USA in February 2020, and are making preparations. One of our tasks is to develop for this a stand-alone technical demonstrator, showcasing British science and technology – and in doing this what could be better than the use of a Raspberry Pi. We will develop a digital demonstrator employing facial expression detection on a Raspberry Pi with an attached Pi camera, using the Xnor AI2GO libraries..
The project and its construction is described fully on GitHub here: https://github.com/rendzina/FacialExpression
In brief this project is to develop a Raspberry Pi demonstrator using the Xnor AI2GO library to allow facial expression detection from an attached Pi v2 Camera, and for it to run unattended (no attached keyboard/mouse), changing a ‘face’ graphic on an attached Raspberry Pi ‘Pi Moroni HyperPixel’ display. As the camera detects faces with differing expressions, the image onscreen changes accordingly.