Ellie Displays is a concept where Ellie can display emotions and interactions. It is her form of feedback to the people who walk through Ellie. We use a grid of 7938 RGB LEDs attached to the ceiling in Ellie to show the feedback to the persons. The interactions can be anything, from showing Ellie’s emotion in color to displaying what she hears. In our proof of concept we made her react to sound, made her show video’s and made us draw on the display through an app.
Ellie API
Since we think it should be easily accessible to use the LEDs for any project we started writing a RESTful API with which we can display anything on the LED array in an easy way. The LEDs could previously only be controlled with ArtNet (which is DMX over IP). ArtNet is powerful but not easy to learn and converting an image to the right UDP packages is a lenghty process. Thats why we decided to write the API to make it as simple as possible. The most used option is the RESTful API option which sends a HTTP request to the server with XY and RGB data. Although this option is meant to do simple things like send only one image or blackout everything it’s fast enough to display realtime GIFS and images. For better performance we also added the option to use websockets.
For now the Ellie API can only control the LED ceiling but it’s built modular so we can add any actuator we want in the future.
Exploration Experiments
To see wat we could do with the API we made a drawing app with which you can draw live on the LED ceiling. It is compatible with mobile devices as well as desktop browsers. We also made it possible to send GIFS to the LEDS and let it interact with you voice. More recently we started an experiment where we let people push a big red button and sometimes a video would display on the LED ceiling.