In particular, the idea was that our software would enable Labster to recognize in real-time the facial gestures of the students as they would work through virtual lab experiments. Michael Bodekaer (Labster): We talked about incorporating Faceshift’s software into Labster’s online tools for teaching science. What was the opportunity? And how did you end up applying for the EuroStars grant? Brian Amberg, Co-founder and CTO of Faceshift, at the Demo Session organized by the ETH Founders Community. Michael Bodekaer, Co-founder and CTO of Labster trying out Faceshift’s software under the instruction of Dr. He was very interested in what we were doing and after a follow-up meeting, we found a great opportunity for collaboration. Michael Bodekaer, Co-Founder and CTO of Labster, was a friend of one of the attendees, and was the first to volunteer to try out our software. It’s a casual monthly event organized by the ETH Founders Community where a start-up, in this case us, presents their company to fellow founders, most of them also from the Technopark Zürich. Thibaut Weise (Faceshift): In February Faceshift presented at a demo session at the Technopark Zurich. How did you meet and how did you decide to collaborate? This freedom allows them to experiment and make mistakes without creating a hazardous or costly situation. Students at the high school or college level gain confidence and become comfortable with the lab environment, while learning on their own time and at their own pace. In particular, we develop virtual laboratories that simulate real labs in every aspect. Our mission at Labster is to pioneer online tools for teaching science.
When I found that available research on the subject supported my observations, I set out to create a laboratory simulator where students could perform expensive, hazardous experiments not possible in a real lab environment, with proven effectiveness in regards to student learning and motivation. Mads Bonde (Labster): Through both my own studies and teaching experience, I had observed that laboratory teaching was both very limited by budget, time and safety, along with being quite ineffective. Labster develops online laboratory simulators where students can perform expensive, hazardous experiments not possible in a real lab environment, with proven effectiveness in regards to student learning and motivation. We since started working with clients in the film and game industry, in the educational sector and have also taken on various custom visual effects projects.
So the timeline is that we developed our software successfully at the university, got an article published about our work in 2011 and then decided to make our software commercially available for the professional animation industry in 2012. For example, imagine you are in a virtual game like World of Warcraft or Virtual Worlds and your avatar fully expresses the visual feedback and nonverbal cues that your face produces in real-time. This, in turn enables live online communication with avatars where the avatar itself displays real-time the emotions and facial expressions of the person it represents. Since that day, our goal has been to understand faces to fully capture with the help of a 3D-camera whatever a person is doing with his face and then reproducing these expressions real-time onto a virtual avatar. With hacks for the camera being available in the first week, we developed and modified some of our software and algorithms for this new camera. At the time in 2010, the game-changing thing that happened was the release of the Microsoft Kinect, the first commercially available 3D camera for Microsoft. For my postdoc I continued my work at EPFL with 3D-cameras.
I did my PhD at ETH Zurich where I was working in the Computer Vision Lab in the area of 3D-scanning, modeling and animation. Thibaut Weise (Faceshift): Faceshift is a spin-off of the ETH and EPFL.
Sunnie: What do your respective companies, Faceshift and Labster do? What is your founding story?
This description is then used to animate virtual characters for use in movie or game production
Faceshift’s software analyzes in real-time the face motions of an actor, and describes them as a mixture of basic expressions, plus head orientation and gaze.