Hackathon Insights: Playing around with Augmented Reality
A recap from our last internal hackathon. We used AR to visualize sensor data, created an audio transcriber, and built a new Alexa skill.
Deep Tech and Data Science
July 30, 2017
5 min read
We recently organized another one of our internal wattx hackathons where small, dedicated teams spent two days hacking together on three different projects.
One of the projects, called SnukAR, was initiated by Franziska and Tassilo. Their objective? To combine augmented reality with tangible data. The use case? Snuk, our venture evolving around IoT building infrastructure. After the hackathon, we sat down together to talk about their idea vision and the actual implementation.
What did you set out to do?
Fran: Both me and Tassilo have been thinking about using AR (augmented reality) in combination with Snuk data. One of the major things about Snuk is that we are collecting a lot of interesting data coming from sensors that are spread out through a building. However, presenting that data in a way that is inviting, easy to digest and fun is not simple. So we wanted to see if we could take real-time data from sensors and visualize it through AR in order to make building data a bit more tactile and enjoyable to the viewer.
What exactly did you end up building?
Fran: Actually, Tassilo and I ended up building two different solutions while using similar technologies. Both of us used graphical identifiers or markers, sort of like QR-codes, to mark our sensors. However, Tassilo made use of a webcam to read the identifiers while I used my mobile camera and a dedicated app. We also used two different AR-libraries, namely ARToolkit and OpenHybrid.
Why did you want to work with AR?
Fran: Our monthly hackathons are an opportunity to try out and get a feeling for new technologies. For us, this was an opportunity to bring an idea we had while we were working on Snuk to life, and to learn more about existing AR technologies. Hackathons are generally awesome as they allow us to “learn by doing”. We take two days to research what technologies are out there, what the differences between them are, and how we can use them.
Also, my own motivation to work on the project was that it could be a nice showcase. Imagine walking through a building with your AR glasses on and being able to see all of this data that is generated by a building visualized in front of your eyes. That could be a great way for Snuk of showing the added value that they bring to building managers in their day-to-day work.
Other hackathon projects
Audio-to-text transcriber and a brand-new Amazon Alexa skill
Omar spent the two hackathon-days building an audio-to-text transcriber. For us, especially the UX team, it’s a great addition to the internal toolset as we conduct interviews with experts and potential users in our project discovery phase very often. From now on, instead of taking notes while interviewing or transcribing the conversations later by hand, we simply upload our audio recordings and get a full transcription back in a matter of minutes, saving us hours of work.
Pedro and Wen were busy experimenting with Alexa and building a skill specifically for our new venture, Loopstock. Although Alexa sometimes lets them down, the end result is a starting step towards a voice interface that lets nurses and doctors do inventory management without having to carry a phone or tablet with them.
For AR to be fully adopted into the mainstream, it will require a breakthrough application, like on-site navigating in unfamiliar surroundings, e.g. big commercial centers or large train stations. At the end of the day, the business case will decide.