Hackathon Insights - September and October Projects



October 19, 2017

Each month, we organize a two-day internal hackathon during which WATTx-ers dedicate their time to working on side projects and get a chance to explore new trends in design, tech, and data science. In September and October, we hacked on projects revolving around IoT networking, smart devices, data science, automation, and the OpenCV. Here’s what we’ve learned.

Data Science for inventory management

We know from previous user and market research that hospitals waste money and resources due to inefficient processes in stock management. This is what inspired us to found our venture loopstock - an intelligent healthcare inventory management system. Loopstock creates transparency in the management of inventory, essentially through the use of RFID tags.

During the September hackathon, our Data Scientist Christian jumped in to help loopstock understand better what particular insights they could provide to hospitals. He developed a statistical model and, in a next step, wrote a script that allowed him to fake tracking data of products and push it into the loopstock database.

With Christian’s help, the company is now able to display and visualize data for hospitals via an intuitive dashboard.

Building our own Smart Video Car

Our hardware engineer Wen decided to test the functionality and reliability of the SunFounder Smart Video Car Kit (see picture below), which, in simple words, is a smart car with a camera attached to it.

The problem Wen tried to solve was also related to our venture loopstock, as the company is constantly looking for the most efficient ways of replacing lengthy manual labour and tracking medical supplies as well as medication in hospitals through RFID tags. In a real setting, this process is very repetitive and happens hundreds of times daily which is why it yields enormous potential of being automated.

In his project, Wen tested a simple approach where the car followed lines on the floor that he drew using line detection sensors. Results were very promising, more to come in further hackathons!

Fetching analytic data from Venture Capital firms to support our business team

Our Venture Developers (VD) often need to find specific data on ventures in order to run market or competitor analyses, or find potential investors for our ventures. Usually, they consult Tracxn, a website that tracks startups across 230+ sectors. The data Traxcn provides is qualitative and informative, however, our venture developers struggle with the large amount of data displayed on the unintuitive dashboard. This pain point inspired our UX researcher Alex in collaboration with our Software Engineer Mikhail to facilitate and accelerate the whole process of finding relevant information for the VD team.

Their process was to firstly collect data on all funded startups in Berlin over the last years (approx. 2,400 in total) and to then put it into GitHub. From there, the project was divided into two distinct parts: data scraping and data analysis.

Data Scraping

As a pro-subscriber, we get access to Tracxn APIs and data on all companies that are based in Berlin. Mikhail set the focus on Berlin as there is a limitation from the website that only allows a certain amount of API calls each month.

A simple scraper which queried Berlin companies, combined the data and saved it as a pickled Python object, was written in a next step.

Data Analysis

At this point of the project, Rafael, one of our Data Scientists joined the team. Rafael used Pandas to analyse the data that resulted from the scraping, and the Matplotlib to plot simple graphs, as can be seen in the following graph as an example:

In a last step, the team separated the data into three different dataframes: companies, investments, and feeds.

Overall the team was happy with the outcomes, as they were able to identify relevant VCs for our daily work and projects at WATTx.

Building a face recognition system to automatically open our office door - KiwiPI meets OpenCV

KiwiPI - the amended automatic lock system Rafal build by combining the RaspberryPi with KIWI back in August (read more here) - was working great, but wasn’t operating particularly speedily: On average, it took 4.5 to 5s to process data - meaning; taking a picture of someone waiting at the door, detecting and then encoding their face, and recognizing them as an employee (or as an intruder).

To improve the performance of the KiwiPI, Rafal came up with the idea of extracting the face detection process and migrate to OpenCV (Open Source Computer Vision Library). And the results were impressive: After hacking around and adjusting his code, Rafal managed to reduce the processing time from almost 5s to 0.4s, so that employees can enter the office even faster now!

Building models to predict cargo ship arrival at port terminals

The problem many ports encounter nowadays is that they don’t know the exact arrival time of cargo ships. Those usually arrived with big delays, causing losses of efficiency in the whole value chain within port terminals. In fact, the arrival time of ships is one of the most important data points used by post terminals to organize internal activity.

This hackathon project took place in the context of one of our big research topics: exploring the shipping industry. Christian from the Data Science team hacked around a set of data on cargo destinations and time arrivals. He took the rich, openly available AIS (Automatic Identification System) data on container ships in order to model the individual ship’s whereabouts and calculate the ETA (estimated time of arrival) as well as turnaround time at a certain ports.

The first step Christian undertook was to filter and pre-process the available data to make machine learning possible, and then, in a next step, trained a basic model.

The results were positive for small batches of data, as you can see in the following example:

Predicted time with Christian’s model in comparison with real time of arrival

In a further iteration, Christian and the Data Science team will work on refining the model for bigger batches of data in order to optimise the estimated time of arrival.

Building our own WATTx Router

Another field of investigation at WATTx is how to develop security systems for connected devices and the Internet of Things (IoT). In this regard, our Software Engineer Marcin, took on the challenge for the October Hackathon to build a simple router (using the Quotom MiniPC as hardware platform), with the functional goals of allowing devices on the local network to communicate with each other while also accessing the internet, and to tunnel the whole internet bound traffic through a VPN (Virtual Private Network).

In the end, Marcin built two router versions: The first was capable of routing within the local network and also access the internet, while the second version of the router successfully worked as a VPN client.

Monitoring for WATTx kubernetes cluster

As we run many of our services on the Google Cloud platform, Wen found that we should also create a monitoring mechanism for the WATTx kubernetes cluster. The problem he saw was that we lacked a monitoring tool that allowed for checking of resources (such as the CPU, memory, network usage, etc.), custom metrics, and alerts in case of extraordinary events.

In his project, Wen upgraded the WATTx kubernetes (an open-source system for automating deployment, scaling, and management of containerized applications) to the latest version which allowed supporting and running a log aggregator. He then run InfluxDB and Grafana (two platforms for analytics and monitoring) as additional services for storing metrics data and to also provide an easy-to-use frontend webpage.

Wen achieved that fine-grained information of each application we run on the Google Cloud platform was made visible.

From here, we can also enable the mechanism that sends out notifications in case something unusual happens - which he will test in the hackathons to come!

Extreme City Networking

First Iteration

During our September hackathon, Pedro, our IoT Engineer, took over the challenge of testing MIOTY, a wireless IoT platform developed by Fraunhofer, on its performance in an unfriendly environment (unfriendly meaning bad for radio frequency transmissions, i.e. many buildings, big distances, such as from office to the basement, thick walls, etc.) as well as its reach. In case of a success concerning MIOTY’s reliability, Pedro saw potential in applying it in smart city environments, and potentially have additional options for the offers of our venture Snuk.

The Setup

The Result

Given the rainy weather and the number of buildings standing between the MIOTY node and the base station, the outcomes were fairly impressive: Pedro achieved a range a little under 1 kilometer.

Second Iteration

In the October month, Pedro continued his efforts with the goal in mind to prove that it is possible to use MIOTY instead of wifi in places the latter couldn’t reach, such as the basement. He utilized a temperature and humidity sensor connected with a Raspberry Pi and the MIOTY node, which acted as the communication protocol for the sensors nodes.

The Setup

The Result

In the second MIOTY testing round, Pedro was able to get positive results as well: Temperature and humidity data was sent from our office building’s basement up to the base station in our office (more than three floors, in a Berlin Altbau).