As a part of our internal innovation activities, we organized Oriflame Hackathon in November. The idea of Hackathon was on a table for a long time but we were struggling with finding the right topic and the format of the event itself. However, this has changed with one business requirement that required the ability to search for products using an image.
We liked this topic because it combined the need of our customers and the trend in image recognition. Beyond the requirements, we decided to process the assignment as a mobile application to explore the various possibilities of creating mobile applications. For the event itself, we just needed to put together a team that will help with preparation and evaluation. In addition to our architects, we decided to invite Michal Marušan and Valdemar Závadský from the Czech office of Microsoft to help us with the initial training on the AI in Azure and then mentoring the individual teams.

On Tuesday, December 4, 12 participants gathered and put together 3 teams that worked on the assignment …. Hackathon officially started at 11 am with a presentation by Michal Marušan from Microsoft, who introduced cognitive services in Azure, so the teams had started with the same knowledge base that they could build on. After about 30 minutes of presentation, hacking began. The teams were hacking until the evening hours, and then we moved to a restaurant to get some dinner. The next day started again from the early morning with the latest tuning of the trained models and application testing. At 12 o’clock, hacking was officially discontinued and it was up to the teams to present the results of their efforts.

The created applications were submitted to a benchmark on the same test set. It was verified that the products from the training suite are reliably detected and the products that have never been seen by the model (test set) are correctly assigned to a known product based on the highest similarity.
The following presentations allowed to look under the hood of each solution.
- The first team has implemented the Progressive Web Application (PWA) using React and using Azure Custom Vision as a tool for image recognition. In order to produce better results, there was also image segmentation applied in case the model wasn’t able to find the product with precision over the given threshold. PWA showed as a good approach and so the team could have focused more on the model tuning, rather than solving issues with the mobile app.
- The second team used two different approaches to mobile access based on Xamarin and PowerApps. The model for both of them was the same and hosted on Azure Custom Vision. PowerApps were one of the biggest surprises of the Hackathon as we have discovered an easy and non-coding way of building mobile applications.
- The third team built their solution on top of a Xamarin and also used Azure Custom Vision. Unlike the others, the third team focused on improving the results based on the Optical Character Recognition. They have achieved great results with the unknown products due to the fact that their model was able to classify without the previous knowledge of the given product. They’ve read the text information from the product itself and used the NLP and searching in Oriflame product database so they could have suggested the most similar alternatives. The combination of Azure Custom Vision and OCR seemed like a very promising solution for production usage.

Based on the great feedback from the participants and the rest of the company, we decided to hold another Hackathon, but this time an open one even for the participants outside the Oriflame. Learn more during spring 2019.
Leave a Reply