At Snips, we are working on the next generation of context-aware interfaces for mobile devices.
By looking at the user's location pattern and enriching it with their personal, social and urban contexts, we can predict their intentions, and thus create apps that dynamically adapt to what they actually want to do. For reasons of speed, battery efficiency and privacy, most of the data processing and machine learning involved is performed directly on-device.
In this talk, we show how we work with Swift and ReactiveCocoa, and how this allows for fast prototyping and profiling of algorithms in a real-world setting, while ensuring a robust data-processing model that scales and can be passed to production.