Building Assistive Apps with App Actions

Building Assistive Apps with AppĀ Actions

In 2017, Google announced app predictions: 5 apps that are suggested to you on the App Drawer with a 60% prediction rate. This year, they are taking it to the next level: trying to predict the action that you are going to make at a specific time, depending on the context you're currently in and by analyzing your usage patterns based on machine learning algorithms that run locally on your phone. Those actions can be, for example, continue listening to Spotify if you plug in your headphones or call your best friend on friday nights. In this talk, I'll show how developers can take advantage of this concept by declaring the actions that your app does so Google can index it and suggest it to your users at the right time. I'll end by explaining that a building block of App Actions - Built-in Intents, can be then used to create a Conversational Action for the Google Assistant.

Cb3c988ada1925dbb88a2b1f11c60f3f?s=128

elainedb

July 09, 2018
Tweet