icon to launch the AR filters menus › Scroll the menu until you find the filter you like › Apply the filter and re-do the process if the filter is not really what you want. This interaction can break ongoing conversations because user’s attention needs to be directed to filter selection.
gestures and speech › Users perform physical or on-screen gestures along with speech command. › The system immediately chooses an AR filter that the users want, and automatically applies to the screen. › This is a collaboration project with Prof. Koji Yatani at University of Tokyo, and IIS Lab, supported by Center for Robust Intelligence and Social Technology (CRIS).
what to apply. › Physical gestures define the scope where an AR filter is going to be applied. › Hand pose estimation › On-screen gestures for applying an AR filter to the chat partner. (currently under development) › Speech identifies what AR filter the user wants to apply. › Speech Recognition Cat Cat