Slides presented at VLDB 2015, full paper PDF available at: http://www.vldb.org/pvldb/vol8/p1250-jiang.pdf
Video of SnapToQuery in action: http://go.osu.edu/snaptoquery
A critical challenge in the data exploration process is discovering and issuing the “right” query, especially when the space of possible queries is large. This problem of exploratory query specification is exacerbated by the use of interactive user interfaces driven by mouse, touch, or next-generation, three-dimensional, motion capture-based devices; which, are often imprecise due to jitter and sensitivity issues. In this paper, we propose SnapToQuery, a novel technique that guides users through the query space by providing interactive feedback during the query specification process by “snapping” to the user’s likely intended queries. These intended queries can be derived from prior query logs, or from the data itself, using methods described in this paper. In order to provide interactive response times over large datasets, we propose two data reduction techniques when snapping to these queries. Performance experiments demonstrate that our algorithms help maintain an interactive experience while allowing for accurate guidance. User studies over three kinds of devices (mouse, touch, and motion capture) show that SnapToQuery can help users specify queries quicker and more accurately; resulting in a query specification time speedup of 1.4× for mouse and touch-based devices and 2.2× for motion capture-based devices.