For developers working with custom views and touches for the first time, it could sometimes be difficult to decipher what goes on when a user touches the screen. In this talk we’ll trace through the steps to see how Android decides which view should handle a touch, and how the system differentiates between swipes and clicks. We’ll start by visually examining how a view hierarchy is laid out and examine the multiple methods involved when the system receives a touch. Then we’ll look into the different kinds of gestures that are categorized in Android and how the system finally decides to call the ubiquitous onClickListener().