Hello readers! iOS 8 is at the gates, as only a few weeks remain until the official release of the updated version of the operating system, and along with it, the release of the Swift programming language. So, as you understand, we are preparing to enter into a new era of the iOS SDK, where new wonderful technologies waiting for us to explore them! However, here at Appcoda we decided to dedicate one more tutorial to the existing SDK, using the Objective-C language. My next tutorials will focus on new iOS 8 technologies, and we’ll use the Swift language. Regarding this one, there were many candidate topics to write for, but ultimately the chosen one is about the Gesture Recognizers. So, let’s see a few things about them.
A gesture recognizer is actually an object of the abstract class UIGestureRecognizer. Such an object is related to a view, and monitors for predefined gestures made on that view. Going one level deeper, I would say that gestures are actually touches and movements of one or more fingers that happen on a specific area of the screen, where a view of interest exists there. In the early versions of iOS SDK, gestures recognizers were not provided to developers, so implementing such ways of interaction required a lot of manual work. Thankfully, Apple wrapped up all that manual work and gave it to developers as a single tool, and that way working with gestures became a really easy part of the iOS programming.