You may have used gesture recognisers in your Swift projects already, for example UIPinchGestureRecognizer for handling pinch-to-zoom or UIRotationGestureRecognizer for detecting a two touch rotation. This post looks at creating bespoke gesture recognisers and illustrates the technique with a single touch rotation gesture. The companion project for this post is available at my GitHub repository here.
The original code for the single touch rotation gesture comes from an ActionScript project I did back in 2008. Although I originally thought it would be useful for games, it could also be useful as a jog/shuttle control for a video application.
The UIGestureRecognizer base class allows us to decouple the logic for recognising and acting upon user gestures. Rather than adding code directly to a view or controller, delegating this code to an extended UIGestureRecognizer allows for code reuse and supports the single responsibility principle.
The Swift code for implementing an existing gesture recogniser, for example UILongPressGestureRecognizer, in its simplest form looks like this: I create an instance of the recogniser, set its target to the component that I want to respond to the gesture and define the method to invoke in response to the gesture: