Topics

#269: Taptic Engine Basics πŸ‘‹

Topics

With the iPhone 7 and iPhone 7 Plus, Apple added a remarkable new piece of hardware to our devices. It's called the Taptic Engine and it is a big change from the old vibration feedback mechanisms we're used to. Today we'll look at how we can use it in our apps. Let's get started.

Haptic feedback on iOS is all about subtlety.

We want to help inform and/or guide the user, not annoy them. For these reasons, Apple has provided a few different classes we can use to generate haptic feedback that matches what we find around the system.

First up, UIImpactFeedbackGenerator. This is great for more prominent user interactions like when a part of our UI "snaps" into place (for example, imagine a drawing app that snaps when you drag over the exact center of the document).

Next, UISelectionFeedbackGenerator. These are perfect for small, subtle changes in selection. We can preview this one by 3D Touching on an app's icon on the home screen (one with a few 3D Touch shortcuts). If we keep our finger down and drag between the menu items, we'll feel the slightest "click" as we change selection. Neat!

Last, we have UINotificationFeedbackGenerator. These are pretty heavy and are meant for things like errors or warnings. (Imagine a login form that uses this generator when the user enters an incorrect password).

Now, lets try this out in code. First we need to instantiate a generator, and call the prepare function on it.

let generator = UISelectionFeedbackGenerator()
generator.prepare()

The prepare function is crucial here. It has to do with the physical hardware inside the device.

Basically, prepare will "wake up" the Taptic Engine hardware and put it into a state where it's ready to generate feedback immediately when we ask it to. iOS is relentless about saving battery and power. The Taptic Engine isn't always using "full" power, and only does so for a few seconds after the prepare function is called. Neat.

Things will still work if we don't call prepare, but they might not perfectly match up with the changes on screen (another crucial part of crafting effective Haptic feedback).

Now all we need to do trigger the actual haptic to "play" is:

generator.selectionChanged()

Each generator has its own function for "playing" the haptic. (i.e. impactOccurred, selectionChanged, and notificationOccurred, respectively).

Finally, let's look at a "complete" example using a gesture recognizer:

var generator : UISelectionFeedbackGenerator? = nil

func panned(_ sender: UIPanGestureRecognizer) {
  switch(sender.state) {

  case .began:
    generator = UISelectionFeedbackGenerator()
    generator?.prepare()
  case .changed:
    if selectionChanged(translationPoint: sender.translation(in: view)) {
      generator?.selectionChanged()

      // we call prepare again right after "playing",
      // to make sure the taptic engine is "ready" if the user
      // moves their finger again quickly.
      generator?.prepare()
    }

  case .cancelled, .ended, .failed: generator = nil

  default: break
  }
}

Success!