Want to offer native, in-app customer service to your users? Get going quickly with Zendeskโ€™s Mobile SDKs. Free with Zendesk.

In Bite #101 we started working on a custom camera view controller.

Today we'll complete it by adding a way for users to capture a photo and do something with it. We'll start by making it easy to use. We'll make the whole screen a capture button by adding a tap gesture recognizer to our view:

let tapGR = UITapGestureRecognizer(target: self, action: "capturePhoto:")
tapGR.numberOfTapsRequired = 1

Next, we want to ask our output to capture a still image. Before we can, we'll need an AVCaptureConnection.

Connections were already implicitly created for us by our session. They represent the conceptual pipes that move data between inputs and outputs.

We grab a connection and use it to ask our output to capture a still image, asynchronously:

func capturePhoto(tapGR: UITapGestureRecognizer) {
  guard let connection = output.connectionWithMediaType(AVMediaTypeVideo) else { return }
  connection.videoOrientation = .Portrait

  output.captureStillImageAsynchronouslyFromConnection(connection) { (sampleBuffer, error) in
    guard sampleBuffer != nil && error == nil else { return }

    let imageData = AVCaptureStillImageOutput.jpegStillImageNSDataRepresentation(sampleBuffer)
    guard let image = UIImage(data: imageData) else { return }


In the closure, we'll do a safety check then convert the CMSampleBuffer we've been given into an NSData then a UIImage.

Lastly, we'll use UIActivityViewController (covered in Bite #71) to allow the user to do something with their new photo.

Download the project we built in Bites #101 & #102 at j.mp/bite102