Active Filters: Media

Topics

#115: Generating Thumbnails from Videos 📼

Topics

It's quite common for an app to need to display one or more thumbnails (small still-image previews) of what's in a video. However, depending on where the video is coming from, we might not have easy access to pre-made thumbnail(s) for it. Let's look at how we can use AVAssetImageGenerator to grab our own.

We start with a simple NSURL for the video, this can be local or remote. We'll create an AVAsset with it and that to a new AVAssetImageGenerator object. We'll configure the generator to apply preferred transforms so our thumbnails are in the correct orientation.

import AVFoundation

if let asset = AVAsset(URL: videoURL) {
  let durationSeconds = CMTimeGetSeconds(asset.duration)
  let generator = AVAssetImageGenerator(asset: asset)

  generator.appliesPreferredTrackTransform = true

  let time = CMTimeMakeWithSeconds(durationSeconds/3.0, 600)
  var thumbnailImage: CGImageRef

  generator.generateCGImagesAsynchronouslyForTimes([NSValue(CMTime: time)]) {
    (requestedTime: CMTime, thumbnail: CGImage?, actualTime: CMTime, result: AVAssetImageGeneratorResult, error: NSError?) in
    self.videoThumbnailImageView.image = UIImage(CGImage: thumbnail)
  }
}

Then we just create a new CMTime that describes at which point in the video we'd like to capture a thumbnail image. (We use 1/3 of the way through here. Finally, we'll kick off the actual thumbnail generation process and wait for it to complete. We only requested a single frame here, but we can ask for as many as our app needs. The closure will be called once per generated thumbnail.

Today we'll take a look at how to record audio from the microphone on a user's device. Let's get started.

The first thing we'll need is an Audio Session. This will be a singleton (Bite #4), and we'll also create a property to hold our recorder:

import AVFoundation

class RecordViewController : UIViewController {
  let session = AVAudioSession.sharedInstance()
  var recorder: AVAudioRecorder?
}

Next, we'll create a function to start recording:

func beginRecording() {
  session.requestRecordPermission { granted in
    guard granted else { return }

    do {
      try self.session.setCategory(AVAudioSessionCategoryPlayAndRecord)
      try self.session.setActive(true)

      let recordingFileName = "recording.caf"

      guard let recordingURL = documentsDirectoryURL()?
        .URLByAppendingPathComponent(recordingFileName) else { return }

      let settings: [String : AnyObject] = [
        AVEncoderAudioQualityKey: AVAudioQuality.High.rawValue,
        AVSampleRateKey: 12000.0,
        AVNumberOfChannelsKey: 1,
        AVFormatIDKey: Int(kAudioFormatMPEG4AAC)
      ]

      try self.recorder = AVAudioRecorder(
        URL: recordingURL, 
        settings: settings
      )
    } catch { }
  }
}

We'll first need to request permission from the user to record audio. Once granted, we'll try to set our Audio Session's category to PlayAndRecord, the category Apple suggests for apps that simultaneously record and playback audio.

We'll create a place for our recording to live, then assemble the settings dictionary for our recorder. We instantiate and store our AVAudioRecorder object, then tell it to start recording. Later, we'll call .stop() on it to stop recording. We can also optionally wire up a delegate to get completion callbacks.

Finally, we can play back our file using AVAudioPlayer:

let audioPlayer = try AVAudioPlayer(contentsOfURL: recordingURL)
audioPlayer.prepareToPlay()
audioPlayer.play()

Topics

#105: Media Library Basics 🎵

Topics

The Media Player Framework is how we can interact with the media library on a user's device. Today we'll look at the basics and build a fun little personal-use app to play some of our own favorite albums. Let's get started.

We'll start with a simple UICollectionViewController. It'll have a flow layout and we'll size the items so they make 2 equal columns:

override func viewDidLayoutSubviews() {
  super.viewDidLayoutSubviews()
  guard let flowLayout = collectionView?.collectionViewLayout as? UICollectionViewFlowLayout else { return }

  squareSize = view.bounds.size.width / 2.0
  flowLayout.itemSize = CGSizeMake(squareSize, squareSize)
}

In viewWillAppear we'll retrieve the albums we want to display.

We'll use the Async library (Bite #35) to make hopping queues easier:

Async.background {
  self.albums = (MPMediaQuery.albumsQuery().collections ?? [])
}.main { self.collectionView?.reloadData() }

When dequeuing cells, we grab the artwork from the representative item of each album. We use our squareSize so it displays nice and crisp:

let album = albums[indexPath.item]
let item = album.representativeItem!
let artwork = item.valueForProperty(MPMediaItemPropertyArtwork) as! MPMediaItemArtwork
cell.imageView.image = artwork.imageWithSize(CGSizeMake(squareSize, squareSize))

Now we'll personalize our app. We'll add a filter to our MPMediaQuery so it only returns some of the albums in our library:

query.addFilterPredicate(
  MPMediaPropertyPredicate(
    value: "Bad Religion",
    forProperty: MPMediaItemPropertyArtist,
    comparisonType: .Contains
  )
)

Finally, when the user selects a cell, we'll grab the system's music player and play the selected album:

We chose to filter to just a single artist here, but MPMediaPredicate is quite powerful. We can supply multiple filters, tons of properties, etc.

let album = albums[indexPath.item]

let player = MPMusicPlayerController.systemMusicPlayer()
player.setQueueWithItemCollection(album)
player.play()

Success!

This is just the beginning of what's possible with the Media Player Framework. Be sure to download the project at j.mp/bite105 and try it for yourself!

In Bite #101 we started working on a custom camera view controller.

Today we'll complete it by adding a way for users to capture a photo and do something with it. We'll start by making it easy to use. We'll make the whole screen a capture button by adding a tap gesture recognizer to our view:

let tapGR = UITapGestureRecognizer(target: self, action: "capturePhoto:")
tapGR.numberOfTapsRequired = 1
view.addGestureRecognizer(tapGR)

Next, we want to ask our output to capture a still image. Before we can, we'll need an AVCaptureConnection.

Connections were already implicitly created for us by our session. They represent the conceptual pipes that move data between inputs and outputs.

We grab a connection and use it to ask our output to capture a still image, asynchronously:

func capturePhoto(tapGR: UITapGestureRecognizer) {
  guard let connection = output.connectionWithMediaType(AVMediaTypeVideo) else { return }
  connection.videoOrientation = .Portrait

  output.captureStillImageAsynchronouslyFromConnection(connection) { (sampleBuffer, error) in
    guard sampleBuffer != nil && error == nil else { return }

    let imageData = AVCaptureStillImageOutput.jpegStillImageNSDataRepresentation(sampleBuffer)
    guard let image = UIImage(data: imageData) else { return }

    self.presentActivityVCForImage(image)
  }
}

In the closure, we'll do a safety check then convert the CMSampleBuffer we've been given into an NSData then a UIImage.

Lastly, we'll use UIActivityViewController (covered in Bite #71) to allow the user to do something with their new photo.

Download the project we built in Bites #101 & #102 at j.mp/bite102

We looked at allowing our users to capture photos/videos using UIImagePickerController in Bite #83. Now we'll take things to the next level by starting to create our own custom camera view controller. Today we'll get all the plumbing wired up and get the preview on the screen. Let's get started.

func setupSession() {
  session = AVCaptureSession()
  session.sessionPreset = AVCaptureSessionPresetPhoto

  let camera = AVCaptureDevice
    .defaultDeviceWithMediaType(AVMediaTypeVideo)

  do { input = try AVCaptureDeviceInput(device: camera) } catch { return }

  output = AVCaptureStillImageOutput()
  output.outputSettings = [ AVVideoCodecKey: AVVideoCodecJPEG ]

  guard session.canAddInput(input) && session.canAddOutput(output) else { return }

  session.addInput(input)
  session.addOutput(output)

  previewLayer = AVCaptureVideoPreviewLayer(session: session)
  previewLayer!.videoGravity = AVLayerVideoGravityResizeAspect
  previewLayer!.frame = view.bounds
  previewLayer!.connection?.videoOrientation = .Portrait

  view.layer.addSublayer(previewLayer!)

  session.startRunning()
}

We'll start with the "single view" template in Xcode. There are a number of different objects we'll need to setup and glue together, so we'll go into our view controller and add a function called setupSession. We'll call this in viewWillAppear(animated:).

First we'll instantiate an AVCaptureSession. It's sort of the central hub of all this. We can configure it with a number of different presets. We'll use a preset for taking high quality still photos. Now, our session needs some inputs and outputs.

We'll use defaulDeviceWithMediaType and pass in Video to get the default hardware device for capturing and recording images on the user's device (usually the the back camera). Then we'll try to create an AVCaptureDeviceInput from the device. Next up, an output.

Capture sessions can return us data in all sorts of interesting ways: Still images, videos, raw pixel data, and more. Here we'll set up an AVCaptureStillImageOutput and ask it for JPEG photos. We'll do one more safety check then add both our input and output to our session.

Finally, let's display our camera so the user can see what they're photographing.

We'll pass our session into a new AVCapturePreviewLayer and add it to our view. Then we just need to start the session. If we run the app we'll see it's starting to look like a camera, neat!

Tomorrow, we'll finish up by adding the ability to actually capture some photos.

Update: Part 2 is right here

Topics

#83: UIImagePickerController Basics 📷

Topics

UIImagePickerController has been part of iOS since it's first release and it's evolved quite a bit over the years. Let's take a look at what it can do:

  • Capture images and videos
  • Choose images and videos from the Photos library
  • Crop images after choosing/capturing
  • Trim videos after choosing/capturing

Whew! That's quite a bit of functionality packed into this one class.

We can't cover all of that in this Bite, instead let's look at a simple example use case. We'll be letting our users take a photo, crop it, and then show how to access it for use in our app.

The first step is to find out the device we're running on has a camera, can take photos. Then we'll configure the UIImagePickerController and present it.

let types = UIImagePickerController.availableMediaTypesForSourceType(.Camera)!
let canTakePhotos = types.contains(kUTTypeImage as String)

if UIImagePickerController.isSourceTypeAvailable(.Camera) && canTakePhotos {
  let ipc = UIImagePickerController()

  ipc.sourceType = .Camera
  ipc.mediaTypes = [kUTTypeImage as String]
  ipc.allowsEditing = true
  ipc.delegate = self

  presentViewController(ipc, animated: true, completion: nil)
}

Then we'll add a function from UIImagePickerControllerDelegate where we'll get a userInfo dictionary. We'll use the values inside to extract the captured image in either it's original or cropped form. We can also access a few other details, like the cropped rect as a CGRect or the image's metadata as a dictionary.

Note that we'll need to declare conformance to the UINavigationControllerDelegate protocol since UIImagePickerController is actually a subclass of UINavigationController under the hood.

class ViewController: UIViewController, UINavigationControllerDelegate, UIImagePickerControllerDelegate {
  func imagePickerController(picker: UIImagePickerController, didFinishPickingMediaWithInfo info: [String: AnyObject]) {
    let uncroppedImage = info[UIImagePickerControllerOriginalImage] as? UIImage
    let croppedImage = info[UIImagePickerControllerEditedImage] as? UIImage
    let cropRect = info[UIImagePickerControllerCropRect]!.CGRectValue

    picker.dismissViewControllerAnimated(true, completion: nil)
  }

  func imagePickerControllerDidCancel(picker: UIImagePickerController) {
    picker.dismissViewControllerAnimated(true, completion: nil)
  }
}

Topics

#67: CloudKit Assets ⛅️

Topics

One of the best parts of CloudKit is how great it is at handling not just our models, but also larger assets like images, audio, or video.

Assets are saved just like any other property. Here we'll attach an image captured from the user's camera to a new record. Then we'll upload it to CloudKit using a CKModifyRecordsOperation (covered in more detail in Bite #31). In our case we're only saving a single record, but we're using an operation anyway, so we can take advantage of its perRecordProgressBlock, and track the upload progress of our asset.

func imagePickerController(picker: UIImagePickerController, didFinishPickingMediaWithInfo info: [String : AnyObject]) {
  guard let mediaURL = info[UIImagePickerControllerMediaURL] as? NSURL else { return }

  let spaceshipRecord = CKRecord(recordType: "Spaceship")

  spaceshipRecord["model"] = "Tantive IV"
  spaceshipRecord["maxSpeed"] = 950 // in km
  spaceshipRecord["image"] = CKAsset(fileURL: mediaURL)

  let operation = CKModifyRecordsOperation(recordsToSave: [spaceshipRecord], recordIDsToDelete: nil)

  operation.perRecordProgressBlock = { self.progressView.progress = $1 }
  operation.completionBlock = { self.progressView.hidden = true }

  progressView.hidden = false

  CKContainer.defaultContainer().publicCloudDatabase.addOperation(operation)
}

It's worth noting that CloudKit doesn't seem to report progress constantly as we might expect. It seems to report between 0-3 times depending on the size of the asset we're uploading.

After fetching a record containing an asset, we can grab the downloaded file from disk using the fileURL property of the CKAsset:

let asset = spaceshipRecord["image"] as! CKAsset

imageView.image = UIImage(
  contentsOfFile: asset.fileURL.absoluteString
)

Topics

#25: Picture in Picture 📺

Topics

One of the coolest new features in iOS 9 is the new Picture in Picture functionality on iPad. This lets users watch video content from an app even while it's in the background.

To support it in our app, we'll first make sure you set the Playback audio category in our application(application:didFinishLaunchingWithOptions:) function:

do {
  try AVAudioSession.sharedInstance().setCategory(AVAudioSessionCategoryPlayback)
} catch { }

Then we'll use AVPlayerViewController to play video content. Picture in Picture mode will automatically kick-in if our app enters background but only if: 1.) our player is full screen, 2.) video content is playing in it, and 3.) Picture in Picture is supported on the device.

Next we'll implement this wonderfully long delegate method to restore our player UI when the user returns from Picture in Picture mode:

func playerViewController(playerViewController: AVPlayerViewController, restoreUserInterfaceForPictureInPictureStopWithCompletionHandler completionHandler: (Bool) -> Void) {
  navigationController?.presentViewController(playerViewController, animated: true) {
    completionHandler(true)
  }
}

More About PIP

  • If we need to support a generic AVPlayerLayer, AVKit also includes a new AVPictureInPictureController.

  • We also get PIP for free in WKWebView assuming our app has the Playback audio session category set.