Playing audio is an important part of many apps. One common trick is to fade in the volume of audio playback so we don't surprise or startle the user. This year, Apple has made this much simpler to implement using AVAudioPlayer. Let's take a look.
First we'll set up a standard AVAudioPlayer, and begin playing it at a volume of 0:
Ever since 1983 when Matthew Broderick's IMSAI 8080 began speaking out loud, we've dreamed of computers that can have conversations with us.
In iOS 9, Apple added the ability to synthesize speech using the high-quality 'Alex' voice. Sadly it's only available on US devices for now, but that's sure to change. Let's try it out:
guardletvoice=AVSpeechSynthesisVoice(identifier:AVSpeechSynthesisVoiceIdentifierAlex)else{return}letsynth=AVSpeechSynthesizer()synth.delegate=selfletutter=AVSpeechUtterance(string:"Would you like to play a game?")utter.voice=voicesynth.speakUtterance(utter)
We start by making sure 'Alex' is available, then we make a new synthesizer. Next, we create an AVSpeechUtterance, and set it's voice. Then, we simply tell the synthesizer to speak! Very cool.
Even cooler, we can implement one of the optional functions of AVSpeechSynthesizerDelegate to get live progress callbacks as each word is spoken. Neat!
We'll first need to request permission from the user to record audio. Once granted, we'll try to set our Audio Session's category to PlayAndRecord, the category Apple suggests for apps that simultaneously record and playback audio.
We'll create a place for our recording to live, then assemble the settings dictionary for our recorder. We instantiate and store our AVAudioRecorder object, then tell it to start recording. Later, we'll call .stop() on it to stop recording. We can also optionally wire up a delegate to get completion callbacks.
Finally, we can play back our file using AVAudioPlayer:
In Bite #101 we started working on a custom camera view controller.
Today we'll complete it by adding a way for users to capture a photo and do something with it. We'll start by making it easy to use. We'll make the whole screen a capture button by adding a tap gesture recognizer to our view:
We looked at allowing our users to capture photos/videos using UIImagePickerController in Bite #83. Now we'll take things to the next level by starting to create our own customcamera view controller. Today we'll get all the plumbing wired up and get the preview on the screen. Let's get started.
We'll start with the "single view" template in Xcode. There are a number of different objects we'll need to setup and glue together, so we'll go into our view controller and add a function called setupSession. We'll call this in viewWillAppear(animated:).
First we'll instantiate an AVCaptureSession. It's sort of the central hub of all this. We can configure it with a number of different presets. We'll use a preset for taking high quality still photos. Now, our session needs some inputs and outputs.
We'll use defaulDeviceWithMediaType and pass in Video to get the default hardware device for capturing and recording images on the user's device (usually the the back camera). Then we'll try to create an AVCaptureDeviceInput from the device. Next up, an output.
Capture sessions can return us data in all sorts of interesting ways: Still images, videos, raw pixel data, and more. Here we'll set up an AVCaptureStillImageOutput and ask it for JPEG photos. We'll do one more safety check then add both our input and output to our session.
Finally, let's display our camera so the user can see what they're photographing.
We'll pass our session into a new AVCapturePreviewLayer and add it to our view. Then we just need to start the session. If we run the app we'll see it's starting to look like a camera, neat!
Tomorrow, we'll finish up by adding the ability to actually capture some photos.