Topics

#112: Layout Guides & Anchors ⚓️

Topics

Layout Guides & Anchors were added in iOS 9/OS X 10.11 as a more convenient way to add Auto Layout constraints. Let's dive in.

Layout Guides let us create sort of "invisible boxes" that exist solely for the purpose of layout. Instead of adding "spacer" views or using other such tricks, we can add some layout guides to a view, then add some constraints to them. It's a nice way to avoid needless performance wastes like rendering the dummy "spacer" views, and the code itself ends up being much more readable:

var textContainerGuide = UILayoutGuide()
view.addLayoutGuide(textContainerGuide)

Note: Layout Guides are defined in code using UILayoutGuide on iOS and NSLayoutGuide on OS X.

Anchors are best explained with an example. They allow us to turn verbose code like these manually configured constraints:

NSLayoutConstraint(item: someView,
  attribute: .Leading,
  relatedBy: .Equal,
  toItem: anotherView,
  attribute: .LeadingMargin,
  multiplier: 1.0,
  constant: 0.0
).active = true

NSLayoutConstraint(item: someView,
  attribute: .Trailing,
  relatedBy: .Equal,
  toItem: anotherView,
  attribute: .TrailingMargin,
  multiplier: 1.0,
  constant: 0.0
).active = true

...into a set of much more readable short lines of code (shown below). The new code reads much more easily from left to right, and is much easier to scan and reason about what's going on.

let margins = anotherView.layoutMarginsGuide

someView.leadingAnchor.constraintEqualToAnchor(margins.leadingAnchor).active = true
someView.trailingAnchor.constraintEqualToAnchor(margins.trailingAnchor).active = true

UIView, NSView, and UILayoutGuide all have lots of new anchor properties available to make adding constraints to them much simpler than before. We also get a little extra type safety as a bonus.

Topics

#111: Reachability.swift 📡

Topics

Mobile devices can lose or change connectivity state suddenly and unexpectedly. It's important for us to make our apps respond well and degrade gracefully when this occurs.

Reachability describes how "reachable" the Internet is at a given time. Apple has long offered sample code for dealing with Reachability, but the process has always been a bit of a nuisance.

Today we'll look at a great little library from Ashley Mills called Reachability.swift that aims to make this less painful. Let's begin.

We'll start by trying to create a new Reachability object:

let reachability: Reachability

do {
  reachability = try Reachability.reachabilityForInternetConnection()
} catch {
  print("ERROR: Unable to create Reachability")
  assumeConnectivity()
}

Then we'll set some closures on it that will be called when the Reachability state changes. These will be called on a background queue, so we'll hop to the main queue before updating any UI.

reachability.whenReachable = { reachability in
  dispatch_async(dispatch_get_main_queue()) { /* TODO */ }
}

reachability.whenUnreachable = { reachability in
  dispatch_async(dispatch_get_main_queue()) { /* TODO */ }
}

Now, we can tell the object to start listening for connectivity changes:

do { try reachability.startNotifier() } catch {
  print("ERROR: Unable to start Reachability notifier")
  assumeConnectivity()
}

Then stop it later with:

reachability.stopNotifier()

It's important to consider if our app even needs this functionality.

If possible, we should try to avoid degrading the experience at all when connectivity goes away. If this isn't possible, for example if our app is streaming live video, we should respond to the change in Reachability, and tell the user why playback was interrupted.

Note: We looked at the simple closure syntax here, but Reachability.swift has great support for NSNotificationCenter notifications as well as Wifi vs. Cellular detection.

More info about Reachability.swift can be found at git.io/reachability

Weekly Sponsor: Imgix 🖼

A huge continued thanks to imgix for sponsoring this week's Bites! imgix is real-time image resizing as a service. They allows us to resize, crop, and process images on the fly, simply by changing their URLs. Let's take a look.

By now we've learned all about how imgix has client libraries for all sorts of languages and environments, and how we might not even need one depending on our use case. That's part of what makes imgix so awesome, we can play around with size, crops, effects, even draw text or watermarks on images just by changing some URL parameters. They use our existing image storage locations as a source (supports public web folders, S3 buckets, and Web Proxies). We can configure multiple image sources, set caching times, and set custom domains on their site. Best of all, response times for images average around 70ms!

imgix is packed with features, here's a handy one: Cropping images to a circle. It may sound trivial, but doing this processing on the server side, and retrieving the avatar as a pre-flattened image in our UI can be a huge performance win.

GET /avatars/123.jpg?mask=ellipse&fit=crop&h=200&w=200

If we combine this trick with what we learned last week about cropping images around faces, we have a pretty great user avatar system with nothing but a few query string parameters. Not too shabby!

One last fun (and very useful) imgix feature before we go: text. Yep, we can easily add completely customized text to any image (with full emoji support of course! ) Let's try it out:

GET /hairforce1.jpg?fit=crop
 &w=204
 &h=142
 &txt=WATCH+THE+HAIR+%F0%9F%92%87
 &txtfont=Impact
 &txtclr=fff
 &txtline=2
 &txtlineclr=000
 &txtfit=max
 &txtsize=24
 &txtalign=center%2Cbottom
 &txtpad=6

Stop maintaining your frankenstein ImageMagick setup and let imgix handle it. It's free to get started. Use the link littlebitesofcocoa.com/imgix so they know you heard about them here.

Topics

#110: Snapshot 📸

Topics

Fastlane is collection of incredibly helpful tools built by Felix Krause.

This week, Twitter announced that Fastlane would be joining their already-great Fabric suite of tools. Congrats Felix! 🎉

We'll be taking a look at some of Fastlane's tools individually in the coming weeks. Today we'll begin with snapshot.

If we're being good localization citizens in our apps, we could have something like 20 or more languages that we support. We also want to market our app well, so we use all 5 screenshot slots on the App Store. Oh, and we support lots of different iOS devices. Ack!

This all adds up to potentially hundreds of new screenshots to capture and upload each release. We better make sure we get the status bar looking clean, and not accidentally capture any loading snippers, etc. Eesh. That sounds like a whole lot of time spent not building apps. Let's see how we can use snapshot to save our sanity:

snapshot uses Xcode 7's UI Testing functionality (Bite #30) to help us automate our apps and specify exactly when to capture screenshots. It also makes sure they look great with presentable status bars, etc.

Let's use the springboard app we created in Bite #104 to try it out.

We'll install the snapshot gem, then head into our app's main directory and run snapshot init.

This will create a SnapshotHelper.swift file we'll need to add to the UI Test target in our project.

Instead of calling launch on XCUIApplication() directly, we need to plug snapshot into the mix by calling setLanguage on our app, then launching it. In our test, we'll snap a screenshot of the initial screen and give it a name.

class SnapshotTests: XCTestCase {
  override func setUp() {
    super.setUp()

    continueAfterFailure = false

    let app = XCUIApplication()
    setLanguage(app)
    app.launch()
  }

  func testExample() {
    snapshot("01MainScreen")
  }
}

Now we just head into our app's directory and run snapshot.

Our screenshots will be captured, and saved to disk. We can check out the results in Finder.

To configure the list of devices and languages that will be captured we can add the Snapfile that was created earlier when we ran snapshot init to our project as well and customize it to our needs.

More info about snapshot can be found at git.io/snapshot.

Topics

#109: Xcode Code Snippets 🐰

Topics

Today we're talking Code Snippets. These are a great way to speed up our workflow. A well-kept library of snippets can save us hours of writing repetitive boilerplate code. Let's take a look.

We'll begin by opening the Utilities sidebar by clicking the button on the far right of the toolbar. Then we'll click the curly braces { } icon in the bottom panel to see our library of snippets.

Xcode ships with a decent starting set of snippets, but the real power comes when we start to define our own. Let's make a snippet for something we do all the time: Hopping to a background queue, then hopping back to the main queue to update some UI.

We'll start by writing some code we want to make a snippet. We can put this anywhere, we'll delete it when we're done:

dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0)) {
  // background code

  dispatch_async(dispatch_get_main_queue()) {
    // main queue code
  }
}

Next, we'll replace those comments with some tokens. When we use our snippet, we'll be able to press the tab key to cycle through each token and easily replace it. Tokens are defined like this:

<# token name #>

We'll replace our two comments with tokens, then select our whole snippet and drag and drop the code itself onto our library of snippets. A panel will open, where we'll be able to describe our snippet, and configure when it can be triggered.

We'll use hop for our completion shortcut, then click Done, to try it out. Beginning to type hop and pressing tab now inserts our code. Success!

Topics

#108: Error Handling 📛

Topics

Today we're going to talk about Error Handling. First though, a bit of a pitch: Great apps handle errors gracefully. Think of error handling as the dental flossing of creating apps. Sure, not the most exciting part of the job, but very important. Alright, let's dive in:

enum BlueprintFileError: ErrorType {
  case Interrupted
  case CorruptData
  case ShipBoardedByVader
}

func decryptDeathStarPlans() throws -> BlueprintFile {
  // ...

  throw BlueprintFileError.Interrupted
}

To allow one of our functions to throw an error, we add a throws keyword to its definition. Then we can create our own enum that inherits from the system's ErrorType.

Then anywhere inside our function, when something goes wrong, we can throw the appropriate error.

Elsewhere in our code, we can use a do/catch block to handle these errors. We'll try our dangerous code in the do block, then catch any errors below.

do {
  try decryptDeathStarPlans()
} catch BlueprintFileError.Interrupted {
  alert("Decryption Interrupted!", options: [ "Try Again", "Cancel" ])
} catch BlueprintFileError.CorruptData {
  alert("Sorry, the file could not be read from disk.", options: [ "OK" ])
} catch BlueprintFileError.ShipBoardedByVader {
  transferFilesToDroid("R2-D2")
  alert("Ship is being boarded, " +
    "decryption will continue on R2-D2", options: [ "OK" ])
} catch let error {
  alert("Decription Failed", options: [ "Try Again", "Cancel" ])
}

Finally, let's look at how best to actually handle these errors. Every app is unique, and will need special consideration around how best to handle its errors. That being said, here's some guidelines that should apply in most cases, and are illustrated above:

  • Fail as gracefully as possible, and preserve as much of the user's work as possible.
  • If necessary, tell the user what happened in clear simple terms. (No jargon).
  • When possible, give the user a way to try the task again.
  • Handle all cases, even if unlikely, if it can go wrong, it will, for someone.

Today we'll take a look at how to record audio from the microphone on a user's device. Let's get started.

The first thing we'll need is an Audio Session. This will be a singleton (Bite #4), and we'll also create a property to hold our recorder:

import AVFoundation

class RecordViewController : UIViewController {
  let session = AVAudioSession.sharedInstance()
  var recorder: AVAudioRecorder?
}

Next, we'll create a function to start recording:

func beginRecording() {
  session.requestRecordPermission { granted in
    guard granted else { return }

    do {
      try self.session.setCategory(AVAudioSessionCategoryPlayAndRecord)
      try self.session.setActive(true)

      let recordingFileName = "recording.caf"

      guard let recordingURL = documentsDirectoryURL()?
        .URLByAppendingPathComponent(recordingFileName) else { return }

      let settings: [String : AnyObject] = [
        AVEncoderAudioQualityKey: AVAudioQuality.High.rawValue,
        AVSampleRateKey: 12000.0,
        AVNumberOfChannelsKey: 1,
        AVFormatIDKey: Int(kAudioFormatMPEG4AAC)
      ]

      try self.recorder = AVAudioRecorder(
        URL: recordingURL, 
        settings: settings
      )
    } catch { }
  }
}

We'll first need to request permission from the user to record audio. Once granted, we'll try to set our Audio Session's category to PlayAndRecord, the category Apple suggests for apps that simultaneously record and playback audio.

We'll create a place for our recording to live, then assemble the settings dictionary for our recorder. We instantiate and store our AVAudioRecorder object, then tell it to start recording. Later, we'll call .stop() on it to stop recording. We can also optionally wire up a delegate to get completion callbacks.

Finally, we can play back our file using AVAudioPlayer:

let audioPlayer = try AVAudioPlayer(contentsOfURL: recordingURL)
audioPlayer.prepareToPlay()
audioPlayer.play()

Topics

#106: EasyAnimation 🏃

Topics

Animation is one of the greatest parts about building (and using) iOS apps. The APIs however, can feel a bit scattered. UIView's animation functions are wonderful, but some animations require using Core Animation directly.

When they do, things get progressively more complex depending on if we need to run multiple animations in succession, or just run code after an animation completes.

Today we'll look at a great library from Marin Todorov called EasyAnimation that improves on all of this. Let's dive in:

EasyAnimation makes animating CALayers that normally would require CABasicAnimation (or one of its siblings) work with the standard UIView.animateWithDuration functions:

UIView.animateWithDuration(0.3, animations: {
  self.view.layer.position.y = 64.0
})

Under the hood, EasyAnimation does all the heavy lifting of translating our animations back into CAAnimation code and handling all of the implementation details for us. Neat!

Normally, if we wanted to run code after one the animations on a CALayer finished, we'd need to wire up an animation delegate, implement the callback functions, make sure to clean up after ourselves, etc.

With EasyAnimation though, we're able to just use the normal completion closure.

UIView.animateWithDuration(
  0.3,
  delay: 0.1,
  options: [.BeginFromCurrentState],
  animations: {
  self.view.layer.borderWidth = 2.0
  self.view.layer.cornerRadius = 12.0
}, completion: { finished in
  self.tableView.reloadData()
})

Last but certainly not least, EasyAnimation makes "chaining" multiple animations together (running one after another) extremely convenient. It also supports cancelling the chain, repeating, delays and more:

let chain = UIView.animateAndChainWithDuration(0.3, animations: {
  self.avatarView.center = headerView.center
}).animateWithDuration(0.2, animations: {
  self.headerView.alpha = 1.0
})

More info about EasyAnimation can be found at git.io/easyanimation

Weekly Sponsor: Imgix 🎨

A huge continued thanks to imgix for sponsoring this week's Bites! imgix is real-time image resizing as a service. They allows us to resize, crop, and process images on the fly, simply by changing their URLs. Let's take a look.

By now we've learned all about how imgix has client libraries for all sorts of languages and environments, and how we might not even need one depending on our use case. That's part of what makes imgix so awesome, we can play around with size, crops, effects, even draw text or watermarks on images just by changing some URL parameters. They use our existing image storage locations as a source (supports public web folders, S3 buckets, and Web Proxies). We can configure multiple image sources, set caching times, and set custom domains on their site. Best of all, response times for images average around 70ms!

imgix is packed with features, here's a fun one: If we had an app that served up album art, it'd be neat if we themed our UI to match the colors in each album cover. With imgix, this is super simple! We just request the image's palette:

GET /albums/the-process-of-belief.jpg?palette=json&colors=5

imgix will then return some JSON containing colors we can set on our UI elements:

{
  "average_luminance": 0.294337,
  "colors": [{ "blue": 0.027451, "green": 0.847059, "red": 1 },
             { "blue": 0.603922, "green": 0.792157, "red": 0.427451 }]
}

If we were building a website, we could even specify a CSS class name and request the palette as 100% valid CSS. Then we could just include it with a tag in our .

Another really powerful feature of imgix is their intelligent cropping tools. If our app allowed folks to upload avatar images for their profiles, we could easily automatically crop those images right around the users' faces with an imgix URL like this:

GET /jony.jpg?crop=faces&fit=facearea&h=50&w=50&facepad=1.5

With that simple URL, imgix will:

  • Find the face(s)
  • Add some padding
  • Crop the image

Stop maintaining your frankenstein ImageMagick setup and let imgix handle it. It's free to get started. Use the link littlebitesofcocoa.com/imgix so they know you heard about them here.

Topics

#105: Media Library Basics 🎵

Topics

The Media Player Framework is how we can interact with the media library on a user's device. Today we'll look at the basics and build a fun little personal-use app to play some of our own favorite albums. Let's get started.

We'll start with a simple UICollectionViewController. It'll have a flow layout and we'll size the items so they make 2 equal columns:

override func viewDidLayoutSubviews() {
  super.viewDidLayoutSubviews()
  guard let flowLayout = collectionView?.collectionViewLayout as? UICollectionViewFlowLayout else { return }

  squareSize = view.bounds.size.width / 2.0
  flowLayout.itemSize = CGSizeMake(squareSize, squareSize)
}

In viewWillAppear we'll retrieve the albums we want to display.

We'll use the Async library (Bite #35) to make hopping queues easier:

Async.background {
  self.albums = (MPMediaQuery.albumsQuery().collections ?? [])
}.main { self.collectionView?.reloadData() }

When dequeuing cells, we grab the artwork from the representative item of each album. We use our squareSize so it displays nice and crisp:

let album = albums[indexPath.item]
let item = album.representativeItem!
let artwork = item.valueForProperty(MPMediaItemPropertyArtwork) as! MPMediaItemArtwork
cell.imageView.image = artwork.imageWithSize(CGSizeMake(squareSize, squareSize))

Now we'll personalize our app. We'll add a filter to our MPMediaQuery so it only returns some of the albums in our library:

query.addFilterPredicate(
  MPMediaPropertyPredicate(
    value: "Bad Religion",
    forProperty: MPMediaItemPropertyArtist,
    comparisonType: .Contains
  )
)

Finally, when the user selects a cell, we'll grab the system's music player and play the selected album:

We chose to filter to just a single artist here, but MPMediaPredicate is quite powerful. We can supply multiple filters, tons of properties, etc.

let album = albums[indexPath.item]

let player = MPMusicPlayerController.systemMusicPlayer()
player.setQueueWithItemCollection(album)
player.play()

Success!

This is just the beginning of what's possible with the Media Player Framework. Be sure to download the project at j.mp/bite105 and try it for yourself!

Page 27 of 38