We first looked at iMessage Apps in Bite #237, when we created a Sticker Pack app. Today we'll go one step further and create an iMessage app that provides it's own UI for displaying Stickers. Let's get started.
This time, we'll make an iMessage app to send Little Bites of Cocoa Bites to our friends!
We'll start by creating a new Messages Application in Xcode. Then, we'll make a new file called BiteBrowserViewController.swift. We'll make it a subclass of MSStickerBrowserViewController.
We'll need to conform to MSStickerBrowserViewDataSource so we'll add a property to hold the Biteimages and implement a couple of simple functions:
The loadStickersfunction loads Bite images from disk, then adds them as MSSticker instances. The loading plumbing code isn't important as everyone will likely be implementing this specifically for their use case.
The important bit is that we need to create MSStickers and get them into our stickers array:
Finally, we'll update our Bundle Display Name in Info.plist to something like "Bite Sender". We can now Build & Run our app in Messages and start sending Bites to everyone we know!
WWDC brought us a whirlwind of changes all over Apple's platforms. One interesting announcement was Xcode Source Editor Extensions. These allow us to add commands to Xcode that can work with the contents of the code we're working on. Yes, Apple also announced that Xcode won't load plugins (like Alcatraz, etc.) anymore. Bummer.
Today we'll try to shake off our feels about plugins going away by making our first Source Editor Extension, let's do it!
We're going to make a super-simple Extension that lets us easily "sign" our code comments. Like this:
// - @jakemarsh
We'll start by creating a new macOS app. Then we'll add a new target to it, and choose Xcode Source Editor Extension.
Xcode gives us a nice template of files to start with. We can head into SourceEditorCommand.swift to implement our command.
We start by looking at the invocation'sselection and guard'ing to make sure there's a valid insertion point for us to append to, then create a new comment using the handy NSUserName() function in macOS.
Finally, we use the provided XCSourceEditorCommandInvocation one more time to insert the new comment text into the buffer. We call the completionHandler with a nil to let Xcode know we've completed our work without any errors, and we're done!
We can customize the name of our command in our Info.plist, like this:
Now, how are we going to test this thing out? Well, Xcode has some nice support built in for debugging these, but first we'll have to endure a little pain to get things working (at least during this first beta, later Xcode 8 releases will hopefully not need these steps).
We'll only need this while running OS X 10.11, macOS 10.12 users can skip this step:
We'll need to run sudo /usr/libexec/xpccachectl in Terminal, then reboot our Mac.
Once it's back up, we can open Xcode 8 again, and Build & Run our Extension. We'll be asked to choose which app we'd like to run it in. We'll choose Xcode 8 (be careful not to choose an older Xcode version as the icons are easy to miss).
Another instance of Xcode 8 will launch in our dock andโฆ Boom! Xcode's dark heart is revealed!
Just kidding, Xcode is displaying itself this way to help us differentiate between the one that we're debugging, and the "real" one.
We can head to the Editor menu to test out our new command. That's not all though, we can go back to the other Xcode instance and set breakpoints, inspect variables, etc. Neat!
Plugins may be gone, but Source Editor Extensions still offer a ton of interesting possibilities. Looking for inspiration? Here'safewonGithub already.
With all the WWDC excitement recently, it might have been easy to miss a wonderful new library release from the fine folks at Lickability. It's called PinpointKit and it can completely transform how we collect feedback from users testing our apps. Let's check it out!
After integrating PinpointKit into our app via CocoaPods/Carthage, or just manually, we can trigger a new bug report like this:
We're provided a Configuration struct where we can customize the look and feel, and a PinpointKitDelegate to hook into the state of the feedback being sent.
Now, whenever a user reports a bug they'll be able to:
Send along system logs automatically (opt-in)
Add arrows, boxes, and text to screenshots
Redact sensitive info before reporting
By default PinpointKit reports via Email, but this, (and just about everything else) is completely customizable.
iOS 10 adds a whole new category of apps with iMessage Apps. These run inside the Messages app and can be as simple as a set of stickers or as complex as person-to-person payment system. Today we'll dip our toes into the water by creating the simplest of all iMessage Apps: a Sticker Pack App. Let's get started.
To begin, we'll need to actually create the artwork for whatever we want our stickers to be.
Stickers can be animated or static, and can be any of these formats: PNG, APNG, JPEG, GIF APNG, GIF. All stickers are limited to a file size of 500 KB each.
We'll keep things simple and just make the Swift logo into a sticker.
Now, let's actually create our app. We'll open Xcode, create a new project and choose Sticker Pack Application.
We'll give our app a name, then we'll see our new project.
At this point, all we have to do is select the Stickers.xcstickers file in Xcode's navigation and drag and drop our image file(s) into Xcode.
Last but not least, we can set a name and choose our desired sticker size in the Attributes Inspector on the right. There's Small, Medium and Large. Each one will display different depending on which device a user is running. (On @3x devices like iPhone 6S and 6S Plus, Small is 100x100, Medium is 136x136, and Large is 204x204).
We can now build and run on our device, choose Messages as the app to run in, fire up a conversation, pick our new app and start sending Swift-y stickers to all our friends!
WWDC 2016 brought us a ton of new goodies, let's dive right in. Today we'll take a look at SFSpeechRecognizer. It allows us to recognize spoken words in audio files or even audio buffers. Let's check it out.
We'll start by importing Speech, and requesting the user's authorization:
Before we can proceed, we'll need to add a key called NSSpeechRecognitionUsageDescription to our app's Info.plist and give it a value explaining how our app will use the functionality.
Users will see this text, so we should try to be short and simple. Something like "Speech recognition will be used to provide closed captioning of your Instagram videos." (for example) should work fine.
Next we create a recognizer, then configure a request with the URL to our audio file.
Then we'll kick off a recognition task. We configure it to report even partial results, then print each one.
Plugins are a great way for us to extend Xcode, adding functionality that Apple didn't ship with it. We've covered Xcode plugins a bit here in the past, notably in Bite #147, where we learned about using Alcatraz to install plugins. Today we'll look at a new plugin called BuildTimeAnalyzer from Robert Gummesson. It can help uncover why are code is compiling slowly. Let's dive in.
We'll start by completing the steps in Bite #147 to install Alcatraz. (If we're going to be installing and trying out plugins Alcatraz is a good way to organize and manage everything).
Once installed, we'll open the Alcatraz Package Manager by going to Window > Package Manager.
We can search for "BuildTimeAnalyzer" in the search field to find the plugin and install it. Once installed, we'll restart Xcode and open our project.
We can open the analyzer window by selecting View > Build Time Analyzer from Xcode's menu.
Next, we'll need to add some compiler flags.
(Remember to add these to any other targets your app may depend to see compile times for those too).
We'll add -Xfrontend and -debug-time-function-bodies to the "Other Swift Flags" in our app's target's build settings.
After that, one last clean and build and we should start to see a list of how long each bit of our code is taking to compile. Neat!
We can click on each result to jump straight to the line of code that's causing the slow down. The results are often surprising!
For example: Adding Collection types together seems to bring the Swift compiler to a slow crawl.
In Bite #231, we took a look at Realm's new Fine-grained notifications functionality. Today we'll build upon this by checking out another new Realm feature: Queryable, Live Inverse Collections. Whew! That's a fancy name. This feature deserves one though, let's check it out.
Here's a Realm object with a relationship defined on it:
That dogsproperty can be used in a query, and it will even stay updated to reflect changes to the property's value made elsewhere in our app, automatically.
None of that is new though. What is new is the inverse of this mechanic.
LinkingObjects are live and auto-updating. When new relationships are formed or removed, they will update to reflect the new state.
LinkingObjects can be used In queries, natively. (Previously this would need to be done in our code):
// People that have a child that have a parent named Diane.realm.objects(Person).filter("ANY children.parents.name == 'Diane'")// People whose parents have an average age of > 65.realm.objects(Person).filter("parents.@avg.age > 65")
LinkingObjects behave like regular Realm collections:
// Which of my parents are over the age of 56?self.parents.filter("age > 56")// Calculate the age of my parents.self.parents.average("age")
Animation plays a key role in how we understand the user interfaces in the software we use. This role expands itself when animations are driven directly from a user's gestures or interactions with the interface. Today we'll look at a new framework that can help us create these types of experiences without breaking a sweat. Let's dive in.
As the project's README puts it: "all animation is the interpolation of values over time."
Interpolate helps us describe the relationships that we want to exist between a user's gesture and the interpolated values that should result for the properties of our views. Let's try it by animating a color.
Since Pan GRs report every step of their progress as a simple float (from 0.0 - 1.0), we can simply set that progress percentage value directly on the Interpolate object.
There's tons more too, Interpolate supports easing functions, and works on all sorts of foundation types (points, rects, colors, etc.).
Swift Protocols are awesome. Understanding how they can (or should) fit into our code can be tricky. Today we'll take a baby step into the world of protocols with a simple, but "real" example and try to illustrate the upsides. Let's get started.
We're going to be fancy and abstract away some of our layout code. So we'll create a little struct to hold some layout settings like this:
See? Fancy. This is great if we want to specify each individual combination each time, but it'd be nice if we could define some sort of "pre-canned" layouts that we could use by name. Sounds like a great job for a Swift Enum.
Lovely, this will be handy. How are we going to wire all this together though? Simple, we'll make a Protocol that's only responsibility is to convert itself to a LayoutSettings.
Making our CannedLayoutEnum adopt our new Protocol is a bit more involved, but really just means switch-ing over self and return the proper combination of settings for each case.
We first looked at Realm way back in Bite #49. It's a great data storage solution for our mobile apps. Today we'll start looking at some of the latest improvements in Realm and the new capabilities they offer. First up is Fine-grained notifications. Let's dive in:
Realm has offered notifications of write operations for a while, they look like this:
These are still around and work great, but it might help to know more about what changed. That's where the new Collection Notifications come in.
Collection notifications give us access the changes that just occurred at a fine-grained level, including the specific indices of insertions, deletions, etc
.Update's values can be easily mapped to NSIndexPath objects suitable for use in table views and collection views.
Here's a complete example showing all of this in action:
classSpaceshipsViewController:UITableViewController{varnotificationToken:NotificationToken?=niloverridefuncviewDidLoad(){super.viewDidLoad()letrealm=try!Realm()letresults=realm.objects(Spaceships).filter("maxSpeed > 0")// Observe Results NotificationsnotificationToken=results.addNotificationBlock{[weakself](changes:RealmCollectionChange)inguardlettableView=self?.tableViewelse{return}switchchanges{case.Initial:// Results are now populated and can be accessed without blocking the UItableView.reloadData()breakcase.Update(_,letdeletions,letinsertions,letmodifications):// Query results have changed, so apply them to the UITableViewtableView.beginUpdates()tableView.insertRowsAtIndexPaths(insertions.map{NSIndexPath(forRow:$0,inSection:0)},withRowAnimation:.Automatic)tableView.deleteRowsAtIndexPaths(deletions.map{NSIndexPath(forRow:$0,inSection:0)},withRowAnimation:.Automatic)tableView.reloadRowsAtIndexPaths(modifications.map{NSIndexPath(forRow:$0,inSection:0)},withRowAnimation:.Automatic)tableView.endUpdates()breakcase.Error(leterror):// An error occurred while opening the Realm file on the background worker threadfatalError("\(error)")break}}}deinit{notificationToken?.stop()}}