Realm is a database made specifically for running on mobile devices. It works on Mac and iOS devices. (It even supports Android!) It's a great alternative to Core Data or even raw SQLite.
There's plenty to love about Realm, but the best part is it's ultra-simple API:
Define an Object
importRealmSwiftclassSpaceship:Object{dynamicvarname=""dynamicvartopSpeed=0// in km}
varship=Spaceship()ship.name="Outrider"ship.topSpeed=1150vardash=User()dash.firstName="Dash"dash.lastName="Rendar"ship.owner=dash// need one Realm per threadletrealm=Realm()realm.write{realm.add(ship)}
Find Objects
Queries in Realm couldn't be simpler. They are chainable. You can add as many calls to .filter as you'd like. You can sort results using the chainable sorted function.
MKDirections and MKDirectionsRequest are the classes we use to retrieve directions between two locations. They also make it very easy to display the resulting directions route on a map. Let's dive in:
Setup Directions Request
Note: Off camera, we have let the user input a keyword search String, then geocoded it into an MKPlacemark object called placemark.
letrequest=MKDirectionsRequest()request.source=MKMapItem.mapItemForCurrentLocation()letdestination=MKPlacemark(coordinate:placemark.location.coordinate,addressDictionary:nil)request.destination=MKMapItem(placemark:destination)letd=MKDirections(request:request)d.calculateDirectionsWithCompletionHandler{response,erroringuardletroute=response?.routes.firstelse{return}// we'll do this bit next}
MKLocalSearch is a wonderful little gem hidden inside MapKit. It allows an app to perform a natural language query for local points of interest near a given latitude and longitude. Today we'll use it to build a simple app to find us a place to eat lunch.
manager=CLLocationManager()manager!.delegate=selfmanager!.desiredAccuracy=kCLLocationAccuracyThreeKilometersmanager!.requestWhenInUseAuthorization()manager!.requestLocation()// then later, once we have a location:letrequest=MKLocalSearchRequest()request.naturalLanguageQuery="Restaurants"request.region=MKCoordinateRegionMakeWithDistance(location.coordinate,1600,1600)MKLocalSearch(request:request).startWithCompletionHandler{(response,error)inguarderror==nilelse{return}guardletresponse=responseelse{return}guardresponse.mapItems.count>0else{return}letrandomIndex=Int(arc4random_uniform(UInt32(response.mapItems.count)))letmapItem=response.mapItems[randomIndex]mapItem.openInMapsWithLaunchOptions(nil)}
First we set up our CLLocationManager, and use the new iOS 9 method for requesting a “single location” instead of continuous updates.
Then we create the MKLocalSearchRequest object and tell it to search "Restaurants" within 1600 meters (about a mile) of the user's current location.
After that it's just a matter of waiting for results, choosing one at random, and then we cheat a bit (since this is just a demo) and pop the user out to the Maps app to view the chosen result.
NSFileManager is one of the primary ways you interact with the file system on iOS and OS X. Today we’ll look at both how to use it, as well as how to put some of the new Swift 2 features we’ve been learning about to real world use.
guardletdocumentsPath=NSSearchPathForDirectoriesInDomains(.DocumentDirectory,.UserDomainMask,true).firstelse{return}letlogPath=documentsPath+"spaceship.log"guardletlogExists=fm.fileExistsAtPath(logPath)else{return}do{letattributes=tryfm.attributesOfItemAtPath(logPath)letcreatedAt=attributes[NSFileCreationDate]// > 5 minutes (in seconds) ago?ifletcreated=createdAtwherefabs(created.timeIntervalSinceNow)>300{tryfm.removeItemAtPath(logPath)"[BEGIN SPACESHIP LOG]".writeToFile(logPath,atomically:false,encoding:NSUTF8StringEncoding,error:nil)}}catchleterrorasNSError{print("Error: \(error.localizedDescription)")}
Here we use NSFileManager to wipe out a log file if it's more than 5 minutes old:
First we use guard to make sure our documents directory is available.
Then, we create a path for our log file, and check that it exists.
Many of NSFileManager's functions are marked as throws, so we need a set of do/catch blocks.
Next we check when the log file was created.
We use another new Swift 2 feature, pattern matching, to do our age check on the optional inside the if statement.
Finally we try to remove the file, and replace it with a fresh, blank log.
We then use the catch statement to capture and print any errors that have been thrown.
There's a ton of great functionality packed in the UIPasteboard and NSPasteboard classes. Let's take a look at some things you can do:
Get the System Pasteboard
// iOSletpb=UIPasteboard.generalPasteboard()// OS Xletpb=NSPasteboard.generalPasteboard()
Copy to Clipboard
On iOS, similar convenience methods exist for images, URLs, and colors.
// iOSpb.string="Delicious!"// OS Xpb.clearContents()pb.setString("Delicious!",forType:NSPasteboardTypeString)
Read Contents
On OS X, you'll use the NSPasteboardType* family of keys.
pb.string// iOSpb.stringForType(NSPasteboardTypeString)// OS X
Get Notified Of Changes
// iOS only. On OS X, you'll need to poll :(NSNotificationCenter.defaultCenter().addObserver(self,selector:"pasteboardChanged:",name:UIPasteboardChangedNotification,object:nil)funcpasteboardChanged(n:NSNotification){ifletpb=n.objectas?UIPasteboard{print(pb.items)}}
Share Data Between Apps
If you're using App Groups, you can share data between your apps by creating your own custom keyboard:
Alert Views and Action Sheets are some of the oldest parts of UIKit. In iOS 8, Apple overhauled how you create and display them using UIAlertController. Let's take a look at how it can improve the adaptability of your UI as well as the readability of your code:
@IBActionfunclaunchButtonTapped(launchButton:UIButton){letcontroller:UIAlertController=UIAlertController(title:"Where To?",message:"Where would you like to go?",preferredStyle:.ActionSheet)controller.addAction(UIAlertAction(title:"Cancel",style:.Cancel,handler:nil))controller.addAction(UIAlertAction(title:"Home",style:.Default){actioninflyToPlace("Home")})controller.addAction(UIAlertAction(title:"Pluto",style:.Default){actioninflyToPlace("Pluto")})controller.popoverPresentationController?.sourceView=launchButtonpresentViewController(controller,animated:true,completion:nil)}
First we create the controller, and set the style to .ActionSheet. We could have used .Alert instead, and our actions would be presented as an Alert View.
Then we add actions, each with a block callback that will fire when the action is selected. A nil handler will have no action.
Then we set .sourceView to the button that originated the action so our action sheet's arrow will ‘point' to the right spot on iPad.
And finally we present our controller just like any other view controller.
Local Notifications are a way to display a system notification to a user of your app without needing to implement a push server. You simply schedule them using an NSDate and they are presented at that time. Let's take a look.
Register for Notifications
Since iOS 8, we've had two different things to register for: User Notifications and Remote notifications. Since we'll be “sending” Local Notifications, we only need to register for User Notifications.
This will prompt the user for permission to see notifications, except instead of coming from a backend server, they'll be generated locally on the device.
Schedule a Local Notification
After we get the didRegisterUserNotificationSettings callback, we're ready to schedule a notification. We use the UILocalNotification class to configure all the aspects of how the notification will look and behave:
funclandSpaceship(){autopilotPerformAction(.LandShip)letlandingTime=30.0// in secondsletn=UILocalNotification()n.alertTitle="Spaceship Landed!"n.alertBody="Good news everyone! Our ship has successfully landed."n.timeZone=NSCalendar.currentCalendar().timeZonen.fireDate=NSDate(timeIntervalSinceNow:landingTime)UIApplication.sharedApplication().scheduleLocalNotification(n)}
Let's use HealthKit to retrieve the user's current weight. HealthKit requires itemized authorization for each bit of data we want to read or write. Here we ask to write nothing, and read weight only.
Then the fun part: We execute a sample query, grabbing the latest weight result. Then we format the weight using NSMassFormatter.
letsortDescriptor=NSSortDescriptor(key:HKSampleSortIdentifierEndDate,ascending:false)letquery=HKSampleQuery(sampleType:weightQuantity,predicate:nil,limit:1,sortDescriptors:[sortDescriptor]){query,results,sampleinguardletresults=resultswhereresults.count>0else{return}letsample=results[0]as!HKQuantitySampleletweightInPounds=sample.quantity.doubleValueForUnit(HKUnit.poundUnit())letsuffix=NSMassFormatter().unitStringFromValue(weightInPounds,unit:.Pound)letformattedWeight=NSNumberFormatter.localizedStringFromNumber(NSNumber(double:weightInPounds),numberStyle:.NoStyle)print("User currently weighs: \(formattedWeight)\(suffix)")}store.executeQuery(query)
CMPedometer is part of the Core Motion framework. One of the features it provides is the ability to retrieve the number of steps a user has taken during a given time period. Let’s take a look at how to do it:
importTimepieceletpedometer=CMPedometer()ifCMPedometer.isStepCountingAvailable(){// NSDate.beginningOfDay comes from the Timepiece librarypedometer.queryPedometerDataFromDate(NSDate.beginningOfDay,toDate:NSDate()){data,erroringuardletdata=dataelse{return}print("The user has taken \(data.numberOfSteps) so far today, that's \(data.distance) meters!")}}
We first import the Timepiece library (covered in Bite #3) and create a pedometer object. Then we check if the device supports step counting, and then we ask for the pedometer data since the beginning of today. The call hands back a CMPedometerData object which has properties representing the number of steps taken and the distance traveled.
Whether you're applying a circular crop to user avatars or just resizing a photo downloaded from a web service, processing images can be a bit of a chore. Toucan is a Swift image processing library from Gavin Bunney that makes working with images a breeze.
Let's import Toucan and take a look at what it can do:
Resize Images
// userAvatar is a UIImage downloaded from the networkletresizedAvatar=Toucan.Resize.resizeImage(myImage,size:CGSize(width:100,height:100))
Toucan provides another syntax for chaining different processing steps together. Just call .image at the end to get a final processed UIImage. Here we can also see how to apply a 1 point wide border to the final image.