UICollectionViewController's new property called installsStandardGestureForInteractiveMovement defaults to true, so once we implement delegate function above, we're good to go.
To customize things further, we'll need to manually call UICollectionView's new interactive movement functions. Let's look at a very rough idea of how we might make the picked up cell "hover" and the others "wiggle" like iOS's home screen.
We'll also start/stop wiggling when we dequeue cells. Lastly, we'll apply the same changes that are in animatePickingUpCell to the cell's layout attributes. To do this we can subclass UICollectionViewFlowLayout and override layoutAttributesForInteractivelyMovingItemAtIndexPath.
After all that's done, this is the outcome:
Here's a direct link to this example video, just in case.
Download the project at j.mp/bite104 to see a complete working example.
Update on March 22nd, 2019: Thanks to reader Le Zeng, the project download now supports Swift 4.2. Thanks so much!
Like in Bite #99, we'll use Auto Layout Visual Format Language to pin the stack view's edges to its superview. Look at that, not bad for our first try!
Those views are close talkers, let's add some spacing:
stackView.spacing=10.0
Ack! What the heck is going on here?!
Let's break it down: In the default βFill' distribution mode, if views don't naturally fill the axis of the stack view, the stack view will resize one (or more) according to their hugging priority (covered in Bite #69).
We'll solve our issue by setting a low hugging priority on our label, signaling to the stack view that it be the one to stretch, not our image view.
In Bite #101 we started working on a custom camera view controller.
Today we'll complete it by adding a way for users to capture a photo and do something with it. We'll start by making it easy to use. We'll make the whole screen a capture button by adding a tap gesture recognizer to our view:
We looked at allowing our users to capture photos/videos using UIImagePickerController in Bite #83. Now we'll take things to the next level by starting to create our own customcamera view controller. Today we'll get all the plumbing wired up and get the preview on the screen. Let's get started.
We'll start with the "single view" template in Xcode. There are a number of different objects we'll need to setup and glue together, so we'll go into our view controller and add a function called setupSession. We'll call this in viewWillAppear(animated:).
First we'll instantiate an AVCaptureSession. It's sort of the central hub of all this. We can configure it with a number of different presets. We'll use a preset for taking high quality still photos. Now, our session needs some inputs and outputs.
We'll use defaulDeviceWithMediaType and pass in Video to get the default hardware device for capturing and recording images on the user's device (usually the the back camera). Then we'll try to create an AVCaptureDeviceInput from the device. Next up, an output.
Capture sessions can return us data in all sorts of interesting ways: Still images, videos, raw pixel data, and more. Here we'll set up an AVCaptureStillImageOutput and ask it for JPEG photos. We'll do one more safety check then add both our input and output to our session.
Finally, let's display our camera so the user can see what they're photographing.
We'll pass our session into a new AVCapturePreviewLayer and add it to our view. Then we just need to start the session. If we run the app we'll see it's starting to look like a camera, neat!
Tomorrow, we'll finish up by adding the ability to actually capture some photos.
My continued thanks to imgix for sponsoring this week's Bites! imgix is real-time image resizingas a service. They allow us to resize, crop, and process images on the fly, simply by changing their URLs. Let's take a look.
There's client libraries for all sorts of languages and environments, but for basic usage we don't even really need one. That's part of what makes imgix so awesome, we can play around with size, crops, effects, even draw text or watermarks on images just by changing some URL parameters.
They use our existing image storage locations as a source (supports public web folders, S3 buckets, and Web Proxies). We can configure multiple image sources, set caching times, and set custom domains on their site. Response times for images average around 70ms. π
Let's look at using imgix in an iOS or OS X app. After signing up, we can use imgix's great Objective-C/Swift client library maintained by Sam Soffes. It's called imgix-objc (more info can be found at git.io/imgixobjc).
We'll create a client, and our first image URL. This will generate a signed image URL we can load anywhere.
Boom, say hello to Little Bites of Cocoa night mode. π
Fun! A more realistic use case would be handling Retina displays. We can store our images' original size in full-resolution, then use imgix's awesome dpr parameter (plus a width and height) to serve up perfect, crisp images on any iOS device (or any display on just about any device for that matter).
Stop maintaining your frankenstein ImageMagick setup and let imgix handle it. It's totally free to get started. Use the link littlebitesofcocoa.com/imgix so they know you heard about them here. π
At one point or another, we've all heard some form of "Oh you make apps? I'm interested in that, but I'm having trouble just getting started". Today we'll take a look at the specific steps to create our first app and where to go from there. Let's dive in:
We'll need a Mac. We'll open the Mac App Store and search for "Xcode", then install it. (It's free).
Once it's installed, we'll open Xcode. We'll allow it to verify it's installation, and install any components it needs to.
Then we'll go to the File menu and choose File > New > Project⦠We'll leave the Master-Detail iOS Application selected and click Next. We'll give our new app a name: "Foodstagram". We'll enter "com.somename" as the organization identifier, select Swift as the language, and click Next. Finally, we'll choose where to save our new project, then click Create.
When our project opens, we'll hit the little βΆ button in the top left corner. Xcode will build our new app and launch it in the Simulator.
Hey it's an app! Hit the +button to add some entries, then edit to delete them.
Congratulations, our new app-venture (sorry, had to) has begun! Next, we need to learn how to learn. When first starting out, much of our time will be spent Googling for things we don't know yet. This is completely normal. Let's start right now.
We'll look at going from idea to implementation. Our idea: "The top bar of the app should be green."
We'll Google for how to do it, using as specific a phrase as we can given what we know so far:
"ios swift change top bar background color".
We'll read the first result from stackoverflow. The first answer contains some possibly helpful code snippets, but we'll read a few more to be sure. It seems the rest of the answers all suggest a similar piece of code talking about "appearance". Sounds promising. One even mentions "color" and "green", so we'll copy it for pasting later:
We'll keep reading our search results until we find a coderwall post from Eranga Bandara that answers our question: "Add following code to didFinishLaunchingWithOptions function in AppDelegate.swift".
Perfect, we'll do just that and click the AppDelegate file in Xcode, then paste our copied code at the end of the specified function.
We can check if it worked by clicking the βΆ button again. Success! Our top bar is now green!
We can't learn everything this way, so our next step is to try to find something to teach us the rest of the basics. There's plenty of great resources out there, including one that's just getting started. ππ«
In Bite #98, we learned how to work with Auto Layout in code. While this helped us gain a ton of flexibility, our code is now a bit... verbose. For simple layouts this won't be a big deal, but for anything complex we would quickly rack of hundreds of lines of code for just our layout. Today we'll look at one solution for this issue: Auto Layout'sVisual Format Language. Let's take a look.
Before jumping into code, let's look at the Visual Format Language itself. It's essentially just a string that the system parses and turns into an array of NSLayoutConstraint objects:
"|-[header]-|"
The two pipes on the outside represent the superview. The dashes represent spacing, in this case, the system-standard spacing. Views are described with a name in brackets.
When parsed, this would create 2 new constraints: 1 leading and 1 trailing. It would keep the header the standard spacing away from the left and right edges of the superview. To let the system know our views by name we put them in a dictionary:
letviews=["header":header]
If we wanted to use a specific spacing amount we could describe that within dashes on the edges of our Visual Format Language:
"|-20-[header]-20-|"
That's fine for simple spacing, but what if we needed to reference lots of numeric values? We can name those and put them in a dictionary just like our views:
"|-edgeSpacing-[header]-edgeSpacing-|"
Now that we have a basic understanding, we can add our first set of constraints:
We've only scratched the surface of what's possible with Visual Format Language. In the future we'll look at configuring inequalities, constraint priorities, and more.
Auto Layout works great in Interface Builder, but it's often helpful to have the flexibility and clarity of wiring up constraints in code. Let's dive in.
We'll add a view and set translatesAutoresizingMaskIntoConstraints to false. Normally Interface Builder does this automatically under the hood, but since we're working in code we'll need to set it ourselves. Don't want any funky autoresizing constraints in there meddling around.
Whew! That's a long constructor. The neat part though, is from left-to-right it almost reads like the equations from before. Lastly, we'll assign a fixed size to our logo view. Since there's only 1 view involved, the equation is much simpler:
EventKit Alarms are how we configure notifications and alerts that will be triggered to remind the user about calendar events or reminders.
Like in Bite #96, we'll use Timepiece (Bite #3) to help us compose dates. We'll create an event for when our cookies will be done baking. The event will start in 12 minutes, and we'll add a new EKAlarm to it that triggers 5 seconds before the event. Alarms can be created with at absolute or relative times.
lete=EKEvent(eventStore:eventStore)e.startDate=NSDate()+12.minutese.endDate=e.startDate+30.secondse.calendar=eventStore.defaultCalendarForNewEventse.title="Cookies are Done! πͺ"e.addAlarm(EKAlarm(relativeOffset:-5.0))tryeventStore.saveEvent(e,span:.ThisEvent)
Interestingly, OS X is actually ahead of iOS here. EKAlarm on OS X **has more **properties for configuring a sound to play, an email address to notify and more.
Now when we get to Disneyland, the alarm will remind us to head over to . (Just an example, in real life we'd never need the reminder).
Things get more fun when adding alarms to reminders. Let's finish by adding an alarm to a new reminder that triggers when the user arrives somewhere:
EventKit is how we access and interact with a user's calendars and events. It has APIs to manage events, reminders, alarms and even participants. Today we'll take a look at the basics. Let's get started.
Authorization
Before doing anything fun, we'll need to check the current authorization status, then request permission if needed.
Lastly, we'll use the handy function to put new events where the user expects:
We'll use Timepiece (Bite #3) to help us compose dates. Then we'll create a predicate to look for events in the last week, across all the user's calendars.
Adding an Event
letevent=EKEvent(eventStore:eventStore)event.title="Drop off Carbonite Shipment"event.startDate="3031-02-15".dateFromFormat("yyyy-MM-dd")event.calendar=eventStore.defaultCalendarForNewEventstryeventStore.saveEvent(event,span:.ThisEvent)