Parallax Scrolling as a Category on UICollectionView

iOS 7 introduced a layered design language where the system achieved to create the illusion of three dimensionality by moving views on screen with the movement of the device. The parallax on the lock screen is probably the best known example of this effect. Some system views such as alert dialogs use this effect and as a developer you can use it in your own apps with the UIMotionEffect classes. Ash Furrow wrote a really nice tutorial on how to accomplish this back then in 2013.

Another place where a similar parallax effect can be quite useful is scroll views where the contents of the cells scroll slightly differently than the main scroll view to give the user the illusion of as if the cells are cut-outs showing an object which sits deeper into the screen. The probably best known example of this effect is images in WhatsApp chats. Below you can see this effect in action in a simple demo app:

A collection view with image cells with parallax scrolling
A collection view with image cells with parallax scrolling

I recently wanted to implement something similar for a collection view with a custom layout. The most straight forward way to achieve this effect is to implement scrollViewDidScroll: and adjust the contents of the cells accordingly as in this project. Another, more elegant approach is like Ole Begemann described in his blog post as part of a UICollectionViewLayout.

Since I was already using a custom quilt layout, I decided to subclass and extend it with parallax scrolling as Ole described.

Which worked great, as expected, but only in the simulator…

Since the layout gets invalidated at each scroll event, it needs to be recomputed constantly. And in the case of the quilt layout this turned out to be very expensive: a collection with about 50 photos could not be scrolled at 60fps on an iPhone 6.

This could be resolved by implementing some smart caching of the layout attributes in the layout subclass and returning them only by adjusting their parallax offsets. Depending on the nature of the layout this could introduce significant complexity. Complexity is usually a pretty good sign of our code trying to tell us that we’re swimming against the flow.

So, I decided to listen to my code and implement parallax scrolling by implementing scrollViewDidScroll: and without touching the layout attributes. Since the app I’m developing, has multiple collection views full of photos already implemented, I could either introduce a common superclass and implement scrollViewDidScroll: there or go the composition route. Due to its flexibility I chose the composition route and decided to implement parallax scrolling as a category on UICollectionViewController

Here is the header for the ParallaxScroll category:

@import UIKit;

@protocol UICollectionViewCellParallax <NSObject>

- (void)updateWithParallaxOffset:(CGPoint)offset;


@interface UICollectionViewController (ParallaxScroll)


It declares an informal protocol with a single method updateWithParallaxOffset:, which we will later implement in our collection view cells which should have parallax scroll.

And the implementation contains a single method:

- (void)scrollViewDidScroll:(UIScrollView *)scrollView {
    NSArray *visibleCells = self.collectionView.visibleCells;
    for (UICollectionViewCell *cell in visibleCells) {
        if ([cell respondsToSelector:@selector(updateWithParallaxOffset:)]) {
            CGRect bounds = self.collectionView.bounds;
            CGPoint boundsCenter = CGPointMake(CGRectGetMidX(bounds),
            CGPoint cellCenter =;
            CGPoint offsetFromCenter = 
                CGPointMake(boundsCenter.x - cellCenter.x,
                            boundsCenter.y - cellCenter.y);
            CGSize cellSize = cell.bounds.size;
            CGFloat maxVerticalOffset =
            (bounds.size.height / 2) + (cellSize.height / 2);
            CGFloat scaleFactor = 30. / maxVerticalOffset;
            CGPoint parallaxOffset = CGPointMake(0.0, -offsetFromCenter.y * scaleFactor);
            [(id)cell updateWithParallaxOffset:parallaxOffset];

Since the scrollViewDidScroll: is implemented at the UICollectionViewController level all of its subclasses in our app are going to be able to handle parallax scroll. On the other hand, we can implement scrollViewDidScroll: in the subclasses and still retain the parallax scrolling as long as the child implementations call [super scrollViewDidScroll:]. The informal protocol UICollectionViewCellParallax is central to how this approach works. Parallax scrolling can be selectively turned on and off by implementing the updateWithParallaxOffset: or not in a UICollectionViewCell subclass. For a simple photo cell, updateWithParallaxOffset: can be as simple as moving the image view accordingly:

- (void)updateWithParallaxOffset:(CGPoint)offset {
    self.imageViewCenterYConstraint.constant = offset.y;

By implementing parallax scroll with a category and an informal protocol, we get the best of both approaches: the scrolling stays very smooth because the layout does not get recomputed at every scroll position, which might be expensive, and we do get to cleanly separate parallax scroll from the rest of our view controller code in its separate location.

Android Diary #3: Status Report

I started exploring the Android development environment as I wrote in the last two posts.

After installing the latest Android Studio beta, I jumped head first into implementing simple things. The whole issue #11 of was very helpful but Android 101 tutorial for iOS developers by Stephen Barnes was my guiding light.

Initially, I to built a simple skeleton app with a list view as the home screen and a button which takes the user to a map view. The list view tutorial by Lars Vogel is a great resource to learn about list views in Android.

Showing a map view on Android turned out to be significantly more involved than using a MKMapView on iOS due to the strange nature Android ecosystem works. There is no map view in the Android core; developers need to import a separate closed source SDK from Google called Google Play Services, get the certificate for their app using a command line tool (at least for the debug mode), sign up for Goole API Console and create a key for the Maps API before they can use a map view in their apps.

On the iOS side, the only comparable complexity is the provisioning profile dance and it has been getting better and better in the recent years and for the most common functionality it is as simple as logging into your Apple Developer Account through Xcode these days.

As a matter of fact, this issue with the map view on Android is a great example of the general feeling of development experience when coming over from the iOS development world: life as an Android developer is a hard one.

All in all, things are going well and I was able to record a short walk today but that will be another post about the differences in how iOS and Android handle background services.

Android Diary #2: The Goal: A Flight Logger for Glider Pilots

Yesterday I posted about that I will start experimenting with Android development.

Now I would like to explain what app I will be building and why.

I started learning how to fly a glider last summer. It is great fun and very exciting but also quite regulated because of safety. One of those regulations states that each pilot needs to keep a logbook in order to document all of their flights. Being a left-handed geek who despises pen and paper, I started looking around for an app that would help me with that.

Here are the features that I would like to have:

  • The user should be able to record a flight including:
    • The basic information like the name of the pilot, the time in UTC, because it’s legally required
    • Geeky information like the flight path and altitude profile, because it’s cool
  • The user should be able to see a list of previous flights
  • The user should be able to see details of a single flight
  • The user should be able to edit the basic details of a flight after the recording
  • The user should be able to see the basic information about the flight
  • The user should be able to see the a map with recorded flight path
  • The user should be able to see the an altitude profile
  • The user should be able to filter their past flights according to:
    • With or without instructor
    • Guest name
    • Plane model
  • The user should be able to export a list of flights (potentially filtered) as a PDF
  • The user should be able to export a flight as an IGC file

I couldn’t find an app which fulfills my requirements. Most of the apps that I could find were either designed to be an auxiliary instrument in the cockpit or to be a recorder to submit flights to online gliding competitions like OLC. The obvious solution of quickly developing the app I have in mind on iOS doesn’t work because the GPS altitude is not accurate enough and none of the iOS devices have a barometric pressure sensor to measure the altitude.

A flight path recorded by AndroFlight
A flight path recorded by AndroFlight

More recent Android devices do contain a pressure sensor and there are some Android apps which come closer to what I’d like to have, like the AndroFlight but during my testing last weekend, it wasn’t able to record the barometric altitude: the recorded flight log only contains the GPS altitude. Moreover, both the design and the usability of the app seems to be from another era: the app couldn’t display the data it recorded, it wasn’t even clear that it recorded anything and it took me two days to figure out how to get the recorded flight log out of the app.

AndroFlight Main Menu
AndroFlight Main Menu

The only solution left was to develop the app on Android myself, since I already wanted to look at the grass on the other side and experiment with Android.

My plan to tackle this problem is as follows:

  1. Learn about the basics of Android development and automated testing
  2. Start implementing the basic view controllers and the navigation inside the app
  3. Figure out how to record GPS coordinates and barometric altitude at the same time
  4. Find out how to serialize the recorded data
  5. Display a list of past flights
  6. Display detailed information about a particular flight
  7. Filtering flights according to user selected criteria
  8. Fly a lot during the development to gather test data…
Engin in an ASK 21
Me in the cockpit of an ASK 21

I will be blogging about my experience during the development of my Flight Logger here. Please drop me a line if you have any ideas or comments.

Android Diary #1: Prologue

I have been developing for iOS since the first iPhone SDK came out in 2008. Originally I was on a completely different career path and learning a bit of the SDK and getting my own little apps out was my internal rationalization for viscerally desired iPhone. So the apps would pay for the iPhone, right?

Fast forward 6 years and my first Android phone (a Nexus 5 in white) was delivered yesterday. This time there was no particular desire to get the Nexus. There was no real rationalization required. I just wanted to see with my own eyes what the ‘other side’ was like. I wanted to experience the system and the apps, compare and contrast, make up my own mind instead of reading about it here and there…

Being a developer, I fully intend to explore the system, libraries and the development tools. I already have an idea for a small Android app that I want for myself: a simple flight logger that doesn’t suck.

Following Brent Simmons’ example of publishing development notes, I will be publishing my thoughts and work on my little flight logger here.

How to Implement the Frosted Glass Effect in iOS

The frosted glass effect is a central design element in iOS 7 and there are various ways how you can add it to views in your app. I have been working a lot with blur effects lately and decided to write up what I’ve learned in a series of posts. Let’s start with the simple case of creating a static frosted glass effect before moving on to blurred backgrounds with animations.

A UIToolbar instance can be used as a superview instead of a normal UIView to provide a frosted glass effect, however with iOS 7.1 it has become much difficult to configure the desired effect correctly. There are also reports of Apple rejecting apps going this route.

A better approach is to take advantage of the drawViewHierarchyInRect:afterScreenUpdates: method in UIView to efficiently capture an image of the background and use the UIImage+ImageEffects category by Apple to obtain a customized blur effect to achieve the frosted glass effect. This is best achieved in the willMoveToSuperview: method of a UIView subclass which implements the frosted glass effect:

- (void)willMoveToSuperview:(UIView *)newSuperview
    [super willMoveToSuperview:newSuperview];
    if (newSuperview == nil) {
    UIGraphicsBeginImageContextWithOptions(newSuperview.bounds.size, YES, 0.0);
    [newSuperview drawViewHierarchyInRect:newSuperview.bounds afterScreenUpdates:YES];
    UIImage *img = UIGraphicsGetImageFromCurrentImageContext();
    UIImage *croppedImage = [UIImage imageWithCGImage:CGImageCreateWithImageInRect(img.CGImage, self.frame)];

    self.backgroundImage = [croppedImage applyBlurWithRadius:11
                                                   tintColor:[UIColor colorWithWhite:1 alpha:0.3]

Since the background is computed once the frosted glass view moves into a superview, changing the frame of the frosted glass view will destroy the illusion. So make sure that you use this trick only for views where you know the frame or the background will not be changing.

The source code for this simple frosted glass view is on Github. In the next post in this series, I’ll show how this view can be extended to support changing the frame over a static background.

Speech Synthesis on iOS 7

One of the new APIs introduced with iOS 7 is the new speech synthesizer in the AVFoundation framework. It went relatively unnoticed among the flood of more significant changes in iOS 7.

I recently received a review for my app The Seven Minute Workout asking for spoken announcement of exercises before they start. What a nice idea, I thought let me record somebody (read: a native speaker, preferably female) saying the names of the exercises, add the audio files to the app bundle and play them when appropriate. Then a Eureka moment came: speech synthesis!

After a bit of Googling and finding out about the new Apple Speech Synthesizer, the documentation for the API was quite straight-forward. There are two classes involved in making your iPhone speak: AVSpeechUtterance which represents what is being said and AVSpeechSynthesizer which does the speaking itself.

Here is some simple example code:

    AVSpeechUtterance *utterance = [AVSpeechUtterance speechUtteranceWithString:@"Hello World!"];
    utterance.voice = [AVSpeechSynthesisVoice voiceWithLanguage:@"en-US"];
    utterance.rate = 0.70*AVSpeechUtteranceDefaultSpeechRate;

    AVSpeechSynthesizer *synth = [[AVSpeechSynthesizer alloc] init];
    [synth speakUtterance:utterance];

The code itself is pretty straight-forward. The only important detail that you should pay attention is to use the correct voice, which defaults to the locale specific language and the speed of the utterance, which is probably optimized for turn-by-turn navigation was way too fast for the purposes of The Seven Minute Workout.

View Controller Containers and Status Bar Style

One of the changes in iOS 7 is the new method of determining the status bar style through the preferredStatusBarStyle method.

The basic idea is quite simple: the system asks the current view controller for the preferred style by calling preferredStatusBarStyle whenever a status bar update is triggered. In the case of a container view controller, the container can forward the call to a child view controller by implementing the childViewControllerForStatusBarStyle: method and returning the child view controller which should receive the preferredStatusBarStyle message. If childViewControllerForStatusBarStyle: returns nil or is not implemented at all, the container view controller itself is expected to return a preferred status bar style.

This is all well and good, but Apple decided not to implement childViewControllerForStatusBarStyle: for UINavigationController and UITabBarController, preventing the view controllers contained in these controllers from determining the preferred status bar style.

However, this can easily be added by creating categories for implementing childViewControllerForStatusBarStyle:.

Here is how this can be implemented for UITabBarController:

@implementation UITabBarController (StatusBarStyle)
- (UIViewController *)childViewControllerForStatusBarStyle {
    return self.selectedViewController;

And UINavigationController:

@implementation UINavigationController (StatusBarStyle)
- (UIViewController *)childViewControllerForStatusBarStyle {
    return self.topViewController;

When you include these categories in your project, your navigation and tab bar controllers will enable their selected and top view controllers to control the status bar style. If you use this approach it is a good idea to call [self setNeedsStatusBarAppearanceUpdate] in your view controller’s viewDidAppear: to make sure that your preferredStatusBarStyle method gets called.

Adding AirDrop Support to Your App

One of the most interesting new features of iOS 7 is the AirDrop functionality. AirDrop allows users to send files between devices without the need for an Internet connection by establishing a direct connection between two devices using the built-in Bluetooth and WiFi protocols. This makes it very easy and very fast to exchange data between devices and can be a valuable competitive advantage for the apps that need to transfer files between iOS devices.

While sending files over AirDrop is quite easy, receiving them is a bit more involved but nothing to be afraid of. Let’s look at some example code to see how AirDrop support can be added to your app:

###Sending files:

To enable your own app to have the option to share files over AirDrop, you need use the system provided class UIActivityViewController:

- (void)presentActivities:(id)sender
    UIActivityViewController * activities = [[UIActivityViewController alloc] initWithActivityItems:@[_url]
    [self presentViewController:activities

Where _url is the file URL of the file that you would like to send. After initizalization, the activities controller is presented modally. If there are any other devices with AirDrop turned on nearby, the activity controller will automatically show them and handle the file transfer for you.

That’s it! No, really, you don’t need to do more to send files over AirDrop from your app!

###Receiving Files:

In order to be able to receive files over AirDrop there are two steps that you need to implement:

  • Declare your app as an handler for the desired files types.
  • Handle the received AirDrop file in your app delegate

If we want our app to be recognized as a handler for a certain file type we have to declare it as such by adding the corresponding UTI in our Info.plist. In Xcode 5 this can be achieved by going to the Info tab under our build target and adding our desired UTI under “Document Types”. One important caveat here is that some UTIs are reserved for the system. For instance if you declare your app as a handler for public.image, public.jpeg or public.png this will not be honoured by the system at run time because those UTIs seem to be reserved for Apple’s own Camera and Photo apps.

After you have declared your desired UTIs, you need to implement the method application:openURL:sourceApplication:annotation: in your application delegate. When the system routes an AirDrop file to your app, this method will get called and the url will point to a file in your app’s /Documents/Inbox directory. This directory is a special directory where your app has only read permissions. If you need to modify the file you must move it to another directory first. Another point to pay attention to is, that the files in this directory can be encrypted using data protection. In the normal case the files inside your apps documents directory are freely available but in some cases, by the time this method gets called the user could have already locked the device. Therefore it is a good idea to check if the file is readable first by using the protectedDataAvailable property of the application object which gets passed to this delegate method as the first parameter. Apple documentation recommends to save the AirDrop URL for later and to return YES from this method even if the file could not be accessed due to data protection.

If you would like to get up to speed on further new technologies introduced in iOS 7 such as UIKit Dynamics, Multipeer Connectivity, Custom View Controller Transitions and much more, check out our book Developing an iOS 7 Edge, which I wrote with an amazing group of top-notch iOS developers.

Make Your Custom UITableViewCells Scroll Smoothly

We all need custom UITableViewCells with many subviews from time to time.

The lowest-hanging fruit you can do to make them scroll more smoothly is to reduce the amount of drawing and layouting taking place in the subviews:

cell.layer.shouldRasterize = YES;
cell.layer.rasterizationScale = [UIScreen mainScreen].scale;

These two lines made Moped scroll frame rates to go from about 30fps to about 45fps on an old iPod Touch! 60fps is much closer now.

There is a small caveat though: this method causes all the views inside the table cell to be rendered to a bitmap, which gets drawn on the screen quickly when your cell scrolls. However if you have cells with animating progress bar or similar, this won’t give you any improvements and might make things worse because the rasterized bitmap needs to be redrawn every time your progress bar updates.