Be Cool with CIFilter Animations πŸ”₯


The Problem

SO. This week I was tasked to look into a very difficult animation. This was the kind of animation that I dread even though I LOVE doing animations.

This animation was the kind that made me raise my eyebrow. The kind that made you wish you knew OpenGL. The kind that made you regret not looking into OpenGL when you had the chance a couple of years ago. The kind that made you kick yourself for being too lazy to learn it when you had the time.

That's right. I had to somehow make an image ripple. Or a wave. Whatever. As long as it looked cool.

Of course, as is expected, no one knew OpenGL on my team. And worse still, I told people I could do it if given enough time. Way to dig yourself a hole, Hector.

So sure enough, I did what programmers do best. I did some stretches and prepared for some major Google-Fu.

Here's where Google brought me first:

//Solution #1
[UIView beginAnimations:nil context:NULL];
[UIView setAnimationDuration:1.0];
[UIView setAnimationTransition:(UIViewAnimationTransition)110 forView:view cache:NO];
[UIView commitAnimations];

//Solution #2
CATransition *animation = [CATransition animation];
[animation setDelegate:self];
[animation setDuration:2.0f];
[animation setTimingFunction:UIViewAnimationCurveEaseInOut];
[animation setType:@"rippleEffect"];
[myView.layer addAnimation:animation forKey:NULL];

Of course, you can imagine my surprise when that didn't work. Who would've thunk undocumented APIs wouldn't work? (yes, there is a heavy hint of sarcasm in there) And besides, when you really need something in iOS to work, never assume 4 lines of code will work for you. Also, it goes without saying that this code violates the App Store's policy on using undocumented APIs.

After that initial search I basically gave up. All other results that Google yielded were over 3 years old or too difficult to understand. That's when I stumbled into a marvelous, overly-complicated, barely-documented class: CIFilter. In that link, I saw something pretty marvelous - CIRippleTransition. After looking into their Core Image Programming Guide, I can confidently say one thing; It was pretty apparent that I was too dumb to understand how to do anything. The closest thing that helped were these 11 lines:

  1. Create Core Image images (CIImage objects) to use for the transition.
  2. Set up and schedule a timer.
  3. Create a CIContext object.
  4. Create a CIFilter object for the filter to apply to the image.
  5. On OS X, set the default values for the filter.
  6. Set the filter parameters.
  7. Set the source and the target images to process.
  8. Calculate the time.
  9. Apply the filter.
  10. Draw the result.
  11. Repeat steps 8–10 until the transition is complete.

Of course, it wasn't readily apparent what most of these lines meant so I figured it out for you! I even condensed it to about 3 steps (you're welcome):

  1. Get yourself acquainted with a filter's attributes.
  2. Apply those attributes to your filter
  3. Set a timer using CADisplayLink & update your image view in your timer function.

The steps above are actually pretty simple once you understand what's going on but there are a lot of pitfalls along the way. Also, these steps are not only applicable to initiating CIFilterTransitions - they can also be used to perform incremental animations on regular CIFilters. Personally, I like to think of these steps as what you would do if you were creating a programmatic version of a gif. Let's take these step by step to create a CIFilter animation/transition in iOS 9 (Mainly because CIRippleTransition is only available in iOS 9 but this will work for supported filters in earlier iOS versions).

Alright, so kick back, relax, and grab a glass of wine or something.

The Solution

Step 1 - Get yourself acquainted with a filter's attributes.

CIFilter instances have a dictionary property called attributes. When creating a CIFilter, you should always make it a habit to print out your filter's attributes. If you aren't the printing type, then this link should be an invaluable asset to you. The documentation I linked to has the value requirements for every filter supported by CIFilter.

Pitfall #1

πŸ’€πŸ’€πŸ’€

Unfortunately, the attributes dictionary is read-only. This is because CIFilters rely heavily on KVC (Key-Value-Coding Compliance). To set these attributes, use NSObject's setValue:forKey: instance method. When determining which values to implement, remember one thing; the only values you HAVE to set are the ones that aren't marked as having a default value. That being said, your application will crash if you set a value for an unlisted key. For our CIRippleTransition example, this line will crash when executed:

rippleTransitionFilter?.setValue(CIColor(UIColor: .redColor()), forKey: kCIInputColorKey)

and the crash will say something like this:

"...setValue:forUndefinedKey:]: this class is not key value coding-compliant for the key inputColor."

AUTHOR'S NOTE: For those wondering, this is similar to how Interface Builder works when setting IBOutlets. This also may be why the crash would look familiar to you if you've forgotten to connect your IBOutlets in your storyboards.

Step 2 - Apply those attributes to your filter

For our CIRippleTransition example, the documentation shows that the keys we need to set are the inputImage, inputTargetImage, and inputShadingImage keys. According to the docs, the inputImage is the original image to filter, and the inputTargetImage is the image you want to transition to. When printing out the attributes, the inputShadingImage attribute has a description that says, "An image that looks like a shaded sphere enclosed in a square image". Since I don't want to supply a custom shading image for this example, I'm going to feed it an empty CIImage instance since I just want a ripple to animate across my image. We can fake a ripple across our image by setting our inputTargetImage to the same image as our starting inputImage. This means the transition filter will transition from the original image to itself. Here's what setting these attributes that looks like:

let coreImage = CIImage(image: UIImage(named: "TheKraken"))
let rippleTransitionFilter = CIFilter(name: "CIRippleTransition")
rippleTransitionFilter?.setValue(coreImage, forKey: kCIInputImageKey)

//If you want to transition to another image, you would supply a different image value here.
rippleTransitionFilter?.setValue(coreImage, forKey: kCIInputTargetImageKey)
rippleTransitionFilter?.setValue(CIImage(), forKey: kCIInputShadingImageKey)

Pitfall #2

πŸ’€πŸ’€πŸ’€

What isn't readily apparent at this point is why I'm using CIImage instead of UIImage. The CoreImage library can't work with UIKit's UIImage so we need to convert any UIImages we use into CIImages. Fondly reminded of UIColor's CGColor property, I initially tried using UIImage's CIImage property. Unfortunately, this property can return nil more often than you'd think. Especially if you created an image using a CGImageRef in drawRect: like I did. This is because your UIImagecan be backed by two different image types. It can be backed by a CGImageRef or it can be backed with a CIImage. In the UIImage header, this is what we see:

var CGImage: CGImage? { get } // returns underlying CGImageRef or nil if CIImage based

@available(iOS 5.0, *)
var CIImage: CIImage? { get } // returns underlying CIImage or nil if CGImageRef based

Long story short, if your UIImage is backed by a CGImageRef, calling image.CIImage won't work for you. CIImage has an initializer that takes a UIImage though that worked everytime for me. I suggest using this method to create your CIImages:

let coreImage = CIImage(image: UIImage(named: "TheKraken"))

The reason we need to use CIImage instead of UIImage is explained perfectly by the CIImage documentation online:

Although a CIImage object has image data associated with it, it is not an image. You can think of a CIImage object as an image β€œrecipe.” A CIImage object has all the information necessary to produce an image, but Core Image doesn’t actually render an image until it is told to do so. This β€œlazy evaluation” method allows Core Image to operate as efficiently as possible.

Step 3 - Set a timer using CADisplayLink & update your image view in your timer function.

I looked at this step and I wondered - why? After looking at a few examples online, I saw that animating CIFilters meant that we need to incrementally update our filter, grab the filtered image, and redraw it as many times as we can per second. This means the smart thing to do is plug into the display's refresh rate.

So this sounds easy enough. Most of us are aware of NSTimer but for situations where I'm drawing to a display, I prefer to use CADisplayLink. This is due to the fact that the use of an NSTimer can make any custom animation suffer due to the fact that you can't always determine how long it can take a frame to render. CADisplayLink however, calls your timer method every time the screen refreshes and redraws it contents. So let's set that up and plug it in to what we already have so far:

//For the sake of bookkeeping, I am retaining a reference to our filter since we need to adjust it in our timer function
private lazy var filter: CIFilter? = {
    let coreImage = CIImage(image: UIImage(named: "TheKraken"))

    let rippleTransitionFilter = CIFilter(name: "CIRippleTransition")
    rippleTransitionFilter?.setValue(coreImage, forKey: kCIInputImageKey)
    rippleTransitionFilter?.setValue(coreImage, forKey: kCIInputTargetImageKey)
    rippleTransitionFilter?.setValue(CIImage(), forKey: kCIInputShadingImageKey)

    return rippleTransitionFilter
}()

//Setting up a duration value and our transition's startTime so we can leverage it in our calculations later.
private var duration = 2.0
private var transitionStartTime = CACurrentMediaTime()

@IBOutlet private var imageView: UIImageView!

func rippleImage(duration: Double) {
    guard let filter = filter else {
        return
    }

    //Don't forget to keep track of your duration for calculations later.
    self.duration = duration

    //Update our start time since we immediately fire off our display link after this line.
    transitionStartTime = CACurrentMediaTime()

    let displayLink = CADisplayLink(target: self, selector: "timerFired:")
    displayLink?.addToRunLoop(NSRunLoop.mainRunLoop(), forMode: NSDefaultRunLoopMode)
}

As you can see, we have set up our display link to fire off and call a timerFired: function every time the display refreshes. In our timer function, we are going to update our filter transition's inputTimeKey so it knows how far it is in the transition by leveraging our transitionStartTime and duration variables we just set up in the code above. When printing out the attributes of our ripple transition filter, we also see that the description of our inputTimeKey states this: "The parametric time of the transition. This value drives the transition from start (at time 0) to end (at time 1)." Knowing that information, we can now drive the rest of our transition home.

func timerFired(displayLink: CADisplayLink) {
    guard let filter = filter else {
        //If the filter is nil, invalidate our display link.
        displayLink.invalidate()
        return
    }

    //Grab the difference of the current time and our transitionStartTime and see the percentage of that against our duration. Using min(), we guarantee that our percentage doesn't go over 1.0.
    let progress = max((CACurrentMediaTime() - transitionStartTime) / duration, 1.0)
    filter.setValue(progress, forKey: kCIInputTimeKey)

    //After we set a value on our filter, the filter applies that value to the image and filters it accordingly so we get a new outputImage immediately after the setValue finishes running.
    imageView.image = UIImage(CIImage: filter.outputImage)

    if progress == 1.0 {
        imageView.image = UIImage(CIImage: originalCoreImage)
        displayLink.invalidate()
    }
}

What we do here is set our inputTime incrementally from 0.0 to 1.0. Each time the display refreshes, this function gets called. We then grab the current system time and do some math to figure out how far along we are in our animation and set that percentage as our inputTime. Each time we call setValue:forKey: on our filter, the filter's outputImage updates and we can then set our imageView's image from there. Once we reach 1.0 in our inputTime, we invalidate the link and stop the animation. And there you have it! Now you can go forth and animate all the transition filters you want.

HOWEVER, there is a little known issue here with certain filter animations. If the images you are trying to animate to isn't the size of the container it's being drawn to, you might notice that your imageView doesn't respect your imageView's contentMode. Also, you may even notice that you can see behind your image's edges. So yep! You guessed it. We have arrived at our final(extra) pitfalls.

Pitfall #3

πŸ’€πŸ’€πŸ’€

Let's talk about why you can see past your image's edges. In case you don't know what I'm talking about, it can look a little something like this:

This took me a while to figure out, but WWDC 2015's What's New In Core Image video finally shed some light on how to fix this problem. CIImage has two instance methods called imageByClampingToExtent() and imageByCroppingToRect(). When used with CIImage's extent property, we can solve this problem.

First, you need to use imageByClampingToExtent() when dealing with your CIFilter instance. This function returns a CIImage that makes every edge of your image extend indefinitely by repeating the last pixels found in each edge of your image. If you make each image passed into setValue:forKey: on your filter call this function, then your filtered outputImage won't have this ugly issue. In our example above, we would put the call to this function everywhere we feed an image into our filter via kCIInputImageKey or kCIInputTargetImageKey:

private lazy var filter: CIFilter? = {
    let coreImage = CIImage(image: UIImage(named: "TheKraken"))?.imageByClampingToExtent()

    let rippleTransitionFilter = CIFilter(name: "CIRippleTransition")
    rippleTransitionFilter?.setValue(coreImage, forKey: kCIInputImageKey)
    rippleTransitionFilter?.setValue(coreImage, forKey: kCIInputTargetImageKey)
    rippleTransitionFilter?.setValue(CIImage(), forKey: kCIInputShadingImageKey)

    return rippleTransitionFilter
}()

Pitfall #4

πŸ’€πŸ’€πŸ’€

Unfortunately, our animation at this point will now crash at this line inside our timer function:

imageView.image = UIImage(CIImage: filter.outputImage)

This is due to the fact that the image still has an indefinite rect. You'll get a weird auto layout crash but the real reason is because UIImageView doesn't know how to map an image to it's bounds if the image has an infinite rect. Luckily for us, we have CIImage's imageByCroppingToRect() which finally gives our image a valid dimension. When used in conjunction with imageByClampingToExtent() this function doesn't actually crop our infinite rect. Instead, it tells the CIImage's "recipe" to apply this dimension's rect when drawing to a screen. The rect you should pass in to imageByCroppingToRect() should be your original CIImage's extent property. We can now FINALLY edit our code to look like this by keeping an extra reference to our original image's CIImage representation:

private lazy var originalImageExtent: CGRect? = {
    return CIImage(image: UIImage(named: "TheKraken"))?.extent
}()

func timerFired(displayLink: CADisplayLink) {
    //Make sure our extent and filter aren't nil
    guard let filter = filter, extent = originalImageExtent else {
        //If the filter is nil, invalidate our display link.
        displayLink.invalidate()
        return
    }

    //...
    //Let's use our fancy new copy function here.
    imageView.image = UIImage(CIImage: filter.outputImage.imageByCroppingToRect(extent))
    //...
}

At this point, you should have a successful-ish CIImage animation.

Pitfall #5

πŸ’€πŸ’€πŸ’€

UNFORTUNATELY, (depending on whether or not your images aren't the size of your imageView) you will come across two more issues. One issue is that UIImages created from a CIImage using UIImage(CIImage: filter.outputImage) do not respect the aspect ratio or content mode of your imageView when drawing to it. A way around this is to convert your CIImage to a CGImage first using CIContext like so:

let context = CIContext(options: nil) //You can also create a context from an OpenGL context bee-tee-dubs.
let cgImage = context.createCGImage(filter.outputImage, fromRect: originalInputCIImage.extent())
imageView.image = UIImage(CGImage: cgImage)

Pitfall #6

πŸ’€πŸ’€πŸ’€

The second issue is if your image is a high resolution image. The issue you get here is a VERY choppy animation. I'm talking like 5 FPS. The only workaround I can seem to find is to shrink the image programmatically before filtering it or shrinking the original image using Photoshop or Preview. If any of you have a way to fix this, PLEASE let me know.

Pitfall #7

πŸ’€πŸ’€πŸ’€

Last pitfall, I swear. This one is super short too!

So you may notice that your image kind of pixelates when the animation starts. To make your image as crisp as possible, you need to specify a scale to your CIImage so it knows to scale your image up or down depending on what Retina Display your device has. This is similar to drawing images with CoreGraphics. The simple fix for this is to specify your scale when you are making a UIImage out of your outputImage like so:

let transitionImage = filter.outputImage.imageByCroppingToRect(extent)
imageView.image = UIImage(CIImage: transitionImage, scale: UIScreen.mainScreen().scale, orientation: .Up)

After that, your images should be rendering at the correct resolution!

Secret Step #4 - Be happy because you did it!

The good thing about the steps listed here is that you can apply these same paradigms to all CIFilters. For Transition Filters, you update the kCIInputTimeKey value. However, you can update other keys incrementally (if it makes sense - like kCIInputCenterKey) for all other CIFilter types and it'll work the same. Just remember to make sure that whatever key you update is supported by that filter! Using the same paradigms listed above, I was able to achieve this effect with CIBumpDistortion:

Conclusion

CIFilter's are hard to work with to begin with. Animating them, as you can see, is even harder. I'm no expert at this since I've only been playing with this since last week. As always, feel free to comment on this post and learn me a little something! I'm always open to learning new things. πŸ‘» In this post, I hope to save all of you alot of time animating your filters since there are so many issues you can come across doing so. This kind of thing is especially abhorrent when the documentation is either too complicated to follow, the video is too long and you don't have the time, or you're like me and hate jumping around to a thousand pages (like Stack Overflow) to get the entire picture. I honestly mean this next part too; I wish you the BEST of LUCK animating your filters. πŸ˜– And as always -

Happy coding fellow nerds!