Drawing Images From Pixel Data - In Swift

This can either be read as a follow up to my last post about improving a Cellular Automata demo created by Simon Gladman (aka FlexMonkey) and speeding it up or as a standalone post with simple example code for creating images (UIImage or CGImage) from raw pixel values.

Background

I had reached the point where the rendering code was the bottleneck in the Gray Scott Cellular Automata app that I was optimising. The existing code was drawing a set of one point rectangles into a UIGraphicsImageContext

I could see in the profiler that the execution time was being dominated by the drawRect calls which didn't surprise me and I knew that there must be a better way to draw pixel data.

Solution - CGDataProvider and CGImageCreate

This is the core generally applicable function that anyone can use to create images quickly from pixel data.

There are other options about how you structure and pass in the pixel data but this was sufficient for my needs and gives a simple basis for you to customise from if you need another pixel format. Swift structures just work for now creating raw data buffers for images.

Simple Usage Example

With the PixelData structure it is obvious how to set new values.

Gray Scott Renderer Now

As you can see I made a simple calculation to change the Double values (between 0.0 and 1.0 upto UInt8s. The array is created at one go and then modified as appending to the array seems to be slower than creating the array and then writing values to particular indexes. Even reserve capacity didn't seem to help (at least earlier I haven't retested with this code).

The performance improved massively (greater than 10x faster) and the next changes I made was only to log the time in the function on a sample basis (1 in 1024). There may be further improvements possible but I haven't tried yet as it is no longer dominating the runtime of the system.

Results

After making this change the code reached the point when I needed to slow it down to avoid writing hundreds of images per second to the imageview. To actually keep it busy and still write about 62 frames per second I found that I could run the solver 13 times each frame on my iPad Retina Mini and now am experimenting with increasing the image size (currently updating at about 56fps at 245x245 pixels rather than the original 70x70) and the solver is back to being the bottleneck.

Credits

I need to credit this page with providing the Objective-C code that this code is based on. While the Apple document is accurate it doesn't really guide you to the solution. Also obviously Simon Gladman for the original project and talk that inspired it.