Advanced Imaging On iOS: @rsebbe
Advanced Imaging On iOS: @rsebbe
@rsebbe
Foreword
• Don’t make any assumption about imaging on
iOS. Why?
AVFoundation
Imaging 101
• On iOS, you typically use either PNGs and/or JPEGs.
Unified Memory
GPU & CPU share same memory.
Architecture Decode T Display
Going back&forth is cheap
(iOS/Mac)
Comparisons Transfer
GPU
Draw w/
transform
Decode Process T Display CGContextDrawImage
Draw w/ CALayer/UIView
transform
Decode T Display
setTransform
CALayer.contents (or
Pure GPU Decode Display
UIImageView.image)
Demo 1
The Strong, the Weak, & the Ugly
• How do I do that?
Core Graphics / ImageIO
• Need access to pixel values: use CGBitmapContext
Draw small
from large Decode Display CALayer.contents Memory
image (GPU)
Draw small
CGContextDrawImage (with
from large Decode T Display Mem
small target size)
image (CPU)
Demo 2
The Strong & Idiot vs. the Weak & Smart
11MP, 10x
Show GPU is slower. Show GPU version does entire image decoding, while CPU does smarter, reduced drawing.
Show Time Profiler function trace
Show VMTracker tool, Dirty size.
Change code to show influence of draw size on speed (+ function trace)
Core Image
• CPU or GPU, ~20x speed difference on recent iPhone
Source Image
Perspective Crop
CIImage
Pure GPU Decode Process Display imageWithContentsOfURL
(or CGImage)
Core Image
• Live processing or not? Depends.
Visible Tiled
Atomic Refresh
Rendering
Faster computation
Slower computation
overall
Compute
Use CGImageSourceCreateThumbnail (or
thumbnails from
CGBitmapContext / CGContextDrawImage)
large image