How To Combine Two Images In iOS

A question that has been asked a lot on my iOS Still Image Capture With AVCaptureSession post is how to combine the overlay image that is shown with the image that is being captured by the camera. The more general question is how to combine two images so that is what I will show first and then I will give you my updated captureStillImage method that you can add to the AROverlayImageCapture example project which is available here.

Let’s say we have two images we want to combine; say an image of a person that we want to overlay an image of a funny hat on to. Here are the two UIImages:

UIImage *personImage = [UIImage imageNamed:@"person.jpg"];
UIImage *hatImage = [UIImage imageNamed:@"hat.png];

In this case we want the resultant image to be that same size as the personImage. Let’s get the size that we want for the final image:

CGSize finalSize = [personImage size];

Also get the size of the hat image which is probably much smaller:

CGSize hatSize = [hatImage size];

Now we need to create a graphics context in which we will do our drawing:


The graphics context is kind of like out piece of paper that we will draw to. The first thing we want to draw on it is the photo of the person:

[personImage drawInRect:CGRectMake(0,0,finalSize.width,finalSize.height)];

Now we draw the hat at the position that we want it to be on top of the other image.

[hatImage drawInRect:CGRectMake(HAT_X_POS,HAT_Y_POS,hatSize.width,hatSize.height)];

Next we create the new UIImage with:

UIImage *newImage = [UIGraphicsGetImageFromCurrentImageContext();

Finally we need to clean up and close the context as we no longer need it:


That is all there is to it.

Now for all you people who have asked how to modify the AROverlayImageCapture project to include the overlay image here is a new version of captureStillImage which has now become captureStillImageWithOverlay: which takes a UIImage as an argument.

NOTE: One thing that you will notice that is different from what we did above is that to correctly position the overlay in this particular case we need to apply some scaling. This is due to the fact that the photos taken with the camera are 480×640 whereas the screen is 320×480 so if you don’t scale the position and size of your overlay it will not be positioned or sized the way it was on your iPhone screen.

- (void)captureStillImageWithOverlay:(UIImage*)overlay
	AVCaptureConnection *videoConnection = nil;
	for (AVCaptureConnection *connection in [[self stillImageOutput] connections]) {
		for (AVCaptureInputPort *port in [connection inputPorts]) {
			if ([[port mediaType] isEqual:AVMediaTypeVideo]) {
				videoConnection = connection;
		if (videoConnection) { 
	NSLog(@"about to request a capture from: %@", [self stillImageOutput]);
	[[self stillImageOutput] captureStillImageAsynchronouslyFromConnection:videoConnection 
                                                       completionHandler:^(CMSampleBufferRef imageSampleBuffer, NSError *error) { 
                                                         CFDictionaryRef exifAttachments = CMGetAttachment(imageSampleBuffer, kCGImagePropertyExifDictionary, NULL);
                                                         if (exifAttachments) {
                                                           NSLog(@"attachements: %@", exifAttachments);
                                                         } else { 
                                                           NSLog(@"no attachments");
                                                         NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageSampleBuffer];    
                                                         UIImage *image = [[UIImage alloc] initWithData:imageData];
                                                         CGSize imageSize = [image size];
                                                         CGSize overlaySize = [overlay size];
                                                         [image drawInRect:CGRectMake(0, 0, imageSize.width, imageSize.height)];
                                                         CGFloat xScaleFactor = imageSize.width / 320;
                                                         CGFloat yScaleFactor = imageSize.height / 480;
                                                         [overlay drawInRect:CGRectMake(30 * xScaleFactor, 100 * yScaleFactor, overlaySize.width * xScaleFactor, overlaySize.height * yScaleFactor)]; // rect used in AROverlayViewController was (30,100,260,200)
                                                         UIImage *combinedImage = UIGraphicsGetImageFromCurrentImageContext();
                                                         [self setStillImage:combinedImage];
                                                         [image release];
                                                         [[NSNotificationCenter defaultCenter] postNotificationName:kImageCapturedSuccessfully object:nil];

You will also need to declare the method in your CaptureSessionManager header file:

- (void)captureStillImageWithOverlay:(UIImage*)overlay;

Finally you will need to update the scanButtonPressed method in AROverlayViewController.m with:

- (void)scanButtonPressed {
	[[self scanningLabel] setHidden:NO];
        [[self captureManager] captureStillImageWithOverlay:[UIImage imageNamed:@"overlaygraphic.png"]];

There you have it folks, how to combine two images in iOS and the particulars to update the AROverlayImageCapture project to include the overlay image in your capture output.

Related posts

Leave a Comment