iOS Still Image Capture Using AVCaptureSession

I had a request to show how to capture a still image of the live video feed in the AROverlayExample project. This is probably the simplest use of the output of the AVCaptureSession possible so I have created a new project based off of the AROverlayExample that uses the scan button to capture an image and saves the image to your device’s photo album.

You can get the source code for AROverlayImageCapture here.

Here are the instructions for how to do this starting with the AROverlayExample. If you haven’t already, it might be a good idea to read through my last post to understand how we got to where we are. I am using Xcode 4.0.

Add Frameworks

The first thing we need to do is add a few frameworks to the project. Click on your project in the Groups & Files pane, then on the right choose your target and then the Build Phases tab. Click on Link Binary With Libraries so that it expands. Click the plus sign and add the ImageIO, CoreMedia and CoreVideo frameworks to your target.

Modify CaptureManager

In your CaptureManager.h file add the following pound define at the top and then these two properties and public methods:


#define kImageCapturedSuccessfully @"imageCapturedSuccessfully"
.
.
.
@property (retain) AVCaptureStillImageOutput *stillImageOutput;
@property (nonatomic, retain) UIImage *stillImage;

- (void)addStillImageOutput;
- (void)captureStillImage;

The first property is of the class AVCaptureStillImageOutput and as is implied by its verbose name it is needed to capture a still image. The UIImage will keep a reference to the image once we have captured it.

At the top of CaptureManager.m import the ImageIO framework with:

#import <ImageIO/ImageIO.h>

I’m not going to show you (though it is in the source code) but remember to synthesize the two new properties and release and nil them in the dealloc method. Then add the following two methods:

- (void)addStillImageOutput 
{
  [self setStillImageOutput:[[[AVCaptureStillImageOutput alloc] init] autorelease]];
  NSDictionary *outputSettings = [[NSDictionary alloc] initWithObjectsAndKeys:AVVideoCodecJPEG,AVVideoCodecKey,nil];
  [[self stillImageOutput] setOutputSettings:outputSettings];
  
  AVCaptureConnection *videoConnection = nil;
  for (AVCaptureConnection *connection in [[self stillImageOutput] connections]) {
    for (AVCaptureInputPort *port in [connection inputPorts]) {
      if ([[port mediaType] isEqual:AVMediaTypeVideo] ) {
        videoConnection = connection;
        break;
      }
    }
    if (videoConnection) { 
      break; 
    }
  }
  
  [[self captureSession] addOutput:[self stillImageOutput]];
}

- (void)captureStillImage
{  
	AVCaptureConnection *videoConnection = nil;
	for (AVCaptureConnection *connection in [[self stillImageOutput] connections]) {
		for (AVCaptureInputPort *port in [connection inputPorts]) {
			if ([[port mediaType] isEqual:AVMediaTypeVideo]) {
				videoConnection = connection;
				break;
			}
		}
		if (videoConnection) { 
      break; 
    }
	}
  
	NSLog(@"about to request a capture from: %@", [self stillImageOutput]);
	[[self stillImageOutput] captureStillImageAsynchronouslyFromConnection:videoConnection 
                                                       completionHandler:^(CMSampleBufferRef imageSampleBuffer, NSError *error) { 
                                                         CFDictionaryRef exifAttachments = CMGetAttachment(imageSampleBuffer, kCGImagePropertyExifDictionary, NULL);
                                                         if (exifAttachments) {
                                                           NSLog(@"attachements: %@", exifAttachments);
                                                         } else { 
                                                           NSLog(@"no attachments");
                                                         }
                                                         NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageSampleBuffer];    
                                                         UIImage *image = [[UIImage alloc] initWithData:imageData];
                                                         [self setStillImage:image];
                                                         [image release];
                                                         [[NSNotificationCenter defaultCenter] postNotificationName:kImageCapturedSuccessfully object:nil];
                                                       }];
}

The first method just prepares the stillImageOutput by letting you specify the output settings. The settings I have used work fine for our sample app but you may want to explore the other options. I haven’t really looked into them.

The second method actually captures a still image of the live video view and saves the image to the stillImage property we set up earlier. It then posts a notification that the image has been captured which we will be listening for in our AVOverlayViewController which we will get to next.

Modify AVOverlayViewController

Now we need to add a few things to AVOverlayViewController.m. The first thing to do is to declare a new private method at the top of the file like so:

@interface AROverlayViewController ()
- (void)image:(UIImage *)image didFinishSavingWithError:(NSError *)error contextInfo:(void *)contextInfo;
@end

Then add the following two methods to the file:

- (void)saveImageToPhotoAlbum 
{
  UIImageWriteToSavedPhotosAlbum([[self captureManager] stillImage], self, @selector(image:didFinishSavingWithError:contextInfo:), nil);
}

- (void)image:(UIImage *)image didFinishSavingWithError:(NSError *)error contextInfo:(void *)contextInfo
{
  if (error != NULL) {
    UIAlertView *alert = [[UIAlertView alloc] initWithTitle:@"Error!" message:@"Image couldn't be saved" delegate:self cancelButtonTitle:@"Ok" otherButtonTitles:nil, nil];
    [alert show];
    [alert release];
  }
  else {
    [[self scanningLabel] setHidden:YES];
  }
}

The first method is simply a method to save the captured image to your Photo Album and the second method is a callback to call when the image has been saved. I have modified the project so that while the image is being saved the scanning label says “Saving…” Once the save is complete the label is hidden. There is some basic error checking as well.

Now we need to wire all these things together. First of all lets make sure we initialize the stillImageOutput by calling

[[self captureManager] addStillImageOutput];

in our viewDidLoad. Also add an observer for the notification that will be sent when the image has been captured with:

[[NSNotificationCenter defaultCenter] addObserver:self selector:@selector(saveImageToPhotoAlbum) name:kImageCapturedSuccessfully object:nil];

Finally change your button method to be:

- (void)scanButtonPressed {
	[[self scanningLabel] setHidden:NO];
  [[self captureManager] captureStillImage];
}

You should now be able to run the app and when you tap the scan button the live video feed will be captured and saved to your photo album. Let me know if you have any issues with this implementation.

Related posts

Leave a Comment