Quick-Tip: Detecting Smiles Using iOS 7 (Objective-C)

Torrey Betts / Monday, October 14, 2013


iOS 7 introduced 2 new detections for faces in images; smiles and eye blinking. This functionality is really useful for developers writing apps that use the camera or process image files and want to detect these characteristics.


Detecting Smiles and Eye Blinks

Detecting faces, smiles and eye blinks requires very little code. In the snippet below we figure out the image orientation for later passing into the CIDetector featuresInImage:options: method. Then we configure the detection accuracy in an NSDictionary and initialize the CIDetector with the NSDictionary containing options. Afterwards, we call featuresInImage:options: to grab an NSArray of all features found in the image to process in a for loop.

    int exifOrientation = 0;
    NSMutableString *detectionString = [[NSMutableString alloc] init];

    switch (_imageView.image.imageOrientation) {
        case UIImageOrientationUp:
            exifOrientation = 1;
        case UIImageOrientationDown:
            exifOrientation = 3;
        case UIImageOrientationLeft:
            exifOrientation = 8;
        case UIImageOrientationRight:
            exifOrientation = 6;
        case UIImageOrientationUpMirrored:
            exifOrientation = 2;
        case UIImageOrientationDownMirrored:
            exifOrientation = 4;
        case UIImageOrientationLeftMirrored:
            exifOrientation = 5;
        case UIImageOrientationRightMirrored:
            exifOrientation = 7;

    NSDictionary *detectorOptions = @{ CIDetectorAccuracy : CIDetectorAccuracyHigh };
    CIDetector *faceDetector = [CIDetector detectorOfType:CIDetectorTypeFace context:nil options:detectorOptions];

    NSArray *features = [faceDetector featuresInImage:[CIImage imageWithCGImage:_imageView.image.CGImage]
                                              options:@{ CIDetectorSmile : @YES,
                                                         CIDetectorEyeBlink : @YES,
                                                         CIDetectorImageOrientation :[NSNumber numberWithInt:exifOrientation] }];

    for(CIFaceFeature *faceFeature in features)
        NSString *hasSmile = faceFeature.hasSmile ? @"Yes" : @"No";
        NSString *hasLeftEye = faceFeature.hasLeftEyePosition ? @"Yes" : @"No";
        NSString *hasRightEye = faceFeature.hasRightEyePosition ? @"Yes" : @"No";
        NSString *hasLeftEyeBlink = faceFeature.leftEyeClosed ? @"Yes" : @"No";
        NSString *hasRightEyeBlink = faceFeature.rightEyeClosed ? @"Yes" : @"No";
        [detectionString appendFormat:@" SMILING: %@\n LEFT EYE: %@\n LEFT EYE BLINKING: %@\n RIGHT EYE: %@\n RIGHT EYE BLINKING: %@",
                        hasSmile, hasLeftEye, hasLeftEyeBlink, hasRightEye, hasRightEyeBlink];

Note: From my own personal testing, I find the detection of smiles at the time of this posting to be inconsistent. You'll have photos where a person is smiling big or a half smile and the detection does not pick up the smile even with a high or low accuracy setting.

Download the Example Project

The Xcode project source code for this quick tip can be downloaded by clicking this link. Additionally, this project uses the NucliOS IGGridView control, if you don't own a copy of NucliOS, feel free to download the trial at the following link and clicking the Download Trial button.

By Torrey Betts