在 ALAsset表示中解释 XMP- 元数据

当用户对 iOS 上内置的 Photos.app中的照片进行一些更改(裁剪、红眼移除、 ...)时,这些更改不会应用到相应的 ALAssetRepresentation返回的 fullResolutionImage中。

但是,这些更改将应用于 ALAssetRepresentation返回的 thumbnailfullScreenImage。 此外,关于应用的更改的信息可以通过键 @"AdjustmentXMP"ALAssetRepresentation的元数据字典中找到。

我想应用这些变化的 fullResolutionImage自己保持一致性。我发现在 IOS6 + CIFilterfilterArrayFromSerializedXMP: inputImageExtent:error:可以将这个 XMP 元数据转换成一个 CIFilter的数组:

ALAssetRepresentation *rep;
NSString *xmpString = rep.metadata[@"AdjustmentXMP"];
NSData *xmpData = [xmpString dataUsingEncoding:NSUTF8StringEncoding];


CIImage *image = [CIImage imageWithCGImage:rep.fullResolutionImage];


NSError *error = nil;
NSArray *filterArray = [CIFilter filterArrayFromSerializedXMP:xmpData
inputImageExtent:image.extent
error:&error];
if (error) {
NSLog(@"Error during CIFilter creation: %@", [error localizedDescription]);
}


CIContext *context = [CIContext contextWithOptions:nil];


for (CIFilter *filter in filterArray) {
[filter setValue:image forKey:kCIInputImageKey];
image = [filter outputImage];
}

然而,这只适用于一些过滤器(裁剪,自动增强) ,但不适用于其他人,如红眼删除。在这些情况下,CIFilter没有明显的效果。因此,我的问题是:

  • 是否有人知道一种方法来创建红眼删除 CIFilter?(在某种程度上与 Photos.app 一致。键 kCIImageAutoAdjustRedEye的过滤器是不够的。例如,它不需要眼睛位置的参数。)
  • 是否有可能在 iOS5下生成并应用这些过滤器?
4258 次浏览
ALAssetRepresentation* representation = [[self assetAtIndex:index] defaultRepresentation];


// Create a buffer to hold the data for the asset's image
uint8_t *buffer = (Byte*)malloc(representation.size); // Copy the data from the asset into the buffer
NSUInteger length = [representation getBytes:buffer fromOffset: 0.0  length:representation.size error:nil];


if (length==0)
return nil;


// Convert the buffer into a NSData object, and free the buffer after.


NSData *adata = [[NSData alloc] initWithBytesNoCopy:buffer length:representation.size freeWhenDone:YES];


// Set up a dictionary with a UTI hint. The UTI hint identifies the type
// of image we are dealing with (that is, a jpeg, png, or a possible
// RAW file).


// Specify the source hint.


NSDictionary* sourceOptionsDict = [NSDictionary dictionaryWithObjectsAndKeys:


(id)[representation UTI], kCGImageSourceTypeIdentifierHint, nil];


// Create a CGImageSource with the NSData. A image source can
// contain x number of thumbnails and full images.


CGImageSourceRef sourceRef = CGImageSourceCreateWithData((CFDataRef) adata,  (CFDictionaryRef) sourceOptionsDict);


[adata release];


CFDictionaryRef imagePropertiesDictionary;


// Get a copy of the image properties from the CGImageSourceRef.


imagePropertiesDictionary = CGImageSourceCopyPropertiesAtIndex(sourceRef,0, NULL);


CFNumberRef imageWidth = (CFNumberRef)CFDictionaryGetValue(imagePropertiesDictionary, kCGImagePropertyPixelWidth);


CFNumberRef imageHeight = (CFNumberRef)CFDictionaryGetValue(imagePropertiesDictionary, kCGImagePropertyPixelHeight);


int w = 0;


int h = 0;


CFNumberGetValue(imageWidth, kCFNumberIntType, &w);


CFNumberGetValue(imageHeight, kCFNumberIntType, &h);


// Clean up memory


CFRelease(imagePropertiesDictionary);