IOS 图像定位行为奇怪

在过去的几个星期里,我一直在处理目标 c 中的图像,并注意到许多奇怪的行为。首先,像许多其他人一样,我一直有这样一个问题: 用相机拍摄的图像(或者用别人的相机拍摄的,对我来说是彩信)旋转90度。我不知道为什么世界上会发生这种事(因此 我的问题) ,但我能够想出一个廉价的工作左右。

我这次的问题是 为什么会这样?为什么苹果会旋转图片?当我用相机拍照时,除非我执行上面提到的代码,否则当我保存照片时,它会被旋转保存。直到几天前,我的变通方法还是可行的。

我的应用程序修改图像的单个像素,特别是 PNG 的 alpha 通道(因此在我的场景中,任何 JPEG 转换都会被抛出窗外)。几天前我注意到,虽然图像在我的应用程序中显示得很好,这要归功于我的解决方案代码,但当我的算法修改图像的单个像素时,它认为图像是旋转的。因此,它不修改图像顶部的像素,而是修改图像侧面的像素(因为它认为应该旋转) !我不知道如何在内存中旋转图像-理想情况下,我宁愿擦掉所有的 imageOrientation标志。

还有一件事也一直困扰着我... ... 当我拍照时,imageOrientation设置为3。我的工作区代码足够聪明,能够意识到这一点并翻转它,这样用户就不会注意到。此外,我的代码保存图像库意识到这一点,翻转它,那么保存它,以便它出现在相机滚动正确。

代码看起来是这样的:

NSData* pngdata = UIImagePNGRepresentation (self.workingImage); //PNG wrap
UIImage* img = [self rotateImageAppropriately:[UIImage imageWithData:pngdata]];
UIImageWriteToSavedPhotosAlbum(img, nil, nil, nil);

当我加载这个新保存的图像到我的应用程序中时,imageOrientation是0-正是我想要看到的,我的旋转工作区甚至不需要运行(注意: 当从互联网上加载图像而不是用照相机拍摄的图像时,imageOrientation总是0,导致完美的行为)。由于某种原因,我的保存代码似乎擦除这个 imageOrientation标志。我本来只是希望在用户拍摄照片并添加到应用程序中的时候,窃取这段代码,然后用它来擦除我的图像定位功能,但是它似乎不起作用。UIImageWriteToSavedPhotosAlbumimageOrientation有什么特别的功能吗?

对于这个问题最好的解决办法是只要用户一完成拍摄图像就吹走 imageOrientation。我认为苹果公司的旋转行为是有原因的,对吗?一些人认为这是苹果的缺陷。

(... ... 如果你还没有迷路... ... 注2: 当我拍横向照片时,一切看起来都很完美,就像从网上拍的照片一样)

编辑:

下面是一些图片和场景的实际样子。从目前的评论来看,这种奇怪的行为似乎不仅仅是 iPhone 的行为,我认为这是好事。

这是我用手机拍的照片(注意正确的方向) ,它看起来和我在手机上拍照时一模一样:

Actual Photo taken on iPhone

下面是我发给自己的图片在 Gmail 中的样子(看起来 Gmail 处理得很好) :

Photo as it appears in Gmail

下面是图像在窗口中看起来像一个缩略图(看起来处理不当) :

Windows Thumbnail

下面是用 Windows相片检视器打开的实际图像(仍然处理不当) :

Windows Photo Viewer Version

在所有关于这个问题的评论之后,下面是我的想法... iPhone 拍摄了一张图片,并说“为了正确显示,它需要旋转90度”。这些信息将出现在 EXIF 数据中。(为什么它需要旋转90度,而不是默认垂直,我不知道)。从这里开始,Gmail 可以很聪明地读取和分析 EXIF 数据,并正确地显示它。但是,Windows 不够聪明,无法读取 EXIF 数据,因此显示图像 不正当地。我的假设正确吗?

103962 次浏览

I did R&D on it and discovered , every image file has metadata property. If the metadata specifies the orientation of the image which is generally ignored by other OS but Mac. Most of images taken are having their meta data property set to right angle. So Mac shows it 90 degree rotated manner. You can see the same image in proper way in windows OS.

For more detail read this answer http://graphicssoft.about.com/od/digitalphotography/f/sideways-pictures.htm

try reading your image's exif here http://www.exifviewer.org/ , or http://regex.info/exif.cgi , or http://www.addictivetips.com/internet-tips/view-complete-exif-metadata-information-of-any-jpeg-image-online/

I had the same problem when I get the image from Camera, I put the following code to fix it.. Added the method scaleAndRotateImage from here

- (void) imagePickerController:(UIImagePickerController *)thePicker didFinishPickingMediaWithInfo:(NSDictionary *)imageInfo {
// Images from the camera are always in landscape, so rotate
UIImage *image = [self scaleAndRotateImage: [imageInfo objectForKey:UIImagePickerControllerOriginalImage]];
//then save the image to photo gallery or wherever
}




- (UIImage *)scaleAndRotateImage:(UIImage *) image {
int kMaxResolution = 320;


CGImageRef imgRef = image.CGImage;


CGFloat width = CGImageGetWidth(imgRef);
CGFloat height = CGImageGetHeight(imgRef);




CGAffineTransform transform = CGAffineTransformIdentity;
CGRect bounds = CGRectMake(0, 0, width, height);
if (width > kMaxResolution || height > kMaxResolution) {
CGFloat ratio = width/height;
if (ratio > 1) {
bounds.size.width = kMaxResolution;
bounds.size.height = bounds.size.width / ratio;
}
else {
bounds.size.height = kMaxResolution;
bounds.size.width = bounds.size.height * ratio;
}
}


CGFloat scaleRatio = bounds.size.width / width;
CGSize imageSize = CGSizeMake(CGImageGetWidth(imgRef), CGImageGetHeight(imgRef));
CGFloat boundHeight;
UIImageOrientation orient = image.imageOrientation;
switch(orient) {


case UIImageOrientationUp: //EXIF = 1
transform = CGAffineTransformIdentity;
break;


case UIImageOrientationUpMirrored: //EXIF = 2
transform = CGAffineTransformMakeTranslation(imageSize.width, 0.0);
transform = CGAffineTransformScale(transform, -1.0, 1.0);
break;


case UIImageOrientationDown: //EXIF = 3
transform = CGAffineTransformMakeTranslation(imageSize.width, imageSize.height);
transform = CGAffineTransformRotate(transform, M_PI);
break;


case UIImageOrientationDownMirrored: //EXIF = 4
transform = CGAffineTransformMakeTranslation(0.0, imageSize.height);
transform = CGAffineTransformScale(transform, 1.0, -1.0);
break;


case UIImageOrientationLeftMirrored: //EXIF = 5
boundHeight = bounds.size.height;
bounds.size.height = bounds.size.width;
bounds.size.width = boundHeight;
transform = CGAffineTransformMakeTranslation(imageSize.height, imageSize.width);
transform = CGAffineTransformScale(transform, -1.0, 1.0);
transform = CGAffineTransformRotate(transform, 3.0 * M_PI / 2.0);
break;


case UIImageOrientationLeft: //EXIF = 6
boundHeight = bounds.size.height;
bounds.size.height = bounds.size.width;
bounds.size.width = boundHeight;
transform = CGAffineTransformMakeTranslation(0.0, imageSize.width);
transform = CGAffineTransformRotate(transform, 3.0 * M_PI / 2.0);
break;


case UIImageOrientationRightMirrored: //EXIF = 7
boundHeight = bounds.size.height;
bounds.size.height = bounds.size.width;
bounds.size.width = boundHeight;
transform = CGAffineTransformMakeScale(-1.0, 1.0);
transform = CGAffineTransformRotate(transform, M_PI / 2.0);
break;


case UIImageOrientationRight: //EXIF = 8
boundHeight = bounds.size.height;
bounds.size.height = bounds.size.width;
bounds.size.width = boundHeight;
transform = CGAffineTransformMakeTranslation(imageSize.height, 0.0);
transform = CGAffineTransformRotate(transform, M_PI / 2.0);
break;


default:
[NSException raise:NSInternalInconsistencyException format:@"Invalid image orientation"];


}


UIGraphicsBeginImageContext(bounds.size);


CGContextRef context = UIGraphicsGetCurrentContext();


if (orient == UIImageOrientationRight || orient == UIImageOrientationLeft) {
CGContextScaleCTM(context, -scaleRatio, scaleRatio);
CGContextTranslateCTM(context, -height, 0);
}
else {
CGContextScaleCTM(context, scaleRatio, -scaleRatio);
CGContextTranslateCTM(context, 0, -height);
}


CGContextConcatCTM(context, transform);


CGContextDrawImage(UIGraphicsGetCurrentContext(), CGRectMake(0, 0, width, height), imgRef);
UIImage *imageCopy = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();


return imageCopy;
}

Any image generated by iPhone/iPad is saved as Landscape Left with EXIF Orientation tag (Exif.Image.Orientation) specifying the actual orientation.

It have the following values: 1 : Landscape Left 6 : Portrait Normal 3 : Landscape Right 4 : Portrait Upside Down

In IOS, the EXIF info is properly read and the images are displayed in the same way it was taken. But in Windows, the EXIF info is NOT used.

If you open one of these images in GIMP, it will say that the image has rotation info.

I know exactly what your problem is. You are using UIImagePicker, which is weird in every sense. I would suggest you use AVFoundation for the camera which gives flexibility in orientation as well as quality. Use AVCaptureSession. You can get the code here How to save photos taken using AVFoundation to Photo Album?

I came across this question because I was having a similar problem, but using Swift. Just wanted to link to the answer that worked for me for any other Swift developers: https://stackoverflow.com/a/26676578/3904581

Here's a Swift snippet that fixes the problem efficiently:

let orientedImage = UIImage(CGImage: initialImage.CGImage, scale: 1, orientation: initialImage.imageOrientation)!

Super simple. One line of code. Problem solved.

Quick copy/paste Swift translation of Dilip's excellent answer.

import Darwin


class func rotateCameraImageToProperOrientation(imageSource : UIImage, maxResolution : CGFloat) -> UIImage {


let imgRef = imageSource.CGImage;


let width = CGFloat(CGImageGetWidth(imgRef));
let height = CGFloat(CGImageGetHeight(imgRef));


var bounds = CGRectMake(0, 0, width, height)


var scaleRatio : CGFloat = 1
if (width > maxResolution || height > maxResolution) {


scaleRatio = min(maxResolution / bounds.size.width, maxResolution / bounds.size.height)
bounds.size.height = bounds.size.height * scaleRatio
bounds.size.width = bounds.size.width * scaleRatio
}


var transform = CGAffineTransformIdentity
let orient = imageSource.imageOrientation
let imageSize = CGSizeMake(CGFloat(CGImageGetWidth(imgRef)), CGFloat(CGImageGetHeight(imgRef)))




switch(imageSource.imageOrientation) {
case .Up :
transform = CGAffineTransformIdentity


case .UpMirrored :
transform = CGAffineTransformMakeTranslation(imageSize.width, 0.0);
transform = CGAffineTransformScale(transform, -1.0, 1.0);


case .Down :
transform = CGAffineTransformMakeTranslation(imageSize.width, imageSize.height);
transform = CGAffineTransformRotate(transform, CGFloat(M_PI));


case .DownMirrored :
transform = CGAffineTransformMakeTranslation(0.0, imageSize.height);
transform = CGAffineTransformScale(transform, 1.0, -1.0);


case .Left :
let storedHeight = bounds.size.height
bounds.size.height = bounds.size.width;
bounds.size.width = storedHeight;
transform = CGAffineTransformMakeTranslation(0.0, imageSize.width);
transform = CGAffineTransformRotate(transform, 3.0 * CGFloat(M_PI) / 2.0);


case .LeftMirrored :
let storedHeight = bounds.size.height
bounds.size.height = bounds.size.width;
bounds.size.width = storedHeight;
transform = CGAffineTransformMakeTranslation(imageSize.height, imageSize.width);
transform = CGAffineTransformScale(transform, -1.0, 1.0);
transform = CGAffineTransformRotate(transform, 3.0 * CGFloat(M_PI) / 2.0);


case .Right :
let storedHeight = bounds.size.height
bounds.size.height = bounds.size.width;
bounds.size.width = storedHeight;
transform = CGAffineTransformMakeTranslation(imageSize.height, 0.0);
transform = CGAffineTransformRotate(transform, CGFloat(M_PI) / 2.0);


case .RightMirrored :
let storedHeight = bounds.size.height
bounds.size.height = bounds.size.width;
bounds.size.width = storedHeight;
transform = CGAffineTransformMakeScale(-1.0, 1.0);
transform = CGAffineTransformRotate(transform, CGFloat(M_PI) / 2.0);


default : ()
}


UIGraphicsBeginImageContext(bounds.size)
let context = UIGraphicsGetCurrentContext()


if orient == .Right || orient == .Left {
CGContextScaleCTM(context, -scaleRatio, scaleRatio);
CGContextTranslateCTM(context, -height, 0);
} else {
CGContextScaleCTM(context, scaleRatio, -scaleRatio);
CGContextTranslateCTM(context, 0, -height);
}


CGContextConcatCTM(context, transform);
CGContextDrawImage(UIGraphicsGetCurrentContext(), CGRectMake(0, 0, width, height), imgRef);


let imageCopy = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();


return imageCopy;
}

My question this time is why is this happening? Why is Apple rotating images?

The answer to this is very simple. Apple is NOT rotating the image. That's where the confusion lies.

The CCD camera doesn't rotate, so it's always taking the photo in landscape mode.

Apple did a very smart thing - instead of spending all the time to rotate the image - shuffling megabytes of data around - just tag it with HOW the picture was taken.

OpenGL does translations very easily - so the DATA never gets shuffled - just HOW ITS DRAWN.

Hence the orientation meta data.

This becomes a problem if you want to crop, resize etc - but once you know what's happening, you just define your matrix and everything works out.

Try changing the image format to .jpeg. This worked for me

For anyone else using Xamarin, here's a C# translation of Dilip's great answer, and a thanks to thattyson for the Swift translation.

public static UIImage RotateCameraImageToProperOrientation(UIImage imageSource, nfloat maxResolution) {


var imgRef = imageSource.CGImage;


var width = (nfloat)imgRef.Width;
var height = (nfloat)imgRef.Height;


var bounds = new CGRect(0, 0, width, height);


nfloat scaleRatio = 1;


if (width > maxResolution || height > maxResolution)
{
scaleRatio = (nfloat)Math.Min(maxResolution / bounds.Width, maxResolution / bounds.Height);
bounds.Height = bounds.Height * scaleRatio;
bounds.Width = bounds.Width * scaleRatio;
}


var transform = CGAffineTransform.MakeIdentity();
var orient = imageSource.Orientation;
var imageSize = new CGSize(imgRef.Width, imgRef.Height);
nfloat storedHeight;


switch(imageSource.Orientation) {
case UIImageOrientation.Up:
transform = CGAffineTransform.MakeIdentity();
break;


case UIImageOrientation.UpMirrored :
transform = CGAffineTransform.MakeTranslation(imageSize.Width, 0.0f);
transform = CGAffineTransform.Scale(transform, -1.0f, 1.0f);
break;


case UIImageOrientation.Down :
transform = CGAffineTransform.MakeTranslation(imageSize.Width, imageSize.Height);
transform = CGAffineTransform.Rotate(transform, (nfloat)Math.PI);
break;


case UIImageOrientation.DownMirrored :
transform = CGAffineTransform.MakeTranslation(0.0f, imageSize.Height);
transform = CGAffineTransform.Scale(transform, 1.0f, -1.0f);
break;


case UIImageOrientation.Left:
storedHeight = bounds.Height;
bounds.Height = bounds.Width;
bounds.Width = storedHeight;
transform = CGAffineTransform.MakeTranslation(0.0f, imageSize.Width);
transform = CGAffineTransform.Rotate(transform, 3.0f * (nfloat)Math.PI / 2.0f);
break;


case UIImageOrientation.LeftMirrored :
storedHeight = bounds.Height;
bounds.Height = bounds.Width;
bounds.Width = storedHeight;
transform = CGAffineTransform.MakeTranslation(imageSize.Height, imageSize.Width);
transform = CGAffineTransform.Scale(transform, -1.0f, 1.0f);
transform = CGAffineTransform.Rotate(transform, 3.0f * (nfloat)Math.PI / 2.0f);
break;


case UIImageOrientation.Right :
storedHeight = bounds.Height;
bounds.Height = bounds.Width;
bounds.Width = storedHeight;
transform = CGAffineTransform.MakeTranslation(imageSize.Height, 0.0f);
transform = CGAffineTransform.Rotate(transform, (nfloat)Math.PI / 2.0f);
break;


case UIImageOrientation.RightMirrored :
storedHeight = bounds.Height;
bounds.Height = bounds.Width;
bounds.Width = storedHeight;
transform = CGAffineTransform.MakeScale(-1.0f, 1.0f);
transform = CGAffineTransform.Rotate(transform, (nfloat)Math.PI / 2.0f);
break;


default :
break;
}


UIGraphics.BeginImageContext(bounds.Size);
var context = UIGraphics.GetCurrentContext();


if (orient == UIImageOrientation.Right || orient == UIImageOrientation.Left) {
context.ScaleCTM(-scaleRatio, scaleRatio);
context.TranslateCTM(-height, 0);
} else {
context.ScaleCTM(scaleRatio, -scaleRatio);
context.TranslateCTM(0, -height);
}


context.ConcatCTM(transform);
context.DrawImage(new CGRect(0, 0, width, height), imgRef);


var imageCopy = UIGraphics.GetImageFromCurrentImageContext();
UIGraphics.EndImageContext();


return imageCopy;
}

Quickly refactored for Swift 3 (can someone test it and confirm everything works ok?):

static func rotateCameraImageToProperOrientation(imageSource : UIImage, maxResolution : CGFloat) -> UIImage {
let imgRef = imageSource.cgImage


let width = CGFloat(imgRef!.width)
let height = CGFloat(imgRef!.height)


var bounds = CGRect(x: 0, y: 0, width: width, height: height)


var scaleRatio : CGFloat = 1
if width > maxResolution || height > maxResolution {


scaleRatio = min(maxResolution / bounds.size.width, maxResolution / bounds.size.height)
bounds.size.height = bounds.size.height * scaleRatio
bounds.size.width = bounds.size.width * scaleRatio
}


var transform = CGAffineTransform.identity
let orient = imageSource.imageOrientation
let imageSize = CGSize(width: imgRef!.width, height: imgRef!.height)


switch imageSource.imageOrientation {
case .up :
transform = CGAffineTransform.identity


case .upMirrored :
transform = CGAffineTransform(translationX: imageSize.width, y: 0)
transform = transform.scaledBy(x: -1, y: 1)


case .down :
transform = CGAffineTransform(translationX: imageSize.width, y: imageSize.height)
transform = transform.rotated(by: CGFloat.pi)


case .downMirrored :
transform = CGAffineTransform(translationX: 0, y: imageSize.height)
transform = transform.scaledBy(x: 1, y: -1)


case .left :
let storedHeight = bounds.size.height
bounds.size.height = bounds.size.width
bounds.size.width = storedHeight
transform = CGAffineTransform(translationX: 0, y: imageSize.width)
transform = transform.rotated(by: 3.0 * CGFloat.pi / 2.0)


case .leftMirrored :
let storedHeight = bounds.size.height
bounds.size.height = bounds.size.width
bounds.size.width = storedHeight
transform = CGAffineTransform(translationX: imageSize.height, y: imageSize.width)
transform = transform.scaledBy(x: -1, y: 1)
transform = transform.rotated(by: 3.0 * CGFloat.pi / 2.0)


case .right :
let storedHeight = bounds.size.height
bounds.size.height = bounds.size.width
bounds.size.width = storedHeight
transform = CGAffineTransform(translationX: imageSize.height, y: 0)
transform = transform.rotated(by: CGFloat.pi / 2.0)


case .rightMirrored :
let storedHeight = bounds.size.height
bounds.size.height = bounds.size.width
bounds.size.width = storedHeight
transform = CGAffineTransform(scaleX: -1, y: 1)
transform = transform.rotated(by: CGFloat.pi / 2.0)


}


UIGraphicsBeginImageContext(bounds.size)
let context = UIGraphicsGetCurrentContext()


if orient == .right || orient == .left {


context!.scaleBy(x: -scaleRatio, y: scaleRatio)
context!.translateBy(x: -height, y: 0)
} else {
context!.scaleBy(x: scaleRatio, y: -scaleRatio)
context!.translateBy(x: 0, y: -height)
}


context!.concatenate(transform)
context!.draw(imgRef!, in: CGRect(x: 0, y: 0, width: width, height: height))


let imageCopy = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()


return imageCopy!
}

Swift 4 version with safety checks of Dilip's answer.

public static func rotateCameraImageToProperOrientation(imageSource : UIImage, maxResolution : CGFloat = 320) -> UIImage? {


guard let imgRef = imageSource.cgImage else {
return nil
}


let width = CGFloat(imgRef.width)
let height = CGFloat(imgRef.height)


var bounds = CGRect(x: 0, y: 0, width: width, height: height)


var scaleRatio : CGFloat = 1
if (width > maxResolution || height > maxResolution) {


scaleRatio = min(maxResolution / bounds.size.width, maxResolution / bounds.size.height)
bounds.size.height = bounds.size.height * scaleRatio
bounds.size.width = bounds.size.width * scaleRatio
}


var transform = CGAffineTransform.identity
let orient = imageSource.imageOrientation
let imageSize = CGSize(width: CGFloat(imgRef.width), height: CGFloat(imgRef.height))


switch(imageSource.imageOrientation) {
case .up:
transform = .identity
case .upMirrored:
transform = CGAffineTransform
.init(translationX: imageSize.width, y: 0)
.scaledBy(x: -1.0, y: 1.0)
case .down:
transform = CGAffineTransform
.init(translationX: imageSize.width, y: imageSize.height)
.rotated(by: CGFloat.pi)
case .downMirrored:
transform = CGAffineTransform
.init(translationX: 0, y: imageSize.height)
.scaledBy(x: 1.0, y: -1.0)
case .left:
let storedHeight = bounds.size.height
bounds.size.height = bounds.size.width;
bounds.size.width = storedHeight;
transform = CGAffineTransform
.init(translationX: 0, y: imageSize.width)
.rotated(by: 3.0 * CGFloat.pi / 2.0)
case .leftMirrored:
let storedHeight = bounds.size.height
bounds.size.height = bounds.size.width;
bounds.size.width = storedHeight;
transform = CGAffineTransform
.init(translationX: imageSize.height, y: imageSize.width)
.scaledBy(x: -1.0, y: 1.0)
.rotated(by: 3.0 * CGFloat.pi / 2.0)
case .right :
let storedHeight = bounds.size.height
bounds.size.height = bounds.size.width;
bounds.size.width = storedHeight;
transform = CGAffineTransform
.init(translationX: imageSize.height, y: 0)
.rotated(by: CGFloat.pi / 2.0)
case .rightMirrored:
let storedHeight = bounds.size.height
bounds.size.height = bounds.size.width;
bounds.size.width = storedHeight;
transform = CGAffineTransform
.init(scaleX: -1.0, y: 1.0)
.rotated(by: CGFloat.pi / 2.0)
}


UIGraphicsBeginImageContext(bounds.size)
if let context = UIGraphicsGetCurrentContext() {
if orient == .right || orient == .left {
context.scaleBy(x: -scaleRatio, y: scaleRatio)
context.translateBy(x: -height, y: 0)
} else {
context.scaleBy(x: scaleRatio, y: -scaleRatio)
context.translateBy(x: 0, y: -height)
}


context.concatenate(transform)
context.draw(imgRef, in: CGRect(x: 0, y: 0, width: width, height: height))
}


let imageCopy = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()


return imageCopy
}

Here is Swift3 version of Dilip's awesome answer

func rotateCameraImageToProperOrientation(imageSource : UIImage, maxResolution : CGFloat) -> UIImage {
let imgRef = imageSource.cgImage!;


let width = CGFloat(imgRef.width);
let height = CGFloat(imgRef.height);


var bounds = CGRect(x: 0, y: 0, width: width, height: height)


var scaleRatio : CGFloat = 1
if (width > maxResolution || height > maxResolution) {
scaleRatio = min(maxResolution / bounds.size.width, maxResolution / bounds.size.height)
bounds.size.height = bounds.size.height * scaleRatio
bounds.size.width = bounds.size.width * scaleRatio
}


var transform = CGAffineTransform.identity
let orient = imageSource.imageOrientation
let imageSize = CGSize(width: width, height: height)




switch(imageSource.imageOrientation) {
case .up :
transform = CGAffineTransform.identity


case .upMirrored :
transform = CGAffineTransform(translationX: imageSize.width, y: 0.0);
transform = transform.scaledBy(x: -1, y: 1);


case .down :
transform = CGAffineTransform(translationX: imageSize.width, y: imageSize.height);
transform = transform.rotated(by: CGFloat(Double.pi));


case .downMirrored :
transform = CGAffineTransform(translationX: 0.0, y: imageSize.height);
transform = transform.scaledBy(x: 1, y: -1);


case .left :
let storedHeight = bounds.size.height
bounds.size.height = bounds.size.width;
bounds.size.width = storedHeight;
transform = CGAffineTransform(translationX: 0.0, y: imageSize.width);
transform = transform.rotated(by: 3.0 * CGFloat(Double.pi) / 2.0);


case .leftMirrored :
let storedHeight = bounds.size.height
bounds.size.height = bounds.size.width;
bounds.size.width = storedHeight;
transform = CGAffineTransform(translationX: imageSize.height, y: imageSize.width);
transform = transform.scaledBy(x: -1, y: 1);
transform = transform.rotated(by: 3.0 * CGFloat(Double.pi) / 2.0);


case .right :
let storedHeight = bounds.size.height
bounds.size.height = bounds.size.width;
bounds.size.width = storedHeight;
transform = CGAffineTransform(translationX: imageSize.height, y: 0.0);
transform = transform.rotated(by: CGFloat(Double.pi) / 2.0);


case .rightMirrored :
let storedHeight = bounds.size.height
bounds.size.height = bounds.size.width;
bounds.size.width = storedHeight;
transform = CGAffineTransform(scaleX: -1.0, y: 1.0);
transform = transform.rotated(by: CGFloat(Double.pi) / 2.0);
}


UIGraphicsBeginImageContext(bounds.size)
let context = UIGraphicsGetCurrentContext()!


if orient == .right || orient == .left {
context.scaleBy(x: -scaleRatio, y: scaleRatio);
context.translateBy(x: -height, y: 0);
} else {
context.scaleBy(x: scaleRatio, y: -scaleRatio);
context.translateBy(x: 0, y: -height);
}


context.concatenate(transform);
context.draw(imgRef, in: CGRect(x: 0, y: 0, width: width, height: height))


let imageCopy = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();


return imageCopy!;
}

you can fix this HECI rotate image by this code

Swift 5 Extension:

extension UIImage {
/// Fix image orientaton to protrait up
func fixedOrientation() -> UIImage? {
guard imageOrientation != UIImage.Orientation.up else {
// This is default orientation, don't need to do anything
return self.copy() as? UIImage
}


guard let cgImage = self.cgImage else {
// CGImage is not available
return nil
}


guard let colorSpace = cgImage.colorSpace, let ctx = CGContext(data: nil, width: Int(size.width), height: Int(size.height), bitsPerComponent: cgImage.bitsPerComponent, bytesPerRow: 0, space: colorSpace, bitmapInfo: CGImageAlphaInfo.premultipliedLast.rawValue) else {
return nil // Not able to create CGContext
}


var transform: CGAffineTransform = CGAffineTransform.identity


switch imageOrientation {
case .down, .downMirrored:
transform = transform.translatedBy(x: size.width, y: size.height)
transform = transform.rotated(by: CGFloat.pi)
case .left, .leftMirrored:
transform = transform.translatedBy(x: size.width, y: 0)
transform = transform.rotated(by: CGFloat.pi / 2.0)
case .right, .rightMirrored:
transform = transform.translatedBy(x: 0, y: size.height)
transform = transform.rotated(by: CGFloat.pi / -2.0)
case .up, .upMirrored:
break
@unknown default:
fatalError("Missing...")
break
}


// Flip image one more time if needed to, this is to prevent flipped image
switch imageOrientation {
case .upMirrored, .downMirrored:
transform = transform.translatedBy(x: size.width, y: 0)
transform = transform.scaledBy(x: -1, y: 1)
case .leftMirrored, .rightMirrored:
transform = transform.translatedBy(x: size.height, y: 0)
transform = transform.scaledBy(x: -1, y: 1)
case .up, .down, .left, .right:
break
@unknown default:
fatalError("Missing...")
break
}


ctx.concatenate(transform)


switch imageOrientation {
case .left, .leftMirrored, .right, .rightMirrored:
ctx.draw(cgImage, in: CGRect(x: 0, y: 0, width: size.height, height: size.width))
default:
ctx.draw(cgImage, in: CGRect(x: 0, y: 0, width: size.width, height: size.height))
break
}


guard let newCGImage = ctx.makeImage() else { return nil }
return UIImage.init(cgImage: newCGImage, scale: 1, orientation: .up)
}
}

Objective C Code :

-(UIImage *)scaleAndRotateImage:(UIImage *)image{
// No-op if the orientation is already correct
if (image.imageOrientation == UIImageOrientationUp) return image;
        

// We need to calculate the proper transformation to make the image upright.
// We do it in 2 steps: Rotate if Left/Right/Down, and then flip if Mirrored.
CGAffineTransform transform = CGAffineTransformIdentity;
        

switch (image.imageOrientation) {
case UIImageOrientationDown:
case UIImageOrientationDownMirrored:
transform = CGAffineTransformTranslate(transform, image.size.width, image.size.height);
transform = CGAffineTransformRotate(transform, M_PI);
break;
                

case UIImageOrientationLeft:
case UIImageOrientationLeftMirrored:
transform = CGAffineTransformTranslate(transform, image.size.width, 0);
transform = CGAffineTransformRotate(transform, M_PI_2);
break;
                

case UIImageOrientationRight:
case UIImageOrientationRightMirrored:
transform = CGAffineTransformTranslate(transform, 0, image.size.height);
transform = CGAffineTransformRotate(transform, -M_PI_2);
break;
case UIImageOrientationUp:
case UIImageOrientationUpMirrored:
break;
}
        

switch (image.imageOrientation) {
case UIImageOrientationUpMirrored:
case UIImageOrientationDownMirrored:
transform = CGAffineTransformTranslate(transform, image.size.width, 0);
transform = CGAffineTransformScale(transform, -1, 1);
break;
                

case UIImageOrientationLeftMirrored:
case UIImageOrientationRightMirrored:
transform = CGAffineTransformTranslate(transform, image.size.height, 0);
transform = CGAffineTransformScale(transform, -1, 1);
break;
case UIImageOrientationUp:
case UIImageOrientationDown:
case UIImageOrientationLeft:
case UIImageOrientationRight:
break;
}
        

// Now we draw the underlying CGImage into a new context, applying the transform
// calculated above.
CGContextRef ctx = CGBitmapContextCreate(NULL, image.size.width, image.size.height,
CGImageGetBitsPerComponent(image.CGImage), 0,
CGImageGetColorSpace(image.CGImage),
CGImageGetBitmapInfo(image.CGImage));
CGContextConcatCTM(ctx, transform);
switch (image.imageOrientation) {
case UIImageOrientationLeft:
case UIImageOrientationLeftMirrored:
case UIImageOrientationRight:
case UIImageOrientationRightMirrored:
// Grr...
CGContextDrawImage(ctx, CGRectMake(0,0,image.size.height,image.size.width), image.CGImage);
break;
                

default:
CGContextDrawImage(ctx, CGRectMake(0,0,image.size.width,image.size.height), image.CGImage);
break;
}
        

// And now we just create a new UIImage from the drawing context
CGImageRef cgimg = CGBitmapContextCreateImage(ctx);
UIImage *img = [UIImage imageWithCGImage:cgimg];
CGContextRelease(ctx);
CGImageRelease(cgimg);
return img;
}

Use of code

 UIImage *img=[info objectForKey:UIImagePickerControllerOriginalImage];


img=[self scaleAndRotateImage:img];
NSData *image = UIImageJPEGRepresentation(img, 0.1);