使用 AVAssetWriter/AVAssetReader 设置视频帧速率问题

情况:

我试图导出视频与一些参数,如视频比特率,音频比特率,帧率,改变视频分辨率等。注意,我让用户设置视频帧速率的分数; 就像用户可以设置视频帧速率一样,比如23.98。

我使用 AVAssetWriterAVAssetReader执行这个操作。我使用 像素缓冲适配器编写示例缓冲区。

除了 视频帧速率视频帧速率,其他的都很好用。

我所尝试的:

  1. 设置 AVAssetWriter.movieTimeScale为建议的 给你。这确实改变了视频帧速率,但也使视频迟缓。(大意是)

  1. 设置 SourceFrameRateKey。这没有帮助。(大意是)

  1. 设定 AVAssetWriterInput.mediaTimeScale。同样,它改变了视频帧速率,但是使得视频像 AVAssetWriter.movieTimeScale一样变慢。视频显示不同的帧在某一点,有时它坚持和重新开始。(大意是)

  1. 使用 视频合成输出并设置 帧持续时间; 就像 SDAVAssetExportSession所做的那样。具有讽刺意味的是,使用 SDAVAssetExportSession 代码,视频正以我想要的正确帧速率导出,但它在我的代码中不起作用。大意是

我不确定为什么它不能与我的代码一起工作。这种方法的问题在于它总是从 CopNextSampleBuffer ()返回 null。


  1. CMSampleTimingInfo手动改变帧的时间戳,就像建议的 给你一样:
var sampleTimingInfo = CMSampleTimingInfo()
var sampleBufferToWrite: CMSampleBuffer?


CMSampleBufferGetSampleTimingInfo(vBuffer, at: 0, timingInfoOut: &sampleTimingInfo)


sampleTimingInfo.duration = CMTimeMake(value: 100, timescale: Int32(videoConfig.videoFrameRate * 100))


sampleTimingInfo.presentationTimeStamp = CMTimeAdd(previousPresentationTimeStamp, sampleTimingInfo.duration)


previousPresentationTimeStamp = sampleTimingInfo.presentationTimeStamp


let status = CMSampleBufferCreateCopyWithNewTiming(allocator: kCFAllocatorDefault, sampleBuffer: vBuffer,sampleTimingEntryCount: 1, sampleTimingArray: &sampleTimingInfo, sampleBufferOut: &sampleBufferToWrite)

通过这种方法,我确实得到了正确的帧速率设置,但是它增加了视频的持续时间(正如那个问题答案的评论中提到的)。我认为在某些情况下,我可能不得不丢弃一些帧(如果目标帧速率较低; 在大多数情况下,我需要降低帧速率)。

如果我知道如果我想要30fps,并且我的当前帧速率是60fps,那么很容易丢弃每一秒帧并相应地设置 SampleBuffer 时间。

如果我使用这种方法(即设置23.98 fps) ,我如何决定丢弃哪个帧,如果目标帧速率更高,复制哪个帧?提示: 帧速率可以是分数。


2085 次浏览

Here is an idea to select frames. Suppose the fps of source video is F and target fps is TF. rate = TF/F

Initiate a variable n equal to -rate and add rate each time, when the integer part of n changed, select the frame.

e.g. rate = 0.3
n: -0.3 0 0.3 0.6 0.9 1.2 1.5 1.8 2.1
^              ^           ^
frame index:      0  1   2   3   4   5   6   7
select 0 4 7
float rate = 0.39999f; // TF/F
float n =  -rate; // to make sure first frame will be selected
for (int i = 0; i < 100; ++i, n += rate) { // i stands for frame index, take a video with 100 frames as an example
int m = floor(n);
int tmp = n+rate;
// if rate > 1.0 repeat i
// if rate < 1.0 some of the frames will be dropped
for (int j = 0; m+j < tmp; ++j) {
// Use this frame
printf("%d ", i);
}
}
    NSMutableDictionary *writerInputParams = [[NSMutableDictionary alloc] init];
[writerInputParams setObject:AVVideoCodecTypeH264 forKey:AVVideoCodecKey];
[writerInputParams setObject:[NSNumber numberWithInt:width] forKey:AVVideoWidthKey];
[writerInputParams setObject:[NSNumber numberWithInt:height] forKey:AVVideoHeightKey];
[writerInputParams setObject:AVVideoScalingModeResizeAspectFill forKey:AVVideoScalingModeKey];
NSMutableDictionary * compressionProperties = [[NSMutableDictionary alloc] init];
[compressionProperties setObject:[NSNumber numberWithInt: 20] forKey:AVVideoExpectedSourceFrameRateKey];
[compressionProperties setObject:[NSNumber numberWithInt: 20] forKey:AVVideoAverageNonDroppableFrameRateKey];
[compressionProperties setObject:[NSNumber numberWithInt: 0.0] forKey:AVVideoMaxKeyFrameIntervalDurationKey];
[compressionProperties setObject:[NSNumber numberWithInt: 1] forKey:AVVideoMaxKeyFrameIntervalKey];
[compressionProperties setObject:[NSNumber numberWithBool:YES] forKey:AVVideoAllowFrameReorderingKey];
[compressionProperties setObject:AVVideoProfileLevelH264BaselineAutoLevel forKey:AVVideoProfileLevelKey];
[writerInputParams setObject:compressionProperties forKey:AVVideoCompressionPropertiesKey];


self.assetWriterInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:writerInputParams];
self.assetWriterInput.expectsMediaDataInRealTime = YES;

It has been verified that SCNView refreshes 60 frames per second, but using AVAssetWriter only wants to save 20 frames per second, what should to do?

Neither AVVideoExpectedSourceFrameRateKey nor AVVideoAverageNonDroppableFrameRateKey above will not affect fps, config fps will not work !!! // Set this to make sure that a functional movie is produced, even if the recording is cut off mid-stream. Only the last second should be lost in that case. self.videoWriter.movieFragmentInterval = CMTimeMakeWithSeconds(1.0, 1000); self.videoWriter.shouldOptimizeForNetworkUse = YES; self.videoWriter.movieTimeScale = 20; The above configuration will not affect fps either.

self.assetWriterInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:writerInputParams];
self.assetWriterInput.expectsMediaDataInRealTime = YES;
/// this config will change video frame presenttime to fit fps, but it will be change video duration.

// self.assetWriterInput.mediaTimeScale = 20; self.assetWriterInput.mediaTimeScale will affect the fps, but will cause the video duration to be stretched by 3 times, because BOOL isSUc = [self.writerAdaptor appendPixelBuffer:cvBuffer withPresentationTime:presentationTime]; The time of the filled frame will be re-modified, so the self.assetWriterInput.mediaTimeScale value is configured, which is seriously inconsistent with expectations, and the video duration should not be stretched.

So if you want to control the fps of the video that AVAssetWriter finally saves, you must pass the control, and must make sure call 20 per second.

CMTime presentationTime = CMTimeMake(_writeCount * (1.0/20.0) * 1000, 1000);
BOOL isSUc = [self.writerAdaptor appendPixelBuffer:cvBuffer withPresentationTime:presentationTime];
_writeCount += 1;