使用 AVPlayer 时保持良好的滚动性能

我正在开发一个应用程序,其中有一个集合视图,并且集合视图的单元格可以包含视频。现在我正在使用 AVPlayerAVPlayerLayer显示视频。不幸的是,滚动性能非常糟糕。看起来 AVPlayerAVPlayerItemAVPlayerLayer在主线程上做了很多工作。他们不断取出锁,等待信号量等,这阻塞了主线程,并造成严重的帧下降。

有没有办法告诉 AVPlayer停止在主线程上做这么多事情?到目前为止,我还没有尝试解决这个问题。

我还尝试用 AVSampleBufferDisplayLayer构建一个简单的视频播放器。使用它,我可以确保所有的事情都发生在主线程之外,我可以在滚动和播放视频时达到 ~ 60 fps。不幸的是,这种方法的级别要低得多,而且它不提供音频播放和时间擦洗等功能。有没有什么方法可以获得类似的性能与 AVPlayer?我宁愿用那个。

编辑: 深入研究之后,看起来在使用 AVPlayer时不可能获得良好的滚动性能。创建一个 AVPlayer并与一个 AVPlayerItem实例关联将启动一系列工作,这些工作将蹦床到主线程上,然后在主线程上等待信号量并尝试获取一系列锁。随着 scrollview 中视频数量的增加,主线程停止运行的时间急剧增加。

AVPlayer交易量似乎也是一个巨大的问题。处理一个 AVPlayer也试图同步一些东西。同样,当你创造更多的玩家时,情况会变得非常糟糕。

这是相当令人沮丧的,它使 AVPlayer几乎不可用我正在尝试做的事情。像这样阻止主线是如此业余的事情,所以很难相信苹果的工程师会犯这样的错误。不管怎样,希望他们能尽快解决这个问题。

19567 次浏览

尽可能多地在后台队列中构建您的 AVPlayerItem(有些操作必须在主线程上执行,但是您可以执行安装操作并等待视频属性加载到后台队列中——仔细阅读文档)。这涉及到与 KVO 的巫毒舞蹈,真的不好玩。

AVPlayer等待 AVPlayerItem状态变成 AVPlayerItemStatusReadyToPlay时,就会发生打嗝。为了减少打嗝的长度,你要尽可能多地让 AVPlayerItem在背景线程上更接近于 AVPlayerItemStatusReadyToPlay,然后再把它分配给 AVPlayer

我实际上已经有一段时间没有实现这个了,但是 IIRC 主线程块的产生是因为底层 AVURLAsset的属性是延迟加载的,如果你不自己加载它们,当 AVPlayer想要播放的时候,它们会在主线程上繁忙加载。

查看 AVAsset 的文档,尤其是 AVAsynchronousKeyValueLoading周围的东西。我认为在 AVPlayer上使用资产以最小化主线程块之前,我们需要加载 durationtracks的值。有可能我们还必须走过每一个轨道,并对每一个环节做 AVAsynchronousKeyValueLoading,但我不记得100% 。

不知道这是否有帮助——但是这里有一些我用来在后台队列上加载视频的代码,绝对有助于主线程阻塞(抱歉,如果它没有编译1:1,我是从一个更大的代码库中抽象出来的) :

func loadSource() {
self.status = .Unknown


let operation = NSBlockOperation()
operation.addExecutionBlock { () -> Void in
// create the asset
let asset = AVURLAsset(URL: self.mediaUrl, options: nil)
// load values for track keys
let keys = ["tracks", "duration"]
asset.loadValuesAsynchronouslyForKeys(keys, completionHandler: { () -> Void in
// Loop through and check to make sure keys loaded
var keyStatusError: NSError?
for key in keys {
var error: NSError?
let keyStatus: AVKeyValueStatus = asset.statusOfValueForKey(key, error: &error)
if keyStatus == .Failed {
let userInfo = [NSUnderlyingErrorKey : key]
keyStatusError = NSError(domain: MovieSourceErrorDomain, code: MovieSourceAssetFailedToLoadKeyValueErrorCode, userInfo: userInfo)
println("Failed to load key: \(key), error: \(error)")
}
else if keyStatus != .Loaded {
println("Warning: Ignoring key status: \(keyStatus), for key: \(key), error: \(error)")
}
}
if keyStatusError == nil {
if operation.cancelled == false {
let composition = self.createCompositionFromAsset(asset)
// register notifications
let playerItem = AVPlayerItem(asset: composition)
self.registerNotificationsForItem(playerItem)
self.playerItem = playerItem
// create the player
let player = AVPlayer(playerItem: playerItem)
self.player = player
}
}
else {
println("Failed to load asset: \(keyStatusError)")
}
})


// add operation to the queue
SomeBackgroundQueue.addOperation(operation)
}


func createCompositionFromAsset(asset: AVAsset, repeatCount: UInt8 = 16) -> AVMutableComposition {
let composition = AVMutableComposition()
let timescale = asset.duration.timescale
let duration = asset.duration.value
let editRange = CMTimeRangeMake(CMTimeMake(0, timescale), CMTimeMake(duration, timescale))
var error: NSError?
let success = composition.insertTimeRange(editRange, ofAsset: asset, atTime: composition.duration, error: &error)
if success {
for _ in 0 ..< repeatCount - 1 {
composition.insertTimeRange(editRange, ofAsset: asset, atTime: composition.duration, error: &error)
}
}
return composition
}

我设法在每个单元格中创建了一个类似 feed 的水平视图,像这样:

  1. 缓冲-创建一个管理器,以便您可以预加载(缓冲)的视频。你想要缓冲的 AVPlayers的数量取决于你所寻找的经验。在我的应用程序我只管理3 AVPlayers,所以一个球员正在发挥现在和前面的和下一个球员正在缓冲。缓冲管理器所做的就是管理正确的视频在任何给定的点上被缓冲

  2. 重复使用的单元格-让 TableView/CollectionView重复使用 cellForRowAtIndexPath:中的单元格,你所要做的就是在你让单元格传递给他之后,它就是正确的播放器(我只是在单元格上给缓冲区一个 indexPath,然后他返回正确的那个)

  3. 每当缓冲管理器接到一个加载新视频的调用来缓冲 AVPlayer 时,创建他的所有资产和通知,就像这样调用它们:

//播放器

dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_BACKGROUND, 0), ^{
self.videoContainer.playerLayer.player = self.videoPlayer;
self.asset = [AVURLAsset assetWithURL:[NSURL URLWithString:self.videoUrl]];
NSString *tracksKey = @"tracks";
dispatch_async(dispatch_get_main_queue(), ^{
[self.asset loadValuesAsynchronouslyForKeys:@[tracksKey]
completionHandler:^{                         dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_BACKGROUND, 0), ^{
NSError *error;
AVKeyValueStatus status = [self.asset statusOfValueForKey:tracksKey error:&error];


if (status == AVKeyValueStatusLoaded) {
self.playerItem = [AVPlayerItem playerItemWithAsset:self.asset];
// add the notification on the video
// set notification that we need to get on run time on the player & items
// a notification if the current item state has changed
[self.playerItem addObserver:self forKeyPath:@"status" options:NSKeyValueObservingOptionNew context:contextItemStatus];
// a notification if the playing item has not yet started to buffer
[self.playerItem addObserver:self forKeyPath:@"playbackBufferEmpty" options:NSKeyValueObservingOptionNew context:contextPlaybackBufferEmpty];
// a notification if the playing item has fully buffered
[self.playerItem addObserver:self forKeyPath:@"playbackBufferFull" options:NSKeyValueObservingOptionNew context:contextPlaybackBufferFull];
// a notification if the playing item is likely to keep up with the current buffering rate
[self.playerItem addObserver:self forKeyPath:@"playbackLikelyToKeepUp" options:NSKeyValueObservingOptionNew context:contextPlaybackLikelyToKeepUp];
// a notification to get information about the duration of the playing item
[self.playerItem addObserver:self forKeyPath:@"duration" options:NSKeyValueObservingOptionNew context:contextDurationUpdate];
// a notificaiton to get information when the video has finished playing
[NotificationCenter addObserver:self selector:@selector(itemDidFinishedPlaying:) name:AVPlayerItemDidPlayToEndTimeNotification object:self.playerItem];
self.didRegisterWhenLoad = YES;


self.videoPlayer = [AVPlayer playerWithPlayerItem:self.playerItem];


// a notification if the player has chenge it's rate (play/pause)
[self.videoPlayer addObserver:self forKeyPath:@"rate" options:NSKeyValueObservingOptionNew context:contextRateDidChange];
// a notification to get the buffering rate on the current playing item
[self.videoPlayer addObserver:self forKeyPath:@"currentItem.loadedTimeRanges" options:NSKeyValueObservingOptionNew context:contextTimeRanges];
}
});
}];
});
});

地点: 是你想要添加播放器的视图

如果你需要任何帮助或解释,请告诉我

祝你好运

如果你查看 Facebook 的 异步显示工具包(Facebook 和 Instagram 订阅源背后的引擎) ,你可以使用它们的 AVideoNode在后台线程上渲染大部分视频。如果您子节点到一个 ASDisplayNode,并添加到 displayNode.view的任何视图,您正在滚动(表/集合/滚动) ,您可以实现完美的平滑滚动 (只要确保他们在后台线程上创建节点和资产等等)。唯一的问题是在更改视频项时,因为这会强制自己进入主线程。如果你只有一些视频在那个特定的视图你是罚款使用这种方法!

        dispatch_async(dispatch_get_global_queue(QOS_CLASS_BACKGROUND, 0), {
self.mainNode = ASDisplayNode()
self.videoNode = ASVideoNode()
self.videoNode!.asset = AVAsset(URL: self.videoUrl!)
self.videoNode!.frame = CGRectMake(0.0, 0.0, self.bounds.width, self.bounds.height)
self.videoNode!.gravity = AVLayerVideoGravityResizeAspectFill
self.videoNode!.shouldAutoplay = true
self.videoNode!.shouldAutorepeat = true
self.videoNode!.muted = true
self.videoNode!.playButton.hidden = true
            

dispatch_async(dispatch_get_main_queue(), {
self.mainNode!.addSubnode(self.videoNode!)
self.addSubview(self.mainNode!.view)
})
})

下面是一个在 UICollectionView 中显示“视频墙”的工作解决方案:

1)将所有单元格存储在一个 NSMapTable 中(从此以后,您只能从 NSMapTable 访问一个单元格对象) :

self.cellCache = [[NSMapTable alloc] initWithKeyOptions:NSPointerFunctionsWeakMemory valueOptions:NSPointerFunctionsStrongMemory capacity:AppDelegate.sharedAppDelegate.assetsFetchResults.count];
for (NSInteger i = 0; i < AppDelegate.sharedAppDelegate.assetsFetchResults.count; i++) {
[self.cellCache setObject:(AssetPickerCollectionViewCell *)[self.collectionView dequeueReusableCellWithReuseIdentifier:CellReuseIdentifier forIndexPath:[NSIndexPath indexPathForItem:i inSection:0]] forKey:[NSIndexPath indexPathForItem:i inSection:0]];
}

2)将此方法添加到 UICollectionViewCell 子类:

- (void)setupPlayer:(PHAsset *)phAsset {
typedef void (^player) (void);
player play = ^{
NSString __autoreleasing *serialDispatchCellQueueDescription = ([NSString stringWithFormat:@"%@ serial cell queue", self]);
dispatch_queue_t __autoreleasing serialDispatchCellQueue = dispatch_queue_create([serialDispatchCellQueueDescription UTF8String], DISPATCH_QUEUE_SERIAL);
dispatch_async(serialDispatchCellQueue, ^{
__weak typeof(self) weakSelf = self;
__weak typeof(PHAsset) *weakPhAsset = phAsset;
[[PHImageManager defaultManager] requestPlayerItemForVideo:weakPhAsset options:nil
resultHandler:^(AVPlayerItem * _Nullable playerItem, NSDictionary * _Nullable info) {
if(![[info objectForKey:PHImageResultIsInCloudKey] boolValue]) {
AVPlayer __autoreleasing *player = [AVPlayer playerWithPlayerItem:playerItem];
__block typeof(AVPlayerLayer) *weakPlayerLayer = [AVPlayerLayer playerLayerWithPlayer:player];
[weakPlayerLayer setFrame:weakSelf.contentView.bounds]; //CGRectMake(self.contentView.bounds.origin.x, self.contentView.bounds.origin.y, [[UIScreen mainScreen] bounds].size.width, [[UIScreen mainScreen] bounds].size.height * (9.0/16.0))];
[weakPlayerLayer setVideoGravity:AVLayerVideoGravityResizeAspect];
[weakPlayerLayer setBorderWidth:0.25f];
[weakPlayerLayer setBorderColor:[UIColor whiteColor].CGColor];
[player play];
dispatch_async(dispatch_get_main_queue(), ^{
[weakSelf.contentView.layer addSublayer:weakPlayerLayer];
});
}
}];
});


}; play();
}

3)这样从 UICollectionView 委托调用上面的方法:

- (UICollectionViewCell *)collectionView:(UICollectionView *)collectionView cellForItemAtIndexPath:(NSIndexPath *)indexPath
{


if ([[self.cellCache objectForKey:indexPath] isKindOfClass:[AssetPickerCollectionViewCell class]])
[self.cellCache setObject:(AssetPickerCollectionViewCell *)[collectionView dequeueReusableCellWithReuseIdentifier:CellReuseIdentifier forIndexPath:indexPath] forKey:indexPath];


dispatch_async(dispatch_get_global_queue(0, DISPATCH_QUEUE_PRIORITY_HIGH), ^{
NSInvocationOperation *invOp = [[NSInvocationOperation alloc]
initWithTarget:(AssetPickerCollectionViewCell *)[self.cellCache objectForKey:indexPath]
selector:@selector(setupPlayer:) object:AppDelegate.sharedAppDelegate.assetsFetchResults[indexPath.item]];
[[NSOperationQueue mainQueue] addOperation:invOp];
});


return (AssetPickerCollectionViewCell *)[self.cellCache objectForKey:indexPath];
}

顺便说一下,下面是如何用 Photos 应用程序的 Video 文件夹中的所有视频填充 PHFetchResult 集合:

// Collect all videos in the Videos folder of the Photos app
- (PHFetchResult *)assetsFetchResults {
__block PHFetchResult *i = self->_assetsFetchResults;
if (!i) {
static dispatch_once_t onceToken;
dispatch_once(&onceToken, ^{
PHFetchResult *smartAlbums = [PHAssetCollection fetchAssetCollectionsWithType:PHAssetCollectionTypeSmartAlbum subtype:PHAssetCollectionSubtypeSmartAlbumVideos options:nil];
PHAssetCollection *collection = smartAlbums.firstObject;
if (![collection isKindOfClass:[PHAssetCollection class]]) collection = nil;
PHFetchOptions *allPhotosOptions = [[PHFetchOptions alloc] init];
allPhotosOptions.sortDescriptors = @[[NSSortDescriptor sortDescriptorWithKey:@"creationDate" ascending:NO]];
i = [PHAsset fetchAssetsInAssetCollection:collection options:allPhotosOptions];
self->_assetsFetchResults = i;
});
}
NSLog(@"assetsFetchResults (%ld)", self->_assetsFetchResults.count);


return i;
}

如果你想过滤本地(而不是 iCloud)的视频,这是我假设的,因为你正在寻找平滑滚动:

// Filter videos that are stored in iCloud
- (NSArray *)phAssets {
NSMutableArray *assets = [NSMutableArray arrayWithCapacity:self.assetsFetchResults.count];
[[self assetsFetchResults] enumerateObjectsUsingBlock:^(PHAsset *asset, NSUInteger idx, BOOL *stop) {
if (asset.sourceType == PHAssetSourceTypeUserLibrary)
[assets addObject:asset];
}];


return [NSArray arrayWithArray:(NSArray *)assets];
}

我把上面所有的答案都试了一遍,发现它们都是正确的,只是有一定的限制。

到目前为止,我使用的最简单的方法是,在后台线程中将 AVPlayerItem分配给 AVPlayer实例的代码。我注意到在主线程上为播放器分配 AVPlayerItem(即使在 AVPlayerItem对象准备好之后)总是会对你的性能和帧速率产生影响。

Swift 4

前男友。

let mediaUrl = //your media string
let player = AVPlayer()
let playerItem = AVPlayerItem(url: mediaUrl)


DispatchQueue.global(qos: .default).async {
player.replaceCurrentItem(with: playerItem)
}