Video Rendering
概述
视频渲染将解码后的视频帧数据输出到显示设备(屏幕)进行显示。本文档详细介绍了视频渲染的实现方式、优化策略以及各平台的具体实现,特别是 iOS 端的画中画(PiP)渲染技术。
渲染方式
OpenGL/OpenGL ES
- 跨平台图形API,广泛支持
- OpenGL ES 是移动设备版本
- 支持硬件加速,性能好
- 适用于大多数平台和场景
DirectX/Vulkan
- DirectX:Windows平台图形API
- Vulkan:新一代低开销图形API,跨平台
- 性能优秀,适合高性能应用
- Vulkan 提供更好的多线程支持和更低的驱动开销
平台原生渲染
- Metal:Apple平台(macOS/iOS)图形API
- EGL:OpenGL ES的窗口系统接口
- 利用平台特性,性能最优
- 针对特定平台优化,充分利用硬件能力
渲染优化
多线程渲染
- 解码和渲染分离到不同线程
- 避免阻塞主线程
- 提高流畅度和响应性
帧同步(VSync)
- 与显示器刷新率同步
- 避免画面撕裂
- 降低功耗
- 提供更流畅的视觉体验
画面缩放与裁剪
- 高效缩放算法:双线性、双三次插值
- 硬件加速缩放:利用GPU进行缩放操作
- 支持裁剪和旋转:灵活的画面处理能力
播放控制
播放/暂停/停止
- 控制解码和渲染流程
- 状态管理
- 资源释放
- 提供基本的播放控制功能
Seek定位
- 快速定位到指定时间点
- 查找最近的关键帧
- 预加载数据
- 优化定位性能,减少等待时间
播放速度控制
- 支持倍速播放(0.5x、1.5x、2x等)
- 调整解码和渲染速度
- 保持音视频同步
- 提供灵活的播放速度调整能力
渲染实现详解
iOS 视频渲染架构
graph TB
subgraph "视频源"
VideoDecoder[视频解码器]
CVPixelBuffer[CVPixelBuffer]
end
subgraph "渲染层"
AVSampleBufferDisplayLayer[AVSampleBufferDisplayLayer]
CALayer[CALayer]
Metal[Metal渲染]
OpenGLES[OpenGL ES渲染]
end
subgraph "显示层"
UIView[UIView]
AVPlayerLayer[AVPlayerLayer]
PiPController[画中画控制器]
end
VideoDecoder --> CVPixelBuffer
CVPixelBuffer --> AVSampleBufferDisplayLayer
CVPixelBuffer --> Metal
CVPixelBuffer --> OpenGLES
AVSampleBufferDisplayLayer --> CALayer
Metal --> CALayer
OpenGLES --> CALayer
CALayer --> UIView
CALayer --> AVPlayerLayer
AVPlayerLayer --> PiPController
style PiPController fill:#e74c3c,stroke:#c0392b,stroke-width:2px,color:#fff
iOS 画中画(Picture in Picture)渲染实现
1. 画中画架构
graph TD
subgraph "应用层"
ViewController[ViewController]
VideoView[视频播放视图]
end
subgraph "AVFoundation"
AVPictureInPictureController[AVPictureInPictureController]
AVPlayerLayer[AVPlayerLayer]
AVPlayer[AVPlayer]
AVSampleBufferDisplayLayer[AVSampleBufferDisplayLayer]
end
subgraph "系统层"
PiPWindow[系统画中画窗口]
SystemUI[系统UI控制]
end
ViewController --> AVPictureInPictureController
VideoView --> AVPlayerLayer
AVPlayerLayer --> AVPlayer
AVPictureInPictureController -.->|控制| AVPlayerLayer
AVPictureInPictureController -.->|或控制| AVSampleBufferDisplayLayer
AVPictureInPictureController --> PiPWindow
PiPWindow --> SystemUI
style AVPictureInPictureController fill:#3498db,stroke:#2980b9,stroke-width:2px,color:#fff
2. iOS PiP 实现代码
2.1 基于 AVPlayer 的画中画
import AVKit
import AVFoundation
class VideoPlayerViewController: UIViewController {
// MARK: - Properties
private var player: AVPlayer!
private var playerLayer: AVPlayerLayer!
private var pipController: AVPictureInPictureController?
// MARK: - Setup
override func viewDidLoad() {
super.viewDidLoad()
setupPlayer()
setupPictureInPicture()
}
private func setupPlayer() {
// 创建播放器
let url = URL(string: "https://example.com/video.mp4")!
player = AVPlayer(url: url)
// 创建播放器图层
playerLayer = AVPlayerLayer(player: player)
playerLayer.frame = view.bounds
playerLayer.videoGravity = .resizeAspect
view.layer.addSublayer(playerLayer)
}
private func setupPictureInPicture() {
// 检查设备是否支持画中画
guard AVPictureInPictureController.isPictureInPictureSupported() else {
print("画中画不支持")
return
}
// 创建画中画控制器
pipController = AVPictureInPictureController(playerLayer: playerLayer)
pipController?.delegate = self
// 配置音频会话
try? AVAudioSession.sharedInstance().setCategory(.playback, mode: .moviePlayback)
try? AVAudioSession.sharedInstance().setActive(true)
}
// MARK: - PiP Control
@IBAction func togglePictureInPicture(_ sender: UIButton) {
if pipController?.isPictureInPictureActive == true {
// 停止画中画
pipController?.stopPictureInPicture()
} else {
// 开启画中画
pipController?.startPictureInPicture()
}
}
}
// MARK: - AVPictureInPictureControllerDelegate
extension VideoPlayerViewController: AVPictureInPictureControllerDelegate {
func pictureInPictureControllerWillStartPictureInPicture(_ pictureInPictureController: AVPictureInPictureController) {
print("画中画即将开始")
}
func pictureInPictureControllerDidStartPictureInPicture(_ pictureInPictureController: AVPictureInPictureController) {
print("画中画已开始")
}
func pictureInPictureControllerWillStopPictureInPicture(_ pictureInPictureController: AVPictureInPictureController) {
print("画中画即将停止")
}
func pictureInPictureControllerDidStopPictureInPicture(_ pictureInPictureController: AVPictureInPictureController) {
print("画中画已停止")
}
func pictureInPictureController(_ pictureInPictureController: AVPictureInPictureController,
failedToStartPictureInPictureWithError error: Error) {
print("画中画启动失败: \(error.localizedDescription)")
}
func pictureInPictureController(_ pictureInPictureController: AVPictureInPictureController,
restoreUserInterfaceForPictureInPictureStopWithCompletionHandler completionHandler: @escaping (Bool) -> Void) {
// 恢复用户界面
// 例如:展示播放器视图控制器
completionHandler(true)
}
}
2.2 基于 AVSampleBufferDisplayLayer 的画中画(自定义解码)
import AVKit
import AVFoundation
class CustomVideoRenderer: NSObject {
// MARK: - Properties
private var sampleBufferDisplayLayer: AVSampleBufferDisplayLayer!
private var pipController: AVPictureInPictureController?
private var videoCallDelegate: AVPictureInPictureSampleBufferPlaybackDelegate!
// MARK: - Setup
func setupCustomPictureInPicture() {
// 创建 AVSampleBufferDisplayLayer
sampleBufferDisplayLayer = AVSampleBufferDisplayLayer()
sampleBufferDisplayLayer.videoGravity = .resizeAspect
// 创建播放代理
videoCallDelegate = VideoCallPlaybackDelegate()
// 创建画中画控制器(iOS 15+)
if #available(iOS 15.0, *) {
let contentSource = AVPictureInPictureController.ContentSource(
sampleBufferDisplayLayer: sampleBufferDisplayLayer,
playbackDelegate: videoCallDelegate
)
pipController = AVPictureInPictureController(contentSource: contentSource)
pipController?.delegate = self
pipController?.canStartPictureInPictureAutomaticallyFromInline = true
}
}
// MARK: - Render Frame
func renderVideoFrame(_ sampleBuffer: CMSampleBuffer) {
guard sampleBufferDisplayLayer.isReadyForMoreMediaData else {
print("DisplayLayer 未准备好")
return
}
// 渲染视频帧
sampleBufferDisplayLayer.enqueue(sampleBuffer)
}
func renderPixelBuffer(_ pixelBuffer: CVPixelBuffer, presentationTime: CMTime) {
// 从 CVPixelBuffer 创建 CMSampleBuffer
var sampleBuffer: CMSampleBuffer?
var formatDescription: CMFormatDescription?
CMVideoFormatDescriptionCreateForImageBuffer(
allocator: kCFAllocatorDefault,
imageBuffer: pixelBuffer,
formatDescriptionOut: &formatDescription
)
guard let formatDesc = formatDescription else { return }
var timingInfo = CMSampleTimingInfo(
duration: .invalid,
presentationTimeStamp: presentationTime,
decodeTimeStamp: .invalid
)
CMSampleBufferCreateReadyWithImageBuffer(
allocator: kCFAllocatorDefault,
imageBuffer: pixelBuffer,
formatDescription: formatDesc,
sampleTiming: &timingInfo,
sampleBufferOut: &sampleBuffer
)
if let buffer = sampleBuffer {
renderVideoFrame(buffer)
}
}
}
// MARK: - Playback Delegate
@available(iOS 15.0, *)
class VideoCallPlaybackDelegate: NSObject, AVPictureInPictureSampleBufferPlaybackDelegate {
var isPlaying = false
var timeRange: CMTimeRange = .zero
func pictureInPictureController(_ pictureInPictureController: AVPictureInPictureController,
setPlaying playing: Bool) {
isPlaying = playing
// 通知业务层播放状态变化
NotificationCenter.default.post(name: .pipPlaybackStateChanged, object: playing)
}
func pictureInPictureControllerTimeRangeForPlayback(_ pictureInPictureController: AVPictureInPictureController) -> CMTimeRange {
return timeRange
}
func pictureInPictureControllerIsPlaybackPaused(_ pictureInPictureController: AVPictureInPictureController) -> Bool {
return !isPlaying
}
func pictureInPictureController(_ pictureInPictureController: AVPictureInPictureController,
didTransitionToRenderSize newRenderSize: CMVideoDimensions) {
print("画中画窗口大小变化: \(newRenderSize.width)x\(newRenderSize.height)")
}
func pictureInPictureController(_ pictureInPictureController: AVPictureInPictureController,
skipByInterval skipInterval: CMTime) async {
// 处理快进/快退
print("跳转: \(skipInterval.seconds) 秒")
}
}
extension Notification.Name {
static let pipPlaybackStateChanged = Notification.Name("pipPlaybackStateChanged")
}
// MARK: - PiP Delegate
extension CustomVideoRenderer: AVPictureInPictureControllerDelegate {
func pictureInPictureControllerDidStartPictureInPicture(_ pictureInPictureController: AVPictureInPictureController) {
print("自定义画中画已开始")
}
func pictureInPictureControllerDidStopPictureInPicture(_ pictureInPictureController: AVPictureInPictureController) {
print("自定义画中画已停止")
}
}
3. 画中画状态流转
stateDiagram-v2
[*] --> Idle: 初始化
Idle --> Starting: startPictureInPicture()
Starting --> Active: 启动成功
Starting --> Failed: 启动失败
Failed --> Idle: 重置
Active --> Stopping: stopPictureInPicture()
Active --> Active: 用户交互
Stopping --> Idle: 停止完成
Active --> Restoring: 用户点击返回
Restoring --> Idle: 恢复主界面
note right of Active
画中画窗口显示
支持拖动、调整大小
显示播放控制
end note
4. 画中画权限配置
在 Info.plist 中添加:
<key>UIBackgroundModes</key>
<array>
<string>audio</string>
</array>
在 Xcode 项目设置中启用:
– Signing & Capabilities → + Capability → Background Modes
– 勾选 Audio, AirPlay, and Picture in Picture
Android 视频渲染实现
1. Android 渲染架构
graph TB
subgraph "视频源"
MediaCodec[MediaCodec]
Surface[Surface]
end
subgraph "渲染方式"
SurfaceView[SurfaceView]
TextureView[TextureView]
GLSurfaceView[GLSurfaceView]
end
subgraph "PiP支持"
PictureInPictureParams[PiP参数配置]
Activity[Activity.enterPictureInPictureMode]
end
MediaCodec --> Surface
Surface --> SurfaceView
Surface --> TextureView
Surface --> GLSurfaceView
Activity --> PictureInPictureParams
style Activity fill:#4CAF50,stroke:#388E3C,stroke-width:2px,color:#fff
2. Android PiP 实现代码
import android.app.PictureInPictureParams
import android.content.res.Configuration
import android.os.Build
import android.util.Rational
import androidx.appcompat.app.AppCompatActivity
class VideoPlayerActivity : AppCompatActivity() {
private lateinit var videoView: VideoView
private var isInPipMode = false
override fun onCreate(savedInstanceState: Bundle?) {
super.onCreate(savedInstanceState)
setContentView(R.layout.activity_video_player)
videoView = findViewById(R.id.videoView)
setupVideoPlayer()
}
private fun setupVideoPlayer() {
val videoUrl = "https://example.com/video.mp4"
videoView.setVideoPath(videoUrl)
videoView.start()
}
// 进入画中画模式
fun enterPictureInPictureMode() {
if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.O) {
val aspectRatio = Rational(16, 9)
val params = PictureInPictureParams.Builder()
.setAspectRatio(aspectRatio)
.build()
enterPictureInPictureMode(params)
}
}
override fun onUserLeaveHint() {
super.onUserLeaveHint()
// 用户按 Home 键时自动进入画中画
if (videoView.isPlaying) {
enterPictureInPictureMode()
}
}
override fun onPictureInPictureModeChanged(
isInPictureInPictureMode: Boolean,
newConfig: Configuration
) {
super.onPictureInPictureModeChanged(isInPictureInPictureMode, newConfig)
isInPipMode = isInPictureInPictureMode
if (isInPictureInPictureMode) {
// 进入画中画模式,隐藏控制UI
hideControls()
} else {
// 退出画中画模式,显示控制UI
showControls()
}
}
private fun hideControls() {
// 隐藏播放控制、标题栏等
supportActionBar?.hide()
}
private fun showControls() {
// 显示播放控制、标题栏等
supportActionBar?.show()
}
}
Metal 渲染实现(iOS/macOS)
Metal 渲染管线
graph LR
subgraph "渲染管线"
CVPixelBuffer[CVPixelBuffer]
MTLTexture[Metal纹理]
VertexShader[顶点着色器]
FragmentShader[片段着色器]
MTLRenderCommandEncoder[渲染命令编码器]
MTLCommandBuffer[命令缓冲区]
Display[显示]
end
CVPixelBuffer -->|纹理映射| MTLTexture
MTLTexture --> VertexShader
VertexShader --> FragmentShader
FragmentShader --> MTLRenderCommandEncoder
MTLRenderCommandEncoder --> MTLCommandBuffer
MTLCommandBuffer -->|提交| Display
style MTLTexture fill:#9b59b6,stroke:#8e44ad,stroke-width:2px,color:#fff
Metal 渲染代码示例
import MetalKit
class MetalVideoRenderer: NSObject, MTKViewDelegate {
var device: MTLDevice!
var commandQueue: MTLCommandQueue!
var pipelineState: MTLRenderPipelineState!
var textureCache: CVMetalTextureCache!
// MARK: - Setup
func setupMetal(metalView: MTKView) {
// 创建设备
device = MTLCreateSystemDefaultDevice()
metalView.device = device
metalView.delegate = self
// 创建命令队列
commandQueue = device.makeCommandQueue()
// 创建纹理缓存
CVMetalTextureCacheCreate(kCFAllocatorDefault, nil, device, nil, &textureCache)
// 创建渲染管线
setupPipeline()
}
private func setupPipeline() {
let library = device.makeDefaultLibrary()
let vertexFunction = library?.makeFunction(name: "vertexShader")
let fragmentFunction = library?.makeFunction(name: "fragmentShader")
let pipelineDescriptor = MTLRenderPipelineDescriptor()
pipelineDescriptor.vertexFunction = vertexFunction
pipelineDescriptor.fragmentFunction = fragmentFunction
pipelineDescriptor.colorAttachments[0].pixelFormat = .bgra8Unorm
pipelineState = try? device.makeRenderPipelineState(descriptor: pipelineDescriptor)
}
// MARK: - Render
func renderPixelBuffer(_ pixelBuffer: CVPixelBuffer, in view: MTKView) {
// 从 CVPixelBuffer 创建 Metal 纹理
var textureRef: CVMetalTexture?
let width = CVPixelBufferGetWidth(pixelBuffer)
let height = CVPixelBufferGetHeight(pixelBuffer)
CVMetalTextureCacheCreateTextureFromImage(
kCFAllocatorDefault,
textureCache,
pixelBuffer,
nil,
.bgra8Unorm,
width,
height,
0,
&textureRef
)
guard let cvTexture = textureRef,
let texture = CVMetalTextureGetTexture(cvTexture) else {
return
}
// 创建命令缓冲区
guard let commandBuffer = commandQueue.makeCommandBuffer(),
let renderPassDescriptor = view.currentRenderPassDescriptor,
let renderEncoder = commandBuffer.makeRenderCommandEncoder(descriptor: renderPassDescriptor) else {
return
}
// 设置渲染状态
renderEncoder.setRenderPipelineState(pipelineState)
renderEncoder.setFragmentTexture(texture, index: 0)
// 绘制
renderEncoder.drawPrimitives(type: .triangleStrip, vertexStart: 0, vertexCount: 4)
renderEncoder.endEncoding()
// 显示
if let drawable = view.currentDrawable {
commandBuffer.present(drawable)
}
commandBuffer.commit()
}
// MARK: - MTKViewDelegate
func mtkView(_ view: MTKView, drawableSizeWillChange size: CGSize) {
// 处理视图大小变化
}
func draw(in view: MTKView) {
// 每帧调用
}
}
Metal 着色器代码
// Shaders.metal
#include <metal_stdlib>
using namespace metal;
struct VertexOut {
float4 position [[position]];
float2 texCoord;
};
// 顶点着色器
vertex VertexOut vertexShader(uint vertexID [[vertex_id]]) {
float2 positions[4] = {
float2(-1.0, -1.0),
float2( 1.0, -1.0),
float2(-1.0, 1.0),
float2( 1.0, 1.0)
};
float2 texCoords[4] = {
float2(0.0, 1.0),
float2(1.0, 1.0),
float2(0.0, 0.0),
float2(1.0, 0.0)
};
VertexOut out;
out.position = float4(positions[vertexID], 0.0, 1.0);
out.texCoord = texCoords[vertexID];
return out;
}
// 片段着色器
fragment float4 fragmentShader(VertexOut in [[stage_in]],
texture2d<float> texture [[texture(0)]]) {
constexpr sampler textureSampler(mag_filter::linear, min_filter::linear);
return texture.sample(textureSampler, in.texCoord);
}
Windows 视频渲染实现
1. Windows 渲染架构
graph TB
subgraph "视频源"
Decoder[视频解码器<br/>FFmpeg/MediaFoundation]
VideoFrame[视频帧数据]
end
subgraph "渲染API"
D3D11[Direct3D 11]
D3D12[Direct3D 12]
OpenGL[OpenGL]
GDI[GDI/GDI+]
end
subgraph "窗口系统"
HWND[窗口句柄]
SwapChain[交换链]
Present[显示]
end
Decoder --> VideoFrame
VideoFrame --> D3D11
VideoFrame --> D3D12
VideoFrame --> OpenGL
VideoFrame --> GDI
D3D11 --> SwapChain
D3D12 --> SwapChain
OpenGL --> HWND
GDI --> HWND
SwapChain --> Present
HWND --> Present
style D3D11 fill:#0078d4,stroke:#005a9e,stroke-width:2px,color:#fff
style D3D12 fill:#0078d4,stroke:#005a9e,stroke-width:2px,color:#fff
2. Direct3D 11 渲染实现
2.1 D3D11 渲染器类
#include <d3d11.h>
#include <dxgi.h>
#include <wrl/client.h>
#include <DirectXMath.h>
using Microsoft::WRL::ComPtr;
using namespace DirectX;
class D3D11VideoRenderer {
public:
D3D11VideoRenderer(HWND hwnd);
~D3D11VideoRenderer();
bool Initialize();
bool RenderFrame(uint8_t* yuvData, int width, int height);
void Resize(int width, int height);
void Cleanup();
private:
bool CreateDeviceAndSwapChain();
bool CreateRenderTarget();
bool CreateShaders();
bool CreateTextures(int width, int height);
void UpdateYUVTextures(uint8_t* yData, uint8_t* uData, uint8_t* vData,
int width, int height);
private:
HWND m_hwnd;
// D3D11 核心对象
ComPtr<ID3D11Device> m_device;
ComPtr<ID3D11DeviceContext> m_context;
ComPtr<IDXGISwapChain> m_swapChain;
ComPtr<ID3D11RenderTargetView> m_renderTargetView;
// 着色器资源
ComPtr<ID3D11VertexShader> m_vertexShader;
ComPtr<ID3D11PixelShader> m_pixelShader;
ComPtr<ID3D11InputLayout> m_inputLayout;
// 纹理资源(YUV)
ComPtr<ID3D11Texture2D> m_textureY;
ComPtr<ID3D11Texture2D> m_textureU;
ComPtr<ID3D11Texture2D> m_textureV;
ComPtr<ID3D11ShaderResourceView> m_srvY;
ComPtr<ID3D11ShaderResourceView> m_srvU;
ComPtr<ID3D11ShaderResourceView> m_srvV;
// 采样器
ComPtr<ID3D11SamplerState> m_sampler;
// 顶点缓冲区
ComPtr<ID3D11Buffer> m_vertexBuffer;
int m_width = 0;
int m_height = 0;
};
// 构造函数
D3D11VideoRenderer::D3D11VideoRenderer(HWND hwnd)
: m_hwnd(hwnd) {
}
// 析构函数
D3D11VideoRenderer::~D3D11VideoRenderer() {
Cleanup();
}
// 初始化
bool D3D11VideoRenderer::Initialize() {
if (!CreateDeviceAndSwapChain()) return false;
if (!CreateRenderTarget()) return false;
if (!CreateShaders()) return false;
// 创建采样器状态
D3D11_SAMPLER_DESC samplerDesc = {};
samplerDesc.Filter = D3D11_FILTER_MIN_MAG_MIP_LINEAR;
samplerDesc.AddressU = D3D11_TEXTURE_ADDRESS_CLAMP;
samplerDesc.AddressV = D3D11_TEXTURE_ADDRESS_CLAMP;
samplerDesc.AddressW = D3D11_TEXTURE_ADDRESS_CLAMP;
samplerDesc.MaxAnisotropy = 1;
samplerDesc.ComparisonFunc = D3D11_COMPARISON_ALWAYS;
samplerDesc.MaxLOD = D3D11_FLOAT32_MAX;
HRESULT hr = m_device->CreateSamplerState(&samplerDesc, &m_sampler);
if (FAILED(hr)) return false;
// 创建顶点缓冲区
struct Vertex {
XMFLOAT3 position;
XMFLOAT2 texCoord;
};
Vertex vertices[] = {
{ XMFLOAT3(-1.0f, -1.0f, 0.0f), XMFLOAT2(0.0f, 1.0f) },
{ XMFLOAT3(-1.0f, 1.0f, 0.0f), XMFLOAT2(0.0f, 0.0f) },
{ XMFLOAT3( 1.0f, -1.0f, 0.0f), XMFLOAT2(1.0f, 1.0f) },
{ XMFLOAT3( 1.0f, 1.0f, 0.0f), XMFLOAT2(1.0f, 0.0f) }
};
D3D11_BUFFER_DESC bufferDesc = {};
bufferDesc.Usage = D3D11_USAGE_DEFAULT;
bufferDesc.ByteWidth = sizeof(vertices);
bufferDesc.BindFlags = D3D11_BIND_VERTEX_BUFFER;
D3D11_SUBRESOURCE_DATA initData = {};
initData.pSysMem = vertices;
hr = m_device->CreateBuffer(&bufferDesc, &initData, &m_vertexBuffer);
if (FAILED(hr)) return false;
return true;
}
// 创建设备和交换链
bool D3D11VideoRenderer::CreateDeviceAndSwapChain() {
DXGI_SWAP_CHAIN_DESC swapChainDesc = {};
swapChainDesc.BufferCount = 2;
swapChainDesc.BufferDesc.Width = 0;
swapChainDesc.BufferDesc.Height = 0;
swapChainDesc.BufferDesc.Format = DXGI_FORMAT_R8G8B8A8_UNORM;
swapChainDesc.BufferDesc.RefreshRate.Numerator = 60;
swapChainDesc.BufferDesc.RefreshRate.Denominator = 1;
swapChainDesc.BufferUsage = DXGI_USAGE_RENDER_TARGET_OUTPUT;
swapChainDesc.OutputWindow = m_hwnd;
swapChainDesc.SampleDesc.Count = 1;
swapChainDesc.SampleDesc.Quality = 0;
swapChainDesc.Windowed = TRUE;
swapChainDesc.SwapEffect = DXGI_SWAP_EFFECT_FLIP_DISCARD;
D3D_FEATURE_LEVEL featureLevels[] = {
D3D_FEATURE_LEVEL_11_1,
D3D_FEATURE_LEVEL_11_0
};
HRESULT hr = D3D11CreateDeviceAndSwapChain(
nullptr,
D3D_DRIVER_TYPE_HARDWARE,
nullptr,
D3D11_CREATE_DEVICE_BGRA_SUPPORT,
featureLevels,
ARRAYSIZE(featureLevels),
D3D11_SDK_VERSION,
&swapChainDesc,
&m_swapChain,
&m_device,
nullptr,
&m_context
);
return SUCCEEDED(hr);
}
// 创建渲染目标
bool D3D11VideoRenderer::CreateRenderTarget() {
ComPtr<ID3D11Texture2D> backBuffer;
HRESULT hr = m_swapChain->GetBuffer(0, __uuidof(ID3D11Texture2D),
(void**)&backBuffer);
if (FAILED(hr)) return false;
hr = m_device->CreateRenderTargetView(backBuffer.Get(), nullptr,
&m_renderTargetView);
return SUCCEEDED(hr);
}
// 渲染帧
bool D3D11VideoRenderer::RenderFrame(uint8_t* yuvData, int width, int height) {
// 如果尺寸变化,重新创建纹理
if (width != m_width || height != m_height) {
if (!CreateTextures(width, height)) return false;
m_width = width;
m_height = height;
}
// 更新 YUV 纹理
int ySize = width * height;
int uvSize = (width / 2) * (height / 2);
uint8_t* yData = yuvData;
uint8_t* uData = yuvData + ySize;
uint8_t* vData = yuvData + ySize + uvSize;
UpdateYUVTextures(yData, uData, vData, width, height);
// 清空渲染目标
float clearColor[4] = { 0.0f, 0.0f, 0.0f, 1.0f };
m_context->ClearRenderTargetView(m_renderTargetView.Get(), clearColor);
// 设置渲染目标
m_context->OMSetRenderTargets(1, m_renderTargetView.GetAddressOf(), nullptr);
// 设置视口
RECT clientRect;
GetClientRect(m_hwnd, &clientRect);
D3D11_VIEWPORT viewport = {};
viewport.Width = static_cast<float>(clientRect.right - clientRect.left);
viewport.Height = static_cast<float>(clientRect.bottom - clientRect.top);
viewport.MinDepth = 0.0f;
viewport.MaxDepth = 1.0f;
m_context->RSSetViewports(1, &viewport);
// 设置着色器和资源
m_context->VSSetShader(m_vertexShader.Get(), nullptr, 0);
m_context->PSSetShader(m_pixelShader.Get(), nullptr, 0);
ID3D11ShaderResourceView* srvs[] = {
m_srvY.Get(), m_srvU.Get(), m_srvV.Get()
};
m_context->PSSetShaderResources(0, 3, srvs);
m_context->PSSetSamplers(0, 1, m_sampler.GetAddressOf());
// 设置顶点缓冲区
UINT stride = sizeof(float) * 5; // position(3) + texCoord(2)
UINT offset = 0;
m_context->IASetVertexBuffers(0, 1, m_vertexBuffer.GetAddressOf(),
&stride, &offset);
m_context->IASetInputLayout(m_inputLayout.Get());
m_context->IASetPrimitiveTopology(D3D11_PRIMITIVE_TOPOLOGY_TRIANGLESTRIP);
// 绘制
m_context->Draw(4, 0);
// 呈现
m_swapChain->Present(1, 0); // VSync
return true;
}
// 更新 YUV 纹理
void D3D11VideoRenderer::UpdateYUVTextures(uint8_t* yData, uint8_t* uData,
uint8_t* vData, int width, int height) {
// 更新 Y 纹理
m_context->UpdateSubresource(m_textureY.Get(), 0, nullptr,
yData, width, 0);
// 更新 U 纹理
m_context->UpdateSubresource(m_textureU.Get(), 0, nullptr,
uData, width / 2, 0);
// 更新 V 纹理
m_context->UpdateSubresource(m_textureV.Get(), 0, nullptr,
vData, width / 2, 0);
}
// 创建纹理
bool D3D11VideoRenderer::CreateTextures(int width, int height) {
// 创建 Y 纹理
D3D11_TEXTURE2D_DESC texDesc = {};
texDesc.Width = width;
texDesc.Height = height;
texDesc.MipLevels = 1;
texDesc.ArraySize = 1;
texDesc.Format = DXGI_FORMAT_R8_UNORM;
texDesc.SampleDesc.Count = 1;
texDesc.Usage = D3D11_USAGE_DEFAULT;
texDesc.BindFlags = D3D11_BIND_SHADER_RESOURCE;
HRESULT hr = m_device->CreateTexture2D(&texDesc, nullptr, &m_textureY);
if (FAILED(hr)) return false;
hr = m_device->CreateShaderResourceView(m_textureY.Get(), nullptr, &m_srvY);
if (FAILED(hr)) return false;
// 创建 U 纹理
texDesc.Width = width / 2;
texDesc.Height = height / 2;
hr = m_device->CreateTexture2D(&texDesc, nullptr, &m_textureU);
if (FAILED(hr)) return false;
hr = m_device->CreateShaderResourceView(m_textureU.Get(), nullptr, &m_srvU);
if (FAILED(hr)) return false;
// 创建 V 纹理
hr = m_device->CreateTexture2D(&texDesc, nullptr, &m_textureV);
if (FAILED(hr)) return false;
hr = m_device->CreateShaderResourceView(m_textureV.Get(), nullptr, &m_srvV);
if (FAILED(hr)) return false;
return true;
}
// 清理资源
void D3D11VideoRenderer::Cleanup() {
if (m_context) {
m_context->ClearState();
m_context->Flush();
}
}
2.2 HLSL 着色器代码
// VertexShader.hlsl
struct VS_INPUT {
float3 position : POSITION;
float2 texCoord : TEXCOORD0;
};
struct PS_INPUT {
float4 position : SV_POSITION;
float2 texCoord : TEXCOORD0;
};
PS_INPUT main(VS_INPUT input) {
PS_INPUT output;
output.position = float4(input.position, 1.0f);
output.texCoord = input.texCoord;
return output;
}
// PixelShader.hlsl (YUV to RGB 转换)
Texture2D textureY : register(t0);
Texture2D textureU : register(t1);
Texture2D textureV : register(t2);
SamplerState samplerState : register(s0);
struct PS_INPUT {
float4 position : SV_POSITION;
float2 texCoord : TEXCOORD0;
};
float4 main(PS_INPUT input) : SV_TARGET {
float y = textureY.Sample(samplerState, input.texCoord).r;
float u = textureU.Sample(samplerState, input.texCoord).r - 0.5f;
float v = textureV.Sample(samplerState, input.texCoord).r - 0.5f;
// YUV to RGB 转换矩阵 (BT.709)
float r = y + 1.5748f * v;
float g = y - 0.1873f * u - 0.4681f * v;
float b = y + 1.8556f * u;
return float4(r, g, b, 1.0f);
}
3. Media Foundation 视频渲染
#include <mfapi.h>
#include <mfidl.h>
#include <mfreadwrite.h>
#include <evr.h>
class MediaFoundationRenderer {
public:
MediaFoundationRenderer(HWND hwnd);
~MediaFoundationRenderer();
bool Initialize();
bool LoadVideo(const wchar_t* filePath);
bool Play();
bool Pause();
void Stop();
private:
HWND m_hwnd;
ComPtr<IMFMediaSession> m_session;
ComPtr<IMFMediaSource> m_mediaSource;
ComPtr<IMFTopology> m_topology;
ComPtr<IMFVideoDisplayControl> m_videoDisplay;
};
MediaFoundationRenderer::MediaFoundationRenderer(HWND hwnd)
: m_hwnd(hwnd) {
}
bool MediaFoundationRenderer::Initialize() {
// 初始化 Media Foundation
HRESULT hr = MFStartup(MF_VERSION);
if (FAILED(hr)) return false;
// 创建媒体会话
hr = MFCreateMediaSession(nullptr, &m_session);
if (FAILED(hr)) return false;
return true;
}
bool MediaFoundationRenderer::LoadVideo(const wchar_t* filePath) {
// 创建媒体源
ComPtr<IMFSourceResolver> sourceResolver;
HRESULT hr = MFCreateSourceResolver(&sourceResolver);
if (FAILED(hr)) return false;
MF_OBJECT_TYPE objectType = MF_OBJECT_INVALID;
ComPtr<IUnknown> source;
hr = sourceResolver->CreateObjectFromURL(
filePath,
MF_RESOLUTION_MEDIASOURCE,
nullptr,
&objectType,
&source
);
if (FAILED(hr)) return false;
hr = source.As(&m_mediaSource);
if (FAILED(hr)) return false;
// 创建拓扑
hr = MFCreateTopology(&m_topology);
if (FAILED(hr)) return false;
// 创建 EVR(Enhanced Video Renderer)
ComPtr<IMFActivate> evrActivate;
hr = MFCreateVideoRendererActivate(m_hwnd, &evrActivate);
if (FAILED(hr)) return false;
// 配置拓扑(简化版,实际需要更复杂的配置)
// ... 添加源节点、输出节点等
// 设置拓扑
hr = m_session->SetTopology(0, m_topology.Get());
if (FAILED(hr)) return false;
return true;
}
bool MediaFoundationRenderer::Play() {
PROPVARIANT varStart;
PropVariantInit(&varStart);
HRESULT hr = m_session->Start(&GUID_NULL, &varStart);
PropVariantClear(&varStart);
return SUCCEEDED(hr);
}
4. Windows Desktop Duplication API(屏幕捕获)
#include <dxgi1_2.h>
class DesktopDuplicationRenderer {
public:
bool Initialize();
bool CaptureFrame(ID3D11Texture2D** outTexture);
private:
ComPtr<ID3D11Device> m_device;
ComPtr<ID3D11DeviceContext> m_context;
ComPtr<IDXGIOutputDuplication> m_duplication;
};
bool DesktopDuplicationRenderer::Initialize() {
// 创建 D3D11 设备
D3D_FEATURE_LEVEL featureLevel;
HRESULT hr = D3D11CreateDevice(
nullptr,
D3D_DRIVER_TYPE_HARDWARE,
nullptr,
0,
nullptr,
0,
D3D11_SDK_VERSION,
&m_device,
&featureLevel,
&m_context
);
if (FAILED(hr)) return false;
// 获取 DXGI 设备
ComPtr<IDXGIDevice> dxgiDevice;
hr = m_device.As(&dxgiDevice);
if (FAILED(hr)) return false;
// 获取适配器
ComPtr<IDXGIAdapter> dxgiAdapter;
hr = dxgiDevice->GetAdapter(&dxgiAdapter);
if (FAILED(hr)) return false;
// 获取输出
ComPtr<IDXGIOutput> dxgiOutput;
hr = dxgiAdapter->EnumOutputs(0, &dxgiOutput);
if (FAILED(hr)) return false;
// 获取 Output1
ComPtr<IDXGIOutput1> dxgiOutput1;
hr = dxgiOutput.As(&dxgiOutput1);
if (FAILED(hr)) return false;
// 创建桌面复制
hr = dxgiOutput1->DuplicateOutput(m_device.Get(), &m_duplication);
if (FAILED(hr)) return false;
return true;
}
bool DesktopDuplicationRenderer::CaptureFrame(ID3D11Texture2D** outTexture) {
ComPtr<IDXGIResource> desktopResource;
DXGI_OUTDUPL_FRAME_INFO frameInfo;
// 获取帧
HRESULT hr = m_duplication->AcquireNextFrame(
500, // 超时时间(ms)
&frameInfo,
&desktopResource
);
if (FAILED(hr)) {
if (hr == DXGI_ERROR_WAIT_TIMEOUT) {
// 没有新帧
return false;
}
return false;
}
// 转换为纹理
ComPtr<ID3D11Texture2D> texture;
hr = desktopResource.As(&texture);
if (FAILED(hr)) {
m_duplication->ReleaseFrame();
return false;
}
*outTexture = texture.Detach();
// 释放帧
m_duplication->ReleaseFrame();
return true;
}
5. Windows 平台渲染流程图
sequenceDiagram
participant App as 应用程序
participant Decoder as 解码器
participant D3D11 as D3D11渲染器
participant GPU as GPU
participant Display as 显示器
App->>Decoder: 发送视频数据
Decoder->>Decoder: 解码视频帧
Decoder->>D3D11: 传递YUV数据
D3D11->>D3D11: CreateTextures(YUV)
D3D11->>D3D11: UpdateSubresource
D3D11->>GPU: 设置渲染管线
D3D11->>GPU: 绑定纹理和着色器
D3D11->>GPU: Draw调用
GPU->>GPU: YUV to RGB转换
GPU->>GPU: 渲染到后备缓冲区
D3D11->>Display: SwapChain->Present()
Display->>Display: 显示画面
Note over D3D11,Display: VSync同步
6. WPF 视频渲染
using System;
using System.Windows;
using System.Windows.Controls;
using System.Windows.Media;
using System.Windows.Media.Imaging;
public class WpfVideoRenderer {
private Image imageControl;
private WriteableBitmap bitmap;
public WpfVideoRenderer(Image image) {
imageControl = image;
}
public void Initialize(int width, int height) {
// 创建 WriteableBitmap
bitmap = new WriteableBitmap(
width,
height,
96, 96,
PixelFormats.Bgra32,
null
);
imageControl.Source = bitmap;
}
public void RenderFrame(byte[] rgbData, int width, int height) {
if (bitmap == null ||
bitmap.PixelWidth != width ||
bitmap.PixelHeight != height) {
Initialize(width, height);
}
// 更新位图
bitmap.Lock();
try {
// 复制数据到后备缓冲区
int stride = width * 4; // BGRA32
Int32Rect rect = new Int32Rect(0, 0, width, height);
bitmap.WritePixels(rect, rgbData, stride, 0);
}
finally {
bitmap.Unlock();
}
// 标记为脏区域,触发重绘
bitmap.AddDirtyRect(new Int32Rect(0, 0, width, height));
}
// 从 YUV 转换并渲染
public void RenderYUVFrame(byte[] yuvData, int width, int height) {
byte[] rgbData = ConvertYUVToRGB(yuvData, width, height);
RenderFrame(rgbData, width, height);
}
private byte[] ConvertYUVToRGB(byte[] yuv, int width, int height) {
// YUV to RGB 转换实现
byte[] rgb = new byte[width * height * 4];
int ySize = width * height;
int uvSize = ySize / 4;
for (int i = 0; i < height; i++) {
for (int j = 0; j < width; j++) {
int yIndex = i * width + j;
int uvIndex = (i / 2) * (width / 2) + (j / 2);
float y = yuv[yIndex];
float u = yuv[ySize + uvIndex] - 128;
float v = yuv[ySize + uvSize + uvIndex] - 128;
// BT.709 转换
int r = (int)(y + 1.5748f * v);
int g = (int)(y - 0.1873f * u - 0.4681f * v);
int b = (int)(y + 1.8556f * u);
// 限制范围
r = Math.Max(0, Math.Min(255, r));
g = Math.Max(0, Math.Min(255, g));
b = Math.Max(0, Math.Min(255, b));
int rgbIndex = (i * width + j) * 4;
rgb[rgbIndex] = (byte)b; // B
rgb[rgbIndex + 1] = (byte)g; // G
rgb[rgbIndex + 2] = (byte)r; // R
rgb[rgbIndex + 3] = 255; // A
}
}
return rgb;
}
}
7. Windows 画中画实现(Compact Overlay)
using Windows.UI.ViewManagement;
using Windows.Foundation;
public class WindowsPipController {
private ApplicationView appView;
public async Task<bool> EnterCompactOverlay() {
appView = ApplicationView.GetForCurrentView();
// 检查是否支持 Compact Overlay
if (!appView.IsViewModeSupported(ApplicationViewMode.CompactOverlay)) {
return false;
}
// 设置首选大小
var preferences = ViewModePreferences.CreateDefault(
ApplicationViewMode.CompactOverlay
);
preferences.CustomSize = new Size(400, 300);
// 进入 Compact Overlay 模式
bool success = await appView.TryEnterViewModeAsync(
ApplicationViewMode.CompactOverlay,
preferences
);
return success;
}
public async Task<bool> ExitCompactOverlay() {
if (appView == null) return false;
// 退出 Compact Overlay,返回正常模式
bool success = await appView.TryEnterViewModeAsync(
ApplicationViewMode.Default
);
return success;
}
// 事件处理
public void Initialize() {
appView = ApplicationView.GetForCurrentView();
// 监听窗口大小变化
appView.VisibleBoundsChanged += (sender, args) => {
// 处理窗口大小变化
if (appView.ViewMode == ApplicationViewMode.CompactOverlay) {
// 画中画模式下的处理
OnCompactOverlayModeChanged(true);
} else {
OnCompactOverlayModeChanged(false);
}
};
}
private void OnCompactOverlayModeChanged(bool isCompactOverlay) {
// 根据模式调整 UI
if (isCompactOverlay) {
// 隐藏复杂控件,只显示视频
} else {
// 显示完整 UI
}
}
}
8. Windows 渲染性能优化
class RenderOptimizer {
public:
// 1. 使用硬件加速视频解码
bool EnableHardwareDecoding() {
ComPtr<ID3D11VideoDevice> videoDevice;
HRESULT hr = m_device.As(&videoDevice);
if (FAILED(hr)) return false;
// 创建视频解码器
D3D11_VIDEO_DECODER_DESC decoderDesc = {};
decoderDesc.Guid = D3D11_DECODER_PROFILE_H264_VLD_NOFGT;
decoderDesc.SampleWidth = 1920;
decoderDesc.SampleHeight = 1080;
decoderDesc.OutputFormat = DXGI_FORMAT_NV12;
ComPtr<ID3D11VideoDecoder> decoder;
hr = videoDevice->CreateVideoDecoder(&decoderDesc, nullptr, &decoder);
return SUCCEEDED(hr);
}
// 2. 使用 Flip 模型减少延迟
void EnableFlipModel() {
DXGI_SWAP_CHAIN_DESC1 swapChainDesc = {};
swapChainDesc.SwapEffect = DXGI_SWAP_EFFECT_FLIP_SEQUENTIAL;
swapChainDesc.BufferCount = 2;
swapChainDesc.Scaling = DXGI_SCALING_STRETCH;
swapChainDesc.AlphaMode = DXGI_ALPHA_MODE_UNSPECIFIED;
// ... 其他配置
}
// 3. 异步渲染
std::thread renderThread;
std::queue<VideoFrame> frameQueue;
std::mutex queueMutex;
std::condition_variable queueCV;
bool isRunning = true;
void StartRenderThread() {
renderThread = std::thread([this]() {
while (isRunning) {
VideoFrame frame;
{
std::unique_lock<std::mutex> lock(queueMutex);
queueCV.wait(lock, [this] {
return !frameQueue.empty() || !isRunning;
});
if (!isRunning) break;
frame = frameQueue.front();
frameQueue.pop();
}
// 渲染帧
RenderFrame(frame);
}
});
}
// 4. 纹理池管理
class TexturePool {
public:
ComPtr<ID3D11Texture2D> AcquireTexture(int width, int height) {
std::lock_guard<std::mutex> lock(mutex_);
for (auto& tex : textures_) {
if (!tex.inUse && tex.width == width && tex.height == height) {
tex.inUse = true;
return tex.texture;
}
}
// 创建新纹理
TextureInfo info;
info.width = width;
info.height = height;
info.inUse = true;
// 创建 D3D11 纹理...
textures_.push_back(info);
return info.texture;
}
void ReleaseTexture(ComPtr<ID3D11Texture2D> texture) {
std::lock_guard<std::mutex> lock(mutex_);
for (auto& tex : textures_) {
if (tex.texture == texture) {
tex.inUse = false;
break;
}
}
}
private:
struct TextureInfo {
ComPtr<ID3D11Texture2D> texture;
int width;
int height;
bool inUse;
};
std::vector<TextureInfo> textures_;
std::mutex mutex_;
};
};
渲染性能对比
| 渲染方式 | 平台 | CPU占用 | GPU占用 | 延迟 | 复杂度 | 适用场景 |
|---|---|---|---|---|---|---|
| AVPlayerLayer | iOS/macOS | 低 | 低 | 低 | 低 | 普通视频播放 |
| AVSampleBufferDisplayLayer | iOS/macOS | 低 | 低 | 低 | 中 | 自定义解码+PiP |
| Metal | iOS/macOS | 低 | 中 | 极低 | 高 | 高性能、特效渲染 |
| OpenGL ES | iOS/Android | 中 | 中 | 低 | 中 | 跨平台、通用渲染 |
| SurfaceView | Android | 低 | 低 | 低 | 低 | 标准视频播放 |
| TextureView | Android | 中 | 中 | 中 | 中 | 需要动画/变换 |
| Direct3D 11 | Windows | 低 | 低 | 低 | 中 | Windows主流渲染 |
| Direct3D 12 | Windows | 低 | 低 | 极低 | 高 | 高性能游戏/专业应用 |
| Media Foundation | Windows | 低 | 低 | 低 | 低 | Windows媒体播放 |
| WPF WriteableBitmap | Windows | 高 | 低 | 中 | 低 | WPF应用集成 |
Windows vs iOS vs Android 渲染对比
graph TD
subgraph Windows["Windows 渲染方案"]
WinD3D11["Direct3D 11<br/>✓ 硬件加速<br/>✓ 主流方案<br/>✓ 易用性高"]
WinD3D12["Direct3D 12<br/>✓ 极致性能<br/>✓ 低延迟<br/>⚠️ 复杂度高"]
WinMF["Media Foundation<br/>✓ 系统集成<br/>✓ 编解码一体<br/>✓ 开箱即用"]
WinWPF["WPF<br/>✓ UI集成好<br/>⚠️ 性能一般<br/>✓ 开发效率高"]
end
subgraph iOS["iOS 渲染方案"]
iOSAVP["AVPlayer<br/>✓ 简单易用<br/>✓ 系统优化<br/>✓ 自动PiP"]
iOSAVS["AVSampleBuffer<br/>✓ 自定义解码<br/>✓ 实时流<br/>✓ PiP支持"]
iOSMetal["Metal<br/>✓ 极致性能<br/>✓ 低功耗<br/>⚠️ 复杂度高"]
end
subgraph Android["Android 渲染方案"]
AndSurface["SurfaceView<br/>✓ 标准方案<br/>✓ 性能好<br/>⚠️ 动画支持弱"]
AndTexture["TextureView<br/>✓ 灵活性高<br/>✓ 支持动画<br/>⚠️ 延迟稍高"]
AndGL["OpenGL ES<br/>✓ 跨平台<br/>✓ 特效丰富<br/>⚠️ 需手动管理"]
end
style WinD3D11 fill:#0078d4,stroke:#005a9e,stroke-width:2px,color:#fff
style iOSMetal fill:#9b59b6,stroke:#8e44ad,stroke-width:2px,color:#fff
style AndSurface fill:#4CAF50,stroke:#388E3C,stroke-width:2px,color:#fff
| AVSampleBufferDisplayLayer | iOS/macOS | 低 | 低 | 低 | 中 | 自定义解码+PiP |
| Metal | iOS/macOS | 低 | 中 | 极低 | 高 | 高性能、特效渲染 |
| OpenGL ES | iOS/Android | 中 | 中 | 低 | 中 | 跨平台、通用渲染 |
| SurfaceView | Android | 低 | 低 | 低 | 低 | 标准视频播放 |
| TextureView | Android | 中 | 中 | 中 | 中 | 需要动画/变换 |
最佳实践
1. 选择合适的渲染方式
- iOS PiP:优先使用
AVSampleBufferDisplayLayer,支持自定义解码 - 高性能场景:使用 Metal(iOS)或 Vulkan(Android)
- 跨平台方案:使用 OpenGL ES
2. 性能优化
// 优化建议
class RenderOptimization {
// 1. 复用纹理对象
private var textureCache: CVMetalTextureCache?
// 2. 使用对象池减少内存分配
private var bufferPool: [CMSampleBuffer] = []
// 3. 控制渲染帧率
private let targetFPS = 30
private var lastRenderTime: CFTimeInterval = 0
func shouldRenderFrame() -> Bool {
let currentTime = CACurrentMediaTime()
let elapsed = currentTime - lastRenderTime
let frameInterval = 1.0 / Double(targetFPS)
if elapsed >= frameInterval {
lastRenderTime = currentTime
return true
}
return false
}
// 4. 异步渲染
private let renderQueue = DispatchQueue(label: "com.video.render", qos: .userInteractive)
func renderAsync(_ buffer: CMSampleBuffer) {
renderQueue.async { [weak self] in
self?.doRender(buffer)
}
}
private func doRender(_ buffer: CMSampleBuffer) {
// 渲染实现
}
}
3. 内存管理
class MemoryManagement {
// 及时释放不用的资源
func cleanup() {
// 清理纹理缓存
CVMetalTextureCacheFlush(textureCache, 0)
// 释放命令缓冲区
commandBuffer = nil
// 清空缓冲池
bufferPool.removeAll()
}
// 监听内存警告
func setupMemoryWarning() {
NotificationCenter.default.addObserver(
self,
selector: #selector(handleMemoryWarning),
name: UIApplication.didReceiveMemoryWarningNotification,
object: nil
)
}
@objc private func handleMemoryWarning() {
cleanup()
}
}
故障排查
| 问题 | 可能原因 | 解决方案 |
|---|---|---|
| 画中画无法启动 | 权限未配置 | 检查 Info.plist 和 Background Modes |
| 画面撕裂 | 未启用 VSync | 启用垂直同步 |
| 画面卡顿 | 主线程阻塞 | 异步渲染,使用独立渲染队列 |
| 内存飙升 | 纹理未释放 | 及时释放纹理,使用纹理缓存 |
| 黑屏 | 解码格式不支持 | 检查像素格式,使用正确的纹理格式 |
