当前位置:   article > 正文

基于Unity的云渲染Demo_unity 云渲染

unity 云渲染

        记录下:这个月离开了待了将近4年的游戏公司,转而加入一家做云渲染的公司,以图形程序的岗位入职,一进来就帮忙支持了像镜面反射以及urp或是hdrp下拷贝贴图的pass实现。

而安排的项目是云渲染相关,目的是为了主机端能有着不错的表现和效能,因此趁着开工前,看了unity渲染相关的内容,趁着闲下来的时候,再这里对自己看到的内容做下简要的总结。

        一、首先unity官方提供了云渲染支持的Demo,链接如下:About Unity Render Streaming | Unity Render Streaming | 3.1.0-exp.6

代码看的是exp.3,插件包怎么安装和使用就不具体说明了。

        二、提供了相关Demo,包含了启动作为搭建服务器业务逻辑的场景Boardcast.unity以及作为客户端的Receiver或者renderpipline等场景。

        三、通讯是基于WebRTC插件包,具体安装该插件包的时候就会自动做依赖,然后使用的时候还需要开启外部服务器webserver.exe,这个在文档里也说明了。

        四、看到这一插件包的时候,一个让我好奇的是服务器怎样将画面传输给客户端,另外一个客户端输入字母传输给服务器,这两点,我会根据这几天看的代码会做稍微细致的解读。

  1、画面怎样传输?

        a、画面传输是服务器发送画面给客户端,因此打开BroadCast.unity 场景,可以看到camera下有个叫ScreenStreamSender.cs s. 该脚本创建了一个RT将其设置到创建的VedioStreamTrack,VedioStreamTrack负责将该RT传输到远端

  1. protected override MediaStreamTrack CreateTrack()
  2. {
  3. RenderTexture rt;
  4. if (m_sendTexture != null)
  5. {
  6. rt = m_sendTexture;
  7. RenderTextureFormat supportFormat =
  8. WebRTC.WebRTC.GetSupportedRenderTextureFormat(SystemInfo.graphicsDeviceType);
  9. GraphicsFormat graphicsFormat =
  10. GraphicsFormatUtility.GetGraphicsFormat(supportFormat, RenderTextureReadWrite.Default);
  11. GraphicsFormat compatibleFormat = SystemInfo.GetCompatibleFormat(graphicsFormat, FormatUsage.Render);
  12. GraphicsFormat format = graphicsFormat == compatibleFormat ? graphicsFormat : compatibleFormat;
  13. if (rt.graphicsFormat != format)
  14. {
  15. Debug.LogWarning(
  16. $"This color format:{rt.graphicsFormat} not support in unity.webrtc. Change to supported color format:{format}.");
  17. rt.Release();
  18. rt.graphicsFormat = format;
  19. rt.Create();
  20. }
  21. m_sendTexture = rt;
  22. }
  23. else
  24. {
  25. RenderTextureFormat format =
  26. WebRTC.WebRTC.GetSupportedRenderTextureFormat(SystemInfo.graphicsDeviceType);
  27. rt = new RenderTexture(streamingSize.x, streamingSize.y, depth, format) { antiAliasing = antiAliasing };
  28. rt.Create();
  29. m_sendTexture = rt;
  30. }
  31. // The texture obtained by ScreenCapture.CaptureScreenshotIntoRenderTexture is different between OpenGL and other Graphics APIs.
  32. // In OpenGL, we got a texture that is not inverted, so need flip when sending.
  33. var isOpenGl = SystemInfo.graphicsDeviceType == GraphicsDeviceType.OpenGLCore ||
  34. SystemInfo.graphicsDeviceType == GraphicsDeviceType.OpenGLES2 ||
  35. SystemInfo.graphicsDeviceType == GraphicsDeviceType.OpenGLES3;
  36. //return new VideoStreamTrack(rt, isOpenGl);
  37. m_curTrack = new VideoStreamTrack(rt, isOpenGl);
  38. return m_curTrack;
  39. }

        b、既然知道传输的是这张RT,那就是需要知道怎样维护和更新这张RT,Demo中使用

ScreenCapture.CaptureScreenshotIntoRenderTexture(m_screenTexture);每帧读取屏幕项目到该RT中然后再blit到m_sendTexture:
  1. IEnumerator RecordScreenFrame()
  2. {
  3. while (true)
  4. {
  5. OnUpdateFramed?.Invoke();
  6. yield return new WaitForEndOfFrame();
  7. if (!connections.Any() || m_sendTexture == null || !m_sendTexture.IsCreated())
  8. {
  9. continue;
  10. }
  11. if (m_precount == m_curcount)
  12. {
  13. continue;
  14. }
  15. else
  16. {
  17. m_precount = m_curcount;
  18. }
  19. m_curTrack.setCameraRT(m_Camera.transform.rotation.eulerAngles, m_Camera.transform.position, Convert.ToInt64(timestamp), renderTime);
  20. ScreenCapture.CaptureScreenshotIntoRenderTexture(m_screenTexture);
  21. Graphics.Blit(m_screenTexture, m_sendTexture, material);
  22. }
  23. }
    c、此前再profiler性能时,看到了一个大的耗时:刚开错误的判断是CaptureScrrenshotIntoRenderTexture的开销,于是想到了直接再管线中增加一pass来读取RT,取消掉刚函数的调用。但是后来发现还是存在该消耗,于是仔细深入的追踪相关代码,发现是底层WebRTC在真正发送这张RT的时候通过往cammandbuffer插入指令的方式来编码导致的卡顿:
  1. public static void Encode(IntPtr callback, IntPtr track)
  2. {
  3. _command.IssuePluginEventAndData(callback, (int)VideoStreamRenderEventId.Encode, track);
  4. Graphics.ExecuteCommandBuffer(_command);
  5. _command.Clear();
  6. }

 要优化会比较复杂(其实后面新版本测试时间其实是做了优化,时间只有2.+ms)。

2、输入系统

        a、首先输入系统是基于unity的inputSystem

        b、服务器使用InputReceive.cs(挂在broadcast.unity场景下的主相机下)作为与业务上层交互的组件,可以自由的绑定收到客户端输入后的事件

        c、 客户端使用InputSender与之对应,具体可以在对应的客户端场景中搜索到相关组件

        d、底层在实现InputReceiver和InputSender还是令人好比较好奇的,这里结合我看到的说几点:

        d.1:InputSender注册了InputSystem的事件,将收到的事件做成消息包发送到服务器    

  1. public Sender()
  2. {
  3. InputSystem.onEvent += OnEvent;
  4. InputSystem.onDeviceChange += OnDeviceChange;
  5. InputSystem.onLayoutChange += OnLayoutChange;
  6. _onEvent = (InputEventPtr ptr, InputDevice device) => { onEvent?.Invoke(ptr, device); };
  7. _corrector = new InputPositionCorrector(_onEvent);
  8. }

    

  1. private unsafe void SendEvent(InputEventPtr eventPtr, InputDevice device)
  2. {
  3. if (m_Subscribers == null)
  4. return;
  5. REVIEW: we probably want to have better control over this and allow producing local events
  6. against remote devices which *are* indeed sent across the wire
  7. // Don't send events that came in from remote devices.
  8. if (device != null && device.remote)
  9. return;
  10. var message = NewEventsMsg.Create(eventPtr.data, 1);
  11. if (first_send)
  12. {
  13. TimeSpan ts = DateTime.UtcNow - new DateTime(1970, 1, 1, 0, 0, 0, 0);
  14. Debug.LogError("time---lxf first_send:" + ts.TotalMilliseconds.ToString());
  15. first_send = false;
  16. }
  17. Send(message);
  18. }

        d.2:InputReceiver收到消息包后做解包构造成InputSystem需要的Event喂给InputSystem触发输入

  1. void IObserver<Message>.OnNext(Message msg)
  2. {
  3. switch (msg.type)
  4. {
  5. case MessageType.Connect:
  6. ConnectMsg.Process(this);
  7. break;
  8. case MessageType.Disconnect:
  9. DisconnectMsg.Process(this, msg);
  10. break;
  11. case MessageType.NewLayout:
  12. NewLayoutMsg.Process(this, msg);
  13. break;
  14. case MessageType.RemoveLayout:
  15. RemoveLayoutMsg.Process(this, msg);
  16. break;
  17. case MessageType.NewDevice:
  18. NewDeviceMsg.Process(this, msg);
  19. break;
  20. case MessageType.NewEvents:
  21. InputStaticData.OnCameraPosValueChange?.Invoke(msg.posX, msg.posY, msg.posZ, msg.rotateX, msg.rotateY, msg.rotateZ, msg.fx, msg.fy, msg.cx, msg.cy, msg.timestamp);
  22. NewEventsMsg.Process(this, msg);
  23. break;
  24. case MessageType.ChangeUsages:
  25. ChangeUsageMsg.Process(this, msg);
  26. break;
  27. case MessageType.RemoveDevice:
  28. RemoveDeviceMsg.Process(this, msg);
  29. break;
  30. case MessageType.StartSending:
  31. StartSendingMsg.Process(this);
  32. break;
  33. case MessageType.StopSending:
  34. StopSendingMsg.Process(this);
  35. break;
  36. }
  37. }
  1. public static unsafe void Process(InputRemoting Receiver, Message msg)
  2. {
  3. var manager = Receiver.m_LocalManager;
  4. fixed (byte* dataPtr = msg.data)
  5. {
  6. var dataEndPtr = new IntPtr(dataPtr + msg.data.Length);
  7. var eventCount = 0;
  8. var eventPtr = new InputEventPtr((InputEvent*)dataPtr);
  9. var senderIndex = Receiver.FindOrCreateSenderRecord(msg.participantId);
  10. while ((Int64)eventPtr.data < dataEndPtr.ToInt64())
  11. {
  12. // Patch up device ID to refer to local device and send event.
  13. var remoteDeviceId = eventPtr.deviceId;
  14. var localDeviceId = Receiver.FindLocalDeviceId(remoteDeviceId, senderIndex);
  15. eventPtr.deviceId = localDeviceId;
  16. if (localDeviceId != InputDevice.InvalidDeviceId)
  17. {
  18. TODO: add API to send events in bulk rather than one by one
  19. manager.QueueEvent(eventPtr);
  20. }
  21. ++eventCount;
  22. eventPtr = eventPtr.Next();
  23. }
  24. }
  25. }
  1. public override void QueueEvent(InputEventPtr ptr)
  2. {
  3. InputDevice device = InputSystem.GetDeviceById(ptr.deviceId);
  4. // mapping sender coordinate system to receiver one.
  5. if (EnableInputPositionCorrection && device is Pointer && ptr.IsA<StateEvent>())
  6. {
  7. _corrector.Invoke(ptr, device);
  8. }
  9. else
  10. {
  11. base.QueueEvent(ptr);
  12. }
  13. }
  1. public virtual void QueueEvent(InputEventPtr eventPtr)
  2. {
  3. InputSystem.QueueEvent(eventPtr);
  4. }

        d.3:在服务器,BoardCastSample.cs绑定了InputSystem相关事件到脚本 SimpleCameraControllerV2.cs,将客户端相关输入针对场景中的相机位置,旋转等做相应的调度。

  1. private void Start()
  2. {
  3. SyncDisplayVideoSenderParameters();
  4. if (renderStreaming.runOnAwake)
  5. return;
  6. if(settings != null)
  7. renderStreaming.useDefaultSettings = settings.UseDefaultSettings;
  8. if (settings?.SignalingSettings != null)
  9. renderStreaming.SetSignalingSettings(settings.SignalingSettings);
  10. renderStreaming.Run();
  11. inputReceiver.OnStartedChannel += OnStartedChannel;
  12. var map = inputReceiver.currentActionMap;
  13. map["Movement"].AddListener(cameraController.OnMovement);
  14. map["Look"].AddListener(cameraController.OnLook);
  15. map["ResetCamera"].AddListener(cameraController.OnResetCamera);
  16. map["Rotate"].AddListener(cameraController.OnRotate);
  17. map["Position"].AddListener(cameraController.OnPosition);
  18. map["Point"].AddListener(uiController.OnPoint);
  19. map["Press"].AddListener(uiController.OnPress);
  20. map["PressAnyKey"].AddListener(uiController.OnPressAnyKey);
  21. }

声明:本文内容由网友自发贡献,不代表【wpsshop博客】立场,版权归原作者所有,本站不承担相应法律责任。如您发现有侵权的内容,请联系我们。转载请注明出处:https://www.wpsshop.cn/w/从前慢现在也慢/article/detail/89592
推荐阅读
相关标签
  

闽ICP备14008679号