A little bit about me

Friday, June 21, 2019


How to Pathtrace in VR


Before I start, I want to thank David Kuri of the VW Virtual Engineering Lab in Germany for his example project and tutorial that I've based my modifications off of.

The following is his tutorial:
http://blog.three-eyed-games.com/2018/05/03/gpu-ray-tracing-in-unity-part-1/

This link is to the BitBucket Git repo:
https://bitbucket.org/Daerst/gpu-ray-tracing-in-unity/src/Tutorial_Pt3/

My project can be found at  https://github.com/Will-VW/Pathtracing



To start, this is a project I worked on as an experiment while part of the Volkswagen Virtual Engineering Lab in the Bay Area.

While searching for interesting raytracing implementations to challenge our 2080 TI, we were coming up fairly short. We found a couple lackluster and simple DirectX projects, poor Unity and Unreal support, and marginally impressive demos on rails. That is until we found our colleague's project using simple compute shaders in regular Unity.

Obviously we forgo RTX and DXR acceleration and for the moment we do not have denoising or proper mesh support but we gain near unlimited flexibility and control.

As far as we are aware (as of June 2019), there are no raytracing or pathtracing applications, demos, games, roadmaps, plans, or anything else with any sort of VR support or connection. We decided to change that.

I took David's existing project that worked in "simple" single camera pancake views and modified it until I could get it to work properly with SteamVR.

Temporal accumulation was the biggest issue and due to the fact that both right and left eyes were from different perspectives and the constant VR camera movement it was effectively impossible to have anything more than a single frame of accumulation in the headset. I first used instances to split the left and right eye textures so accumulation would be separated. I then used a shader to account for head rotation and adjust the accumulated frames accordingly to attempt to line up with the newest perspective.

Both efforts were effective. The former far more so than the latter which still suffered from artifacts due to the distortion at high FOVs like VR headsets have.

Next I targeted a dynamic resolution so that the number of rays projected each frame could be controlled to suit our needs. I changed the RenderTexture resolution appropriately and allowed the program to render based off of the VR headset resolution rather than the Unity mirrored view resolution.

Later, I attempted foveated rendering to improve performance which remained lackluster. I learned how expensive the Graphics.Blit() and Graphics.CopyTexture() methods are in bandwidth and found that while foveated rendering was possible, it hurt performance more than it helped.

I then added in support for moving around mesh objects that were imported at runtime based on the GameObject Transform. This allowed me to put in a moving and rotating block that moved around the scene. I also learned of the inefficiencies of the existing mesh pathtracing technique and began to explore more implicit surface calculations.

I looked at SLI support to parallelize the pathtracing but found that compute shaders do not support multi-GPU and CUDA/OpenCL options were unviable for our purposes.

Thus we managed to get a fully pathtraced program running at full 90FPS in a Vive Pro at native resolution and 1 ray per pixel with perfectly accurate reflections, lighting, ambient occlusion, and shadows. Albeit with a lot of noise, no RTX acceleration, limited mesh usage due to performance, no standard materials or shaders, and no effective foveated rendering.

4 comments:

  1. Did you account for lens distortion through the camera shader or did you use a standard rectilinear projection and then warp it after?

    ReplyDelete
    Replies
    1. The camera projection is extracted from the unity camera so whatever default settings the unity project has are applied.

      It reads 'camera.cameraToWorldMatrix'

      Delete
    2. Ah, so you're using the default camera matrix. If you prewarp your projection, you can actually avoid over sampling the edges to better match the output image/lenses. Nvidia implemented a similar technique called lens matched shading for rasterized rendering.

      Delete
  2. Hi, I am currently doing some research about vr and path trace. I am wondering how to let this project work on vive pro as you mentioned in the article. When I press play button, nothing happen in my steamVR. Did i miss something? I will be grateful if you help me with this :)

    ReplyDelete