Project Details
Description
The rendering of the recent graphics applications has approached the photorealistic quality. Physically-Based Rendering (PBR) methods are now commonly deployed in computer games, animations, movie special effects, virtual or augmented reality, and building information modeling. In the predictable future, the line between the virtual world and the real world may be gradually blurred. Showing extremely realistic looking virtual characters or objects in VR/AR is no longer an unfathomable goal. Furthermore the detailed city scenery and nature landscape in recent games and movie special effects also mean the use of very large scene dataset will be a future trend. From the above observation, the support of photorealistic rendering of a large scene will play a key role, which naturally calls for the use of cloud computing resources. However, it is still difficult for the quality of the real-time VR rendering to reach the PBR quality. A more practical solution is to pre-render PBR quality results into 360 degree images or videos for viewing on VR headsets. Its drawback is the lack of motion parallax when the user moves. Therefore, many research are reinvestigating the 20-year-old Image-Based Rendering methods, especially the Plenoptic Modeling method and the Light Field Rendering methods. However, the four dimensional light field data will further put heavier computation loading to the VR applications on cloud platforms. Therefore, we will investigate a relatively less explored domain of research, which is to extract the common user-independent computation from the user-dependent computation in the PBR algorithms within a multi-user cloud computing platform. We hope to this will allow of offloading of certain computation to the user’s local devices and to avoid the network latency.
Status | Finished |
---|---|
Effective start/end date | 2017/08/01 → 2020/07/31 |
Keywords
- Physically-Based Rendering
- Virtual or Augmented Reality
- Rendering on Cloud Platforms
Fingerprint
Explore the research topics touched on by this project. These labels are generated based on the underlying awards/grants. Together they form a unique fingerprint.