The summary of ‘Reality Engine | An In-depth Look at AR Compositing’

This summary of the video was created by an AI. It might contain some inaccuracies.

00:00:0000:12:33

The video focuses on advanced Augmented Reality (AR) production techniques, seamlessly integrating Reality Engine and Unreal Engine to achieve hyper-realistic results. The process involves setting up scenes with an AR skylight component, using HDR textures, and detailed node-based compositing for projection and camera nodes. The narrator demonstrates how to utilize mixer nodes for viewing different camera perspectives, aligning dummy cubes for realistic projections, and managing lighting and shadows in real-time through various parameters. Additionally, they cover how to configure shadow and ambient occlusion channels for better control over lighting effects, explaining techniques like composite passes and linking nodes. Throughout, emphasis is placed on debugging, real-time adjustments, and the importance of accurate lighting and shadow representation to enhance the realism of AR projects.

00:00:00

In this part of the video, the focus is on using Augmented Reality (AR) production techniques to create hyper-realistic results through Reality Engine and Unreal Engine integration. Key steps include:

1. **Scene Setup**:
– Dragging and dropping the AR skylight component from the place actor tab.
– Assigning an HDR texture to the cube map section under the details tab and setting its mobility to movable.

2. **Node-based Compositing**:
– Adding a projection cube node and camera node to the R graph, connecting track input to tracking data.
– Making real-time changes to the projection cube, such as enabling/disabling sides and transforming meshes.

3. **Projection Setup**:
– Adding a projection node, connecting track input to the track node, actor input to the projection mesh, and video input to the video source.
– Adding a post-process node to control exposure and other parameters in real time.

The segment also emphasizes the importance of connecting track input pins on each node to the track node and suggests creating multiple cameras to get different perspectives.

00:03:00

In this part of the video, the speaker demonstrates how to add a mixer node to view different passes and outputs in separate channels. They connect two cameras to different channels to check outputs from various perspectives and show how to move the user track using keyboard keys. The speaker explains the debugging process for the camera node and projection node to get visual feedback and modifications. They discuss the effect of moving a physical studio camera on the projection node, mimicking an optical projection device, and the impact of changing the light intensity and orientation on projection, emphasizing the difference between unlit and lit material properties. The demonstration includes adding a light to observe shadows in real-time, and moving the projection cube to study shadow transformations. Finally, the speaker resets the projection cube and creates a dummy cube to place behind an AR object.

00:06:00

In this part of the video, the process of aligning and scaling a dummy cube with a wall is explained. A projection actor is created and connected to the dummy cube, showing how the projection appears on the mesh surface. The video details how to use the projection node as a material and configure the projection cube to reflect the projection, adjusting roughness and metallic parameters to achieve the desired reflection. To focus reflections on specific parts, the visibility settings are adjusted, separating the reflections by connecting different projection cubes to separate nodes and setting their metallic values accordingly.

AR shadows are introduced by creating and aligning virtual spotlights with real light sources, enabling debug mode for visual feedback. The necessity of setting the projection material to “lit mask” for shadow casting is highlighted. Various spotlight parameters, including cone angles and intensity, are adjusted to fit the scene requirements. Finally, camera connections are set up on the mixer node to manage the scene’s brightness and lighting according to the environment.

00:09:00

In this part of the video, the presenter discusses how to manage and adjust shadows and ambient occlusion in a rendering engine. First, they explain how to assign shadows to different channels, such as red and green for various light sources, and demonstrate using a channel viewer to inspect these shadow channels. They proceed to create a post-process material to gain better control over lighting, including adjusting shadow channel intensity and opacity separately for red, green, and blue. For ambient occlusion (AO), they show how to control its application using the AO mask parameter, highlighting areas where AO is applied and adjusting its intensity with AO gamma and opacity settings. Finally, the presenter introduces the Reality Engine Composite Passes, briefly mentioning different methods like merge nodes and Reality materials that will be covered in future videos, focusing on creating a composite passes node and linking it correctly within the graph.

00:12:00

In this part of the video, the presenter connects the video input with the video source and demonstrates how AR objects and the video source can be viewed together. They then connect the camera shadow pin to the lighting pin of composite passes, enabling the shadows. The presenter proceeds to adjust the AR object position to observe the shadow effects, explaining that shadow paths can be controlled by selecting the shadow channel of the created light.

Scroll to Top