This summary of the video was created by an AI. It might contain some inaccuracies.
00:00:00 – 00:26:10
The video focuses on improving the graphics of Final Fantasy 14 using custom shaders and advanced post-processing techniques. Despite the game's high-quality content, its visuals are outdated due to reliance on DirectX 11. To address this, the presenter uses Reshade and its version G-Shade to enhance graphical output without disabling depth information, allowing for effects like screen space fog.
Key techniques covered include fog enhancement, camera exposure simulation, color correction, and bloom effects. They describe using tone mapping shaders, particularly Narkowicz ACES for vibrant colors, but switching to custom filmic curves to avoid issues like "mega bloom." Techniques like a sharpness filter, Photoshop blend modes, and Contrast Adaptive Sharpness were also employed to detail texture without introducing artifacts.
Auto exposure based on luminance histograms, and a high dynamic range buffer to overcome color precision limitations are discussed. Screen space ambient occlusion (SSAO), using algorithms like XE GTAO from Intel, is implemented to enhance 3D object lighting and realism despite higher GPU demands. Shader passes applied include SSAO, color dodge blends, main color corrections, auto exposure, and custom tone mapping, all customizable by users.
Ultimately, the video concludes with a comprehensive guide for users, offering downloadable shaders and presets, encouraging community sharing with a specific hashtag, and preparing for further enhancements.
00:00:00
In this part of the video, the presenter discusses the current state of Final Fantasy 14’s graphics and the steps they took to enhance them. They highlight that, despite the game’s high-quality content, it suffers from outdated visuals due to its engine being built on DirectX 11. To address this, the presenter explains their use of Reshade—a post-processing injector that enhances the game’s graphical output by intercepting and modifying the graphics API calls. Specifically, they use G-Shade, a version of Reshade tailored for Final Fantasy 14, which does not disable depth information, allowing for more advanced effects. The first effect they implemented was screen space fog, illustrating the process and potential limitations, such as the lack of access to the game camera’s properties.
00:03:00
In this part of the video, the speaker discusses enhancing fog effects and implementing advanced post-processing techniques for color correction in a game. They explain how fog can enhance weather effects and distance portrayal, but its color may not always match the game environment perfectly, so it is kept disabled for gameplay. The speaker then describes simulating camera exposure by adjusting pixel brightness, white balancing for color temperature, and controlling contrast, brightness, and saturation. The new color correction method allows for more powerful adjustments on individual color channels, enabling advanced effects like selective desaturation.
Furthermore, the speaker upgrades their bloom effect, which makes bright areas appear as if they overwhelm the camera lens. This is achieved by filtering and blurring bright pixels, then reintegrating them into the scene. The upgraded bloom shader now includes color correction, allowing for adjustments to the bloom highlights to better match the scene. However, they note that the bloom implementation isn’t finished and discuss the challenges of managing bright pixel details in a low dynamic range color space.
00:06:00
In this part of the video, the speaker discusses the need to convert high-range bloomed pixels into low-range values using a tone mapping operator to preserve image detail. They mention their implementation of a tone mapping shader that supports multiple operators, choosing Narkowicz ACES for vibrant colors. However, they realize that the render has already been clamped, making tone mapping ineffective. The issue stems from GShade using a low dynamic range texture, which limits color precision. After contemplating the project’s challenges, the speaker proposes a solution: creating a high dynamic range buffer. This requires additional shader effects to transfer data between the new buffer and the GShade back buffer, allowing the shaders to properly read and write high dynamic range values.
00:09:00
In this part of the video, the speaker discusses the technical aspects of their custom shaders for Final Fantasy XIV and highlights the trade-offs involved. They note that their shaders add milliseconds to render times and are incompatible with other g-shade shaders. However, these shaders enable a legitimate HDR rendering pipeline, enhancing image quality by proper tone mapping and preserving detail. They also explain the implementation of a sharpness filter to counteract the game’s low-quality textures, which improves visual detail but needs depth-based filtering to avoid noise in distant objects. Despite these enhancements, the speaker acknowledges the imperfections and limitations of their shaders, including issues of brightness, bloom, and sharpness artifacts, which ultimately hinder gameplay. They reflect on these shortcomings, setting the stage for further improvements.
00:12:00
In this part of the video, the speaker discusses how to implement auto exposure in a game camera to simulate real camera behavior. They explain that auto exposure adjusts the brightness based on lighting conditions. The first step is determining the average luminance of the image using a luminance histogram, which is preferred over down-sampling because it offers better control over luminance extremes.
The speaker describes creating a histogram with 256 bins, where each bin represents a luminance range from minimum to maximum. They detail a process where the image is divided into 16 by 16 chunks, processed by thread groups, each calculating local luminance distribution and storing results in a shared array. These local distributions are written to a texture and then flattened into a single histogram by summing rows into a 1D histogram.
This 1 by 256 texture representing the luminance distribution is used to calculate an average luminance to dynamically adjust the camera’s exposure. Consequently, the game adjusts brightness automatically: brightening dark scenes and dimming overly bright ones. This post-processing step is usually done before tone mapping. Finally, the speaker mentions moving on to addressing the bloom effect in games.
00:15:00
In this part of the video, the creator discusses various issues and improvements related to gameplay shaders. They initially disabled the bloom effect because it often ruined the visuals, opting to use it only for specific scenarios. They implemented Photoshop blend modes and replaced the bloom with a mild color dodge to enhance the game’s existing bloom without adding too much. The creator then addresses issues with sharpness, stating that their initial simple sharpness filter produced poor results. To improve this, they adopted a technique called Contrast Adaptive Sharpness, which sharpens low-contrast areas without adding artifacts. After implementing these changes, they fine-tuned these effects to create nearly perfect gameplay shaders, eliminating previous problems like unplayable game areas, bloom artifacts, and sharpness artifacts. They conclude by acknowledging the complexities and challenges of creating effective gameplay shaders.
00:18:00
In this part of the video, the focus is on fixing the graphical issues in Final Fantasy 14 caused by problematic shaders and tone mapping. Initially, the speaker explains how the current shaders create an undesirable “mega bloom” effect due to the auto exposure and the Narkowicz ACE tone mapper not interacting well, especially in scenes with stark lighting contrasts. To address this, the speaker replaces the Narkowicz tone mapper with a custom filmic curve based on the Hable tone mapper formula. The adjustment solves the mega bloom issue but results in dull shaders. This is mitigated by enhancing color correction, tweaking exposure, contrast, and saturation settings to achieve a vibrant yet balanced visual output. The segment concludes with successful adjustments but hints at the upcoming addition of screen space ambient occlusion for further graphical improvements.
00:21:00
In this part of the video, the speaker explains the concept of ambient occlusion, specifically focusing on how it affects the lighting on a 3D object like a ball. They describe screen space ambient occlusion (SSAO), which simulates realistic lighting by using the depth buffer and screen space normals. The process involves calculating normal vectors from depth values, using neighboring pixels’ positions, and then obtaining view space normal vectors via cross product. The explanation continues with an overview of how SSAO determines visibility by projecting a hemisphere kernel tangential to the surface normal and sampling depth values within this hemisphere. The speaker notes that implementing SSAO, such as the XE GTAO from Intel, significantly impacts GPU performance but greatly enhances visual realism. After extensive development, they announce the completion of their gameplay preset.
00:24:00
In this segment of the video, the presenter explains the sequence of shader passes applied to enhance the visual quality of a scene. They begin with SSAO for scene lighting, followed by a mild color dodge blend for highlights and bloom. Main color corrections like temperature, tint, contrast, brightness, saturation, and exposure are applied next. Auto exposure and a custom tone mapping curve are then used to preserve image detail. A special color blend enhances contrast and mid-tones, and the image is sharpened and gamma corrected to improve texture and bring out shadows. All shader effects are fully customizable, allowing users to fine-tune or rearrange them as desired. Additional effects, such as dithering and depth of field, are available, and a user guide is provided. The presenter offers a download link for shaders and presets, encourages sharing of images using the hashtag #acerolafx, and concludes by thanking viewers and indicating that more effects were discussed in the user’s guide.
