-
-
Notifications
You must be signed in to change notification settings - Fork 36k
WebGPURenderer: Add HDR Support #29573
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
📦 Bundle sizeFull ESM build, minified and gzipped.
🌳 Bundle size after tree-shakingMinimal build including a renderer, camera, empty scene, and dependencies.
|
Related: https://github.com/ccameron-chromium/webgpu-hdr/blob/main/EXPLAINER.md So I guess when using renderer = new THREE.WebGPURenderer( { antialias: true, hdr: true } );
renderer.toneMapping = THREE.ACESFilmicToneMapping; // -> produces warning What about To clarify: |
I agree with the proposed solution. @CodyJasonBennett warned me about the toneMapping issue and we could indeed probably just warn the developer about the fact that both |
Your intuition is correct. When rendering out in HDR, you send the physical/lighting units (candelas, nits). The display does the rest, including conversion into the electric signal used by the display, which is what
I've chatted with Don about this since I'm eager to see a real comparison with tonemapping in LDR and HDR (simply disabling tonemapping doesn't compare), but it's a lot of work to implement still. I'm happy to upstream tonemappers here if we can figure out an API. Just a lot of unknowns on top of historical problems and inconsistencies from display manufacturers, which complicate this. I'd be more confident with an API once we have direction here. |
Some caution — our output to the drawing buffer must still be a formed image, we cannot send the stimulus (i.e. scene lighting / luminance) directly to the drawing buffer; the browser/device/OS does not provide image formation any more in “HDR” than in “SDR”. A lot of recent HDR demos on social media have made this mistake by omitting tone mapping. We do still want to consider output units (likely nits), as they relate to the viewing device/display. WebGPU HDR, as currently shipped in Chrome, tells us nothing about the display, so we are guessing heavily. The amount of HDR headroom available may vary: turn your laptop brightness up, headroom reduces, different color management may be required. This is a major gap in the current WebGPU implementation in Chrome, and something we may need to keep tabs on for changes. And as @CodyJasonBennett says well, “a lot of unknowns on top of historical problems and inconsistencies” exist outside of Chrome's control. I have a general idea of how to adapt AgX or ACES Filmic for use with “HDR“ output, and I'll look into that a bit. Desaturation is fortunately orthogonal: representation as “SDR” vs. “HDR” does not imply any large difference in saturation. If the comparison does diverge then something is likely wrong.1
A quick test here would be to render a solid MeshBasicMaterial (example: Footnotes
|
Possible API: import { ExtendedSGRBColorSpace } from 'three/addons/math/ColorSpaces.js';
const renderer = new WebGPURenderer({ outputType: THREE.HalfFloatType });
renderer.outputColorSpace = ExtendedSGRBColorSpace; My main concern: Extended sRGB (the only option WebGPU currently allows) is not really an appropriate output color space for lit rendering; we need to render an “HDR” image in reference to a well-defined display/medium. I'll file an issue on the WebGPU spec repo about this (EDIT: gpuweb/gpuweb#4919); perhaps there are plans to enable other output color spaces. I would prefer to have this: import { Rec2100HLGColorSpace } from 'three/addons/math/ColorSpaces.js';
const renderer = new WebGPURenderer({ outputType: THREE.HalfFloatType });
renderer.outputColorSpace = Rec2100HLGColorSpace; Adaptations to tone mapping are also needed, though they depend on information we do not have, and which may not be well-defined at all when using Extended sRGB. I know others are excited about the “HDR” features though — would it be possible to start with a PR that exposes |
Happy new year everyone ! @donmccurdy From user perspective as you said it's such an exciting feature but Do you think it would be possible to make a simple API like : |
I don't feel that a boolean HDR on/off flag would be the right approach. We should really provide the option to use a high-precision drawing buffer, whether or not the device supports HDR. And we may want to provide the option to render into an HDR canvas for export or baking purposes, on devices that support HDR canvas but don't currently have an HDR display connected. So I do prefer the API suggested in #29573 (comment). This has the added benefit of not locking us into Extended sRGB, which I feel would be a huge mistake. |
Here's a small PR to provide an |
@donmccurdy I had to also implement From what I understand I think that we can't simply resolve the final render to |
Hm, there's at least one pre-existing trouble spot, which isn't this PR's fault, but I'm not sure if we can do the new "ExtendedLinearSRGBColorSpace" constant:
The use of a 8-bit or 16-bit drawing buffer should only be depending on |
I see, make sense! I removed Users will need to be aware of the fact that they need to set // updated example usage:
import { ColorManagement } from 'three';
import { ExtendedSRGBColorSpace, ExtendedSRGBColorSpaceImpl } from 'three/addons/math/ColorSpaces.js';
ColorManagement.define( { [ ExtendedSRGBColorSpace ]: ExtendedSRGBColorSpaceImpl } );
renderer = new THREE.WebGPURenderer({ outputType: THREE.HalfFloatType })
renderer.outputColorSpace = ExtendedSRGBColorSpace; In another PR I suggest that we should make a very basic example to demonstrate the difference with HDR, such as a basic material plane or a cube on a transparent canvas with a white or black background. Something like: |
Thanks @RenaudRohlinger! I'll do some testing this week but the implementation looks good to me now. I'm hoping that the WebGPU spec will also add support for the Rec. 2100 HLG color space as an alternative to Extended sRGB, which could address my concerns in #29573 (comment). In that case we'd potentially have a choice of float16 or rgb10a2unorm as the output type when using HDR-capable displays. And of course using float16 without an HDR output color space is also an option, and would improve image quality in some cases. |
Let's merge this shortly after the r179 release, so it's available in r180? Just want to make sure we have time to include tone mapping. |
@donmccurdy While doing your tests this week, if you have a chance to looking an issue I'm running into regarding Display P3 support in When configuring the WebGPU context via As far as I can tell, WebGPU doesn't have (nor need) the equivalent of _gl.pixelStorei( _gl.UNPACK_COLORSPACE_CONVERSION_WEBGL, unpackConversion ); which WebGL uses to handle colorspace conversion for textures. So I’m wondering: Is full P3 texture support something we need to explicitly handle in Could this limitation be related to how color transforms or EOTF/OETF are currently applied in the TSL pipeline? I tried adding the following in WGSLNodeBuilder: needsToWorkingColorSpace( texture ) {
// Skip if texture has no color space
if ( texture.colorSpace === NoColorSpace ) return false;
// Skip if already in working color space
if ( texture.colorSpace === ColorManagement.workingColorSpace ) return false;
// Otherwise convert
return true;
} which is used to call Thanks! 🙏 |
@RenaudRohlinger I'll try to take a closer look, but on first reading this might be related to WebGPU limitations in: Ideally we'd configure The situation is inverted in WebGL: WebGL handles unpacking for image textures but not video textures, which we have to deal with in the shader. I think |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Minor — I might prefer to keep the 'linkedparticle' example as-is for now. Because we don't have tone mapping available yet, there's some boundary structure (blobbiness?) on denser particles that I don't think is an intended look in the final image. Unsure how to screenshot that.
LGTM to merge though, with or without reverting the example. I can follow up with another example and an attempt at the tone mapping.
Thanks again @RenaudRohlinger!!
Thanks for the review! I noticed that if the Renderer |
While trying to get my older HDR example to work, I wondered: |
The following approach should be still valid: #28957 (comment) |
Thanks @Mugen87. For me that's not really patching though, it's replacing a bigger part of the pipeline and hoping to match whatever the previous behaviour was. I don't think that's ideal – something that previously was simple (like patching AgX to not clamp the output just for testing) is now really complex... are there alternative approaches? I could imagine that a callback, that somehow gets the nodes and a context of where in parsing we are, might be useful for many usecases. |
The previous approach was hardly ideal since patching raw shader code was error prone and fragile. To me, it's a clearer workflow to create a custom node for a custom tone mapping and then apply it to the renderer pipeline. If the post processing approach is too complex for you, we maybe can investigate to make |
It's not just about tonemapping – e.g. imagine wanting to patch something into the lighting equations, camera matrix handling, a new environment mapping method, or other details. I do agree that patching is error prone and fragile – but it works at all without having to modify three.js itself. It would be great to be able to say „I know what I’m doing, override this node implementation with that“. |
Description
Enjoy Threejs in HDR 🚀 (requires WebGPU support and an HDR-capable monitor).
Check out the difference:
HDR:
https://raw.githack.com/renaudrohlinger/three.js/utsubo/feat/hdr/examples/webgpu_tsl_vfx_linkedparticles.html
SDR:
https://threejs.org/examples/webgpu_tsl_vfx_linkedparticles.html
This contribution is funded by Utsubo