Skip to content

Conversation

RenaudRohlinger
Copy link
Collaborator

@RenaudRohlinger RenaudRohlinger commented Oct 7, 2024

Description

Enjoy Threejs in HDR 🚀 (requires WebGPU support and an HDR-capable monitor).

Check out the difference:

HDR:
https://raw.githack.com/renaudrohlinger/three.js/utsubo/feat/hdr/examples/webgpu_tsl_vfx_linkedparticles.html

SDR:
https://threejs.org/examples/webgpu_tsl_vfx_linkedparticles.html

This contribution is funded by Utsubo

Copy link

github-actions bot commented Oct 7, 2024

📦 Bundle size

Full ESM build, minified and gzipped.

Before After Diff
WebGL 338.39
78.91
338.39
78.91
+0 B
+0 B
WebGPU 566.01
156.49
566.08
156.52
+76 B
+29 B
WebGPU Nodes 564.61
156.25
564.69
156.27
+76 B
+26 B

🌳 Bundle size after tree-shaking

Minimal build including a renderer, camera, empty scene, and dependencies.

Before After Diff
WebGL 469.82
113.62
469.92
113.65
+105 B
+25 B
WebGPU 637.26
172.46
637.44
172.51
+182 B
+55 B
WebGPU Nodes 591.9
161.69
592.09
161.75
+182 B
+57 B

@sunag sunag added this to the r170 milestone Oct 7, 2024
@Mugen87
Copy link
Collaborator

Mugen87 commented Oct 7, 2024

Related: https://github.com/ccameron-chromium/webgpu-hdr/blob/main/EXPLAINER.md

So I guess when using hdr: true, the developer should not configure any tone mapping, right? Should we check that and provide a warning if the developer attempts to do that? Meaning:

renderer = new THREE.WebGPURenderer( { antialias: true, hdr: true } );
renderer.toneMapping = THREE.ACESFilmicToneMapping; // -> produces warning

What about outputColorSpace? Is it correct to use SRGBColorSpace in the demo? It seems not, since the renderer would attempt to convert unbound HDR texels to sRGB which isn't right.

To clarify: webgpu_tsl_vfx_linkedparticles should have used tone mapping in the first place.

@RenaudRohlinger
Copy link
Collaborator Author

I agree with the proposed solution.

@CodyJasonBennett warned me about the toneMapping issue and we could indeed probably just warn the developer about the fact that both HDR and tonemapping cannot coexist yet.
As for outputColorSpace, I'd value @donmccurdy's input on this matter.

@CodyJasonBennett
Copy link
Contributor

CodyJasonBennett commented Oct 7, 2024

What about outputColorSpace? Is it correct to use SRGBColorSpace in the demo? It seems not, since the renderer would attempt to convert unbound HDR texels to sRGB which isn't right.

Your intuition is correct. When rendering out in HDR, you send the physical/lighting units (candelas, nits). The display does the rest, including conversion into the electric signal used by the display, which is what sRGBTransferOETF does in LDR. We can still do a view transform though and preserve the (de)saturation of a tonemapper, but I haven't seen an implementation that outputs HDR. As they are fitted, they would require a custom implementation and switch depending on output parameters (not enough precision to do both in one).

CodyJasonBennett warned me about the toneMapping issue and we could indeed probably just warn the developer about the fact that both HDR and tonemapping cannot coexist yet.
As for outputColorSpace, I'd value donmccurdy's input on this matter.

I've chatted with Don about this since I'm eager to see a real comparison with tonemapping in LDR and HDR (simply disabling tonemapping doesn't compare), but it's a lot of work to implement still. I'm happy to upstream tonemappers here if we can figure out an API. Just a lot of unknowns on top of historical problems and inconsistencies from display manufacturers, which complicate this. I'd be more confident with an API once we have direction here.

@donmccurdy
Copy link
Collaborator

donmccurdy commented Oct 7, 2024

Some caution — our output to the drawing buffer must still be a formed image, we cannot send the stimulus (i.e. scene lighting / luminance) directly to the drawing buffer; the browser/device/OS does not provide image formation any more in “HDR” than in “SDR”. A lot of recent HDR demos on social media have made this mistake by omitting tone mapping. We do still want to consider output units (likely nits), as they relate to the viewing device/display.

WebGPU HDR, as currently shipped in Chrome, tells us nothing about the display, so we are guessing heavily. The amount of HDR headroom available may vary: turn your laptop brightness up, headroom reduces, different color management may be required. This is a major gap in the current WebGPU implementation in Chrome, and something we may need to keep tabs on for changes. And as @CodyJasonBennett says well, “a lot of unknowns on top of historical problems and inconsistencies” exist outside of Chrome's control.

I have a general idea of how to adapt AgX or ACES Filmic for use with “HDR“ output, and I'll look into that a bit. Desaturation is fortunately orthogonal: representation as “SDR” vs. “HDR” does not imply any large difference in saturation. If the comparison does diverge then something is likely wrong.1

What about outputColorSpace? Is it correct to use SRGBColorSpace in the demo? It seems not, since the renderer would attempt to convert unbound HDR texels to sRGB which isn't right.

Your intuition is correct. When rendering out in HDR, you send the physical/lighting units (candelas, nits).

A quick test here would be to render a solid MeshBasicMaterial (example: #ff8c69) with HDR mode enabled. I believe we'll get the expected result when keeping .outputColorSpace = SRGBColorspace. There is no rule that an sRGB value cannot exceed 1... and I think the WebGPU explainer is saying indeed that we must do so, but the WebGPU explainer is not as clear as I'd prefer. I wish they'd offered rec2100-hlg too, that is easier to reason about for our purposes, and hopefully that's coming.

Footnotes

  1. Historically we've made tone mapping very easy for users, and color grading we've left as advanced/DIY ... and I think this has lead to some misconceptions. Adjusting saturation above/below tone mapping defaults is reasonable — and beneficial more often than not!

@donmccurdy
Copy link
Collaborator

donmccurdy commented Oct 14, 2024

Possible API:

import { ExtendedSGRBColorSpace } from 'three/addons/math/ColorSpaces.js';

const renderer = new WebGPURenderer({ outputType: THREE.HalfFloatType });

renderer.outputColorSpace = ExtendedSGRBColorSpace;

My main concern: Extended sRGB (the only option WebGPU currently allows) is not really an appropriate output color space for lit rendering; we need to render an “HDR” image in reference to a well-defined display/medium. I'll file an issue on the WebGPU spec repo about this (EDIT: gpuweb/gpuweb#4919); perhaps there are plans to enable other output color spaces. I would prefer to have this:

import { Rec2100HLGColorSpace } from 'three/addons/math/ColorSpaces.js';

const renderer = new WebGPURenderer({ outputType: THREE.HalfFloatType });

renderer.outputColorSpace = Rec2100HLGColorSpace;

Adaptations to tone mapping are also needed, though they depend on information we do not have, and which may not be well-defined at all when using Extended sRGB.

I know others are excited about the “HDR” features though — would it be possible to start with a PR that exposes outputType: HalfFloatType and confirm that everything still works as expected, before we continue with next steps? I'm hoping to avoid major color changes for existing scenes like r152, which I'm afraid will be necessary to transition from WebGPU's current HDR API to a correct image formation pipeline.

@mrdoob mrdoob modified the milestones: r170, r171 Oct 31, 2024
@mrdoob mrdoob modified the milestones: r171, r172 Nov 29, 2024
@mrdoob mrdoob modified the milestones: r172, r173 Dec 31, 2024
@Makio64
Copy link
Contributor

Makio64 commented Jan 5, 2025

Happy new year everyone !

@donmccurdy From user perspective as you said it's such an exciting feature but ColorSpace change was an hard one ( and still involve extra code/debug for me sometimes ) ..

Do you think it would be possible to make a simple API like :
const renderer = new WebGPURenderer({ hdr: true }); and default to sdr if hdr isnt support, like the automatic fallback to webgl2?

@donmccurdy
Copy link
Collaborator

I don't feel that a boolean HDR on/off flag would be the right approach. We should really provide the option to use a high-precision drawing buffer, whether or not the device supports HDR. And we may want to provide the option to render into an HDR canvas for export or baking purposes, on devices that support HDR canvas but don't currently have an HDR display connected. So I do prefer the API suggested in #29573 (comment).

This has the added benefit of not locking us into Extended sRGB, which I feel would be a huge mistake.

@donmccurdy
Copy link
Collaborator

Here's a small PR to provide an outputType parameter, which can be used independently of HDR rendering:

#30320

@mrdoob mrdoob modified the milestones: r173, r174 Jan 31, 2025
@mrdoob mrdoob modified the milestones: r174, r175 Feb 27, 2025
@mrdoob mrdoob modified the milestones: r175, r176 Mar 28, 2025
@mrdoob mrdoob removed this from the r176 milestone Apr 24, 2025
@RenaudRohlinger
Copy link
Collaborator Author

RenaudRohlinger commented Jul 26, 2025

@donmccurdy I had to also implement ExtendedLinearSRGBColorSpace to support PostProcessing, because WebGPU requires float16 format for HDR. The render method of PostProcessing was forcing a LinearSRGBColorSpace and so the canvas would fallback to a default BGRA8Unorm, which breaks compatibility with WebGPU.

From what I understand I think that we can't simply resolve the final render to float16 while keeping framebuffers in BGRA8Unorm, since that conflicts with the initially configured WebGPU context, which now uses float16 with extended colorSpace (and would throw an error). Also, reconfiguring the context at runtime is not recommended and potentially problematic.

@donmccurdy
Copy link
Collaborator

donmccurdy commented Jul 27, 2025

Hm, there's at least one pre-existing trouble spot, which isn't this PR's fault, but I'm not sure if we can do the new "ExtendedLinearSRGBColorSpace" constant:

  1. We shouldn't be hard-coding LinearSRGBColorSpace in PostProcessing.js, the default should be ColorManagement.workingColorSpace (which is configurable)
  2. Having output color space configuration on our working color space is conceptually a bit backwards, LinearSRGBColorSpace should be a valid working color space even with an HDR output color space

The render method of PostProcessing was forcing a LinearSRGBColorSpace and so the canvas would fallback to a default BGRA8Unorm...

The use of a 8-bit or 16-bit drawing buffer should only be depending on renderer.outputType at this point — output to LinearSRGBColorSpace with a 16-bit drawing buffer should be fine, even without any of these HDR changes. Is that not working as expected? Or is the issue more that we're changing between two outputColorSpace settings with different 'toneMapping.mode' configurations?

@RenaudRohlinger
Copy link
Collaborator Author

RenaudRohlinger commented Jul 28, 2025

I see, make sense! I removed LinearSRGBColorSpace, and make use of outputType instead. Should be good now.

Users will need to be aware of the fact that they need to set renderer.outputType = THREE.HalfFloatType in order for the HDR to work in that case.
I was trying to handle outputType automatically based on the toneMappingMode but with your comment I understand that's two separate thing.

// updated example usage:
import { ColorManagement } from 'three';
import { ExtendedSRGBColorSpace, ExtendedSRGBColorSpaceImpl } from 'three/addons/math/ColorSpaces.js';

ColorManagement.define( { [ ExtendedSRGBColorSpace ]: ExtendedSRGBColorSpaceImpl } );

renderer = new THREE.WebGPURenderer({ outputType: THREE.HalfFloatType })
renderer.outputColorSpace = ExtendedSRGBColorSpace;

In another PR I suggest that we should make a very basic example to demonstrate the difference with HDR, such as a basic material plane or a cube on a transparent canvas with a white or black background. Something like:
https://ccameron-chromium.github.io/webgpu-hdr/example.html

@donmccurdy
Copy link
Collaborator

donmccurdy commented Jul 28, 2025

Thanks @RenaudRohlinger! I'll do some testing this week but the implementation looks good to me now. I'm hoping that the WebGPU spec will also add support for the Rec. 2100 HLG color space as an alternative to Extended sRGB, which could address my concerns in #29573 (comment). In that case we'd potentially have a choice of float16 or rgb10a2unorm as the output type when using HDR-capable displays.

And of course using float16 without an HDR output color space is also an option, and would improve image quality in some cases.

@donmccurdy donmccurdy modified the milestones: r179, r180 Jul 28, 2025
@donmccurdy
Copy link
Collaborator

Let's merge this shortly after the r179 release, so it's available in r180? Just want to make sure we have time to include tone mapping.

@RenaudRohlinger
Copy link
Collaborator Author

RenaudRohlinger commented Jul 29, 2025

@donmccurdy While doing your tests this week, if you have a chance to looking an issue I'm running into regarding Display P3 support in WebGPUBackend.

When configuring the WebGPU context via this.context.configure(...), I expected colorSpace: this.renderer.outputColorSpace to be sufficient but P3 output doesn't seem to work correctly for the textures, unlike in WebGLBackend where everything works fine.

As far as I can tell, WebGPU doesn't have (nor need) the equivalent of

_gl.pixelStorei( _gl.UNPACK_COLORSPACE_CONVERSION_WEBGL, unpackConversion );

which WebGL uses to handle colorspace conversion for textures. So I’m wondering:

Is full P3 texture support something we need to explicitly handle in ColorSpaceNode or another TSL-side abstraction?

Could this limitation be related to how color transforms or EOTF/OETF are currently applied in the TSL pipeline?

I tried adding the following in WGSLNodeBuilder:

needsToWorkingColorSpace( texture ) {

	// Skip if texture has no color space
	if ( texture.colorSpace === NoColorSpace ) return false;

	// Skip if already in working color space
	if ( texture.colorSpace === ColorManagement.workingColorSpace ) return false;

	// Otherwise convert
	return true;
}

which is used to call convertColorSpace for textures but this doesn’t seem to be enough. Maybe I’m missing a step in how P3 textures should be handled on upload or in shader output in WebGPU? My understanding of color management is still pretty limited, so any insights on how this should be wired up in WebGPU vs WebGL would be much appreciated.

Thanks! 🙏

@donmccurdy
Copy link
Collaborator

donmccurdy commented Jul 29, 2025

@RenaudRohlinger I'll try to take a closer look, but on first reading this might be related to WebGPU limitations in:

Ideally we'd configure GPUTextureDescriptor resources to unpack into THREE.ColorManagement.workingColorSpace, but I believe WebGPU only supports that for video textures currently, with more on the roadmap. Failing that, we could do conversions on CPU (somewhat slow) or in TSL (somewhat inaccurate for mipmaps and interpolation).

The situation is inverted in WebGL: WebGL handles unpacking for image textures but not video textures, which we have to deal with in the shader. I think needsToWorkingColorSpace is currently handling only the WebGL scenario, with conversion disabled for non-video textures. Since the code is already there, I suppose we might as well handle WebGPU textures similarly (if only temporarily).

Copy link
Collaborator

@donmccurdy donmccurdy left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Minor — I might prefer to keep the 'linkedparticle' example as-is for now. Because we don't have tone mapping available yet, there's some boundary structure (blobbiness?) on denser particles that I don't think is an intended look in the final image. Unsure how to screenshot that.

LGTM to merge though, with or without reverting the example. I can follow up with another example and an attempt at the tone mapping.

Thanks again @RenaudRohlinger!!

@sunag sunag merged commit 84aa697 into mrdoob:dev Aug 6, 2025
9 checks passed
@RenaudRohlinger
Copy link
Collaborator Author

Thanks for the review! I noticed that if the Renderer outputType is not set to HalfFloat, WebGPU currently crashes silently.
I suggest adding a warning to make this issue more explicit. I'll address it in a separate PR.

@hybridherbst
Copy link
Contributor

While trying to get my older HDR example to work, I wondered:
How, in the new TSL/WebGPU world, do I do the equivalent of "patching the tonemapping ShaderChunk"?

@Mugen87
Copy link
Collaborator

Mugen87 commented Aug 20, 2025

The following approach should be still valid: #28957 (comment)

@hybridherbst
Copy link
Contributor

hybridherbst commented Aug 20, 2025

Thanks @Mugen87. For me that's not really patching though, it's replacing a bigger part of the pipeline and hoping to match whatever the previous behaviour was. I don't think that's ideal – something that previously was simple (like patching AgX to not clamp the output just for testing) is now really complex... are there alternative approaches? I could imagine that a callback, that somehow gets the nodes and a context of where in parsing we are, might be useful for many usecases.

@Mugen87
Copy link
Collaborator

Mugen87 commented Aug 20, 2025

The previous approach was hardly ideal since patching raw shader code was error prone and fragile.

To me, it's a clearer workflow to create a custom node for a custom tone mapping and then apply it to the renderer pipeline. If the post processing approach is too complex for you, we maybe can investigate to make ToneMappingNode more flexible and allow custom tone functions on that level.

@hybridherbst
Copy link
Contributor

hybridherbst commented Aug 20, 2025

It's not just about tonemapping – e.g. imagine wanting to patch something into the lighting equations, camera matrix handling, a new environment mapping method, or other details. I do agree that patching is error prone and fragile – but it works at all without having to modify three.js itself. It would be great to be able to say „I know what I’m doing, override this node implementation with that“.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

10 participants