Skip to content

WebGLRenderer: LuminanceFormat does not seem to work #30928

@gkjohnson

Description

@gkjohnson

Description

Using LuminanceFormat with a texture results in WebGL warning when trying to render whether it be with a Float or Uint8 data type. Swapping RedFormat in, however, works.

Is there a non standard way to LuminanceFormat? I don't see it used anywhere in examples or anything.

Reproduction steps

  1. Create a data texture with LuminanceFormat and mark "needsUpdate" to true
  2. Try to render with the texture
  3. See that the texture is not rendered with the following warnings the console:
[.WebGL-0x10800a89600] GL_INVALID_ENUM: Invalid internal format 0x1909.Understand this warning
[.WebGL-0x10800a89600] GL_INVALID_OPERATION: Level of detail outside of range.

Code

const map = new THREE.DataTexture(
  new Uint8Array( [ 255 ] ), 1, 1,
  LuminanceFormat, UnsignedByteType,
);
map.needsUpdate = true;

Live example

https://jsfiddle.net/u3zjxk1p/3/

Screenshots

When rendering with LuminanceFormat - this should be white

Image

When rendering with RedFormat

Image

Version

r175

Device

Desktop

Browser

Chrome

OS

MacOS

Metadata

Metadata

Assignees

No one assigned

    Labels

    Projects

    No projects

    Milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions