Skip to content
Merged
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
32 changes: 32 additions & 0 deletions explainer/index.bs
Original file line number Diff line number Diff line change
Expand Up @@ -730,5 +730,37 @@ When using advanced methods to transfer data to the GPU (with a rolling list of

## Bitflags ## {#bitflags}

WebGPU uses C-style bitflags in several places.
(Search `GPUFlagsConstant` in the spec for instances.)
A typical bitflag definition looks like this:

<xmp highlight=idl>
typedef [EnforceRange] unsigned long GPUColorWriteFlags;
[Exposed=Window]
interface GPUColorWrite {
const GPUFlagsConstant RED = 0x1;
const GPUFlagsConstant GREEN = 0x2;
const GPUFlagsConstant BLUE = 0x4;
const GPUFlagsConstant ALPHA = 0x8;
const GPUFlagsConstant ALL = 0xF;
};
</xmp>

This was chosen because there is no other particularly ergonomic way to describe
"enum sets" in JavaScript today.

Bitflags are used in WebGL, which many WebGPU developers will be familiar with.
They also match closely with the API shape that would be used by many native-language bindings.

The closest option is `sequence<enum type>`, but it doesn't naturally describe
an unordered set of unique items and doesn't easily allow things like
`GPUColorWrite.ALL` above.
Additionally, `sequence<enum type>` has significant overhead, so we would have to avoid it in any
APIs that are expected to be "hot paths" (like command encoder methods), causing inconsistency with
parts of the API that *do* use it.

See also issue [#747](https://github.com/gpuweb/gpuweb/issues/747)
which mentions that strongly-typed bitflags in JavaScript would be useful.


# WebGPU Shading Language # {#wgsl}