@@ -730,5 +730,37 @@ When using advanced methods to transfer data to the GPU (with a rolling list of
730730
731731## Bitflags ## {#bitflags}
732732
733+ WebGPU uses C-style bitflags in several places.
734+ (Search `GPUFlagsConstant` in the spec for instances.)
735+ A typical bitflag definition looks like this:
736+
737+ <xmp highlight=idl>
738+ typedef [EnforceRange] unsigned long GPUColorWriteFlags;
739+ [Exposed=Window]
740+ interface GPUColorWrite {
741+ const GPUFlagsConstant RED = 0x1;
742+ const GPUFlagsConstant GREEN = 0x2;
743+ const GPUFlagsConstant BLUE = 0x4;
744+ const GPUFlagsConstant ALPHA = 0x8;
745+ const GPUFlagsConstant ALL = 0xF;
746+ };
747+ </xmp>
748+
749+ This was chosen because there is no other particularly ergonomic way to describe
750+ "enum sets" in JavaScript today.
751+
752+ Bitflags are used in WebGL, which many WebGPU developers will be familiar with.
753+ They also match closely with the API shape that would be used by many native-language bindings.
754+
755+ The closest option is `sequence<enum type> `, but it doesn't naturally describe
756+ an unordered set of unique items and doesn't easily allow things like
757+ `GPUColorWrite.ALL` above.
758+ Additionally, `sequence<enum type> ` has significant overhead, so we would have to avoid it in any
759+ APIs that are expected to be "hot paths" (like command encoder methods), causing inconsistency with
760+ parts of the API that *do* use it.
761+
762+ See also issue [#747] (https://github.com/gpuweb/gpuweb/issues/747)
763+ which mentions that strongly-typed bitflags in JavaScript would be useful.
764+
733765
734766# WebGPU Shading Language # {#wgsl}
0 commit comments