Currently it is not possible to display images by using a Texture from an array larger than [8192, 8192], which seems to be a WGPU limitation:
Uncaught WGPU error (Unknown):
InvalidDimension(LimitExceeded { dim: Y, given: 9000, limit: 8192 })
I see a few potential ways to get around this:
- Create a
LargeImage class that stitches together pygfx.Image
- Within
pygfx.Image, use a single grid geometry with tiled textures if that's possible?
- Get the indices of the data that are currently visible in the canvas, if it is larger than 8192 in any dimension then subsample the Texture data. If it is a zoomed-in view that is under 8192 then update the Texture data with the full data.
Is there a better way?
Currently it is not possible to display images by using a Texture from an array larger than [8192, 8192], which seems to be a WGPU limitation:
I see a few potential ways to get around this:
LargeImageclass that stitches togetherpygfx.Imagepygfx.Image, use a single grid geometry with tiled textures if that's possible?Is there a better way?