You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I have one question related to masked layers. Basically, I would like to retrieve an image buffer for the mask data (similar to retrieving the composite of the Psd file or the Layer, as an Uint8ClampedArray).
However, it's not clear how to consume this data. I understand that I can retrieve the masks bounding box reading the layer.maskData Objects left, right, top, bottom values. I also see that there is an ArrayBuffer inside the layer.layerFrame.userMask.data Object.
However, the length of said buffer does not match with the size of the mask (as it expected to be width * height * 4 in length). I was thinking that this buffer contains the alpha channel or gray scale data only, however after converting this to fit in an RGBA sized buffer, the rendered data does not reflect the original masks shape, with pixels basically all over the place. I also noticed that the Layer class has an async method getUserMask|getRealUserMask which seems to return RGBA-sized values, but once more the rendered data does not reflect the masks shape.
As such, I'm not sure how to read and render the mask buffer object. Is there something obvious I'm overlooking with regards to the mask size ?
The text was updated successfully, but these errors were encountered:
Hi there, great project and the 🚀 is impressive!
I have one question related to masked layers. Basically, I would like to retrieve an image buffer for the mask data (similar to retrieving the composite of the Psd file or the Layer, as an
Uint8ClampedArray
).However, it's not clear how to consume this data. I understand that I can retrieve the masks bounding box reading the
layer.maskData
Objects left, right, top, bottom values. I also see that there is an ArrayBuffer inside thelayer.layerFrame.userMask.data
Object.However, the length of said buffer does not match with the size of the mask (as it expected to be width * height * 4 in length). I was thinking that this buffer contains the alpha channel or gray scale data only, however after converting this to fit in an RGBA sized buffer, the rendered data does not reflect the original masks shape, with pixels basically all over the place. I also noticed that the Layer class has an async method
getUserMask|getRealUserMask
which seems to return RGBA-sized values, but once more the rendered data does not reflect the masks shape.As such, I'm not sure how to read and render the mask buffer object. Is there something obvious I'm overlooking with regards to the mask size ?
The text was updated successfully, but these errors were encountered: