User Frame's data array is 1280*960 even though the depth resolution is 640*480

Hello,
I get all the data that is needed from the first 640x480 to construct and display a user frame without iterating over the other 640x480 data. Why is the array size of a user frame 1280x960 if the depth and the RGB stream of my Astra are only 640x480? Please help, @olga.kuzminykh

Never mind. The data is stored as int16. stupid me

Yes depth is 16bit :slight_smile:

He is referring to the User/Index frame, not the depth frame. It is indeed 16bit.

This is something it puzzled me, why use 16bit instead of 8bit like other sensors/SDKs ? does it store additional information in the upper bits?

We are having to run a loop to render the user image instead of a simple array.copy. Could save a lot of processing time and power as user frame is rendered persistently throughout most games.

The standard format of User Frames on most APIs is 8 bits.

It is true that if you want to display the User Frame on screen you need to convert it to 24 or 32 RGB formats.

But the main use of user frames is to be used as a mask for depth analysis, for which you really need to keep the format at 8 bits.