Get Devices dedicated GPU memory

Hey,

For the performance purpose, I would like to determine the scene content based on the client machine that the application running on. So for that I would like to get/detect the machine dedicated GPU memory size prior loading the scene and hide/show some of the features.

How can I implement this idea?

Thanks

I do not think you can (it is blocked by browser to avoid security / privacy issues)

@Deltakosh thanks for your fast reply.

Any suggestion, how I can handle this logic hiding or showing features based on client machine capacity. In my scene there are features that needs higher memory. So I m thinking to disable them, when client tries to access the application with a lower memory devices.

I have not used this myself but considering in the future. GitHub - pmndrs/detect-gpu: Classifies GPUs based on their 3D rendering benchmark score allowing the developer to provide sensible default settings for graphically intensive applications. it’s more like a compiled list of performance/cards rather than actually testing at runtime

2 Likes

Here is quite detailed answer already - How To Test Device Performance to Select Low/Mid/High Resolution Models - #2 by PirateJC

1 Like

@br-matt @labris Thanks for the info.