As benchmarking is topical right now with Andyroo's findings, could we get some built-in benchmarking tools?
Something that give us some rough scores for the hardware/os/settings we're using.
Perhaps some synthetic tests that give scores for each stage of each workflow task. Giving scores for CPU, GPU, Storage, etc.
These could be quick tests that just take a few mins to run. Perhaps downloading a 'small' set of sample date to run.
And perhaps a set of 'real-world' datasets that you might have, or we could donate data to.
Then we could test a drone scan, lidar scan, low-res scans, very high-res scans, underwater scan (with no georef data), a turntable scan, a body scan, etc. etc. etc.
These all tax Metashape and the hardware in different ways, to different amounts, so we'd be able to run a benchmark that best fits our own work.
This should allow us to see what hardware configs to go with and to also keep track of how Metashape versions differ from each other.