Forum

Author Topic: Performance issues, not using resources properly  (Read 3726 times)

michaldolnik

  • Newbie
  • *
  • Posts: 13
    • View Profile
Performance issues, not using resources properly
« on: May 09, 2024, 05:16:31 PM »
We run our Metashape processing pipeline in AWS cloud and we use instances - g4dn.4xlarge, g4dn.8xlarge, g4dn.16xlarge.

It looks that script does not use the resources efficiently...

E.g. instance with 16xlarge is double that powerful than 8xlarge, but the difference in processing time is 1 hour.


We found out these differences recently, because we started to work with some huge datasets, where computation times are very long.

Do you have some hints when it comes to optimisation of performance? Are we missing something?


Thanks

Alexey Pasumansky

  • Agisoft Technical Support
  • Hero Member
  • *****
  • Posts: 15320
    • View Profile
Re: Performance issues, not using resources properly
« Reply #1 on: May 10, 2024, 03:14:26 PM »
Hello michaldolnik,

Can you share complete contents of Show Info dialog related to chunk processed on each configuration (or Processing Parameters pages from the report file), so that we could compare the processing parameters used and timing?
Best regards,
Alexey Pasumansky,
Agisoft LLC

Neb0skreb

  • Newbie
  • *
  • Posts: 5
    • View Profile
Re: Performance issues, not using resources properly
« Reply #2 on: May 22, 2024, 10:12:57 AM »
We have been processing scans this way for several years now and the optimal solution between time and cost is g4dn.12xlarge - the number of photos is from 100 to 10,000 - this is if the photos are of adequate resolution - no more than 30 megapixels.   If there are a large number of photographs and the resolution is, for example, 61 megapixels, then this will not be enough and all processes will crash—you need a more powerful machine. For the export step, we change the machine to a cheaper r6id.2xlarge since the use of a video card is no longer required.