Forum

Author Topic: Flush memory between batch steps  (Read 3540 times)

Chris_321

  • Newbie
  • *
  • Posts: 25
    • View Profile
Flush memory between batch steps
« on: June 29, 2017, 07:19:06 AM »
Hello,

Is there a possibility in the none pro version to do some kind of garbage collection or flush the memory between batch processing steps? Delete the undo buffer or cache?
Agisoft seems to eat memory over time or at the very least isn't releasing part of what it used for previous steps, which makes sense if I want to access the parts again shortly, but when I have a file with several chunks and try to process them in a row, the baseline of RAM used is continuously rising. job after job, ultimately leading to failed jobs. If I close and reopen the file, it uses a lot less RAM again.

Alexey Pasumansky

  • Agisoft Technical Support
  • Hero Member
  • *****
  • Posts: 14846
    • View Profile
Re: Flush memory between batch steps
« Reply #1 on: June 29, 2017, 08:40:05 AM »
Hello Chris_321,

Memory should be properly handled by PhotoScan.

Note that there is some memory required to store the current project, if you are working with PSZ project format. To reduce the memory required for the project itself you can use PSX (PhotoScan Archive).
Best regards,
Alexey Pasumansky,
Agisoft LLC

Chris_321

  • Newbie
  • *
  • Posts: 25
    • View Profile
Re: Flush memory between batch steps
« Reply #2 on: June 29, 2017, 10:01:26 AM »
Thanks a lot for the quick reply, Alexey!

 I'm already using psx. (Photoscan 1.3.2)

It might very well be that the behavior is as intended, but I definitely see an increase in the memory footprint while the batch process is working through the chunks, and that memory is only partially released when the next chunk is processed. E. g. after a while I have a 10 GB RAM footprint (just the open scene with the chunks, no job running) while after saving the file and reopening it I have a 3 GB footprint (including Windows etc.) I'll take another look at it today in order to see if it happens generally or only with certain jobs.

Alexey Pasumansky

  • Agisoft Technical Support
  • Hero Member
  • *****
  • Posts: 14846
    • View Profile
Re: Flush memory between batch steps
« Reply #3 on: June 29, 2017, 10:29:43 AM »
Hello Chris_321,

Can you specify the steps that you are using in the Batch mode, so that we could try to reproduce the problem?

I can also suggest to switch the default view in the General Preferences tab to Point Cloud, so that dense cloud and mesh models wouldn't be loaded, if they are in the batch processing pipeline.
Best regards,
Alexey Pasumansky,
Agisoft LLC

Chris_321

  • Newbie
  • *
  • Posts: 25
    • View Profile
Re: Flush memory between batch steps
« Reply #4 on: June 29, 2017, 10:53:59 AM »
I'm doing a dense cloud (ultra high -> high would probably be good enough and save some RAM, but that's not the point, here) of a preexisting sparse cloud, high mesh and 8k texture for 16 chunks (so even small increases add up). Had no problems with the dense clouds, but that could simply be because the overall RAM footprint was still relatively low. Problems then occurred during the mesh generation, with some jobs running out of memory that don't when I clear the RAM bei saving, closing and reopening the scene.

I can also suggest to switch the default view in the General Preferences tab to Point Cloud, so that dense cloud and mesh models wouldn't be loaded, if they are in the batch processing pipeline.

I'll give that a try!


Chris_321

  • Newbie
  • *
  • Posts: 25
    • View Profile
Re: Flush memory between batch steps
« Reply #5 on: June 30, 2017, 12:16:39 PM »
It might have been a bit better with default view switched to sparse cloud and making sure to start with a sparse cloud view, but I didn't have precise data over time for the older runs.

I let another batch run with new data/chunks and recorded the RAM usage this time for the mesh creation phase  and there is less and less RAM freed with each consecutive job/chunk. Got again to an about 10 GB  footprint from a 3,5 GB one.
Didn't record other steps since they went through without problems, but that might be because the require less RAM in general.