Forum

Author Topic: Memory leak? in 1.8 and 1.8.1 on linux  (Read 496 times)

stephan

  • Full Member
  • ***
  • Posts: 129
    • View Profile
Memory leak? in 1.8 and 1.8.1 on linux
« on: February 15, 2022, 05:07:18 PM »
Hey there,

I am running a couple of big projects right now with >50000 cameras. To complete them I first do a rough alignment of the complete system, then split the work area into smaller pieces and rung my own processing on each chunk. This is all fully automated.

I've been running into an issue however: my script seems to crash when it gets to a certain number of iterations / processed chunks, and this seems to be related to the amount of RAM available.

When I run a project on a 32Gb RAM machine it will gradually fill up the RAM available after doing maybe 20 to 30 chunks, and photoscan crashes. On a machine with 128Gb of ram I can do more than 100 chunks, but the RAM still fills up gradually with the first chunk processing the dense cloud using about 20Gb, then going up to >100Gb in the later chunks.

I am not storing any large arrays or values in RAM for my own processing, and there is nothing else running on this machine... so I suspect Photoscan is not clearing something... I will reporte here if I find out more.

Cheers,
Stephan

Alexey Pasumansky

  • Agisoft Technical Support
  • Hero Member
  • *****
  • Posts: 13910
    • View Profile
Re: Memory leak? in 1.8 and 1.8.1 on linux
« Reply #1 on: February 15, 2022, 09:20:34 PM »
Hello Stephan,

The script example may be helpful for analysis. In case it contain some sensible information, please send the file to support@agisoft.com for our internal investigation.
Best regards,
Alexey Pasumansky,
Agisoft LLC

stephan

  • Full Member
  • ***
  • Posts: 129
    • View Profile
Re: Memory leak? in 1.8 and 1.8.1 on linux
« Reply #2 on: February 17, 2022, 10:49:52 AM »
Hi Alexey,

Unfortunately I cannot share the code, but these are the basic steps:

First active chunk has a sparse point cloud with 50K cameras aligned.

1. duplicate first active chunk and switch active chunk to the new duplicate
2. split work area into X parts on one axis and Y parts on another axis, jump to area Z and resize work area to fit this subsection of the main chunk
3. calculate matches
4. build dense PC (Colorized)
5. Make mesh from complete PC
6. calculate camera distances based on GPS coordinates and keep only the X closes cameras (500 usually) -> is this filling up memory?
7. reduce overlap (3)
8. build UV(generic mapping, page count 1)
9. build texture (mosaic blending, fill holes True, ghosting filter True)
10. Classify dense point cloud (based on classes) / export classes to separate PLY files on disk
11. remove keypoints outside of the work area from the chunk (based on distance calculation)

+ there are a number of optional steps that can happen, meshing the exported PC, remerging the tiles, basically making a semantic map at the same time as the main map, but I don't use these all the time and in this case I have deactivated them all to have just the basic steps above.


--> All of this runs fine actually, but I just need to restart the process every 2 or 3 days on a computer with 128Gb of RAM because it gradually fills up over processing (specifying in my script to start at chunk X + Z to carry on where it left off).

It's not the end of the world but it's annoying because the complete process with 50K cameras would take about 10 days if I could just let it run by itself without restarting, but because of these restarts I am loosing a few days.

Cheers,
Stephan
« Last Edit: February 17, 2022, 10:51:31 AM by stephan »