Forum

Author Topic: Cluster Processing - RAM consumption of nodes  (Read 3544 times)

HyperFox

  • Newbie
  • *
  • Posts: 18
    • View Profile
Cluster Processing - RAM consumption of nodes
« on: August 31, 2015, 05:59:08 PM »
Hello,

I have got a project, that ends up in 60 packages (nodes) and the processing on each single node consumes 54 GB RAM. So the node start swapping (windows or linux with the same behaviour).

Is ist possible to define the max amount of ram, that is allowed for a node in a cluster?
Fine level task distribution is already activated for all steps...

Best regards
Ansgar

Wishgranter

  • Hero Member
  • *****
  • Posts: 1202
    • View Profile
    • Museum of Historic Buildings
Re: Cluster Processing - RAM consumption of nodes
« Reply #1 on: September 01, 2015, 01:00:34 AM »
how much imgs per project ?
----------------
www.mhb.sk

HyperFox

  • Newbie
  • *
  • Posts: 18
    • View Profile
Re: Cluster Processing - RAM consumption of nodes
« Reply #2 on: September 01, 2015, 08:00:13 AM »
The project has 1000 oblique images, each 15 MPix.
Split in chunks is no option due to the oblique imagery.
As I mentioned, i tried to process it as cluster.
In fine level task distribution, the server creates 60 tasks.
However, each task needs about 54 GB to process the data.
My cluster nodes have only 32 GB per node, so they start swapping.

I also tried to switch off the swap file. The result is the same, but now the nodes crashed instead of swapping (which is expected).

So my question is: Does the server take the node's memory into account, when he is defining the tasks? Can I increase or decrease the number of tasks? Or can I specify the "max memory" for a task in cluster processing?

It's no "windows" issue, a linux setup of my cluster showed the same behaviour...

Wishgranter

  • Hero Member
  • *****
  • Posts: 1202
    • View Profile
    • Museum of Historic Buildings
Re: Cluster Processing - RAM consumption of nodes
« Reply #3 on: September 01, 2015, 01:47:15 PM »
Hmm read PM or email, will talk about it...
----------------
www.mhb.sk

Alexey Pasumansky

  • Agisoft Technical Support
  • Hero Member
  • *****
  • Posts: 14840
    • View Profile
Re: Cluster Processing - RAM consumption of nodes
« Reply #4 on: September 01, 2015, 01:58:38 PM »
Hello Ansgar,

PhotoScan doesn't take into account the hardware specification of the individual nodes during task distribution. Also it is not possible to limit the mem usage on the nodes.

At what processing step do you have these problems?
Best regards,
Alexey Pasumansky,
Agisoft LLC

HyperFox

  • Newbie
  • *
  • Posts: 18
    • View Profile
Re: Cluster Processing - RAM consumption of nodes
« Reply #5 on: September 01, 2015, 03:06:30 PM »
Hello Alexey,

It is the "filtering depth Images" - step in the command "Build dense Cloud".
The first step is done by GPU and works pretty well on all nodes.
Afterwards, on the CPU, the filtering is done.
And during this process, right at the beginning, each node loads it's data into RAM (I think the depth Images) and starts swapping due to the high memory consumption (54GB, 32 available).