Forum

Author Topic: "Recycling" depth map blocks from interrupted session (~35/50 groups complete)  (Read 1604 times)

skogar

  • Newbie
  • *
  • Posts: 10
    • View Profile
Hello all,

First, I am a long time lurker & first time poster, so I want to say thank you to everyone for your posts which I have closely followed over the last 4 years. I wouldn't have made it this far without you guys!

I have a question regarding depth map generating. I use Metashape to calculate volume changes due to slight changes in surface elevation (<1 m) over fairly large survey areas situated in complex mountain topography. I typically run a low resolution model on my PC to evaluate model output using "medium settings" before using a dedicated supercomputing node w/ many cores & GPU support for producing output on "ultra high" settings. Most of the time, this works very well & I am extremely satisfied with the Metashape product!

However, one issue I often face is that I face hard wall-times when using the supercomputer node. For example, depending upon demand in our lab, I may only get 72 hrs or 96 hrs of processing time before my session is terminated.  Most of the time, this isn't an issue. However, on some projects, the generating depth maps & dense cloud step requires >96 hrs (see attached image). In these cases, my session is terminated before the depth maps can finish generating & I have to start over.

So, my question is:  is there any way that I can "recycle" or begin where I left off previously? If I look at my metashape directories, I see where the depth map groups/blocks (not sure about terminology here...) are stored. If it took me 72 hrs, for example, to get to 80% complete on generating depth maps & I have 35/50 depth map groups/blocks complete, is there any way that I can import & reuse those depth maps to save processing time? Technically speaking, I don't see why I should have to start over if the partitions will be the same, & the block/group output has already been successfully resolved & "stored" in the project directories.. I hope I have explained myself well.

Additional info:
1) I haven't relied upon scripting my projects with python yet, but I am open to this idea if that is the only way forward.
2) I know I can split up my project into different chunks. I'd like to avoid this if possible.

Thank you for your time & assistance!

Alexey Pasumansky

  • Agisoft Technical Support
  • Hero Member
  • *****
  • Posts: 14813
    • View Profile
Hello skogar,

As there's currently no option to pause the processing, close Metashape and then resume the operation from the same point, I think you may need to try the network processing feature.

In case there's a stable machine in your lab that wouldn't be rebooted randomly, you can run Metashape Pro as a server there (doesn't require license activation) and use the super-computer as a processing node instance. Then, when it comes to the time the machine should be stopped, you just pause the job on the server (via Agisoft Network Monitor), terminate the node and next time you have access to the processing machine, run Metashape as a node on it again and resume the operation on the server.

Note that such approach is applicable to the tasks that support fine level task subdivision only. Although most of the task do that, there are still some exceptions, like point cloud based (sparse/dense) mesh generation, building UV layout, blending texture.

If you cannot rely on the second machine as a server, you can run it on the very same computer as the processing node, but to terminate it properly you should pause the job, stop the node instance, save the batch list from the server (via Network Monitor) to json file and only after that stop the server. And to resume: run the server, load json, re-connect the node and resume the job.

Hope that the description is not very complicated, but feel free to comment, if you have any difficulties following the instruction.

P.S. just out of interest, what is the resolution of the images your are working with?
Best regards,
Alexey Pasumansky,
Agisoft LLC

skogar

  • Newbie
  • *
  • Posts: 10
    • View Profile
Hey Alexey,

Thanks for your response & information!  I will definitely investigate the network processing feature for future jobs, but I'm afraid that won't help me on my timeline right now.

So, are you saying that each of the files in...

__projectname___.files > 0 > 0 > depthmaps.5 > data0.zip (or any in data0:48.zip)

... are useless? There's no way they can be used or re-integrated in future processing runs?  They are each >90MB... It seems like such a waste!

I am mentally preparing myself to run into this issue again in 44 hours and 45 minutes when my supercomputing node walltime expires. Metashape is 40% done with generating depth maps after 1 day, 21:46 elapsed and currently estimates 2 days, 20:25 left.

I am only using 969 cameras with an image resolution of 5464 x 3640.

Thanks again for your time & knowledge on these matters.


#######################################################################################################

Depth reconstruction devices performance:
 - 100%    done by Tesla P100-PCIE-16GB
Total time: 7942.63 seconds

20 depth maps generated in 7957.84 sec
saved depth maps block in 1.25953 sec
loaded depth map partition in 0.312993 sec
Initializing...
Found 1 GPUs in 0.005188 sec (CUDA: 0.000298 sec, OpenCL: 0.004864 sec)
Using device: Tesla P100-PCIE-16GB, 56 compute units, free memory: 16011/16280 MB, compute capability 6.0
  driver/runtime CUDA: 10020/6050
  max work group size 1024
  max work item sizes [1024, 1024, 64]
Using CUDA device 'Tesla P100-PCIE-16GB' in concurrent. (2 times)

Alexey Pasumansky

  • Agisoft Technical Support
  • Hero Member
  • *****
  • Posts: 14813
    • View Profile
Hello skogar,

You can re-use existing depth maps, if they are shown in the chunk contents of the Workspace pane, but you wouldn't be able to resume the depth maps generation and reconstruct only missing depth maps, preserving already available in the chunk.
Best regards,
Alexey Pasumansky,
Agisoft LLC

skogar

  • Newbie
  • *
  • Posts: 10
    • View Profile
Okay, thank you for your confirming!  I think this would be a good feature to add in the future.

It would also be nice to be able to only generate depth maps without immediately launching into the dense cloud generation afterwards. I believe I have seen other folks request this as well.

Anyways, i'm grateful for your replies. Take care & stay healthy!