Forum

Author Topic: Network processed jobs not behaving as expected  (Read 2332 times)

eastonmbio

  • Newbie
  • *
  • Posts: 13
    • View Profile
Network processed jobs not behaving as expected
« on: May 24, 2022, 02:23:05 AM »
Hi,

I am trying to use network processing to repeat the same tasks over 4 different chunks all living in the same project. Clearly there is an error in how my code is structured and I'm looking to see where I have gone wrong. I am building meshes using depth maps as source data. All 8 of the tasks are sent off and processed but on reopening the project, only the depth maps and mesh from the final chunk is there.

Code: [Select]
import Metashape as ms

app = ms.app
docpath = ms.app.document.path
doc = ms.Document()
chunk = ms.app.document.chunk

doc.open(docpath, read_only = False, ignore_lock=True)


doc.save()

network_server = 'metashape-qmgr.myinstitution.gov.au'

ms.app.settings.network_path = 'Z:/'



client = ms.NetworkClient()

tasks = []  # create task list

chunk = doc.chunks[0]

task = ms.Tasks.BuildDepthMaps()
task.downscale = 4
task.filter_mode = ms.MildFiltering
task.reuse_depth = False
task.max_neighbors = 40
task.max_workgroup_size=100
tasks.append(task)

task = ms.Tasks.BuildModel()
task.source_data = ms.DepthMapsData
task.surface_type = ms.Arbitrary
task.interpolation = ms.EnabledInterpolation
task.face_count = ms.FaceCount.HighFaceCount
task.source_data = ms.DepthMapsData
task.vertex_colors = True
task.vertex_confidence = True
task.volumetric_masks = False
task.keep_depth = True
task.trimming_radius = 10
task.subdivide_task = True
task.workitem_size_cameras = 20
task.max_workgroup_size = 100
tasks.append(task)

chunk = doc.chunks[1]

task = ms.Tasks.BuildDepthMaps()
task.downscale = 4
task.filter_mode = ms.MildFiltering
task.reuse_depth = False
task.max_neighbors = 40
task.max_workgroup_size=100
tasks.append(task)

task = ms.Tasks.BuildModel()
task.source_data = ms.DepthMapsData
task.surface_type = ms.Arbitrary
task.interpolation = ms.EnabledInterpolation
task.face_count = ms.FaceCount.HighFaceCount
task.source_data = ms.DepthMapsData
task.vertex_colors = True
task.vertex_confidence = True
task.volumetric_masks = False
task.keep_depth = True
task.trimming_radius = 10
task.subdivide_task = True
task.workitem_size_cameras = 20
task.max_workgroup_size = 100
tasks.append(task)

chunk = doc.chunks[2]

task = ms.Tasks.BuildDepthMaps()
task.downscale = 4
task.filter_mode = ms.MildFiltering
task.reuse_depth = False
task.max_neighbors = 40
task.max_workgroup_size=100
tasks.append(task)

task = ms.Tasks.BuildModel()
task.source_data = ms.DepthMapsData
task.surface_type = ms.Arbitrary
task.interpolation = ms.EnabledInterpolation
task.face_count = ms.FaceCount.HighFaceCount
task.source_data = ms.DepthMapsData
task.vertex_colors = True
task.vertex_confidence = True
task.volumetric_masks = False
task.keep_depth = True
task.trimming_radius = 10
task.subdivide_task = True
task.workitem_size_cameras = 20
task.max_workgroup_size = 100
tasks.append(task)

chunk = doc.chunks[3]

task = ms.Tasks.BuildDepthMaps()
task.downscale = 4
task.filter_mode = ms.MildFiltering
task.reuse_depth = False
task.max_neighbors = 40
task.max_workgroup_size=100
tasks.append(task)

task = ms.Tasks.BuildModel()
task.source_data = ms.DepthMapsData
task.surface_type = ms.Arbitrary
task.interpolation = ms.EnabledInterpolation
task.face_count = ms.FaceCount.HighFaceCount
task.source_data = ms.DepthMapsData
task.vertex_colors = True
task.vertex_confidence = True
task.volumetric_masks = False
task.keep_depth = True
task.trimming_radius = 10
task.subdivide_task = True
task.workitem_size_cameras = 20
task.max_workgroup_size = 100
tasks.append(task)
# --------------------------
# --------------------------
# ------------------------------------------------------------------------------

# DONE

# convert task list to network tasks
network_tasks = []
for task in tasks:
    if task.target == ms.Tasks.DocumentTarget:
        network_tasks.append(task.toNetworkTask(doc))
    else:
        network_tasks.append(task.toNetworkTask(chunk))


client = ms.NetworkClient()
client.connect(app.settings.network_host)  # server ip
batch_id = client.createBatch(docpath, network_tasks)
client.resumeBatch(batch_id)

I am assuming there is an issue with how I am saving the document but not sure how to correct it.

Thanks

Alexey Pasumansky

  • Agisoft Technical Support
  • Hero Member
  • *****
  • Posts: 15145
    • View Profile
Re: Network processed jobs not behaving as expected
« Reply #1 on: May 24, 2022, 02:13:47 PM »
Hello eastonmbio,

In the end of the script when you are creating the network_tasks list, there's always the same chunk value used for all tasks.

So you may need to optimize you code and to iterate through chunks, inside each loop create the required tasks, which are the same for all chunks and also add relevant network_task list elements, in such manner network_tasks should be properly assigned to every chunk.
Best regards,
Alexey Pasumansky,
Agisoft LLC

eastonmbio

  • Newbie
  • *
  • Posts: 13
    • View Profile
Re: Network processed jobs not behaving as expected
« Reply #2 on: May 25, 2022, 11:36:01 AM »
Thank you, we will look in to this.

Alexey Pasumansky

  • Agisoft Technical Support
  • Hero Member
  • *****
  • Posts: 15145
    • View Profile
Re: Network processed jobs not behaving as expected
« Reply #3 on: May 25, 2022, 11:55:25 AM »
If you have any issues with it, please let me know.
Best regards,
Alexey Pasumansky,
Agisoft LLC