Forum

Show Posts

This section allows you to view all posts made by this member. Note that you can only see posts made in areas you currently have access to.


Topics - eastonmbio

Pages: [1]
1
Python and Java API / Aligning regions unexpectedly changes the original
« on: September 13, 2023, 11:06:15 AM »
Hi there,

I'm using Metashape v1.7.6. We do repeat surveys of the same area each year. Permanent markers are used to create a transformation matrix that help us to align models, and the following code has been implemented to align the bounding boxes to ensure that the models are of an identical area from year to year:

Code: [Select]

import Metashape

doc = Metashape.app.document
chunk = doc.chunks[0]
region = chunk.region


T0 = Metashape.Matrix.diag((1, 1, 1, 1))
if chunk.transform != None:
T0 = chunk.transform

R0 = region.rot
C0 = region.center
s0 = region.size

C0.size = 4
C0.w = 1

chunk = doc.chunks[1]

if chunk.transform != None:
T = chunk.transform.inv() * T0
else:
T = T0

R = Metashape.Matrix( [[T[0,0],T[0,1],T[0,2]], [T[1,0],T[1,1],T[1,2]], [T[2,0],T[2,1],T[2,2]]])

scale = R.row(0).norm()
R = R * (1/scale)

region.rot = R * R0
c = T * C0
c = c / c[3] / 1.
c.size = 3
region.center = c
region.size = s0 * scale / 1.

chunk.region = region


The only issue appears to be that the reference chunk's region changes dramatically on execution of the code. The region for time point 2 aligns perfectly to the dimensions of time point 1's region, but then the original region changes. Is there something in the code that may be causing "chunk[0]"s region to change? This code was originially designed to work with photoscan.

Thanks

2
We are having issues with attempting to integrate the RunScript command into the workflow, since it does not appear to mix well with tasks that are supported by the usual network processing python commands. I am able to perform camera alignment using this syntax without any issue, but when I try to add code to the end that opens the project, performs some required action, then saves the project, the code executes correctly, but on the original project, as if the cameras had never been aligned (i.e. there are no tie points). Can you assist with providing the correct syntax for integrating RunScript into the workflow?

We are using the following syntax to point to the correct script:

Code: [Select]
task = Metashape.Tasks.RunScript()
task.path = "Z:/scripts/PythonScripts/Test_scripts/Perform_action.py"
tasks.append(task)

The code pointed to by RunScript imports the project in the following way before performing the required action:

Code: [Select]

import Metashape as ms


app = ms.app
docpath = app.document.path
doc = ms.Document()
chunk = ms.app.document.chunk

chunk#.dosomething

doc.save()




3
Hi,

I am trying to use network processing to repeat the same tasks over 4 different chunks all living in the same project. Clearly there is an error in how my code is structured and I'm looking to see where I have gone wrong. I am building meshes using depth maps as source data. All 8 of the tasks are sent off and processed but on reopening the project, only the depth maps and mesh from the final chunk is there.

Code: [Select]
import Metashape as ms

app = ms.app
docpath = ms.app.document.path
doc = ms.Document()
chunk = ms.app.document.chunk

doc.open(docpath, read_only = False, ignore_lock=True)


doc.save()

network_server = 'metashape-qmgr.myinstitution.gov.au'

ms.app.settings.network_path = 'Z:/'



client = ms.NetworkClient()

tasks = []  # create task list

chunk = doc.chunks[0]

task = ms.Tasks.BuildDepthMaps()
task.downscale = 4
task.filter_mode = ms.MildFiltering
task.reuse_depth = False
task.max_neighbors = 40
task.max_workgroup_size=100
tasks.append(task)

task = ms.Tasks.BuildModel()
task.source_data = ms.DepthMapsData
task.surface_type = ms.Arbitrary
task.interpolation = ms.EnabledInterpolation
task.face_count = ms.FaceCount.HighFaceCount
task.source_data = ms.DepthMapsData
task.vertex_colors = True
task.vertex_confidence = True
task.volumetric_masks = False
task.keep_depth = True
task.trimming_radius = 10
task.subdivide_task = True
task.workitem_size_cameras = 20
task.max_workgroup_size = 100
tasks.append(task)

chunk = doc.chunks[1]

task = ms.Tasks.BuildDepthMaps()
task.downscale = 4
task.filter_mode = ms.MildFiltering
task.reuse_depth = False
task.max_neighbors = 40
task.max_workgroup_size=100
tasks.append(task)

task = ms.Tasks.BuildModel()
task.source_data = ms.DepthMapsData
task.surface_type = ms.Arbitrary
task.interpolation = ms.EnabledInterpolation
task.face_count = ms.FaceCount.HighFaceCount
task.source_data = ms.DepthMapsData
task.vertex_colors = True
task.vertex_confidence = True
task.volumetric_masks = False
task.keep_depth = True
task.trimming_radius = 10
task.subdivide_task = True
task.workitem_size_cameras = 20
task.max_workgroup_size = 100
tasks.append(task)

chunk = doc.chunks[2]

task = ms.Tasks.BuildDepthMaps()
task.downscale = 4
task.filter_mode = ms.MildFiltering
task.reuse_depth = False
task.max_neighbors = 40
task.max_workgroup_size=100
tasks.append(task)

task = ms.Tasks.BuildModel()
task.source_data = ms.DepthMapsData
task.surface_type = ms.Arbitrary
task.interpolation = ms.EnabledInterpolation
task.face_count = ms.FaceCount.HighFaceCount
task.source_data = ms.DepthMapsData
task.vertex_colors = True
task.vertex_confidence = True
task.volumetric_masks = False
task.keep_depth = True
task.trimming_radius = 10
task.subdivide_task = True
task.workitem_size_cameras = 20
task.max_workgroup_size = 100
tasks.append(task)

chunk = doc.chunks[3]

task = ms.Tasks.BuildDepthMaps()
task.downscale = 4
task.filter_mode = ms.MildFiltering
task.reuse_depth = False
task.max_neighbors = 40
task.max_workgroup_size=100
tasks.append(task)

task = ms.Tasks.BuildModel()
task.source_data = ms.DepthMapsData
task.surface_type = ms.Arbitrary
task.interpolation = ms.EnabledInterpolation
task.face_count = ms.FaceCount.HighFaceCount
task.source_data = ms.DepthMapsData
task.vertex_colors = True
task.vertex_confidence = True
task.volumetric_masks = False
task.keep_depth = True
task.trimming_radius = 10
task.subdivide_task = True
task.workitem_size_cameras = 20
task.max_workgroup_size = 100
tasks.append(task)
# --------------------------
# --------------------------
# ------------------------------------------------------------------------------

# DONE

# convert task list to network tasks
network_tasks = []
for task in tasks:
    if task.target == ms.Tasks.DocumentTarget:
        network_tasks.append(task.toNetworkTask(doc))
    else:
        network_tasks.append(task.toNetworkTask(chunk))


client = ms.NetworkClient()
client.connect(app.settings.network_host)  # server ip
batch_id = client.createBatch(docpath, network_tasks)
client.resumeBatch(batch_id)

I am assuming there is an issue with how I am saving the document but not sure how to correct it.

Thanks

4
Python and Java API / Large holes in model after updating to 1.8.1
« on: February 09, 2022, 04:01:19 AM »
We are using large photosets (2000-4000) to build scans of coral reefs in situ. When processing on ver. 1.7.5 we had no issue (we were aware of approximate maximum sizes of projects before processing would fail and adjusted the workflow accordingly). After updating to metashape 1.8.1 with the goal of using split in chunks to process larger photosets more quickly, we found that regardless of whether we used this script or processed plots as one large chunk, large holes appeared in both the dense cloud and mesh - the reconstructed scene has no resemblance to the true scene. None of the settings were changed (alignment quality, no. photos, dense cloud settings, mesh face count were identical to before).

Is there a list of settings that have been automatically changed to speed up processing between versions?

5
Python and Java API / Rotation of object to plane of best fit
« on: December 02, 2021, 12:52:51 PM »
Our group is monitoring sections of coral reef using photogrammetry scans, part of which requires orthomosaics to track 2D coral growth over time. We are trying to figure out how best to ensure that the top XY view of the reef is as consistent as possible between time series. I'd like to know first of all how the automatic calculation of what the top-down view of the object is? Our data acquisition is primarily nadiral which I think is important but there are some oblique camera angles too, which I am assuming changes the automatic projection. My assumption is that the top of the object is calculated using the average viewing angle of all the cameras in a scene.

Ideally, the top-down view should be perpendicular to the plane of best fit of the growth of the reef slope - is there any way to rotate the object to this view using python? Our coded markers are not permanent features and are not always parralel to the slope, which I think would prevent us from using a marker-based projection in the orthomosaic.

6
I'm looking to execute an external script as a network task using the following tester code:

Code: [Select]
import Metashape

path = Metashape.app.getOpenFileName("Specify path to the PSX document:")
root = Metashape.app.getExistingDirectory("Specify network root path:")

doc = Metashape.Document()
doc.open(path)
chunk = doc.chunk

client=Metashape.NetworkClient()

network_tasks = list()

task = Metashape.Tasks.RunScript()
task.path = "C:/path/to/myscript.py"

n_task = Metashape.NetworkTask()
n_task.name = task.name
n_task.params = task.encode()
n_task.frames.append((chunk.key, 0))
network_tasks.append(n_task)


client.connect('my.server.name')
batch_id = client.createBatch(path[len(root):], [network_tasks] )
client.resumeBatch(batch_id)
print("Job Started...")


The code referred to in the path just imports Metashape. However, when running the script, I get the error message " TypeError: Task object expected". Clearly what I am passing in to the server is not recognised as a task. I am using version 1.5.1. I'm not sure if my code structure needs to change to accommodate network processing of the external script.

7
I'm looking to develop a script for network processing whereby tasks that are able to be done using network processing are mixed with those that do not fit in to the "tasks" classification.

For example, Photo alignment (network) then dense cloud generation (network), and dense cloud filtering based on point color (non-network), and so on. I'm unsure if I am able to do this. I think I would need to send the cleaning tasks to the network as queued jobs, but am unsure how I would go about this. If I was to simply insert code with the goal of point cloud cleaning, it would appear to run immediately, and execute on a point cloud that did not yet exist. The problem would persist I think, if I loaded several scripts into the batch processing window, to be executed one after the other.

The only solution I could think of is embedding any non-network tasks into a loop routine that checks whether the necessary point cloud exists before executing the code, waiting an arbitrary amount of time before running again?

An example of two sections of code I'd like to integrate is:

Code: [Select]

import Metashape

path = Metashape.app.getOpenFileName("Specify path to the PSX document:")
root = Metashape.app.getExistingDirectory("Specify network root path:")

doc = Metashape.Document()
doc.open(path)
chunk = doc.chunk

client=Metashape.NetworkClient()

network_tasks = list()

task = Metashape.Tasks.MatchPhotos()
task.network_distribute = True
task.downscale = 1
task.keypoint_limit = 40000
task.tiepoint_limit = 4000

n_task = Metashape.NetworkTask()
n_task.name = task.name
n_task.params = task.encode()
n_task.frames.append((chunk.key, 0))
network_tasks.append(n_task)

task = Metashape.Tasks.AlignCameras()
task.adaptive_fitting = False
task.network_distribute = True

n_task = Metashape.NetworkTask()
n_task.name = task.name
n_task.params = task.encode()
n_task.frames.append((chunk.key,0))
network_tasks.append(n_task)

task = Metashape.Tasks.BuildDepthMaps()
task.downscale = 4
task.filter_mode = Metashape.FilterMode.MildFiltering
task.network_distribute = True

n_task = Metashape.NetworkTask()
n_task.name = task.name
n_task.params = task.encode()
n_task.frames.append((chunk.key, 0))
network_tasks.append(n_task)

task = Metashape.Tasks.BuildDenseCloud()
task.network_distribute = True
n_task = Metashape.NetworkTask()
n_task.name = task.name
n_task.params = task.encode()
n_task.frames.append((chunk.key, 0))
network_tasks.append(n_task)


client.connect('metashape-qmgr.aims.gov.au')
batch_id = client.createBatch(path[len(root):], network_tasks)
client.resumeBatch(batch_id)
print("Job Started...")


I would like to integrate this with:

Code: [Select]

dense_cloud = chunk.dense_cloud

dense_cloud.selectPointsByColor(color=[85,170,255], tolerance=35, channels='RGB')
dense_cloud.removeSelectedPoints()


I would like this to be followed by mesh building.

I have seen this question asked before, but no answers were given.

Thanks.

Pages: [1]