Forum

Show Posts

This section allows you to view all posts made by this member. Note that you can only see posts made in areas you currently have access to.


Messages - eastonmbio

Pages: [1]
1
Python and Java API / Re: Aligning regions unexpectedly changes the original
« on: September 20, 2023, 12:26:17 PM »
Thank you for your reply Alexey. I'm not sure why the original was changing but it seems to have been some kind of visual glitch, and the region for the reference chunk was in fact preserved.

2
Python and Java API / Aligning regions unexpectedly changes the original
« on: September 13, 2023, 11:06:15 AM »
Hi there,

I'm using Metashape v1.7.6. We do repeat surveys of the same area each year. Permanent markers are used to create a transformation matrix that help us to align models, and the following code has been implemented to align the bounding boxes to ensure that the models are of an identical area from year to year:

Code: [Select]

import Metashape

doc = Metashape.app.document
chunk = doc.chunks[0]
region = chunk.region


T0 = Metashape.Matrix.diag((1, 1, 1, 1))
if chunk.transform != None:
T0 = chunk.transform

R0 = region.rot
C0 = region.center
s0 = region.size

C0.size = 4
C0.w = 1

chunk = doc.chunks[1]

if chunk.transform != None:
T = chunk.transform.inv() * T0
else:
T = T0

R = Metashape.Matrix( [[T[0,0],T[0,1],T[0,2]], [T[1,0],T[1,1],T[1,2]], [T[2,0],T[2,1],T[2,2]]])

scale = R.row(0).norm()
R = R * (1/scale)

region.rot = R * R0
c = T * C0
c = c / c[3] / 1.
c.size = 3
region.center = c
region.size = s0 * scale / 1.

chunk.region = region


The only issue appears to be that the reference chunk's region changes dramatically on execution of the code. The region for time point 2 aligns perfectly to the dimensions of time point 1's region, but then the original region changes. Is there something in the code that may be causing "chunk[0]"s region to change? This code was originially designed to work with photoscan.

Thanks

3
Hi Matt,

If you are using a version of Metashape before 1.7.4, I would highly recommend donwloading a newer version. I have found that for myself, 1.7.6 (still downloadable now) produces more than satisfactory results. The syntax differs somewhat from what you posted which is why I ask the question. In 1.7.4 and later, some of the code structure was altered reflective of variation in pathing and it seems to deal with tasks much better now. I will share below an example of some code that duplicates the chunk via network processing.

Code for sending the job to the network is:

Code: [Select]
import Metashape

app = Metashape.app
docpath = app.document.path
doc = Metashape.Document()
chunk = Metashape.app.document.chunk


network_server = 'my_server' #The same name as the host name in the network monitor

Metashape.app.settings.network_path = 'Z:/'  #The drive that is mapped to the network

client = Metashape.NetworkClient()

doc.open(docpath, read_only=False, ignore_lock=True)
# save latest changes
doc.save()

tasks = []  # create task list

chunk = doc.chunk


task = Metashape.Tasks.RunScript()
task.path = "Z:/scripts/EcoRRAP/PythonScripts/Test_scripts/Resize_Dupe.py"
tasks.append(task)



# convert task list to network tasks
network_tasks = []
for task in tasks:
    if task.target == Metashape.Tasks.DocumentTarget:
        network_tasks.append(task.toNetworkTask(doc))
    else:
        network_tasks.append(task.toNetworkTask(chunk))

client = Metashape.NetworkClient()
client.connect(app.settings.network_host)  # server ip
batch_id = client.createBatch(docpath, network_tasks)
client.resumeBatch(batch_id)

Metashape.app.messageBox("Tasks have been sent do the network. Please reopen this project without saving and it will display the progress of the jobs you have just sent.")


The script that this opens is as follows:

Code: [Select]
# Set duplicate to False if you don't want to back up the sparse cloud

############
duplicate = True
############


import Metashape as ms
import math
import sys


app = ms.app
docpath = app.document.path
doc = ms.Document()
chunk = ms.app.document.chunk

doc.open(docpath, read_only=False, ignore_lock=True)
chunk = ms.app.document.chunk



chunk = doc.chunks[0]


# duplicate chunk to preserve source
if duplicate is True:
    chunk_label = chunk.label  # create reference to source chunk
    chunk.copy()  # now duplicate the source chunk
    chunks = ms.app.document.chunks  # update reference to chunks
    # set reference to the duplicated chunks since we will be working with this chunk:
    dupeChunk = ms.app.document.chunks[len(ms.app.document.chunks)-1]
    if dupeChunk in chunks:
        dupeChunk.label = str(chunk_label) + " -PreCleanUnFiltered"  # rename dupe chunk


doc.save()

Hope this helps.

4
Hi Alexey,

Thank you for your helpful response. I adjusted some of the syntax in the code and it worked.

For my own record, am I right in thinking that if my code opens the current document with the above syntax, if I use a script that also opens the current document as part of a network process after some other action has already been performed using network processing, it should open that document AFTER these actions have been performed, and not in its "old" state, even though there is no specific 'save' command inherent in network processing syntax? The path of the project does not vary between steps, it opens and saves the document without any change.

I believe the problem before was that RunScript was opening the original version of the .psx file, overwriting any previous network processing that had happened earler on in the script. I am not sure exactly the change in my syntax or workflow that made it work correctly, I'm just looking to understand how Metashape deals with these sorts of tasks.

Thank you.

5
We are having issues with attempting to integrate the RunScript command into the workflow, since it does not appear to mix well with tasks that are supported by the usual network processing python commands. I am able to perform camera alignment using this syntax without any issue, but when I try to add code to the end that opens the project, performs some required action, then saves the project, the code executes correctly, but on the original project, as if the cameras had never been aligned (i.e. there are no tie points). Can you assist with providing the correct syntax for integrating RunScript into the workflow?

We are using the following syntax to point to the correct script:

Code: [Select]
task = Metashape.Tasks.RunScript()
task.path = "Z:/scripts/PythonScripts/Test_scripts/Perform_action.py"
tasks.append(task)

The code pointed to by RunScript imports the project in the following way before performing the required action:

Code: [Select]

import Metashape as ms


app = ms.app
docpath = app.document.path
doc = ms.Document()
chunk = ms.app.document.chunk

chunk#.dosomething

doc.save()




6
Thank you, we will look in to this.

7
Hi,

I am trying to use network processing to repeat the same tasks over 4 different chunks all living in the same project. Clearly there is an error in how my code is structured and I'm looking to see where I have gone wrong. I am building meshes using depth maps as source data. All 8 of the tasks are sent off and processed but on reopening the project, only the depth maps and mesh from the final chunk is there.

Code: [Select]
import Metashape as ms

app = ms.app
docpath = ms.app.document.path
doc = ms.Document()
chunk = ms.app.document.chunk

doc.open(docpath, read_only = False, ignore_lock=True)


doc.save()

network_server = 'metashape-qmgr.myinstitution.gov.au'

ms.app.settings.network_path = 'Z:/'



client = ms.NetworkClient()

tasks = []  # create task list

chunk = doc.chunks[0]

task = ms.Tasks.BuildDepthMaps()
task.downscale = 4
task.filter_mode = ms.MildFiltering
task.reuse_depth = False
task.max_neighbors = 40
task.max_workgroup_size=100
tasks.append(task)

task = ms.Tasks.BuildModel()
task.source_data = ms.DepthMapsData
task.surface_type = ms.Arbitrary
task.interpolation = ms.EnabledInterpolation
task.face_count = ms.FaceCount.HighFaceCount
task.source_data = ms.DepthMapsData
task.vertex_colors = True
task.vertex_confidence = True
task.volumetric_masks = False
task.keep_depth = True
task.trimming_radius = 10
task.subdivide_task = True
task.workitem_size_cameras = 20
task.max_workgroup_size = 100
tasks.append(task)

chunk = doc.chunks[1]

task = ms.Tasks.BuildDepthMaps()
task.downscale = 4
task.filter_mode = ms.MildFiltering
task.reuse_depth = False
task.max_neighbors = 40
task.max_workgroup_size=100
tasks.append(task)

task = ms.Tasks.BuildModel()
task.source_data = ms.DepthMapsData
task.surface_type = ms.Arbitrary
task.interpolation = ms.EnabledInterpolation
task.face_count = ms.FaceCount.HighFaceCount
task.source_data = ms.DepthMapsData
task.vertex_colors = True
task.vertex_confidence = True
task.volumetric_masks = False
task.keep_depth = True
task.trimming_radius = 10
task.subdivide_task = True
task.workitem_size_cameras = 20
task.max_workgroup_size = 100
tasks.append(task)

chunk = doc.chunks[2]

task = ms.Tasks.BuildDepthMaps()
task.downscale = 4
task.filter_mode = ms.MildFiltering
task.reuse_depth = False
task.max_neighbors = 40
task.max_workgroup_size=100
tasks.append(task)

task = ms.Tasks.BuildModel()
task.source_data = ms.DepthMapsData
task.surface_type = ms.Arbitrary
task.interpolation = ms.EnabledInterpolation
task.face_count = ms.FaceCount.HighFaceCount
task.source_data = ms.DepthMapsData
task.vertex_colors = True
task.vertex_confidence = True
task.volumetric_masks = False
task.keep_depth = True
task.trimming_radius = 10
task.subdivide_task = True
task.workitem_size_cameras = 20
task.max_workgroup_size = 100
tasks.append(task)

chunk = doc.chunks[3]

task = ms.Tasks.BuildDepthMaps()
task.downscale = 4
task.filter_mode = ms.MildFiltering
task.reuse_depth = False
task.max_neighbors = 40
task.max_workgroup_size=100
tasks.append(task)

task = ms.Tasks.BuildModel()
task.source_data = ms.DepthMapsData
task.surface_type = ms.Arbitrary
task.interpolation = ms.EnabledInterpolation
task.face_count = ms.FaceCount.HighFaceCount
task.source_data = ms.DepthMapsData
task.vertex_colors = True
task.vertex_confidence = True
task.volumetric_masks = False
task.keep_depth = True
task.trimming_radius = 10
task.subdivide_task = True
task.workitem_size_cameras = 20
task.max_workgroup_size = 100
tasks.append(task)
# --------------------------
# --------------------------
# ------------------------------------------------------------------------------

# DONE

# convert task list to network tasks
network_tasks = []
for task in tasks:
    if task.target == ms.Tasks.DocumentTarget:
        network_tasks.append(task.toNetworkTask(doc))
    else:
        network_tasks.append(task.toNetworkTask(chunk))


client = ms.NetworkClient()
client.connect(app.settings.network_host)  # server ip
batch_id = client.createBatch(docpath, network_tasks)
client.resumeBatch(batch_id)

I am assuming there is an issue with how I am saving the document but not sure how to correct it.

Thanks

8
Python and Java API / Re: Large holes in model after updating to 1.8.1
« on: February 13, 2022, 06:30:12 AM »
I think it may yield the results we are looking for. We don't require dense clouds for our purposes, so my research group is currently trialling depth maps-based meshing. Thanks for the response. It would be nice if we could figure out the holes using the dense clouds though, just from the perspective of people who want to understand the software as much as possible. I have a feeling it's the max neighbours settings but I'm looking to have this confirmed as it's the only thing that seems to have changed by default after the update.

9
Python and Java API / Large holes in model after updating to 1.8.1
« on: February 09, 2022, 04:01:19 AM »
We are using large photosets (2000-4000) to build scans of coral reefs in situ. When processing on ver. 1.7.5 we had no issue (we were aware of approximate maximum sizes of projects before processing would fail and adjusted the workflow accordingly). After updating to metashape 1.8.1 with the goal of using split in chunks to process larger photosets more quickly, we found that regardless of whether we used this script or processed plots as one large chunk, large holes appeared in both the dense cloud and mesh - the reconstructed scene has no resemblance to the true scene. None of the settings were changed (alignment quality, no. photos, dense cloud settings, mesh face count were identical to before).

Is there a list of settings that have been automatically changed to speed up processing between versions?

10
Python and Java API / Rotation of object to plane of best fit
« on: December 02, 2021, 12:52:51 PM »
Our group is monitoring sections of coral reef using photogrammetry scans, part of which requires orthomosaics to track 2D coral growth over time. We are trying to figure out how best to ensure that the top XY view of the reef is as consistent as possible between time series. I'd like to know first of all how the automatic calculation of what the top-down view of the object is? Our data acquisition is primarily nadiral which I think is important but there are some oblique camera angles too, which I am assuming changes the automatic projection. My assumption is that the top of the object is calculated using the average viewing angle of all the cameras in a scene.

Ideally, the top-down view should be perpendicular to the plane of best fit of the growth of the reef slope - is there any way to rotate the object to this view using python? Our coded markers are not permanent features and are not always parralel to the slope, which I think would prevent us from using a marker-based projection in the orthomosaic.

11
Thank you Alexey. I have attempted to integrate the RunScript task into a test version of my code, shown here:

Code: [Select]
import Metashape, sys

root = "Z:/"
path = sys.argv[1]

doc = Metashape.Document()
doc.open(path) #loading existing project using relative path from the root
chunk = doc.chunk #active chunk of the project

client=Metashape.NetworkClient()

tasks = []


task = Metashape.Tasks.RunScript() #Reducing overlap as network task
task.code = 'import Metashape\ndoc = Metashape.Document()\ndoc.open("Z:/projects/pilot_testing/PythonTesting.psx", ignore_lock = True)\nchunk = doc.chunk\nchunk.reduceOverlap(overlap=4)\ndoc.save()\n'
tasks.append(task)

#converting tasks to network tasks
network_tasks = []
for task in tasks:
if task.target == Metashape.Tasks.DocumentTarget:
network_tasks.append(task.toNetworkTask(doc))
else:
network_tasks.append(task.toNetworkTask(chunk))



client.connect('my.server.here')
batch_id = client.createBatch(path[len(root):], network_tasks)
client.resumeBatch(batch_id)
print("Reducing overlap ... Check the network monitor now")


However, the document fails to open and I am presented with the error message :

2021-07-06 21:23:17   File "<string>", line 3, in <module>
2021-07-06 21:23:17 OSError: Can't open file: No such file or directory (2): Z:/projects/pilot_testing/PythonTesting.psx
2021-07-06 21:23:17 Error: Can't open file: No such file or directory (2): Z:/projects/pilot_testing/PythonTesting.psx

I know that the path to the file in the RunScript command is correct, since I copied it directly from the Metashape GUI.

I am aware that during your mentioned example, the user was directed to a slightly different code structure in the RunScript task ,involving using the command

doc.open(Metashape.app.settings.network_path + "' + project_path + '", ignore_lock = True)

to open the document. However, when I alter my code structure to be reflective of this, I am presented with a similar error message, and the document also fails to open.

I am unsure why and any help would be appreciated, as this would significantly reduce the time a user needs to spend at a desk for processing. I see in your previous post, you mention that the code is intended to work with version 1.7.4 pre-release. I am using 1.7.2 . 12070, so I am certain that there are just some issues to do with specifying the file paths correctly. Within the current structure, it is no issue for me to send the "normal" network processing type jobs off (analyze photos, align photos etc.). I just cannot make the runscript process open the file correctly, whether I specify the file location directly, or define the path earlier on in the script.

12
I'm looking to execute an external script as a network task using the following tester code:

Code: [Select]
import Metashape

path = Metashape.app.getOpenFileName("Specify path to the PSX document:")
root = Metashape.app.getExistingDirectory("Specify network root path:")

doc = Metashape.Document()
doc.open(path)
chunk = doc.chunk

client=Metashape.NetworkClient()

network_tasks = list()

task = Metashape.Tasks.RunScript()
task.path = "C:/path/to/myscript.py"

n_task = Metashape.NetworkTask()
n_task.name = task.name
n_task.params = task.encode()
n_task.frames.append((chunk.key, 0))
network_tasks.append(n_task)


client.connect('my.server.name')
batch_id = client.createBatch(path[len(root):], [network_tasks] )
client.resumeBatch(batch_id)
print("Job Started...")


The code referred to in the path just imports Metashape. However, when running the script, I get the error message " TypeError: Task object expected". Clearly what I am passing in to the server is not recognised as a task. I am using version 1.5.1. I'm not sure if my code structure needs to change to accommodate network processing of the external script.

13
I'm looking to develop a script for network processing whereby tasks that are able to be done using network processing are mixed with those that do not fit in to the "tasks" classification.

For example, Photo alignment (network) then dense cloud generation (network), and dense cloud filtering based on point color (non-network), and so on. I'm unsure if I am able to do this. I think I would need to send the cleaning tasks to the network as queued jobs, but am unsure how I would go about this. If I was to simply insert code with the goal of point cloud cleaning, it would appear to run immediately, and execute on a point cloud that did not yet exist. The problem would persist I think, if I loaded several scripts into the batch processing window, to be executed one after the other.

The only solution I could think of is embedding any non-network tasks into a loop routine that checks whether the necessary point cloud exists before executing the code, waiting an arbitrary amount of time before running again?

An example of two sections of code I'd like to integrate is:

Code: [Select]

import Metashape

path = Metashape.app.getOpenFileName("Specify path to the PSX document:")
root = Metashape.app.getExistingDirectory("Specify network root path:")

doc = Metashape.Document()
doc.open(path)
chunk = doc.chunk

client=Metashape.NetworkClient()

network_tasks = list()

task = Metashape.Tasks.MatchPhotos()
task.network_distribute = True
task.downscale = 1
task.keypoint_limit = 40000
task.tiepoint_limit = 4000

n_task = Metashape.NetworkTask()
n_task.name = task.name
n_task.params = task.encode()
n_task.frames.append((chunk.key, 0))
network_tasks.append(n_task)

task = Metashape.Tasks.AlignCameras()
task.adaptive_fitting = False
task.network_distribute = True

n_task = Metashape.NetworkTask()
n_task.name = task.name
n_task.params = task.encode()
n_task.frames.append((chunk.key,0))
network_tasks.append(n_task)

task = Metashape.Tasks.BuildDepthMaps()
task.downscale = 4
task.filter_mode = Metashape.FilterMode.MildFiltering
task.network_distribute = True

n_task = Metashape.NetworkTask()
n_task.name = task.name
n_task.params = task.encode()
n_task.frames.append((chunk.key, 0))
network_tasks.append(n_task)

task = Metashape.Tasks.BuildDenseCloud()
task.network_distribute = True
n_task = Metashape.NetworkTask()
n_task.name = task.name
n_task.params = task.encode()
n_task.frames.append((chunk.key, 0))
network_tasks.append(n_task)


client.connect('metashape-qmgr.aims.gov.au')
batch_id = client.createBatch(path[len(root):], network_tasks)
client.resumeBatch(batch_id)
print("Job Started...")


I would like to integrate this with:

Code: [Select]

dense_cloud = chunk.dense_cloud

dense_cloud.selectPointsByColor(color=[85,170,255], tolerance=35, channels='RGB')
dense_cloud.removeSelectedPoints()


I would like this to be followed by mesh building.

I have seen this question asked before, but no answers were given.

Thanks.

Pages: [1]