I am relatively new to scripting in agisoft/python api. Ultimately I'm trying to loop through multiple projects to create a dense cloud from each of them. My script, as its written, keeps failing at the depth maps step. I've tried breaking it out of the loop and it still fails at the same point. I've been through a number of forum post on the topic based on the error and can't seem to figure this one out. I'm using Agisoft Metashape version (1.8.1):
2022-05-31 08:29:23 Agisoft Metashape Professional Version: 1.8.1 build 13915 (64 bit)
2022-05-31 08:29:23 Platform: Windows
2022-05-31 08:29:23 CPU: Intel(R) Xeon(R) CPU E5-2690 v4 @ 2.60GHz (server)
2022-05-31 08:29:23 CPU family: 6 model: 79 signature: 406F1h
2022-05-31 08:29:23 RAM: 448.0 GB
2022-05-31 08:29:25 OpenGL Vendor: NVIDIA Corporation
2022-05-31 08:29:25 OpenGL Renderer: Tesla M60/PCIe/SSE2
2022-05-31 08:29:25 OpenGL Version: 4.6.0 NVIDIA 472.39
2022-05-31 08:29:25 Maximum Texture Size: 16384
2022-05-31 08:29:25 Quad Buffered Stereo: not enabled
2022-05-31 08:29:25 ARB_vertex_buffer_object: supported
2022-05-31 08:29:25 ARB_texture_non_power_of_two: supported
.
Does anyone have any suggestions?
Here is a snippet of the console where it ends.
2022-05-29 23:05:21 Depth filtering devices performance:
2022-05-29 23:05:21 - 13% done by GPU Tesla M60 in 26.727 s = [0% IO ref depths + 8% IO ref masks+ 0% ref size + 0% IO depths + 3% size + 61% IO masks + 0% morpho + 8% voting + 0% final + 10% speckles + 3% IO]
2022-05-29 23:05:21 - 13% done by GPU Tesla M60 in 27.313 s = [0% IO ref depths + 7% IO ref masks+ 0% ref size + 0% IO depths + 5% size + 60% IO masks + 0% morpho + 9% voting + 0% final + 8% speckles + 5% IO]
2022-05-29 23:05:21 - 13% done by GPU Tesla M60 in 29.015 s = [0% IO ref depths + 3% IO ref masks+ 0% ref size + 0% IO depths + 2% size + 54% IO masks + 0% morpho + 12% voting + 0% final + 10% speckles + 13% IO]
2022-05-29 23:05:21 - 13% done by GPU Tesla M60 in 31.794 s = [0% IO ref depths + 6% IO ref masks+ 0% ref size + 0% IO depths + 5% size + 54% IO masks + 0% morpho + 11% voting + 2% final + 11% speckles + 7% IO]
2022-05-29 23:05:21 - 13% done by GPU Tesla M60 in 26.594 s = [0% IO ref depths + 5% IO ref masks+ 0% ref size + 0% IO depths + 5% size + 49% IO masks + 0% morpho + 16% voting + 0% final + 9% speckles + 5% IO]
2022-05-29 23:05:21 - 13% done by GPU Tesla M60 in 26.852 s = [0% IO ref depths + 5% IO ref masks+ 0% ref size + 0% IO depths + 2% size + 63% IO masks + 0% morpho + 9% voting + 0% final + 8% speckles + 9% IO]
2022-05-29 23:05:21 - 13% done by GPU Tesla M60 in 32.556 s = [0% IO ref depths + 5% IO ref masks+ 0% ref size + 0% IO depths + 4% size + 63% IO masks + 0% morpho + 8% voting + 0% final + 8% speckles + 10% IO]
2022-05-29 23:05:21 - 13% done by GPU Tesla M60 in 29.593 s = [0% IO ref depths + 3% IO ref masks+ 0% ref size + 0% IO depths + 4% size + 61% IO masks + 0% morpho + 10% voting + 0% final + 9% speckles + 10% IO]
2022-05-29 23:05:21 filtering done in 42.996 s = 11% caching + 1% pre-unpacking + 78% processing
2022-05-29 23:05:21 39 depth maps filtered in 43.074 sec
2022-05-29 23:05:24 Finished processing in 9103.56 sec (exit code 0)
2022-05-29 23:05:25 Traceback (most recent call last):
2022-05-29 23:05:25 File "V:/projects/Iconic_Reef/Agisoft_Project_Data_Exports/MIRmetashapeprocessing_PartII-MarkersToDense_v4.py", line 266, in <module>
2022-05-29 23:05:25 chunk.buildDepthMaps(downscale = 4, filter_mode = Metashape.ModerateFiltering)
2022-05-29 23:05:25 OSError: Can't replace file or directory: The process cannot access the file because it is being used by another process (32): V:/projects/Iconic_Reef/Agisoft_Project_Data_Exports/LOOE_R4-2_2022-05-05/LOOE_R4-2_2022-05-05.files/1/0/depth_maps/data40.zip
2022-05-29 23:05:25 Error: Can't replace file or directory: The process cannot access the file because it is being used by another process (32): V:/projects/Iconic_Reef/Agisoft_Project_Data_Exports/LOOE_R4-2_2022-05-05/LOOE_R4-2_2022-05-05.files/1/0/depth_maps/data40.zip
Here is the code used to generate the above error:
import Metashape
import os, sys
import time, datetime
import math
import json
if len(sys.argv) > 1:
# Define Image Path from user input
img_path = sys.argv[1]
else:
# Quit if no path given
print("Please put the path to your image folder in as an argument")
sys.exit(1)
#Define Image Path from user input
img_path = sys.argv[1]
proj_dir, jpeg_name = os.path.split(img_path)
base_dir, img_folder = os.path.split(proj_dir)
print("proj_dir: " + str(proj_dir))
print("base_dir: " + str(base_dir))
print("jpeg_name: " + str(jpeg_name))
proj_name = str(jpeg_name.rsplit('_',1)[0])
print("proj_name: " + str(proj_name))
export_folder = proj_name
agisoft_files = "Agisoft_Project_Data_Exports"
export_path = os.path.join(base_dir, agisoft_files, export_folder)
print("export_path: " + str(export_path))
#Make sure the export workspace exists. if not - make them.
if not os.path.exists(export_path):
os.makedirs(export_path)
##Selection Percentages
RU_Percent = 50
PA_Percent = 50
RE_Percent = 10
## Selection Thresholds
RU_Threshold = 12
PA_Threshold = 3.5
RE_Threshold = 0.9
##Define photos list
photos = [os.path.join(img_path, photo)
for photo in os.listdir(img_path)
if photo.endswith('.JPG')]
#Define Start Time
start_time = time.time()
print_time = time.ctime(start_time)
print("Start Time: ", print_time)
#Define which document
doc = Metashape.app.Document()
#Define which chunk
#chunk = Metashape.app.document.chunk # Active Chunk in GUI
#doc.save(export_path + '/' + proj_name + '.psx')
#chunk = doc.addChunk()
chunk = Metashape.app.document.chunk
#Note:
#For matching accuracy the downscale correspondence should be the following:
#Highest = 0
#High = 1
#Medium = 2
#Low = 4
#Lowest = 8
#
#For depth maps quality the downscale correspondence should be the following:
#Ultra = 1
#High = 2
#Medium = 4
#Low = 8
#Lowest = 16
# Error Reduction
# https://agisoft.freshdesk.com/support/solutions/articles/31000154629-how-to-select-fixed-percent-of-the-points-gradual-selection-using-python
## Reconstruction Uncertainty
chunk.optimizeCameras(fit_f=True, fit_cx=True, fit_cy=True, fit_k1=True, fit_k2=True, fit_k3=True,
fit_k4=False, fit_p1=True, fit_p2=True, fit_b1=False, fit_b2=False,
fit_corrections=False, adaptive_fitting=False, tiepoint_covariance=False)
print(proj_name + " optimization 1/5 completed")
doc.save()
points = chunk.point_cloud.points
f = Metashape.PointCloud.Filter()
f.init(chunk, criterion = Metashape.PointCloud.Filter.ReconstructionUncertainty) #Reconstruction Uncertainty
list_values = f.values
list_values_valid = list()
StartPoints = len(list_values_valid)
for i in range(len(list_values)):
if points[i].valid:
list_values_valid.append(list_values[i])
list_values_valid.sort()
target = int(len(list_values_valid) * RU_Percent / 100)
StartPoints = int(len(list_values_valid))
AlignmentPoints = StartPoints
threshold = list_values_valid[target]
if (threshold < RU_Threshold):
threshold = RU_Threshold
f.selectPoints(threshold)
f.removePoints(threshold)
print("")
print("Error Reduction Report:")
RU_actual_threshold = threshold
print(str(threshold) + " threshold reached")
print(str(StartPoints) + " points at start")
print(str(target) + " points removed")
print(proj_name + " Reconstruction Uncertainty filter completed")
print("")
## Projection Accuracy
chunk.optimizeCameras(fit_f=True, fit_cx=True, fit_cy=True, fit_k1=True, fit_k2=True, fit_k3=True,
fit_k4=False, fit_p1=True, fit_p2=True, fit_b1=False, fit_b2=False,
fit_corrections=False, adaptive_fitting=False, tiepoint_covariance=False)
print(proj_name + " optimization 2/5 completed")
doc.save()
points = chunk.point_cloud.points
f = Metashape.PointCloud.Filter()
f.init(chunk, criterion = Metashape.PointCloud.Filter.ProjectionAccuracy) #Projection Accuracy
list_values = f.values
list_values_valid = list()
for i in range(len(list_values)):
if points[i].valid:
list_values_valid.append(list_values[i])
list_values_valid.sort()
target = int(len(list_values_valid) * PA_Percent / 100)
StartPoints = int(len(list_values_valid))
threshold = list_values_valid[target]
if (threshold < PA_Threshold):
threshold = PA_Threshold
f.selectPoints(threshold)
f.removePoints(threshold)
print("")
print("Error Reduction Report:")
PA_actual_threshold = threshold
print(str(threshold) + " threshold reached")
print(str(StartPoints) + " points at start")
print(str(target) + " points removed")
print(proj_name + " Projection Accuracy filter completed")
print("")
## Reprojection Error
chunk.optimizeCameras(fit_f=True, fit_cx=True, fit_cy=True, fit_k1=True, fit_k2=True, fit_k3=True,
fit_k4=False, fit_p1=True, fit_p2=True, fit_b1=False, fit_b2=False,
fit_corrections=False, adaptive_fitting=False, tiepoint_covariance=False)
print(proj_name + " optimization 3/5 completed")
doc.save()
points = chunk.point_cloud.points
f = Metashape.PointCloud.Filter()
f.init(chunk, criterion = Metashape.PointCloud.Filter.ReprojectionError) #Reprojection Error
list_values = f.values
list_values_valid = list()
for i in range(len(list_values)):
if points[i].valid:
list_values_valid.append(list_values[i])
list_values_valid.sort()
target = int(len(list_values_valid) * RE_Percent / 100)
StartPoints = int(len(list_values_valid))
#set threshold
threshold = list_values_valid[target]
if (threshold < RE_Threshold):
threshold = RE_Threshold
#remove points
f.selectPoints(threshold)
f.removePoints(threshold)
print("")
print("Error Reduction Report:")
RE_actual_threshold = threshold
print(str(threshold) + " threshold reached")
print(str(StartPoints) + " points at start")
print(str(target) + " points removed")
print(proj_name + " Reprojection Error filter 1 completed")
print("")
## Reprojection Error 2
chunk.optimizeCameras(fit_f=True, fit_cx=True, fit_cy=True, fit_k1=True, fit_k2=True, fit_k3=True,
fit_k4=True, fit_p1=True, fit_p2=True, fit_b1=True, fit_b2=True,
fit_corrections=False, adaptive_fitting=False, tiepoint_covariance=False)
print(proj_name + " optimization 4/5 completed")
doc.save()
points = chunk.point_cloud.points
f = Metashape.PointCloud.Filter()
f.init(chunk, criterion = Metashape.PointCloud.Filter.ReprojectionError) #Reprojection Error
list_values = f.values
list_values_valid = list()
for i in range(len(list_values)):
if points[i].valid:
list_values_valid.append(list_values[i])
list_values_valid.sort()
target = int(len(list_values_valid) * RE_Percent / 100)
StartPoints = int(len(list_values_valid))
threshold = list_values_valid[target]
if (threshold < RE_Threshold):
threshold = RE_Threshold
f.selectPoints(threshold)
f.removePoints(threshold)
EndPoints = int(len(list_values_valid))
print("")
print("Error Reduction Report:")
RE_actual_threshold = threshold
print(str(threshold) + " threshold reached")
print(str(StartPoints) + " points at start")
print(str(target) + " points removed")
print(proj_name + " Reprojection Error filter 2 completed")
print("")
chunk.optimizeCameras(fit_f=True, fit_cx=True, fit_cy=True, fit_k1=True, fit_k2=True, fit_k3=True,
fit_k4=True, fit_p1=True, fit_p2=True, fit_b1=True, fit_b2=True,
fit_corrections=False, adaptive_fitting=False, tiepoint_covariance=False)
print(proj_name + " optimization 5/5 completed")
doc.save()
# Resize Region
region = chunk.region
T = chunk.transform.matrix
m = Metashape.Vector([10E+10, 10E+10, 10E+10])
M = -m
for point in chunk.point_cloud.points:
if not point.valid:
continue
coord = T * point.coord
for i in range(3):
m[i] = min(m[i], coord[i]) - 0.000015
M[i] = max(M[i], coord[i]) + 0.000015
center = (M + m) / 2
size = M - m
region.center = T.inv().mulp(center)
region.size = size * (1 / T.scale())
region.rot = T.rotation().t()
chunk.region = region
# Build Depth Maps
print(proj_name + " Building Depth Maps")
#Note
#For depth maps quality the downscale correspondence should be the following:
#Ultra = 1
#High = 2
#Medium = 4
#Low = 8
#Lowest = 16
chunk.buildDepthMaps(downscale = 4, filter_mode = Metashape.ModerateFiltering)
doc.save()
# Build Dense Cloud
print(proj_name + " Building Dense Cloud")
chunk.buildDenseCloud(point_colors=True, point_confidence=True, keep_depth=True,
#max_neighbors=100,
#subdivide_task=True,
#workitem_size_cameras=20,
#max_workgroup_size=100
)
doc.save()
print(proj_name + " Dense Cloud built")