Forum

Show Posts

This section allows you to view all posts made by this member. Note that you can only see posts made in areas you currently have access to.


Topics - Bolli

Pages: [1]
1
Hi All,

I'm exporting dense cloud points for viewing in another application. The other half of my workflow requires point confidence to be included in the exported PLY file. My exports from the GUI with the AIP generated dense cloud are just fine when I have all the boxes checked, but my exports from the API lack the required Point Confidence field in the PLY despite using the TRUE argument (below). The API exported PLYs look fine otherwise.

Code: [Select]
    chunk.buildDenseCloud(point_colors=True, point_confidence=True, keep_depth=True)
    doc.save()
    print(proj_name + " Dense Cloud built")

    #Export Local points
    if chunk.dense_cloud:
        chunk.exportPoints(export_path + '/' + proj_name + '.ply',
                           source_data = Metashape.DenseCloudData,
                           save_normals = True,
                           save_colors = True,
                           save_classes = True,
                           save_confidence = True)


Any advice on the this issue?
Bolli

2
Hi all,
I'm trying to work in a detectMarkers into a batch process via python. I'm using Metashape 1.8.3 and the 1.8.3 python api guide. I've basically copied this line of code from the api guide and the target type CircularTarget12bit is coming up as undefined ("NameError: name 'CircularTarget12bit' is not defined"). Is there another target type to represent the 12 bit Circular targets that I'm missing?

Code: [Select]
import Metashape
import os, sys
import time, datetime
import math
import json

#Define which document
doc = Metashape.app.document
doc.open(export_path + '/' + proj_name + '.psx')


#Define which chunk
chunk = Metashape.app.document.chunk

#Detect markers
chunk.detectMarkers(target_type = CircularTarget12bit,
                    tolerance = 50,
                    filter_mask = False,
                    inverted = False,
                    noparity=False)

doc.save()

Thanks for any and all advice!
Bolli

3
Python and Java API / Script fails to create depth maps, errors out
« on: May 31, 2022, 03:31:20 PM »
I am relatively new to scripting in agisoft/python api. Ultimately I'm trying to loop through multiple projects to create a dense cloud from each of them. My script, as its written, keeps failing at the depth maps step. I've tried breaking it out of the loop and it still fails at the same point. I've been through a number of forum post on the topic based on the error and can't seem to figure this one out. I'm using Agisoft Metashape version (1.8.1):
Code: [Select]
2022-05-31 08:29:23 Agisoft Metashape Professional Version: 1.8.1 build 13915 (64 bit)
2022-05-31 08:29:23 Platform: Windows
2022-05-31 08:29:23 CPU: Intel(R) Xeon(R) CPU E5-2690 v4 @ 2.60GHz (server)
2022-05-31 08:29:23 CPU family: 6 model: 79 signature: 406F1h
2022-05-31 08:29:23 RAM: 448.0 GB
2022-05-31 08:29:25 OpenGL Vendor: NVIDIA Corporation
2022-05-31 08:29:25 OpenGL Renderer: Tesla M60/PCIe/SSE2
2022-05-31 08:29:25 OpenGL Version: 4.6.0 NVIDIA 472.39
2022-05-31 08:29:25 Maximum Texture Size: 16384
2022-05-31 08:29:25 Quad Buffered Stereo: not enabled
2022-05-31 08:29:25 ARB_vertex_buffer_object: supported
2022-05-31 08:29:25 ARB_texture_non_power_of_two: supported
.
Does anyone have any suggestions?

Here is a snippet of the console where it ends.
Code: [Select]
2022-05-29 23:05:21 Depth filtering devices performance:
2022-05-29 23:05:21  - 13% done by GPU Tesla M60 in 26.727 s = [0% IO ref depths + 8% IO ref masks+ 0% ref size + 0% IO depths + 3% size + 61% IO masks + 0% morpho + 8% voting + 0% final + 10% speckles + 3% IO]
2022-05-29 23:05:21  - 13% done by GPU Tesla M60 in 27.313 s = [0% IO ref depths + 7% IO ref masks+ 0% ref size + 0% IO depths + 5% size + 60% IO masks + 0% morpho + 9% voting + 0% final + 8% speckles + 5% IO]
2022-05-29 23:05:21  - 13% done by GPU Tesla M60 in 29.015 s = [0% IO ref depths + 3% IO ref masks+ 0% ref size + 0% IO depths + 2% size + 54% IO masks + 0% morpho + 12% voting + 0% final + 10% speckles + 13% IO]
2022-05-29 23:05:21  - 13% done by GPU Tesla M60 in 31.794 s = [0% IO ref depths + 6% IO ref masks+ 0% ref size + 0% IO depths + 5% size + 54% IO masks + 0% morpho + 11% voting + 2% final + 11% speckles + 7% IO]
2022-05-29 23:05:21  - 13% done by GPU Tesla M60 in 26.594 s = [0% IO ref depths + 5% IO ref masks+ 0% ref size + 0% IO depths + 5% size + 49% IO masks + 0% morpho + 16% voting + 0% final + 9% speckles + 5% IO]
2022-05-29 23:05:21  - 13% done by GPU Tesla M60 in 26.852 s = [0% IO ref depths + 5% IO ref masks+ 0% ref size + 0% IO depths + 2% size + 63% IO masks + 0% morpho + 9% voting + 0% final + 8% speckles + 9% IO]
2022-05-29 23:05:21  - 13% done by GPU Tesla M60 in 32.556 s = [0% IO ref depths + 5% IO ref masks+ 0% ref size + 0% IO depths + 4% size + 63% IO masks + 0% morpho + 8% voting + 0% final + 8% speckles + 10% IO]
2022-05-29 23:05:21  - 13% done by GPU Tesla M60 in 29.593 s = [0% IO ref depths + 3% IO ref masks+ 0% ref size + 0% IO depths + 4% size + 61% IO masks + 0% morpho + 10% voting + 0% final + 9% speckles + 10% IO]
2022-05-29 23:05:21 filtering done in 42.996 s = 11% caching + 1% pre-unpacking + 78% processing
2022-05-29 23:05:21 39 depth maps filtered in 43.074 sec
2022-05-29 23:05:24 Finished processing in 9103.56 sec (exit code 0)
2022-05-29 23:05:25 Traceback (most recent call last):
2022-05-29 23:05:25   File "V:/projects/Iconic_Reef/Agisoft_Project_Data_Exports/MIRmetashapeprocessing_PartII-MarkersToDense_v4.py", line 266, in <module>
2022-05-29 23:05:25     chunk.buildDepthMaps(downscale = 4, filter_mode = Metashape.ModerateFiltering)
2022-05-29 23:05:25 OSError: Can't replace file or directory: The process cannot access the file because it is being used by another process (32): V:/projects/Iconic_Reef/Agisoft_Project_Data_Exports/LOOE_R4-2_2022-05-05/LOOE_R4-2_2022-05-05.files/1/0/depth_maps/data40.zip
2022-05-29 23:05:25 Error: Can't replace file or directory: The process cannot access the file because it is being used by another process (32): V:/projects/Iconic_Reef/Agisoft_Project_Data_Exports/LOOE_R4-2_2022-05-05/LOOE_R4-2_2022-05-05.files/1/0/depth_maps/data40.zip

Here is the code used to generate the above error:
Code: [Select]
import Metashape
import os, sys
import time, datetime
import math
import json


if len(sys.argv) > 1:
    # Define Image Path from user input
    img_path = sys.argv[1]
else:
    # Quit if no path given
    print("Please put the path to your image folder in as an argument")
    sys.exit(1)

#Define Image Path from user input
img_path = sys.argv[1]


proj_dir, jpeg_name = os.path.split(img_path)
base_dir, img_folder = os.path.split(proj_dir)
print("proj_dir: " + str(proj_dir))
print("base_dir: " + str(base_dir))
print("jpeg_name: " + str(jpeg_name))

proj_name = str(jpeg_name.rsplit('_',1)[0])
print("proj_name: " + str(proj_name))

export_folder = proj_name
agisoft_files = "Agisoft_Project_Data_Exports"
export_path = os.path.join(base_dir, agisoft_files, export_folder)
print("export_path: " + str(export_path))
#Make sure the export workspace exists.  if not - make them.
if not os.path.exists(export_path):
    os.makedirs(export_path)

##Selection Percentages
RU_Percent = 50
PA_Percent = 50
RE_Percent = 10

## Selection Thresholds
RU_Threshold = 12
PA_Threshold = 3.5
RE_Threshold = 0.9

##Define photos list
photos = [os.path.join(img_path, photo)
        for photo in os.listdir(img_path)
        if photo.endswith('.JPG')]

#Define Start Time
start_time = time.time()
print_time = time.ctime(start_time)
print("Start Time: ", print_time)

#Define which document
doc = Metashape.app.Document()
#Define which chunk
#chunk = Metashape.app.document.chunk # Active Chunk in GUI

#doc.save(export_path + '/' + proj_name + '.psx')

#chunk = doc.addChunk()
chunk = Metashape.app.document.chunk

#Note:
#For matching accuracy the downscale correspondence should be the following:
#Highest = 0
#High = 1
#Medium = 2
#Low = 4
#Lowest = 8
#
#For depth maps quality the downscale correspondence should be the following:
#Ultra = 1
#High = 2
#Medium = 4
#Low = 8
#Lowest = 16


# Error Reduction
# https://agisoft.freshdesk.com/support/solutions/articles/31000154629-how-to-select-fixed-percent-of-the-points-gradual-selection-using-python

## Reconstruction Uncertainty
chunk.optimizeCameras(fit_f=True, fit_cx=True, fit_cy=True,  fit_k1=True, fit_k2=True, fit_k3=True,
                      fit_k4=False, fit_p1=True, fit_p2=True, fit_b1=False, fit_b2=False,
                      fit_corrections=False, adaptive_fitting=False, tiepoint_covariance=False)
print(proj_name + " optimization 1/5 completed")
doc.save()

points = chunk.point_cloud.points
f = Metashape.PointCloud.Filter()
f.init(chunk, criterion = Metashape.PointCloud.Filter.ReconstructionUncertainty) #Reconstruction Uncertainty
list_values = f.values
list_values_valid = list()
StartPoints = len(list_values_valid)

for i in range(len(list_values)):
    if points[i].valid:
        list_values_valid.append(list_values[i])
list_values_valid.sort()
target = int(len(list_values_valid) * RU_Percent / 100)
StartPoints = int(len(list_values_valid))
AlignmentPoints = StartPoints
threshold = list_values_valid[target]
if (threshold < RU_Threshold):
    threshold = RU_Threshold
f.selectPoints(threshold)
f.removePoints(threshold)

print("")
print("Error Reduction Report:")
RU_actual_threshold = threshold
print(str(threshold) + " threshold reached")
print(str(StartPoints) + " points at start")
print(str(target) + " points removed")
print(proj_name + " Reconstruction Uncertainty filter completed")
print("")

## Projection Accuracy
chunk.optimizeCameras(fit_f=True, fit_cx=True, fit_cy=True,  fit_k1=True, fit_k2=True, fit_k3=True,
                      fit_k4=False, fit_p1=True, fit_p2=True, fit_b1=False, fit_b2=False,
                      fit_corrections=False, adaptive_fitting=False, tiepoint_covariance=False)
print(proj_name + " optimization 2/5 completed")
doc.save()

points = chunk.point_cloud.points
f = Metashape.PointCloud.Filter()
f.init(chunk, criterion = Metashape.PointCloud.Filter.ProjectionAccuracy) #Projection Accuracy
list_values = f.values
list_values_valid = list()

for i in range(len(list_values)):
    if points[i].valid:
        list_values_valid.append(list_values[i])
list_values_valid.sort()
target = int(len(list_values_valid) * PA_Percent / 100)
StartPoints = int(len(list_values_valid))
threshold = list_values_valid[target]
if (threshold < PA_Threshold):
    threshold = PA_Threshold
f.selectPoints(threshold)
f.removePoints(threshold)

print("")
print("Error Reduction Report:")
PA_actual_threshold = threshold
print(str(threshold) + " threshold reached")
print(str(StartPoints) + " points at start")
print(str(target) + " points removed")
print(proj_name + " Projection Accuracy filter completed")
print("")

## Reprojection Error
chunk.optimizeCameras(fit_f=True, fit_cx=True, fit_cy=True,  fit_k1=True, fit_k2=True, fit_k3=True,
                      fit_k4=False, fit_p1=True, fit_p2=True, fit_b1=False, fit_b2=False,
                      fit_corrections=False, adaptive_fitting=False, tiepoint_covariance=False)
print(proj_name + " optimization 3/5 completed")
doc.save()

points = chunk.point_cloud.points
f = Metashape.PointCloud.Filter()
f.init(chunk, criterion = Metashape.PointCloud.Filter.ReprojectionError) #Reprojection Error
list_values = f.values
list_values_valid = list()

for i in range(len(list_values)):
    if points[i].valid:
        list_values_valid.append(list_values[i])
list_values_valid.sort()
target = int(len(list_values_valid) * RE_Percent / 100)
StartPoints = int(len(list_values_valid))
#set threshold
threshold = list_values_valid[target]
if (threshold < RE_Threshold):
    threshold = RE_Threshold
#remove points
f.selectPoints(threshold)
f.removePoints(threshold)

print("")
print("Error Reduction Report:")
RE_actual_threshold = threshold
print(str(threshold) + " threshold reached")
print(str(StartPoints) + " points at start")
print(str(target) + " points removed")
print(proj_name + " Reprojection Error filter 1 completed")
print("")

## Reprojection Error 2
chunk.optimizeCameras(fit_f=True, fit_cx=True, fit_cy=True,  fit_k1=True, fit_k2=True, fit_k3=True,
                      fit_k4=True, fit_p1=True, fit_p2=True, fit_b1=True, fit_b2=True,
                      fit_corrections=False, adaptive_fitting=False, tiepoint_covariance=False)
print(proj_name + " optimization 4/5 completed")
doc.save()

points = chunk.point_cloud.points
f = Metashape.PointCloud.Filter()
f.init(chunk, criterion = Metashape.PointCloud.Filter.ReprojectionError) #Reprojection Error
list_values = f.values
list_values_valid = list()

for i in range(len(list_values)):
    if points[i].valid:
        list_values_valid.append(list_values[i])
list_values_valid.sort()
target = int(len(list_values_valid) * RE_Percent / 100)
StartPoints = int(len(list_values_valid))
threshold = list_values_valid[target]
if (threshold < RE_Threshold):
    threshold = RE_Threshold
f.selectPoints(threshold)
f.removePoints(threshold)
EndPoints = int(len(list_values_valid))

print("")
print("Error Reduction Report:")
RE_actual_threshold = threshold
print(str(threshold) + " threshold reached")
print(str(StartPoints) + " points at start")
print(str(target) + " points removed")
print(proj_name + " Reprojection Error filter 2 completed")
print("")

chunk.optimizeCameras(fit_f=True, fit_cx=True, fit_cy=True,  fit_k1=True, fit_k2=True, fit_k3=True,
                      fit_k4=True, fit_p1=True, fit_p2=True, fit_b1=True, fit_b2=True,
                      fit_corrections=False, adaptive_fitting=False, tiepoint_covariance=False)
print(proj_name + " optimization 5/5 completed")
doc.save()

# Resize Region
region = chunk.region
T = chunk.transform.matrix

m = Metashape.Vector([10E+10, 10E+10, 10E+10])
M = -m

for point in chunk.point_cloud.points:
    if not point.valid:
        continue
    coord = T * point.coord
    for i in range(3):
        m[i] = min(m[i], coord[i]) - 0.000015
        M[i] = max(M[i], coord[i]) + 0.000015

center = (M + m) / 2
size = M - m
region.center = T.inv().mulp(center)
region.size = size * (1 / T.scale())

region.rot = T.rotation().t()

chunk.region = region

# Build Depth Maps
print(proj_name + " Building Depth Maps")
#Note
#For depth maps quality the downscale correspondence should be the following:
#Ultra = 1
#High = 2
#Medium = 4
#Low = 8
#Lowest = 16
chunk.buildDepthMaps(downscale = 4, filter_mode = Metashape.ModerateFiltering)
doc.save()

# Build Dense Cloud
print(proj_name + " Building Dense Cloud")
chunk.buildDenseCloud(point_colors=True, point_confidence=True, keep_depth=True,
                #max_neighbors=100,
                #subdivide_task=True,
                #workitem_size_cameras=20,
                #max_workgroup_size=100
                )
doc.save()
print(proj_name + " Dense Cloud built")

4
Python and Java API / Python Doc Save doesn't save as processing progresses
« on: February 15, 2022, 10:43:09 PM »
Hello Forum,
I'm using Metashape 1.7.0 and am trying to piece meal some python code together to process some photos into dense clouds. I've gotten the code to run all the way through but what I notice is that, despite using doc.save(path) at the beginning and doc.save() throughout the code, that the project is virtually blank when I open it up. I've seen some other forum posts like https://www.agisoft.com/forum/index.php?topic=12003.0 and https://www.agisoft.com/forum/index.php?topic=8662.0, but am struggling with figuring this out.

Here is the top part of my code before I start going into the processing and doc.save() steps where I would expect to get a .psx with my photos loaded into it:
Quote
import Metashape
import os, sys, time
import math
import json

#Definitions
img_path = sys.argv[1]
export_folder = "Exports"

proj_dir, jpeg_name = os.path.split(img_path)
base_dir, proj_name = os.path.split(proj_dir)
print()

export_path = os.path.join(proj_dir,export_folder)
#make sure the export workspace exists.  if not - make them.
if not os.path.exists(export_path):
    os.makedirs(export_path)

##Define photos list
photos = [os.path.join(img_path, file)
        for file in os.listdir(img_path)
        if file.endswith('.JPG')]

#Define which document
doc = Metashape.Document()
#Define which chunk
chunk = Metashape.app.document.chunk

doc.save(export_path + '/' + proj_name + '.psx')


#Add Photos
chunk.addPhotos(photos)
doc.save()

print(str(len(chunk.cameras)) + " images loaded")

5
General / Orthomosaic Export Color Corrected
« on: February 05, 2020, 12:01:07 AM »
I am generating a series or orthomosaics from underwater imagery. I am able to generate the ortho just fine in Metashape (1.6.1, but really all recent versions). The BigTiff that I generate comes out with a correction. To be clear, the ortho in Metashape and the BigTiff Ortho look different. This corrected Ortho is giving me issues in my analyses. Am I missing a setting that could solve this issue? Settings attached.
Thanks,
Bolli

6
General / tiled model altitude values export
« on: January 08, 2020, 04:18:02 PM »
Hi all,
I am working with Metashape v1.5.2.7838. I am running into issues with exporting a tiled model as an SLPK with correct altitude values. I am working on seafloor models and thus would expect negative altitudes (depths). The model seems to be exporting inverse altitudes (i.e. 30m below the sea surface would show up 30 feet above the sea surface), but the model itself looks good, just depth referenced incorrectly.

In the attached image, the blue bit is the model at +30 feet and the red line is where I expect it to be at -30 meters.

I appreciate any and all help,
Mike

Pages: [1]