7
« on: February 01, 2024, 03:43:42 PM »
I do a test project to establish a methodology for a coming project that will have about 40.000~50.000 photos and i will not be able to process them all at once in one chunk.
The test setup of the project is as following:
2057 photos.
1) Align all of them together in one chunk.
2) Gradual selection to delete points that i do not need and Optimise.
3) Using the script "Split in Chunks" to split the project in 3 Chunks. (i keep all initial Tie Points from step 1 and all photos (2057) in every new Chunk)
4) Process the 3 new chunks to produce Point Cloud, DEM and Orthophoto
5) Merge the 3 new Chunks. I merge all the assets (Depth Maps, Point Clouds, DEMs, Orthophotos)
The new Merged Chunk has:
6171 photos
2085 Depth maps (they come from Chunk 1 642 Depth maps - Chunk 2 813 Depth maps - Chunk 3 630 Depth maps)
I want to delete ALL duplicate photos.
If i use the script
import Metashape
"""
Script for removing duplicated photos, Metashape (v 1.8)
Matjaz Mori, CPA, May 2022
The script will remove all duplicated photos (photos referenced to the same file) from Metashape project.
"""
compatible_major_version = "2.1"
found_major_version = ".".join(Metashape.app.version.split('.')[:2])
if found_major_version != compatible_major_version:
raise Exception("Incompatible Metashape version: {} != {}".format(found_major_version, compatible_major_version))
def remove_duplicated_photos():
doc = Metashape.app.document
chunk = doc.chunk
lenght = len(chunk.cameras)
message = 'Removing duplicates...'
print (message)
paths = set()
photos = list()
for camera in list(chunk.cameras):
if not camera.type == Metashape.Camera.Type.Regular: #skip camera track, if any
continue
if camera.photo.path in paths:
photos.append(camera)
else:
paths.add(camera.photo.path)
chunk.remove(photos)
lenght_after = len(chunk.cameras)
nr_removed = lenght-lenght_after
message_end = 'Success, ' + str(nr_removed) + ' cameras removed.'
print (message_end)
label = "Scripts/Remove duplicated photos"
Metashape.app.addMenuItem(label, remove_duplicated_photos)
print("To execute this script press {}".format(label))
It Randomly deletes the duplicates and i lose too many Depth Maps. (only about 694 left from 2057 photos)
So what i need is:
1) The script must Check for ALL duplicate photos.
2) Choose to keep only 1 picture from it's set of duplicates but to keep the one that has the depth map and delete the ones that do not have.
3) If a set of duplicates have more than one photo that has a Depth map then choose one of them to keep and delete the rest.
4) If a list of duplicates does not have a picture with a Depth map then choose one to keep and delete the rest.
So i need to keep ONLY ONE photo from each duplicate set.
As a result the amount of photos will be no more than total initial photos (2057) but all of them will have depth maps. (or at least most of them)
The Merged chunk should be as if i processed all data in one chunk.