Forum

Recent Posts

Pages: 1 ... 8 9 [10]
91
Python and Java API / Re: Adjusting marker projection errors via Python
« Last post by Alexey Pasumansky on April 12, 2024, 01:37:48 PM »
Hello Wizyza,

Do you have the exact plan what the script should do?

Here is the example how to loop through the marker projections:
Code: [Select]
marker = chunk.markers[-1]

for camera in list(marker.projections.keys()):
    proj = marker.projections[camera].coord #returns three component vector: MetashapeVector([x, y, 0.0])
    #here you can apply some checks if the projection should be removed or not
    #to remove projection: marker.projections[camera] = None
[code]

92
Hello Wizyza,

If your script is run from the command line, then it should automatically "unlock" Metashape project when the script is over. Or whether the script is working constantly and from time to time is accessing the projects?
And whether you apply any changes to the project and save it by the script?
93
Python and Java API / Re: Remove polygons from mesh
« Last post by Alexey Pasumansky on April 12, 2024, 01:11:24 PM »
Hello apicca,

To delete the faces from the model you need to make them selected and then call:
Code: [Select]
chunk.model.removeSelection()In your code the assignment model variable seems to be missing, for example, model=chunk.model

To select the polygons you can do the following, according to your code:
Code: [Select]
for face_index in range(len(chunk.model.faces)):
   if face_index in polygons_to_remove:
      chunk.model.faces(face_index.selected) = True
Or just apply face selection directly in loop, where you check, if the face fits threshold or not.
94
Python and Java API / Remove polygons from mesh
« Last post by apicca on April 12, 2024, 07:58:13 AM »
Hello,

I am trying to remove all polygons that exceed the size threshold selected.  What is the best way to delete the selected faces? Model.cropSelection(), model.removeSelection()??

I keep on getting Error: 'Metashape.Model' object has no attribute 'selectFaces'

Using Agisoft Metashape Professional 1.7.6


def calculate_triangle_area(vertex1, vertex2, vertex3):
    # Calculate the lengths of the sides of the triangle
    a = (vertex1 - vertex2).norm()
    b = (vertex2 - vertex3).norm()
    c = (vertex3 - vertex1).norm()
   
    s = (a + b + c) / 2.0
   
    # Calculate the area using Heron's formula
    area = (s * (s - a) * (s - b) * (s - c)) ** 0.5
   
    return area

# Function to filter polygons based on their size (area)

def filter_polygons(chunk, max_area_threshold):
 
    polygons_to_remove = []
   
    # Iterate over each face (polygon) in the model
    for face_index in range(len(model.faces)):
        # Get the vertices of the current polygon
        vertices = model.faces[face_index].vertices
       
        # Ensure the polygon is a triangle
        if len(vertices) == 3:
            # Extract the vertices' coordinates
            vertex1 = model.vertices[vertices[0]].coord
            vertex2 = model.vertices[vertices[1]].coord
            vertex3 = model.vertices[vertices[2]].coord
           
            # Calculate the area of the triangle
            area = calculate_triangle_area(vertex1, vertex2, vertex3)
           
            # Check if the area exceeds the specified maximum threshold
            if area > max_area_threshold:
                # Add the face index to the list of polygons to remove
                polygons_to_remove.append(face_index)

    # Remove polygons based on the indices collected
    model.eraseFaces(polygons_to_remove)


max_area_threshold = 5  # Specify your maximum area threshold here

for chunk in chunks:
    # Filter polygons for each chunk
    filter_polygons(chunk, max_area_threshold)

doc.removeSelected()


Thank you!
95
thank you for the detail explanation, according to the codes above, If I change the block size from 250 to 450, this will reduce the number of blocks from 300 to about 80-90. without reducing the pixel size of texture, this should be able to generate the texture on a 128G memory.

I'll try again. but the model at this scale, takes very long time to generate.

And another suggestion, when building the texture for blocked model, I think it's better to give a memory requirement estimation in the dialog.
96
Hello steve3d,

Reworking texturing procedure in general is already in progress, but it would like take quite a while to be implemented in the release version, so currently in order to reduce the memory consumption for the block model texture generation you need either to use lower texture resolution (so smaller number of pages per block are generated) or to reduce the size of initial model blocks.

For a very rough estimation for the number of texture pages and RAM consumption you can use the following code, but adjust the input values: ghosting filter enabled/disabled, block size, texture page size, output resolution:

Code: [Select]
K = 3 #surface complexity and atlas filling coefficient
K_ghost = 60 #60 - with ghosting filter, 36 - without ghosting filter
texture_size = 16384 #pixels
block_size = 25 #meters
resolution = 0.00075 #m/pix resolution
N_pages = int((block_size / resolution / texture_size) ** 2 * K) + 1
req_memory = texture_size ** 2 * K_ghost * N_pages
print(N_pages, "texture pages, ", req_memory / 1024 ** 3, "GB")
97
Python and Java API / Re: Access script path from inside GUI?
« Last post by Wizyza on April 11, 2024, 05:24:27 PM »
Hi everyone,

You can disregard this post. This turns out to not be a Metashape issue.

I wasn't aware of using the __file__ variable in Python. My script can now access files outside of itself.
98
Agisoft Cloud / Re: Agisoft Cloud Release Notes
« Last post by Ilya Shevelev on April 11, 2024, 05:14:43 PM »
Released on 2024-04-11

New features
  • Added limit box tool for Tiled Models and Point Clouds.

Tutorial on limit box tool is available in the following article in our knowledge base.


Full changelog is available in the following article in our knowledge base.
99
General / Re: RTK Positioning accuracy with Mavic 3E
« Last post by Dieter on April 11, 2024, 09:04:06 AM »


This is my point exactly, this will not allow others to participate reading or commenting. Stating using a translator on a forum English is used is pretty rude.


You saw that winking smily at the end of my sentence?
Do I really need to explain this smily's statement to you?

Just for your information: All my statements here come 1:1 from the Google Translater, otherwise I couldn't write anything here at all, because my English is too bad for that.

And that's all there is to it from my side.


Dieter
100
and another thing, is it possible to make these steps parallel?

Code: [Select]
2024-04-11 09:27:18 saved group #259/384: 182.68 MB cubes, 259.858 MB/s, 39.0955 compressed MB - i.e. 21% compression
2024-04-11 09:27:18 2 cameras done in 13.596 s
2024-04-11 09:27:18 loading 2 cameras...
2024-04-11 09:27:19 generating cubes...
2024-04-11 09:27:30 total: 30438815 samples, 12176217 image cubes, 12.8872 avg level
2024-04-11 09:27:31 saving 10985790 merged group cubes ~36%...
2024-04-11 09:27:31 saved group #260/384: 167.63 MB cubes, 261.106 MB/s, 34.4178 compressed MB - i.e. 21% compression
2024-04-11 09:27:31 2 cameras done in 12.859 s
2024-04-11 09:27:31 loading 2 cameras...
2024-04-11 09:27:32 generating cubes...
2024-04-11 09:27:40 total: 13249102 samples, 5170086 image cubes, 12.6206 avg level
2024-04-11 09:27:40 saving 4776529 merged group cubes ~36%...
2024-04-11 09:27:41 saved group #261/384: 72.884 MB cubes, 114.598 MB/s, 14.7257 compressed MB - i.e. 20% compression
2024-04-11 09:27:41 2 cameras done in 9.416 s
2024-04-11 09:27:41 loading 2 cameras...
2024-04-11 09:27:42 generating cubes...
2024-04-11 09:27:49 total: 12203757 samples, 5281054 image cubes, 13.1447 avg level
2024-04-11 09:27:49 saving 4356566 merged group cubes ~36%...
2024-04-11 09:27:49 saved group #262/384: 66.4759 MB cubes, 226.109 MB/s, 15.0156 compressed MB - i.e. 23% compression
2024-04-11 09:27:49 2 cameras done in 8.577 s
2024-04-11 09:27:49 loading 2 cameras...
2024-04-11 09:27:50 generating cubes...
2024-04-11 09:27:59 total: 23498082 samples, 9017552 image cubes, 12.5799 avg level
2024-04-11 09:28:00 saving 7712995 merged group cubes ~33%...
2024-04-11 09:28:01 saved group #263/384: 117.691 MB cubes, 248.818 MB/s, 24.9984 compressed MB - i.e. 21% compression
2024-04-11 09:28:01 2 cameras done in 11.161 s
2024-04-11 09:28:01 loading 2 cameras...
2024-04-11 09:28:02 generating cubes...
2024-04-11 09:28:12 total: 19114522 samples, 7384658 image cubes, 12.678 avg level
2024-04-11 09:28:12 saving 7384658 merged group cubes ~39%...
2024-04-11 09:28:13 saved group #264/384: 112.681 MB cubes, 148.264 MB/s, 21.9049 compressed MB - i.e. 19% compression
2024-04-11 09:28:13 2 cameras done in 12.321 s
2024-04-11 09:28:13 loading 2 cameras...
2024-04-11 09:28:14 generating cubes...
2024-04-11 09:28:26 total: 30670067 samples, 12238388 image cubes, 12.6715 avg level
2024-04-11 09:28:27 saving 11891791 merged group cubes ~39%...

each step only process 2 camera, and each step only use one core to process. when building large models, this also waste too much time.
Pages: 1 ... 8 9 [10]