Forum

Show Posts

This section allows you to view all posts made by this member. Note that you can only see posts made in areas you currently have access to.


Messages - mitchgalea

Pages: [1] 2
1
General / Re: Cleaning Tiled Model (Remove Floating blobs)
« on: May 30, 2023, 03:53:43 AM »
Hi Oscar,

Thanks alot for your response, will look into these options! Cheers

2
General / Cleaning Tiled Model (Remove Floating blobs)
« on: May 29, 2023, 01:53:15 AM »
Hello,

We have internally imaged a large water tank (80m) in diameter. Due to the close proximity to the ceiling we have a lot of images in the dataset (we are using around 12000 images). We want a high-resolution model. I am experimenting with tiled models for this (I haven't had any prior use cases for tiled models).
The settings we used were:
- Source Data: Depth Maps
- Quality: High
- Pixel resolution: 0.004m
- Surface count: High
- Ghosting filter: False

There are a lot of black noise blobs under and behind the tiled model. Using regular cleaning approaches such as gradual selection is not available.

Questions:
1. Is there any way to clean a tiled model?
2. Would enabling ghosting filter reduce the noise?
3. Would producing a regular mesh that cleaning it than using source data as mesh for tiled model work?
4. I am also considering dividing the model into chunks and separately processing, but would like to deliver a single model at the end, is this possible?

I have attached a screenshot.

3
General / Dense Cloud / Laser Scan Rendering quality
« on: May 07, 2023, 04:19:00 AM »
Hello,

For our workflows we have a step where we import quite alot of laser scans into a chunk. We then select laser scans to remove as we don't need that many. The problem is when trying to view the chunk to select the laser scans to remove Metashape is very slow, unusable. We have to hide the laser scans and select scans based on the camera locations. Is there a way to reduce the rendering quality of laser scans so that Metashape will be smoother to use?

4
General / Re: Metashape 2.0 Laser Scan Referencing
« on: May 07, 2023, 04:03:03 AM »
Hello,

Not too sure about the issue you are facing. It seems that the function get_camera is returning None for either non_aligned_camera or aligned_camera. Please ensure that the label is correct, ie in my example I was using "Scan_001", that is the name of the scan image.

5
Bug Reports / Re: Assertion "3478972301441" failed at line 720
« on: March 23, 2023, 03:07:25 AM »
Metashape 2.0.2 fixed this error for us

6
Bug Reports / Re: Assertion "3478972301441" failed at line 720
« on: March 11, 2023, 07:54:44 AM »
I get the same error on my project, I've tested it on both Metashape 2.0.0 and Metashape 2.0.1

7
Python and Java API / Metashape 2.0 python laser scan depth map
« on: February 23, 2023, 06:58:27 AM »
Hello,

Is there a way to access the depth maps for the terrestrial laser scans in metashape 2.0? I have attached a screenshot of what the depth map looks like when opening the tls spherical image. When accessing `Metashape.app.document.chunk.depth_maps` though it is None. I assume that the depth maps are computed based on the laser scan and not stored? If so what would be the calculation per pixel, as my goal is to access the depth of certain pixels?

Thanks

8
General / Re: Metashape 2.0 Laser Scan Referencing
« on: February 21, 2023, 08:12:16 AM »
I figured out how to fix this. I had to compute the chunk transform that would ensure that the laser scans are in the correct position. To do this prior to alignment I duplicated the chunk, and then aligned the original chunk. From there I had to use the python console.

Code: [Select]
# gets chunks to use
non_aligned_chunk = Metashape.app.document.chunks[1]
aligned_chunk = Metashape.app.document.chunks[0]

# function to get camera based on label
def get_camera(chunk,label):
    for camera in chunk.cameras:
        if camera.label == label:
            return camera
    return None

# gets a spherical from a laser scan for both chunks
non_aligned_camera = get_camera(non_aligned_chunk, "Scan_001")
aligned_camera = get_camera(aligned_chunk, "Scan_001")

# non aligned camera is in the correct position so we work out chunk transform to equal non aligned camera position
aligned_chunk.transform.matrix = non_aligned_camera.transform * aligned_camera.transform.inv()

9
General / Metashape 2.0 Laser Scan Referencing
« on: February 21, 2023, 02:32:49 AM »
Hello,

I am using Metashape 2.0 Terrestrial laser scans feature to align TLS scans captured with the Leica RTC360, with images captured from other sources. I have followed the procedure listed here https://agisoft.freshdesk.com/support/solutions/articles/31000168474-terrestrial-laser-scans-processing-in-metashape-2-0-0 . It works and I can align the TLS scans with the images. My issue though is the position of the laser scans after alignment is different, this makes sense because we reset alignment, but is there a way to add the reference information to the laser scans so that after alignment the laser scans keep the initial position. I have tried adding reference information to the laser scan images based on the initial position, but this doesn't do anything (seems that images attached to laser scans are ignored when it comes to reference).

Any help would be appreciated

10
Can this be answered? Seems like its either a yes or no on whether it is currently possible?

11
Hello,

I've experimented with Metashape 2.0 laser scan features and really like how powerful it can be. We would like to use this in a pipeline that uses the python API.

The steps are:
 - importing the point clouds: Can be done in python with chunk.importPointCloud(...)
 - adding laser scan group: ?
 - setting laser scan group to fixed: ?
 - adding laser scans to group: ?
 - unlocking laser scan group transfrom: ?

I have attached screenshots of these steps in the GUI but would like to know how to do all of these steps in python if this is actually possible



12
Hello,

I have a question about the 'cameras' argument for matchPhotos function.

I have a project where I am using the 'cameras' argument, it will only detect points and generate matches with these cameras but it generates matches between these cameras and other cameras already in the project that have keypoints detected. I would expect the argument would limit matches to only be generated between the cameras in the list

13
I am trying to access camera yaw pitch and roll estimated values (as appear in GUI reference pane)

To get the transformed camera position I am using the following code:
Code: [Select]
chunk = Metashape.app.document.chunk
camera = chunk.cameras[2]
transform = chunk.transform.matrix * camera.transform
# this works as expected, values are matching those in reference pane
pos = transform.translation()

rot = transform.rotation()
# this is not working, expected values are 93.583, -0.937, -44.785, outputted values are 273.58297001960716, 1.784351814162492, -135.21509554607456
ypr = Metashape.Utils.mat2ypr(rot)
 

I can do this using external libraries, but thought there would be a way to do this inside of metashape. This issue is persistent over different documents

14
Bug Reports / Unexpected Error on Startup
« on: June 16, 2022, 05:34:40 PM »
Hello,

When opening Metashape (this error started a couple of weeks ago), I have been getting this error:
`Unexpected error occurred.
Details: System clock has been set back (-40)
`
This causes Metashape to not be activated, and I cannot activate it manually either. I have done some research and found that it is an issue with License managers where a file somewhere in the filesystem is corrupted or the time of the file has been modified, however I haven't had any luck fixing this. I have tried searching for files modified in both the future and way back in the past and modifying the dates, I have also tried repairing my Ubuntu install, nothing has been successful. I have only noticed errors with Metashape. Other than reinstalling ubuntu does anyone have any suggestions on fixing this error?

Thanks

Mitchell

15
General / Re: Cylindrical Orthomosaic Generation for long think wall scan
« on: January 26, 2022, 05:48:25 PM »
Thanks for the feedback. I manually found an appropriate value for f and it flattened out the model. Tried on multiple datasets and worked a treat. Thankyou!

Pages: [1] 2