Forum

Show Posts

This section allows you to view all posts made by this member. Note that you can only see posts made in areas you currently have access to.


Messages - Alexandros21

Pages: [1]
1
General / Re: Change in hardware leads to sparser depth maps
« on: March 07, 2021, 03:37:08 PM »
Hi Alexey,

No, I had "use CPU" option disabled. I don't have the log of the run that I sent you, so I built again depth maps on the Cloud and I saved it. Please find the log attached in this post.

Best,
Alexandros

2
General / Re: Change in hardware leads to sparser depth maps
« on: March 05, 2021, 06:58:40 PM »
Hi Alexey,

Thank you for your answer. I sent the sample project at the provided address.

Best regards,
Alexandros

3
The difference in points count is huge. Are you sure you didn't use high quality settings in the old Agisoft? Also, which version is the old Agisoft and which the Metashape that you tried?

4
General / Re: Weird distorted mesh from point cloud
« on: March 01, 2021, 09:17:37 PM »
Hi Mat,

Did you also try building Mesh with Interpolation setting: Enabled (default)?
The weird shapes that you get could be because of the Extrapolation used.

Best,
Alex

5
General / Change in hardware leads to sparser depth maps
« on: March 01, 2021, 12:43:39 PM »
Hi all,

I've started using a Cloud service for my work and I also installed Metashape there. I ran a dataset both on the cloud and my local machine, using the exact same Metashape version (1.7.1. build 11797) and the exact same tweaks (BuildDenseCloud/max_neighbors = -1).

However, the generated depth maps in the Cloud are sparser than the depth maps built in the local machine and consequently, they lead to a much sparser pointcloud. Could you please help me understand what might be the issue? Could it be eg. because of the different GPU? Here is the hardware that I've used:

Local machine:
Processor Intel Core i7-7700k CPU @ 4.2 GHz
RAM 64 GB
GeForce GTX 1080 (20 compute units @ 1733 MHz, 8192 MB)

Cloud:
AMD EPYC 7V12 64-core @ 2.44 GHz
RAM 112 GB
Radeon Instinct MI25 MxGPU (gfx900) (64 compute units @ 1000MHz, 16064 MB)

Thank you in advance,
Alexandros


6
Indeed, I had to run a dense pointcloud with agrresive filtering and then built mesh with reuse depth maps option enabled.

7
Can you show screenshots from 1.6.1 and 1.6.2/3?

Unfortunately, I can only share the attached zoomed-in screenshots.


Hello Alex,

If there's a possibility to provide the project with the alignment results only (saved in PSZ format) and the original images, we will generate the models in different 1.6.x versions and check the difference. You can send the download link to support@agisoft.com.

Thank you Alexey, but I cannot send the data, because I am bound by confidentiality. I hope the screenshots that I provide can be helpful.

Best regards,
Alex

8
Hi Mak,

Thank you for your answer. Unfortunately, extrapolation setting doesn't change much, since there are too large holes. I understand that glass reconstruction is hard, but from my understanding, mild filtering should perform better and not worse than aggresive filtering.

Best,
Alex

9
Hi everyone,

I've created a high quality mesh model from a building with big glass structures, using Metashape 1.6. I know that the recommended way to run mesh is with mild filtered depth maps, however I get massive holes wherever there is glass. Oddly, if I use aggresive depth maps filtering the glass gets reconstructed much better, but the rest model is not as smooth.

Between 1.6.1 and 1.6.2 I also noticed a worsening of my mesh model - the surface is less smooth and the lines are less straight.

Is there any way of improving mesh model from depth maps with mild filtering? I even tried in 1.6.3 but I still get holes in the glass.

If not, is there at least a way to go back to 1.6.1's mesh model generation?

I can get slightly better results by using the visibility consistent method tweak, but I'd prefer to avoid this method, since it's really RAM-expensive.

Any help is much appreciated!

Thank you in advance,
Alex

Pages: [1]