Forum

Recent Posts

Pages: 1 ... 7 8 [9] 10
81
Feature Requests / Re: Latvia Geoid
« Last post by Paulo on April 17, 2019, 11:21:41 AM »
Hello,

you can create a compound CS adding the following tif representing the LV 14 geoid...

Compound Coordinate System: LKS92 / Latvia TM + geoid LV 14
Projected Coordinate System: LKS92 / Latvia TM (EPSG::3059)
Linear Units: metre (EPSG::9001)
Projection Method: Transverse Mercator
    Latitude of natural origin: 0
    Longitude of natural origin: 24
    Scale factor at natural origin: 0.9996
    False easting: 500000
    False northing: -6000000
Geographic Coordinate System: LKS92 (EPSG::4661)
Angular Units: degree (EPSG::9102)
Geodetic Datum: Latvia 1992 (EPSG::6661)
Ellipsoid: GRS 1980 (EPSG::7019)
Prime Meridian: Greenwich (EPSG::8901)
Vertical Coordinate System: Latvia 2014
Vertical Units: metre (EPSG::9001)
Vertical Datum: Latvia 2014

Aatached is tif file
82
General / Re: Dense Point Cloud generation // far slower in Metashape?
« Last post by saduka on April 17, 2019, 10:42:39 AM »
Hello toxicmag,

Are you using the latest Metashape release version? And do you have the processing logs or at least the timing information from the Chunk Info dialog for 1.4 and 1.5 processing?

Usually long dense cloud filtering process is related to the excessive overlap. If it is the case, the processing time can be reduced by limiting the number of pairs for each camera to be estimating during filtering. It can be done by creating main/dense_cloud_max_neighbors tweak and setting its value to 60, for example.

i apologize for being a noob... can you enlighten me as to how to do this? thanks

if you are using python, just add a max_neighbors variable in the function
Code: [Select]
chunk.buildDepthMaps(quality=Metashape.Quality.HighQuality,
                    filter=Metashape.FilterMode.MildFiltering,
                    reuse_depth=True,
                    max_neighbors=50)
doc.save()
chunk.buildDenseCloud(point_colors=False,keep_depth=True,max_neighbors=50) 
doc.save()       
83
Python Scripting / setImage() function in DepthMap
« Last post by sara.zivkovic on April 17, 2019, 09:45:36 AM »
I tried to set other image to the DepthMap object using setImage() function but this only added image to the object - both depth maps are shown in PhotoScan. Can you please tell me if there is a way to actually replace Image object of DepthMap and not just add new?
84
Feature Requests / Latvia Geoid
« Last post by ginnnts on April 17, 2019, 09:36:58 AM »
Latvia 92TM EPSG3059 has a new geoid LV'14.
How do I add it to Metashape coordinate transformation?

Added LV'14 grid files:
85
Python Scripting / Re: Meshing (buildModel command) from Depth Maps ignores face_count
« Last post by kaz on April 17, 2019, 08:35:14 AM »
Hi,

I encountered the same result.
chunk.buildModel in script(~3min) is slower than GUI(1min9sec).
Also, face_count seems not to be used properly.
Computation time: LowFaceCount > MediumFaceCount > HighFaceCount.
I attached screenshots of GUI.
I will appreciate your help.

Additional information:
Ubuntu18.04
Metashape pro(trial)
GPU: GeForce1080Ti
10 images(3024x4032pixels)

Code: [Select]
>>> chunk.buildDepthMaps(quality=Metashape.MediumQuality, filter=Metashape.MildFiltering)
>>> chunk.buildDenseCloud()
>>> print(chunk.dense_cloud)
<DenseCloud '1612181 points'> <- The amount of points is similar as GUI(1623256).

>>> start_time = time.time()
>>> chunk.buildModel(surface=Metashape.Arbitrary, interpolation=Metashape.EnabledInterpolation, face_count=Metashape.FaceCount.LowFaceCount, source=Metashape.DenseCloudData)
BuildModel: source data = Dense cloud, surface type = Arbitrary, face count = Low, interpolation = Enabled, vertex colors = 1
Grid size: 1144 x 612 x 1308
Tree depth: 11
Tree set in 6.78561s (1612153 points)
Leaves/Nodes: 11130834/12720953
Laplacian constraints set in 5.88478s
Depth[0/11]: 1
Evaluated / Got / Solved in: 0 / 0.030369 / 0.0768158
Depth[1/11]: 8
Evaluated / Got / Solved in: 0 / 2.90871e-05 / 0.000530958
Depth[2/11]: 64
Evaluated / Got / Solved in: 0 / 0.000210762 / 0.00101233
Depth[3/11]: 512
Evaluated / Got / Solved in: 0 / 0.0013907 / 0.00366759
Depth[4/11]: 4096
Evaluated / Got / Solved in: 0 / 0.00387549 / 0.0124726
Depth[5/11]: 32768
Evaluated / Got / Solved in: 0 / 0.0701094 / 2.77984
Depth[6/11]: 26992
Evaluated / Got / Solved in: 0 / 0.0104582 / 0.0301316
Depth[7/11]: 93656
Evaluated / Got / Solved in: 0 / 0.111399 / 2.30109
Depth[8/11]: 342224
Evaluated / Got / Solved in: 0 / 0.484556 / 7.34254
Depth[9/11]: 1165992
Evaluated / Got / Solved in: 0 / 0.930498 / 14.05
Depth[10/11]: 3567904
Evaluated / Got / Solved in: 0 / 2.6048 / 42.6426
Depth[11/11]: 7486736
Evaluated / Got / Solved in: 0 / 4.94877 / 72.4493
Linear system solved in 151.357s
Got Iso-value in 0.990996s
Iso-Value -0.415341
3964200 faces extracted in 84.4937s
decimating mesh (3945237 -> 35825)
processing nodes...  done in 0.022816 sec
calculating colors...  done in 0.208744 sec
>>> print(int(time.time()-start_time), 'sec')
264 sec
>>> print(chunk.model)
<Model '35824 faces, 18567 vertices'>

>>> start_time = time.time()
>>> chunk.buildModel(surface=Metashape.Arbitrary, interpolation=Metashape.EnabledInterpolation, face_count=Metashape.FaceCount.MediumFaceCount, source=Metashape.DenseCloudData)
BuildModel: source data = Dense cloud, surface type = Arbitrary, face count = Medium, interpolation = Enabled, vertex colors = 1
Grid size: 1144 x 612 x 1308
Tree depth: 11
Tree set in 6.83281s (1612153 points)
Leaves/Nodes: 11130834/12720953
Laplacian constraints set in 5.73218s
Depth[0/11]: 1
Evaluated / Got / Solved in: 0 / 0.032742 / 0.04515
Depth[1/11]: 8
Evaluated / Got / Solved in: 0 / 2.19345e-05 / 0.000575066
Depth[2/11]: 64
Evaluated / Got / Solved in: 0 / 0.000210047 / 0.00163198
Depth[3/11]: 512
Evaluated / Got / Solved in: 0 / 0.00681806 / 0.00395107
Depth[4/11]: 4096
Evaluated / Got / Solved in: 0 / 0.0469992 / 0.871671
Depth[5/11]: 32768
Evaluated / Got / Solved in: 0 / 0.0666571 / 0.0435393
Depth[6/11]: 26992
Evaluated / Got / Solved in: 0 / 0.0109968 / 0.213359
Depth[7/11]: 93656
Evaluated / Got / Solved in: 0 / 0.0704165 / 1.3744
Depth[8/11]: 342224
Evaluated / Got / Solved in: 0 / 0.27553 / 2.68768
Depth[9/11]: 1165992
Evaluated / Got / Solved in: 0 / 0.856132 / 14.7494
Depth[10/11]: 3567904
Evaluated / Got / Solved in: 0 / 2.73343 / 35.9586
Depth[11/11]: 7486736
Evaluated / Got / Solved in: 0 / 5.14864 / 51.2006
Linear system solved in 116.842s
Got Iso-value in 0.906756s
Iso-Value -0.415338
3964204 faces extracted in 54.9056s
decimating mesh (3945247 -> 107476)
processing nodes...  done in 0.022639 sec
calculating colors...  done in 0.512526 sec
>>> print(int(time.time()-start_time), 'sec')
200 sec
>>> print(chunk.model)
<Model '107475 faces, 54669 vertices'>

>>> start_time = time.time()
>>> chunk.buildModel(surface=Metashape.Arbitrary, interpolation=Metashape.EnabledInterpolation, face_count=Metashape.FaceCount.HighFaceCount, source=Metashape.DenseCloudData)
BuildModel: source data = Dense cloud, surface type = Arbitrary, face count = High, interpolation = Enabled, vertex colors = 1
Grid size: 1144 x 612 x 1308
Tree depth: 11
Tree set in 6.72557s (1612153 points)
Leaves/Nodes: 11130834/12720953
Laplacian constraints set in 5.30451s
Depth[0/11]: 1
Evaluated / Got / Solved in: 0 / 0.0217791 / 0.025856
Depth[1/11]: 8
Evaluated / Got / Solved in: 0 / 2.12193e-05 / 0.000516891
Depth[2/11]: 64
Evaluated / Got / Solved in: 0 / 0.000212193 / 0.00101972
Depth[3/11]: 512
Evaluated / Got / Solved in: 0 / 0.000766039 / 0.00295591
Depth[4/11]: 4096
Evaluated / Got / Solved in: 0 / 0.00368476 / 0.033103
Depth[5/11]: 32768
Evaluated / Got / Solved in: 0 / 0.0562215 / 1.25815
Depth[6/11]: 26992
Evaluated / Got / Solved in: 0 / 0.0289958 / 0.079834
Depth[7/11]: 93656
Evaluated / Got / Solved in: 0 / 0.0720582 / 1.37767
Depth[8/11]: 342224
Evaluated / Got / Solved in: 0 / 0.213139 / 2.5729
Depth[9/11]: 1165992
Evaluated / Got / Solved in: 0 / 0.805761 / 9.47408
Depth[10/11]: 3567904
Evaluated / Got / Solved in: 0 / 2.12819 / 15.3353
Depth[11/11]: 7486736
Evaluated / Got / Solved in: 0 / 4.12108 / 29.3142
Linear system solved in 67.3245s
Got Iso-value in 0.932s
Iso-Value -0.41534
3964200 faces extracted in 45.246s
decimating mesh (3945254 -> 322430)
processing nodes...  done in 0.023334 sec
calculating colors...  done in 1.3526 sec
>>> print(int(time.time()-start_time), 'sec')
140 sec
>>> print(chunk.model)
<Model '322430 faces, 162390 vertices'>

>>> start_time = time.time()
>>> chunk.buildModel(surface=Metashape.Arbitrary, interpolation=Metashape.EnabledInterpolation, face_count=108204, source=Metashape.DenseCloudData)
BuildModel: source data = Dense cloud, surface type = Arbitrary, face count = 108204, interpolation = Enabled, vertex colors = 1
Grid size: 1144 x 612 x 1308
Tree depth: 11
Tree set in 6.79055s (1612153 points)
Leaves/Nodes: 11130834/12720953
Laplacian constraints set in 5.40077s
Depth[0/11]: 1
Evaluated / Got / Solved in: 0 / 0.0289469 / 0.147501
Depth[1/11]: 8
Evaluated / Got / Solved in: 0 / 0.0517442 / 0.656491
Depth[2/11]: 64
Evaluated / Got / Solved in: 0 / 0.000213146 / 0.00120711
Depth[3/11]: 512
Evaluated / Got / Solved in: 0 / 0.000746012 / 0.167142
Depth[4/11]: 4096
Evaluated / Got / Solved in: 0 / 0.0278745 / 0.0296922
Depth[5/11]: 32768
Evaluated / Got / Solved in: 0 / 0.0380363 / 0.177285
Depth[6/11]: 26992
Evaluated / Got / Solved in: 0 / 0.0731647 / 1.32664
Depth[7/11]: 93656
Evaluated / Got / Solved in: 0 / 0.179184 / 2.59301
Depth[8/11]: 342224
Evaluated / Got / Solved in: 0 / 0.271808 / 3.8783
Depth[9/11]: 1165992
Evaluated / Got / Solved in: 0 / 0.698951 / 6.66303
Depth[10/11]: 3567904
Evaluated / Got / Solved in: 0 / 2.34101 / 16.8316
Depth[11/11]: 7486736
Evaluated / Got / Solved in: 0 / 4.87187 / 67.0118
Linear system solved in 108.562s
Got Iso-value in 1.05083s
Iso-Value -0.41534
3964198 faces extracted in 86.3172s
decimating mesh (3945230 -> 108204)
processing nodes...  done in 0.022791 sec
calculating colors...  done in 0.540445 sec
>>> print(int(time.time()-start_time), 'sec')
223 sec
>>> print(chunk.model)
<Model '108204 faces, 55024 vertices'>

Thank you.
Kaz
86
General / Re: size of the 3d model
« Last post by pbourke on April 17, 2019, 06:47:50 AM »
My approaches
1. Use scale rules and therefore create a scale model in MetaShape. See attached.
2. Export as is, scale in external  software to a known/measured dimension.
3. Export as is, scale to unit cube bounding box and apply real scale in viewing software.
I use meshlab for 2 and 3.
87
General / Comment/bug about forum attachements
« Last post by pbourke on April 17, 2019, 06:44:53 AM »
I mentioned this before but it hasn't been fixed.
When replying to a forum question if I attach an image that is too large I get the error "attachment greater than 768kb". If I scale the image to under 768 and repeat, I then get the error "attachment greatre than 512".
Rather annoying and time waster when one is only trying to assist others in the forums.
88
General / Re: How to delete part of the dense model
« Last post by pbourke on April 17, 2019, 06:31:37 AM »
Standard way to delete dense point cloud is using the tools as shown in the attached.
Common pipeline is to clean dense cloud before meshing.
89
General / Lens question
« Last post by pbourke on April 17, 2019, 06:12:29 AM »
I have a large number of small objects to be reconstructed.
I've done a few using the Canon 100mm but that is a 1:1 lens and only works well for objects down to 2cm long, on a full frame sensor.
I would like to know whether 1:2 or 1:5 lenses can be used. For example the Laowa 60mm or Canon EP-E 65mm, see
     https://www.venuslens.net/product/laowa-60mm-f2/
and
     https://www.canon.com.au/camera-lenses/mp-e65mm-f-2-8-1-5x-macro
This is where my knowledge of optics and relation to MetaShape are deficient. They are both a single focal length, but can zoom from infinity to close range, and I believe the image size changes significantly during focus.
Anyone understand this stuff?
90
General / Re: Dense Point Cloud generation // far slower in Metashape?
« Last post by dnb118 on April 17, 2019, 04:00:51 AM »
Hello toxicmag,

Are you using the latest Metashape release version? And do you have the processing logs or at least the timing information from the Chunk Info dialog for 1.4 and 1.5 processing?

Usually long dense cloud filtering process is related to the excessive overlap. If it is the case, the processing time can be reduced by limiting the number of pairs for each camera to be estimating during filtering. It can be done by creating main/dense_cloud_max_neighbors tweak and setting its value to 60, for example.

i apologize for being a noob... can you enlighten me as to how to do this? thanks
Pages: 1 ... 7 8 [9] 10