Forum

Show Posts

This section allows you to view all posts made by this member. Note that you can only see posts made in areas you currently have access to.


Messages - itzhalpepe

Pages: [1]
1
General / Re: Little holes in snow surface
« on: May 14, 2021, 10:21:06 AM »
Hello Alexey,
yes, that could be the last possibility.

Is it possible, that the filter of depth maps in version 1.7.3 is clearly higher/more aggressive than in 1.6.5?

I have the feeling, that I generate clearly worse results with UltraCam images in 1.7.3 than in 1.6.5. Additionally, the processing time of the depth maps increased by factor 3 and of dense cloud by factor 1,5 to 2.

In the moment I am calculating the same project, which  I already have with very good results- This was done with version 1.6.5. Than I can see the difference in detail.

I will send you an email, maybe you can calculate the results as well.

Probably, only my working group is working with UltraCam images, so that could be a special problem.

/Leon


2
General / Re: Little holes in snow surface
« on: May 13, 2021, 02:40:46 PM »
Hello Alexey,
I didn't try it, because we need a DSM with a very high resolution and accuracy of the snow surface.


/Leon

3
General / Re: Checkered pattern in hillshade of DSM
« on: May 13, 2021, 02:35:45 PM »
Hello Alexey, no, the low resolution image is not included.


I think the reproduction should be possible without the use of GCPs.

Maybe you can write me a personal message.

/Leon

4
General / Re: Little holes in snow surface
« on: May 13, 2021, 02:32:34 PM »
Hello Probert,
thanks for your reply and of course I can upload the report of this project in the attachment. Unfortunately, the size of each post is restricted by 512 kb.

Thanks
Leon

5
General / Little holes in snow surface
« on: May 07, 2021, 02:48:37 PM »
Hello together,
I am producing high-resolution DSMs in a  high-mountain area with snow.

If I optimized the model with some GCPs, the accuracy of the DSM increased. However, then I have some little holes in the DSM, where the confidence of the dense point is also low.

If I produce the DSM without GCPs the number of holes is clearly lower.

Make it sense to increase the number of key points(40000) and tie pojnt limit(4000) for a better cover of the holes?

It is not possible, to fill the holes with the option, because we need the z-value of the holes.

/Leon

6
General / Re: Checkered pattern in hillshade of DSM
« on: May 07, 2021, 01:49:22 PM »
Hello Alexey,
the source of the DSM is the dense cloud and the interpolation option is disabled.

The camera calibration is in the attachement. The model was optimized by 3 GCPs.  But the calibration without GCPs is very similiar.

I also tried to  generate the DSM with fixed calibration of the calibration report, but this doesn't work. The research area is in complexe terrain in a high-mountain area with snow.

The checkered pattern is especially in shadow areas, where I have low values for the NIR. There I have also an offset of 0,3 to 0,5 meters, especially in  steep terrain.
In flat or sunny regions, the offset is clearly lower.

The DSM has a resolution of 0,5 meter.

/Leon


7
General / Checkered pattern in hillshade of DSM
« on: May 06, 2021, 10:28:23 PM »
Hello together,
I produced a DSM with the Vexcel Ultracam X.

There I have a difference of 0,3 to 0,4 meters in the z-value, especially in steeper areas.

Therefore I generated a hillshade of the DSM, to prove, if there is an offset in x/y-direction, which would explain the problem in steep areas.

The hillshade shows a checkered pattern, which you can see in the attachement.

Maybe somebody had already the same pattern and can explain the problem?

Thanks

/leon








8
Hello Alexey,
that would be great.

I am working with around 600 images with a resolution of 26000 x 17000 pixel. Each image has 4 bands with 16bit.

9
Hello Alexey,
we have the same issue, that Agisoft ignore our GPU during depth maps processing.

We are working with 6 RTX 2080 TI.

Warning: Device GeForce RTX 2080 Ti ignored because it has 8177/11264 MB available, but 56739 MB required
Warning: Device GeForce RTX 2080 Ti ignored because it has 7992/11264 MB available, but 56739 MB required
Warning: Device GeForce RTX 2080 Ti ignored because it has 8177/11264 MB available, but 56739 MB required
Warning: Device GeForce RTX 2080 Ti ignored because it has 8177/11264 MB available, but 56739 MB required
Warning: Device GeForce RTX 2080 Ti ignored because it has 8177/11264 MB available, but 56739 MB required
Warning: Device GeForce RTX 2080 Ti ignored because it has 8177/11264 MB available, but 56739 MB required

How we can fix that problem? Without the GPUs, the processing time increased by the factor 5.

/Leon

10
General / Re: Problems with UltraCam
« on: January 08, 2021, 03:40:27 PM »
Hello Paul,
thanks for your detailed answer. Theoretical, your calculation and your explanation make sense, but how do you calculate the scale value?

Maybe the right values for the overlap, the GSD and the fly altitude can help you (see Attachement 1)

Forward Overlap: 80%
Side Overlap: 50%
Average GSD: 12,9 cm
Average fly altitude over ground: 3960 m

I think, for the next flight we will use some GCPs(maybe 3) to calibrate the f-value.

Thanks,
Leon





11
General / Problems with UltraCam
« on: January 06, 2021, 02:21:31 PM »
Hello together,
For my master thesis I try to produce some digital surface models of large (400km²) high mountains and snow-covered areas around Davos.



I am using around 400 aerial images from the camera Vexcel UltraCam Eagle M3. I have the calibration report with a focal length of 122.7mm and a pixel size of 4*4µm (see attachement 1). I have the data from three different years (March 2018, April 2019 and March 2020) amd furthermore the IMU-coordinates from the airplane and the angles (Omega, Phi, Kappa).


For the last time in 2020 I also have 39 GCPs from GNSS with a high accuracy.



The coordinate system, which I am using is the CH1903+ / LV95 (EPSG:2056) with the Swiss 2004 geoid. I work with Agisoft Metashape Professional 1.6.5.



The aim of this research is to produce a DSM with a high resolution and a high accuracy (RMSE Z 20cm), but without control points. The reason for such is, that the effort for measuring many GCPs is too high and the research area is also often endangered by avalanches. The problem is that offsets are occurring in the elevation depending on the settings. I can see the difference on the 39 GCPs, which are used as check points.



If I produce a DSM with these settings: No precalibration, with IMU and with the angles, I will have an offset of around 1 meter (see attachement 2).


When using precalibration with a f-value of 30675, the focal length of 122.7mm and the pixel size of 4*4µm and the other values of zero, I have an offset of around 2,2 meters (see attachement 3).



After that, I optimized the first model without precalibration, IMU and angles with the 39 GCPs as control points. The accuracy of this model was very high (RMSE Z 15cm). It was noticeable, that the f-value changed to 30658. So, there is a difference of the f-value to the calibration report of 17 pixels.



In the next step I produced a new model with a fixed f-value of 30658, the IMU-coordinates and the angles, but without some control points. The results were quite perfect with a Z-RMSE of 9cm and a total RMSE of 29cm (see attachement 4).



So I worked with the same settings (f-value 30658) for the year 2019. There it didn´t work and I had an offset of the Z-value of 35 cm. There I do not have many GCPs, so I can not optimize the model to figure out the right value for f. Though my supervisor produced the 2019 model without precalibration, but with IMU and the angles and there it worked better with a Z-RMSE of 15cm. We checked the accuracy on snow-free areas like streets with a DSM from a UAV and an airborne laserscan.



With the 2018 model it did not work again. When working only with the IMU and the angles, there was an offset of some meters as well. With a precalibration like it is presented in the calibration report, the difference increased. There we had some GCPs, which we could use as control points to optimize the model.



In summary, we worked with three different procedure for the same process. Normally, it should work, that we only need the IMU, the angles and maybe the precalibration for results with a high accuracy. One solution for the next year could be that only 3 GCPs will be used as control points to optimize the model. But it would be even better, if no GCPs were necessary.



I guess, that the focal length changes during the flight because of some differences in the temperature. I think Metashape works with a constant focal length, which never changes So maybe these little differences in the focal length in combination with the high fly altitude are the reasons for this mistake.



Or do you have some other ideas?



Best regards,

Leon

12
Bug Reports / Generate Ortho - libtiff error
« on: December 14, 2020, 02:08:50 PM »
Hello together,
I have a problem to produce the orthomosaic.

Everytime, when trieing to produce the orthomosaic, I receive the error message, that there is a libtiff error.

It looks like this: libtiff error: Write error at scanline 19456

The images are from a UltraCam Eagle Mark 3 f120. The resolution of the ortho is approximately 0,12 x 0,12 meters. The source of the surface is the DEM.

I am working with Agisoft Metashape Professional Version: 1.6.5 build 11249 (64 bit) and a work station, which contains an Intel(R) Xeon(R) Platinum 8268 CPU @ 2.90GHz, RAM: 382.7 GB and six RTX 2080 TI.

The working protocoll is in the attachement.

I hope someone had the same probolem and can help me.

Best regards
Leon




13
General / Problems to work with the ready align model
« on: August 29, 2017, 05:22:02 PM »
Hello,
i have a project in photoscan with about 700 photos and i aligned them with high setting, 160000 kie point limit and tie point of 0. Now i have the problem, that the pc didnt need so much time with around 1 hour to edit it. But now it has problems to show the ready project. It needs so much time to delete some tie points or zoom into the picture. The project has at the moment 7.100.000 tie points. The Pc has got 200gb Ram and it doesnt use the processor to 100%. My idea is, that the graphic card isnt good enough to show the project.

I have often the report, that the program hasnt a feedback, but after some minutes it works again. But it is so slow, that i cant work with it.

Do you have other solutions or idea why it works so slow?

Best regards
Leon

Pages: [1]