Forum

Show Posts

This section allows you to view all posts made by this member. Note that you can only see posts made in areas you currently have access to.


Messages - mauroB

Pages: [1] 2
1
General / image error while marker placing
« on: September 01, 2023, 08:01:45 PM »
Dear All,

I'm working with a project which must be georeferenced. For that, during the image capture we placed non-coded cross-type target within the study area.

I perform a preliminary alignment by using a reference calibration fixed (I planned to loose the camera parameters only at the end of workflow, during the final block optimization). Then, I used the tool for the automatic marker detection and matching. At the end of the computation, in the reference pane, all detected markers were displayed along with an error (pix) different from zero...

Considering that after applying the detection tool I did not perform any least squares adjustment or manual refining of (automatically) identified projections, I was wondering what the errors actually represent. At most, I could expect an image error equal to 0, since the projections are those identified by the software, and in any case "an error metric not in the least-squares sense"...

After that, I performed the same test, by using the guided approach for markers placing (i.e., "add marker" comand). Also in this case, after manually adding markers on aligned photos I get projections errors different from zero...

Not convinced by the results, I finally performed the adding of only one marker projection through the manual approach (i.e., "place marker" comand). With only one projections placed, the software returned an image error (also in this case different from zero).

It absolutely makes no sense...

Any idea?
Does the software automatically perform a least-squares-based triangulation (i.e., forward intersection) after placing marker on the images?

P.s. I'm working with Metashape v.1.5.1

Regards,
Mauro


2
General / Re: Import Calibration data from DJI XMP data
« on: August 11, 2023, 08:45:21 PM »
Dear Paulo,

You are right.

Now it seems ok.

Please find attached the correct code, if you need it.

Thanks as usual.

Mauro

3
General / Re: Import Calibration data from DJI XMP data
« on: August 11, 2023, 12:29:23 PM »
Dear Paulo,

Yes. Overall, I found a little bit of confusion between different formulas and conversion convention.

The formula from Brown is:
D_tang_x = 2P1xy + P2(r^2 + 2x^2)
D_tang_y = 2P2xy + P1(r^2 + 2y^2)

The formula from agisoft basically is just inverted:
D_tang_x = 2P2xy + P1(r^2 + 2x^2)
D_tang_y = 2P1xy + P2(r^2 + 2y^2)

So, since I'm using the formula from agisoft, in the parameter value conversion the values do not have to be inverted.

I don't understand where I'm wrong in this simple job.




4
General / Re: Import Calibration data from DJI XMP data
« on: August 10, 2023, 06:39:34 PM »
Dear Paulo,

A quick question based on your wide knowledge and experience in photogrammetry and agisoft.

Do you know how agisoft actually compute the distorsion plots in the camera calibration section? I want to be able to directly compare "at the pixel scale" different distorsion profiles..

I tried to made a code, which basically computes the distorsion values along the upper-right semi-diagonal of the frame, strating from the frame center (see file .py attached).

After converting the k1,2,3 and p1,2 vaues (for the used convention see below), the plots computed according the equations in the user manual appendix fit well only for the radial distorsion. Conversely, the tangential distorsion plot differs from the one reported in the camera calibration section (see image attached).

Could you help me?

Thanks in advance,
Mauro

For the parameter values conversion I used the following formulas (from Luhmann et al., 2019):
k1 pixel units = k1 focal units / focal length^2
k2 pixel units = k2 focal units / focal length^4
k3 pixel units = k3 focal units / focal length^6
p1 pixel units = p1 focal units / focal length
p2 pixel units = -p2 focal units / focal length


5
General / distorsion profiles computing
« on: August 10, 2023, 11:03:11 AM »
Dear All,

I'm wondering how metashape computes the distorsion profies in the camera calibration section.

I tried to implement a code to compute the distorsion profiles along the (upper-right) semi-diagonal of the frame, starting from the frame center.
However, I noticed some discrepancies between the tangential distorsion profiles (see attachments for image plots and code).

For the parameter values conversion I used the following formulas (from Luhmann et al., 2019):
k1 pixel units = k1 focal units / focal length^2
k2 pixel units = k2 focal units / focal length^4
k3 pixel units = k3 focal units / focal length^6
p1 pixel units = p1 focal units / focal length
p2 pixel units = -p2 focal units / focal length

For the distorsion computation I used the formulas in Appendix C of the user manual

Any feedback\suggestion on that?.
Thanks in advance,
Mauro

6
General / Re: BA-derived correlations between intrinsic and extrinsic
« on: November 17, 2022, 05:32:06 PM »
Dear Paulo,
Many thanks for your kind contribution.
MB

7
General / Re: BA-derived correlations between intrinsic and extrinsic
« on: November 17, 2022, 05:21:20 PM »
Dear Paulo,
As usual, many thanks for your kind reply.
If possible, I would take the opportunity of your knowledge for an issue I'm currently facing.
In detail, I'm trying to calibrate an off-the-shelf camera (DJI phantom 4 pro) through a targetless approach within Agisoft. From the analytical point of view, the problem is approached via bundle adjustment in free-network mode (i.e., without GCPs).
The object test field is basically composed of different textured houses, developed on a number of vertical planes. This would ensure a reliable FBM algorithm performance and a good range of depths along the viewing direction (see attachment).
I tried different survey strategies (e.g., nadir plus convergent, convergent 30 or 45°, central POI based, circular volumetric pattern, etc) in order to find the best solution for flight camera calibration.
Overall, the circular pattern with different radius and height plus nadir rolled images at the scene center results in the lowest correlations among the intrinsics parameters (see attachment). However, I think that the correlation between the principal distance and the radial distorsion parameters could be lower (based on my previous experience on DSLR). Do you have any tips in order to try to reduce these correlations?

8
General / Re: BA-derived correlations between intrinsic and extrinsic
« on: November 15, 2022, 09:23:27 PM »
Dear user and agisoft team,
Any tips on the topic?
Regards,
MB

9
General / Re: BA-derived correlations between intrinsic and extrinsic
« on: November 03, 2022, 12:35:01 PM »
Dear All,
Any suggestions?
Regards
MB

10
Python and Java API / Re: Exporting TPs with RMS error in xyz
« on: November 02, 2022, 11:40:32 PM »
Dear Paulo,
I'm involved in adjusting a script by adding the BA-derived tie points precisions.
In the previous discussion and posted code snippet you introduced the tranformation matrix T, which basically allows to transform the model (or "internal") coordinates into a local cartesian coordinates system. I was wondering why did you use this reference system for precisions computation insted of the geocentric (or the world reference) one? Am I missing some detail?
Many thanks.
Mauro

11
Bug Reports / Re: average tie point multiplicity
« on: October 31, 2022, 05:48:44 PM »
Dear Paulo,
What do you think about the implemented algorithm for angle computation?.
I performed some cross-check within CloudCompare and I found a general agreement.

Overall, I had some doubts relating to the different reference systems used within Metashape.
Basically, it uses three different reference systems (i.e. the chunk or "model or block" reference system, the geocentric reference system and the "world" reference system photogrammetrically speaking), which are basically related through a similarity transform (i.e., the chunk.transform and chunk.crs.project functions in the Metashape's python API, respectively). Therefore, considered this geometric shape-invariant link, it is my opinion that the computed intersections angles should be the same regardless the considered reference systems (as it is, based on the script outputs).

12
General / Re: question about camera model variance-covariance matrix
« on: October 31, 2022, 05:15:29 PM »
Dear All,
Does anyone have an idea?
Regards,
MB

13
Bug Reports / Re: average tie point multiplicity
« on: October 31, 2022, 03:29:40 PM »
Dear Paulo,
First of all, thank you very much.
Honestly, I don't use the metashape console for doing data anaysis and computation...
Usually I wotk within a pyhton-based IDE (i.e., spyder)...
The script for the computation of basic stats on the intersection angles might take a while since, in its actual basic configuration, it is not computational efficient (there are three nested for loops). However, it should end (at worst) in 10-15' (I think). Attached you can find a new version of the script and a screen capture of its basic output in a generic IDE.

14
Bug Reports / Re: average tie point multiplicity
« on: October 28, 2022, 11:08:38 AM »
Dear Paul,
Many thanks for your reply.
I noticed this discrepancy while working on a script aimed to evaluate the intersection angle between corresponding image rays (i.e. projections) for each point of the BA-derived sparse cloud.
Since you are a "guru" of Agisoft Metashape, could you provide me with a feedback on developed scripts?
The first attached script is for extracting the projections of each tie point within agisoft (along with data on tie points and cameras coordinates in the model, geocentric and world coordinates systems and re-projection errors), whereas the second one is for analysing in a generic python-based IDE the intersection angles (and it is based on the output of the first code).
Many thanks in advance.
Regards,
MB

15
Bug Reports / average tie point multiplicity
« on: October 27, 2022, 07:38:31 PM »
Dear all,
I was wondering how the average tie points multiplicity statistic (chunk info) is calculated.
Is it based on the valid correspondences (i.e. those remaining after outliers detection) or on the total found ones?.
I noticed a discrepancy between the reported statistic and the computed (by mysellf) average number of valid projections for each tie point.
Regards,
MB

Pages: [1] 2