Forum

Author Topic: How to convert Euler angles to Yaw, Pitch, & Roll (what are PS's conventions?)  (Read 17444 times)

lena

  • Newbie
  • *
  • Posts: 15
    • View Profile
sorry it was my mistake (
but now I have PhotoScan.app.document.chunk.importCameras(path = "c:/studing/germany/diplomaITN/RNY.txt", format = "opk")
error  TypeError: argument 2 must be PhotoScan.CamerasFormat, not str

Alexey Pasumansky

  • Agisoft Technical Support
  • Hero Member
  • *****
  • Posts: 13289
    • View Profile
Hello lena,

If you have switched to the version 1.3.2, you can just use Import CSV dialog from GUI and choose omega, phi, kappe angle convention.
Best regards,
Alexey Pasumansky,
Agisoft LLC

Gwen_sl

  • Newbie
  • *
  • Posts: 1
    • View Profile
Hello Vitalijus,

You can use the following function Python function from the Console pane to load the reference data using OPK data:
Code: [Select]
PhotoScan.app.document.chunk.importCameras(path = "d:/file.txt", format = "opk")At the moment, .importCameras() Python API function assumes that each line contains the following information in the same order:
Quote
camera_label x-coord y-coord z-coord omega phi kappa

So using this function (with the proper filepath, of course) will automatically convert OPK angles and load them to the Reference pane as YPR.


Hello Alexey, does this work on the 1.2.6 version of PhotoScan ?
I ran the line throught the console, the process finished but I don't see any result... nothing load to the reference pane
« Last Edit: June 27, 2018, 10:50:32 AM by Gwen_sl »

lfreguete

  • Newbie
  • *
  • Posts: 7
    • View Profile
I am having hte same problem here.
I had a file imported from Pix4D with all the coordinates of the calibrated cameras. However, the camera orientation are in opk.
I tried to use this solution
yaw, pitch, roll = PhotoScan.utils.mat2ypr(PhotoScan.utils.opk2mat(Metashape.Vector((omega, phi, kappa))).t())

but the resulting angles are way far from what is expected to be. I am trying the solution form this topic now, but having issues with it (error message in the attachments)
Omega is a list of all the cameras' omegas or just one value?


Hello dellagiu,

Using the following script you should be able to generate camera transformation matrix from OPK data:


Code: [Select]
import math, PhotoScan

#omega, phi, kappa - in radians
#X, Y, Z - coordinate information about camera position in units of the corresponding coordinate system


T = chunk.transform.matrix
v_t = T * PhotoScan.Vector( [0, 0, 0, 1] )
v_t.size = 3
m = chunk.crs.localframe(v_t)
m = m * T
s = math.sqrt(m[0, 0] **2 + m[0,1] **2 + m[0,2] **2) #scale factor

sina = math.sin(0 - omega)
cosa = math.cos(0 - omega)
Rx = PhotoScan.Matrix([[1, 0, 0], [0, cosa, -sina], [0, sina, cosa]])
sina = math.sin(0 - phi)
cosa = math.cos(0 - phi)
Ry = PhotoScan.Matrix([[cosa, 0, sina], [0, 1, 0], [-sina, 0, cosa]])
sina = math.sin(0 - kappa)
cosa = math.cos(0 - kappa)
Rz = PhotoScan.Matrix([[cosa, -sina, 0], [sina, cosa, 0], [0, 0, 1]])
 

t = PhotoScan.Vector([X, Y, Z])
t = chunk.crs.unproject(t)

m = chunk.crs.localframe(t)
m = PhotoScan.Matrix([ [m[0,0], m[0,1], m[0,2]], [m[1,0], m[1,1], m[1,2]], [m[2,0], m[2,1], m[2,2]] ])


R = m.inv() * (Rz * Ry * Rx).t()  * PhotoScan.Matrix().diag([1, -1, -1])

Tr = PhotoScan.Matrix([ [R[0,0], R[0,1], R[0,2], t.x], [R[1,0], R[1,1], R[1,2], t.y], [R[2,0], R[2,1], R[2,2], t.z], [0, 0, 0, 1]])

camera.transform = chunk.transform.matrix.inv() * Tr * (1. / s)

Paulo

  • Hero Member
  • *****
  • Posts: 804
    • View Profile
Hi lfreguete,

if you have a file from Pix4D with calibrated external camera parameters with following format (imageName X Y Z Omega Phi Kappa):
Code: [Select]
imageName X Y Z Omega Phi Kappa
DJI_0002.JPG 486862.202652 2626448.858317 332.437667 19.519819 -3.285101 -9.034320
DJI_0003.JPG 486863.093666 2626460.003104 332.686111 19.595224 -4.117747 -10.382848
DJI_0004.JPG 486864.173630 2626477.601380 333.275810 19.698628 -3.958097 -9.997219
DJI_0005.JPG 486865.410026 2626495.172434 333.694031 19.657402 -4.120313 -10.610904
DJI_0006.JPG 486866.539559 2626513.363608 333.610094 19.601182 -4.326155 -11.133216
DJI_0007.JPG 486866.364116 2626516.853950 332.856768 0.254071 -19.187457 -89.557675
DJI_0008.JPG 486874.845835 2626515.362043 332.100779 -3.662028 -19.171621 -99.991631
DJI_0009.JPG 486892.509257 2626513.608562 332.517719 -4.123422 -19.251533 -100.363356
DJI_0010.JPG 486896.622590 2626512.994224 332.621008 -19.643082 2.991298 171.805297
DJI_0011.JPG 486895.363848 2626503.405120 331.978818 -19.963121 3.830133 171.362829
DJI_0012.JPG 486893.844702 2626485.680885 331.834505 -20.021977 3.795313 171.967951

then you can use folowing command to set angle definition to OPK
Code: [Select]
Metashape.app.document.chunk.euler_angles = Metashape.EulerAngles.EulerAnglesOPKbefore using chunk.importReference() to import above file and orientation angles will all be taken into account as OPK...

see following attachment....

« Last Edit: June 25, 2021, 04:11:44 AM by Paulo »
Best Regards,

Paul Pelletier

lfreguete

  • Newbie
  • *
  • Posts: 7
    • View Profile
Good evening, Mr. Pelletier.

Thank you for your suggestion.
My situation is the following: I was able to import the cameras' external parameters file and have the orientation angles as OPK. But for the next steps of my workflow, I need those angles in YPK.
I used the function yaw, pitch, roll = Metashape.utils.mat2ypr(Metashape.utils.opk2mat(Metashape.Vector((omega, phi, kappa))).t()) but I am not very sure the outputs are right,
For example, when importing the images, the first camera presents a Ywas = 142 while the output from this conversion gives 124. I know that the first yaw value is before the camera optimization, but even though it shouldn't have such a difference. Should it?

Is this the correct way to go?

Thanks a lot.

Paulo

  • Hero Member
  • *****
  • Posts: 804
    • View Profile
Ok,

if you have your camera reference rotation in OPK format and want to convert to YPR then following code would do this for all cameras in chunk (including applying grid convergence for Transverse Mercator Projected CS):
Code: [Select]
crs = chunk.crs # CRS ("EPSG::32614") UTM 14
cm = -99 # central meridian of Transverse mercator projection  ("EPSG::32614")
for c in chunk.cameras:
    pg = crs.transform(c.reference.location, crs, crs.geogcs) # camera center in geographic CS
    gc = math.atan(-math.sin(pg.y/180*math.pi)*math.tan((pg.x-(cm))/180*math.pi))/math.pi*180 # grid convergence in degrees
    c.reference.rotation = Metashape.utils.mat2ypr(Metashape.utils.opk2mat(c.reference.rotation)) - Metashape.Vector((gc,0,0))

chunk.euler_angles = Metashape.EulerAngles.EulerAnglesYPR

Hope this helps,

PS you had a .t() (transpose of Metashape.utils.opk2mat() Matrix) which produced the error in your formula
« Last Edit: June 26, 2021, 06:10:52 AM by Paulo »
Best Regards,

Paul Pelletier

lfreguete

  • Newbie
  • *
  • Posts: 7
    • View Profile
I actually found another way of solving it. This way I was able to import the cameras coordinates and convert the opk to ypr.
Here I am sharing a snippet of the code

##APLYING CONVERSION OPK2PYR
chunk.euler_angles = Metashape.EulerAngles.EulerAnglesOPK
# chunk.importCameras(path = os.path.join(path,filename), format = Metashape.CamerasFormatOPK)
crs = Metashape.CoordinateSystem("EPSG::6440")
chunk.crs = crs
chunk.importReference(path=os.path.join(path,filename), format=Metashape.ReferenceFormatCSV, columns='nxyzabc', crs = crs, delimiter=' ', create_markers=False)

for c in chunk.cameras:
    c.reference.rotation = Metashape.utils.mat2ypr(Metashape.utils.opk2mat(c.reference.rotation))   
    c.reference.location = c.reference.location



Paulo

  • Hero Member
  • *****
  • Posts: 804
    • View Profile
Hi lfreguete,

the code you shared does not take into account grid convergence for your LCC crs.... to take this into account and thus conform with using the GUI Reference pane convert button to transform from Omega, Phi, Kappa rotation angles to Yaw, Pitch, Roll I would use following snippet:
Code: [Select]
crs = chunk.crs #  CRS("EPSG::6440")
cm = -84.5 # central meridian of projection ("EPSG::6440")
phi0 = 29 # latitude of origin of projection ("EPSG::6440")
for c in chunk.cameras:
    pg = crs.transform(c.reference.location, crs, crs.geogcs) # camera center in geographic CS
    if cm < 0: #Western Hemisphere
        gc = (cm - pg.x)*math.sin(phi0/180*math.pi) # grid convergence in degrees
    else: #Eastern Hemisphere
        gc = (pg.x - cm)*math.sin(phi0/180*math.pi) # grid convergence in degrees
    c.reference.rotation = Metashape.utils.mat2ypr(Metashape.utils.opk2mat(c.reference.rotation)) - Metashape.Vector((gc,0,0))
chunk.euler_angles = Metashape.EulerAngles.EulerAnglesYPR
« Last Edit: June 27, 2021, 02:52:13 AM by Paulo »
Best Regards,

Paul Pelletier

lfreguete

  • Newbie
  • *
  • Posts: 7
    • View Profile
Do you mind if I share here the files I have gotten so far with the current code?
The ypk output I got seem to be reasonable. I am not familiar with the metashape functions so I am not sure what is the math behind the Metashape.utils.mat2ypr(Metashape.utils.opk2mat(c.reference.rotation)) .
From the manual, I got that opk2mat calculates to world rotation values, but I am not sure if this takes on account the defined crs ( as it is done in the code) or if it considers a default crs.
Would you know more details about it?

Here I am sharing the input,output and code files.



Paulo

  • Hero Member
  • *****
  • Posts: 804
    • View Profile
Yes freguete,

the formula you use basically transforms from OPK to YPR angle format except for yaw as it does not account for grid convergence. I took your original input file and adapted it to a set of 46 images I have by changing the camera labels as :
Code: [Select]
camera_label x-coord y-coord z-coord omega phi kappa
DJI_0002.JPG 565913.846998 74514.171977 -0.657877 -21.028216 21.843247 136.042348
DJI_0003.JPG 565910.080575 74511.003873 -0.288412 -21.380630 21.681882 136.641800
DJI_0004.JPG 565906.363962 74507.794833 -0.358439 -21.306807 21.878936 136.148183
DJI_0005.JPG 565902.260776 74504.865416 -0.481662 -21.160047 22.118499 135.547627
DJI_0006.JPG 565898.224392 74501.820554 -0.392458 -21.689193 21.712659 136.759011
DJI_0007.JPG 565894.457041 74498.661131 -0.470489 -21.349519 22.132705 135.688980
DJI_0008.JPG 565890.589424 74495.650935 -0.676629 -21.379259 22.167888 135.624986
DJI_0009.JPG 565886.460579 74492.673041 -0.595070 -21.602013 21.977819 136.188767
DJI_0010.JPG 565882.538302 74489.481247 -0.242720 -21.550776 22.054879 135.906153
DJI_0011.JPG 565878.567562 74486.685286 -0.290945 -21.697483 21.963246 136.208336
DJI_0012.JPG 565874.602135 74483.630432 -0.447463 -21.911418 21.770636 136.668871
DJI_0013.JPG 565870.826706 74480.461558 -0.777970 -22.060916 21.584828 137.189448
DJI_0014.JPG 565866.834995 74477.410824 -0.952921 -21.848278 21.775524 136.570056
DJI_0015.JPG 565862.834631 74474.253236 -1.008897 -22.011916 21.556954 137.119008
DJI_0016.JPG 565858.818229 74471.158883 -0.909844 -22.181409 21.386093 137.517229
DJI_0017.JPG 565854.935664 74468.116299 -1.103492 -22.170659 21.375316 137.479798
DJI_0018.JPG 565850.566758 74466.541319 -1.311738 -8.081117 28.974924 105.324993
DJI_0020.JPG 565851.104513 74474.579004 -1.740644 22.572490 -19.670639 -37.676510
DJI_0021.JPG 565854.921899 74477.600035 -1.659055 20.860966 -21.775399 -43.064911
DJI_0022.JPG 565858.829298 74480.596110 -1.641437 20.803613 -22.096242 -43.706114
DJI_0023.JPG 565862.858255 74483.614675 -1.689442 20.742128 -22.295139 -44.131851
DJI_0024.JPG 565866.802896 74486.718947 -1.668561 20.700747 -22.477366 -44.330603
DJI_0025.JPG 565870.786147 74489.856036 -1.797701 20.647194 -22.655752 -44.712035
DJI_0026.JPG 565874.715473 74492.982431 -2.001021 20.692205 -22.746005 -44.725261
DJI_0027.JPG 565878.733281 74496.125403 -2.364634 20.735845 -22.793469 -44.714021
DJI_0028.JPG 565882.786222 74499.178371 -2.486617 20.795541 -22.781591 -44.759832
DJI_0029.JPG 565886.758338 74502.315182 -2.536821 20.726457 -22.911718 -44.969943
DJI_0030.JPG 565890.712469 74505.318011 -2.665926 20.817060 -22.915449 -44.835342
DJI_0031.JPG 565894.636050 74508.313332 -2.546445 20.681909 -23.093720 -45.180787
DJI_0032.JPG 565898.559073 74511.576936 -2.551850 20.768272 -23.032816 -45.107745
DJI_0033.JPG 565902.456375 74514.613226 -2.643957 20.895806 -22.949886 -44.869494
DJI_0034.JPG 565906.422417 74517.724222 -2.668356 20.981092 -22.941803 -44.832848
DJI_0035.JPG 565910.427000 74520.885262 -2.540308 20.864855 -22.898073 -44.690226
DJI_0036.JPG 565912.811810 74525.437087 -4.425587 23.245210 -18.137805 -26.325642
DJI_0038.JPG 565905.514698 74526.432652 -2.865125 -17.507351 24.018584 127.861339
DJI_0039.JPG 565901.319210 74523.557086 -2.478111 -21.044487 21.263126 136.480287
DJI_0040.JPG 565897.641480 74520.304116 -2.269319 -21.113689 21.337865 136.309846
DJI_0041.JPG 565893.553275 74517.291487 -2.121621 -21.086772 21.534844 135.981986
DJI_0042.JPG 565889.477279 74514.222810 -2.003707 -21.733681 21.054012 137.490788
DJI_0043.JPG 565885.472875 74511.127472 -1.928966 -21.882600 20.999526 137.746865
DJI_0044.JPG 565881.656016 74508.023781 -1.911717 -22.059524 21.024531 137.848303
DJI_0045.JPG 565877.758096 74504.893756 -1.965217 -22.016090 21.187286 137.690054
DJI_0046.JPG 565873.629848 74501.777767 -1.959388 -22.332511 20.957530 138.270593
DJI_0047.JPG 565869.708421 74498.586067 -1.994690 -22.338843 20.999324 138.314003
DJI_0048.JPG 565865.958894 74495.490147 -2.237834 -22.296054 21.098220 137.986118
DJI_0049.JPG 565861.878677 74492.377363 -2.096247 -21.935394 21.501432 136.807028
then i used your code (with a few changes) to importReference to these 46 cameras as OPK then transform to YPR and exportReference to a Test_ypr.txt file... see 1st attachement
But before transforming to YPR and exporting, I used Convert button from Reference pane to convert rotation angles from Omega, Phi, Kappa to Yaw, Pitch, Roll and saved  Camera reference as Test_ypr_GUI.txt.
Comparing the 2 files we see that Yaw is slightly different (about 0.18 degrees) which corresponds to grid convergence (in previous post I showed how to calculate convergence)..see 2nd attachement

Also attached 2 exported ypr files....
« Last Edit: June 26, 2021, 07:55:42 PM by Paulo »
Best Regards,

Paul Pelletier

Paulo

  • Hero Member
  • *****
  • Posts: 804
    • View Profile
Hi again,

I think I found a more elegant way to code OPK to YPR reference rotation angles transformation without using complex grid convergence formula. It corresponds to Rotation angles values transformed from Omega, Phi, Kappa to Yaw, pitch, Roll in GUI using Convert button from reference pane. The code is:
Code: [Select]
chunk = Metashape.app.document.chunk
crs = chunk.crs #  chunk Coordinate System
crsg = crs.geogcs #  chunk Geographic Coordinate System
for c in chunk.cameras:
    p = crs.unproject(c.reference.location) # camera center in geocentric coordinates (ecef)
    m = (crs.localframe(p)).rotation() # rotation transformation from ecef to crs
    m1 = (crsg.localframe(p)).rotation() # rotation transformation from ecef to crsg
    c.reference.rotation = Metashape.utils.mat2ypr(m1*m.inv()*Metashape.utils.opk2mat(c.reference.rotation))   
chunk.euler_angles = Metashape.EulerAngles.EulerAnglesYPR
and to do YPR to OPK transformation, just switch opk to ypr and m to m1 as:
Code: [Select]
chunk = Metashape.app.document.chunk
crs = chunk.crs #  chunk Coordinate System
crsg = crs.geogcs #  chunk Geographic Coordinate System
for c in chunk.cameras:
    p = crs.unproject(c.reference.location) # camera center in geocentric coordinates
    m = (crs.localframe(p)).rotation()
    m1 = (crsg.localframe(p)).rotation()
    c.reference.rotation = Metashape.utils.mat2opk(m*m1.inv()*Metashape.utils.ypr2mat(c.reference.rotation))   
chunk.euler_angles = Metashape.EulerAngles.EulerAnglesOPK
Included is your original python script adapted with correct  OPK2YPR transformation...
Hope this can be useful !
« Last Edit: June 27, 2021, 09:22:57 PM by Paulo »
Best Regards,

Paul Pelletier

lfreguete

  • Newbie
  • *
  • Posts: 7
    • View Profile
Ah! I see what the coding is doing.
It just uses now the transformation matrices between the 2 orientation systems.

I very much appreciate yours so throughout attention to this issue! It was a huge help indeed!