Hello,
I'm attempting to generate (outside of Metashape) additional 2d feature points between images that Metashape might have missed and triangulate these to 3d points, therefore I can concatenate the points from Metashape and the additionally generated points. It looks as if all the information is available in regards to camera and sensor information from the exported xml files. But I think I'm missing something as the generated points don't line up with the Metashape point cloud.
My method for computing the projection matrix is simple:
Eigen::Matrix<float, 3, 4> K; // from the sensor information
K << f, 0.0, cx, 0.,
0., f, cy, 0.,
0., 0., 1.0, 0.;
Eigen::Matrix<float, 3, 4> M;
M = camera.transformMatrix; // this is the transformation matrix of each camera
P = K * M;
This projection matrix is computed for each camera and then used in the single value matrix decomposition to calculate the 3d coordinate points:
Eigen::Matrix<float, 6, 6> M;
M << P0[0], P0[3], P0[6], P0[9], -pt0.x(), 0,
P0[1], P0[4], P0[7], P0[10], -pt0.y(), 0,
P0[2], P0[5], P0[8], P0[11], -1.0, 0,
P1[0], P1[3], P1[6], P1[9], 0, -pt1.x(),
P1[1], P1[4], P1[7], P1[10], 0, -pt1.y(),
P1[2], P1[5], P1[8], P1[11], 0, -1.0;
Eigen::JacobiSVD<Eigen::MatrixXf> svd(M, Eigen::ComputeThinV);
auto V = svd.matrixV().transpose();
Eigen::Vector4f xyzw = Eigen::Vector4f(V(30), V(31), V(32), V(33));
xyzw = xyzw / xyzw[3];
However, as I mentioned this doesn't seem to line up with the Metashape pointcloud, is there some scaling on the Z axis that I missed somewhere?
Any suggestion would be appreciated.
Best,