Forum

Author Topic: Align coodinate system to single camera  (Read 1184 times)

LukasG

  • Newbie
  • *
  • Posts: 1
    • View Profile
Align coodinate system to single camera
« on: September 13, 2018, 12:18:14 PM »
Hi,

I have an application where I rotate an object in front of my (calibrated) stereo camera rig. I cannot place markers on the scene. With the distance between my stereo-cameras I can scale the object to real world values.

Reconstruction of the Object works fine, however the final orientation of the model is arbitrary in the local coordinate system (as it is supposed with Photoscan).

What I need is a fixed coordinate system with the first camera as a referenz (e.g. Camera0 is sitting in the origin and having zero tilt/rotation etc.). Simply assigning a fixed position to the camera in the referenz pane does not help, as I would need at least three referenced cameras/Points, which I do not have. The only and important Information I have is the relativ position of the Object with respect to the camera for the very first image. I will aquire all scans with the same starting point and need them to be aligned to this position for further processing.

I already read, that you can "fake" a georeferenced coodinate system with the values of the local coordinate system by asigning
chunk.transform.matrix = chunk.transform.matrix

My question is now:
 How do I rotate the faked georeferenced coordinate system to the position and orientation of the first camera in my chunk via python scripting?