Could anyone shade a light on this topic,
I have seen that in the latest release there is the possibility to load the depth image. Metashape.app.document.chunk.importLaserScans(filenames = ["/path/to/depth/image-1"], color_filenames = ["/path/to/corresponding/color/image-1"], image_path = "/path/to/destination/folder/{filename}.tif");
Now I have a question, the new iPhone 13 pro saves RGB images and depth map files from its lidar sensor. My RGB images are geo tag . When I load the images using the above line of code. I observed in the destination folder file is created but all the geotag and roll yaw pitch orientation in the image is lost. Furthermore, iPhone RGB and depth image resolution are different. How it's going to work. Furthermore when I tried to export the mosaic dense pointcloud. there is no ability to export the geo rectified pointcloud.
Am I missing something? In this process or it is just an experimental feature.
Thanks