We regularly import meshes back into projected, either meshes made from photogrammetry that we then cleaned, repaired, or hole-filled in another program (meshlab, blender, etc) or import in structured light scanning models that we have pre-aligned to use the photographs to texture.
The meshes show as being aligned in Metashape, and generating textures work fine, showing the cameras and the model are oriented correctly.
However when we run a report it shows the imported model in a different location and outside the sphere of cameras.
Attached is an image from the report. Again, inside metashape the two models appear to be in the same place and generating textured from the cameras works fine on both (showing metashape knows where the models are relative to the cameras) but the report shows the models in very different places.