Hi Team - thank you very much for having me here. Loving the trial of Metashape that I'm running at the moment.
I hope that my request for support isn't too technical.
I want to create a simple game where the player is running through a scene. The scene will be created from a photogrammetry generated GLB model, for example, Buckingham Palace, London. I need to create "fake" GPS data in our 3D model of Buckingham Palace, so that we can make our system think that the real world object is located somewhere else (it needs to think that Buckingham Palace is actually on our "gaming set", in a big field, 50 miles outside London).
The game controls will be driven through the specific GPS of the user's handheld device, harvested through Geolocation. When the user is standing in our big field, our "game" system should place their location pin on the corresponding position in the Buckingham Palace model. When the user then walks 20m North, the icon, on the model, also moves 20m north too. The geolocation is live so it tracks their moves and, wherever they move, in our field / "gaming set", their icon moves appropriately and proportionately in the model.
I'm understanding Geolocation fine... and we've got some interesting models online already. I just need to make a connection between our glb model and the Geolocation API so that they both understand each other.
Is this something that I can bake deeply into the model when I process it through Metashape?
I look forward to hearing your insight! Many thanks in advance.