I've seen a few references to this bowl effect on pages here. i've also seen what looks like the bowl effect in data. In fact, this was in images taken with the UAV produced by the Swedish re-seller of Photoscan, SmartPlanes.
What causes this? Since the effect coveres a whole scene, in my case with over 200 images, I have always assumed that it can't be lens distortion since the bowl fits a 2nd order surface. When is this distorsion introduced? Alignment? Build Geometry? Georeferencing? Surely it's in one of the first 2 since, if I understand right, georeferencing just uses a rigid 7-parameter transform so that means the 'bowl has to be generated earlier'. Correct?
And what's the maths behind the optimisation? It it relying on a photogrammetry-style epipolar constraints approach?
Patrice