Forum

Author Topic: non destructive sparse point cloud optimisation  (Read 2331 times)

Alvaro L

  • Newbie
  • *
  • Posts: 40
    • View Profile
non destructive sparse point cloud optimisation
« on: June 05, 2018, 11:58:55 AM »
hi

I would be great if sparse point cloud optimisation via gradual selection could be converted into a non destructive step (points are not deleted but masked out) so we don't have to run a new camera aligment if the optimisation does not work in the dense cloud result.

Yoann Courtois

  • Sr. Member
  • ****
  • Posts: 316
  • Engineer in Geodesy, Cartography and Surveying
    • View Profile
Re: non destructive sparse point cloud optimisation
« Reply #1 on: June 05, 2018, 12:11:41 PM »
+1
--
Yoann COURTOIS
R&D Engineer in photogrammetric process and mobile application
Lyon, FRANCE
--

Alexey Pasumansky

  • Agisoft Technical Support
  • Hero Member
  • *****
  • Posts: 14847
    • View Profile
Re: non destructive sparse point cloud optimisation
« Reply #2 on: June 05, 2018, 12:26:16 PM »
Actually, sparse cloud points are not deleted when you "remove" them from the Model view. They are marked as invalid matching points and therefore become hidden.

But the optimization is a not-reversible operation that cannot be undone. So you should keep the duplicate of the original alignment, if you don't want to loose the results. Otherwise you will have to reset the camera alignment and use Align Selected Cameras option to re-align the set basing on the available set of matching points.
Best regards,
Alexey Pasumansky,
Agisoft LLC

Yoann Courtois

  • Sr. Member
  • ****
  • Posts: 316
  • Engineer in Geodesy, Cartography and Surveying
    • View Profile
Re: non destructive sparse point cloud optimisation
« Reply #3 on: June 05, 2018, 12:38:06 PM »
Hi Alexey !

Indeed they are not deleted but marked as invalid, but the result is the same: it's not possible to get invalided points back after optimization.
Even if the optimization is not reversible, it would be good to be able to get the invalided points back after optimization, so that it would be possible to re-optimize and get something almost identical than an "Undo" button.

Duplicate chunk is sometimes time-consuming while working with huge chunk, and because of filtering/optimize is a recursive duo to converge on an accurate result, it would nearly need to duplicate the chunk each time we apply some filters in order not to do a double-undo (or even more) on the recursive process.

Regards
--
Yoann COURTOIS
R&D Engineer in photogrammetric process and mobile application
Lyon, FRANCE
--