Author Topic: Decimate Point Cloud  (Read 221 times)


  • Newbie
  • *
  • Posts: 23
    • View Profile
Decimate Point Cloud
« on: April 10, 2021, 02:58:42 PM »

My workflow for handling the generated point clouds is.

1. Create the dense cloud in High quality.

2. Classify ground points.

3. Create DEM and Orthophoto.

However, the point cloud generated in the High option becomes extremely heavy, the solution found is to decimate the cloud, following the steps, select the cloud> filter point cloud> and specify the spacing, which I understand to be horizontal equidistant from each other.

At that moment, it generates a big flaw for the product I'm working on, which would be a precision model.

If there was a way to specify that in the horizontal distance "D" there is no variation in the Z axis greater than "x" meters, you can eliminate the points within this range, up to the specified limit. In this way the notable edges and points, even being decimated, will still be present, which generates solidity and precision to the point cloud.


In the office where I work they use the cluster option for processing, one difficulty we find is an option to better manage the 3 machines, such as cleaning the "last error", defining which machine can or cannot take a piece according to the current task . We have a good machine but two others are not so good, what would be interesting is to limit the access of these two to the task like finishing alignment, finishing build dense and others huge process.

Another difficulty I notice is that the Classify Ground Points does not have a division in the Cluster, I'm try the Classify Points, but still does not produce as good results as, and without division of Classify Ground absurdly sacrifices the processing, or way to define the limit how many GB of RAM it can access to process in the task, as it always generates an unexpected bad allocation, this option maybe save the day.

Best Regards,
« Last Edit: April 12, 2021, 06:51:08 AM by mrv2020 »