I finally found good settings to seperate ground points from buildings and trees in a cloud generated from vertical photos.
Unfortunately, the result is a little bit noisy: within a building, a few points are classified as ground. This is annoying because these misclassified points generate "spikes" in the model.
This is not very different from the noise you get from supervised classification of satellite images. To remove this effect, we usually apply a smooting filter adapted to a classified image (discrete values, in opposition of continuous values found in an image): affect to each pixel the most frequent class (mode) found in a moving windows of 3x3 or 5x5 pixels. Another solution is to apply a suite of dilatation / erosion of each class.
I was wondering if a similar function exists to clean a classified point cloud: analyse the 3D neighbouring of each point (or cell) and recalss it in the dominant class.
I found similar functions in Meshlab (Ball-Pivoting and Poisson Disk), but Meshlab is really not friendly with large datasets, and these seems more oriented on cleaning meshes, while I rather want to clean point cloud.
Alternatively, a similar functionality will be great to smooth continuous values: give each point the dominant colour of its surrounding.
And let me dream a little bit further: smooting not only the class or point colour, but also the point elevation ?
I'm convinced some python magician already worked on this !
Thanks for your help