In order to capture complete texture information of a surface with highly varied surface depth, I have taken photos at extreme angles. This captures the missing data, but introduces unwanted blur if I do not use masks...
I leave these photos disabled, use just the planar photos for texturing, disable hole filing, and then mark the holes by hand, filter by each marker, and re-enable the extremely angled photos that capture those side and underside surfaces, and then mask out everything but the portion I need to fill the hole.
This works perfectly fine but is
time consuming. This led me to wonder if it would be possible for Metashape to calculate the surface normal of each triangle/vertex, per camera, and their distance from the camera, and then enable the use of filter sliders and automatic masking.
One slider could filter cameras by distance from selected faces/points, to improve texture resolution by allowing easy disabling of user defined gradual selection of 'distant' cameras.
Another slider could instead mask selected cameras by face or marker selection, by the averaged or individual point normals to the camera angle. Metashape would first calculate the normal of every triangle, then calculate the 3D vector difference between the normal and the camera, to allow gradual removal by user defined amounts. This would mask all triangles/points per selected camera that have normal angles greater than a user defined angle.
A third slider could also mask points by distance from a selected camera, instead of disabling the entire camera. Highly useful for complex objects with variable height.
Here is an album illustrating what I am talking about, and I'm sorry I couldn't do better illustrations, and I was a bit hasty in coloring so this loosely represents the idea.
https://ibb.co/album/0CCt2pThe stack of magnets represent the surface normal, green is keep, red is mask, and yellow is when the surface normal angle is within the user's selected range, but has been also masked out, by distance from the camera.
If you wanted to get really fancy you could add an exception to allow excessive normal/camera angle differences
only if there are no other suitable cameras. So instead of simply masking everything above a 90 degree angle for a photo, it would keep that camera's points/faces that exceeded that value if there was no view with a lower angle.
Vector math makes my brain go blank so I understand this isn't a trivial request, but you are all geniuses in my book. Metashape already has the best masking and filtering tools, this would make it the undisputed texturing champion. Thank you for reading.
Share your thoughts or suggestions! Perhaps some element of this is already a function of Metashape texturing algorithm?