Hello everybody. I'm still working on my big project - Realtime 3D Village scanned by UAV - 3300 Images
I got problems to receive a model with good textures and form with this huge scale. So I found a good compromise to have something to work with. It was a heavy lifting from a 2 mio polygon model with lot of textures to a 50k model with one 8k texture. I tried all programs to get a good lowpoly uv setup and used 3D Max to project multitexture highpoly to singletexture lowpoly. But it worked.
Now I and my client are not yet satisfied with the quality. But getting good quality from one piece is almost impossible within the budget and with my hardware (I upgraded to xeon 6-core with 24 GB Ram).
So i tried to puzzle out the right images for a small piece of the scene and just reconstructed a single building. And voila good quality, good texture, small polygon size. The qualityfactor now is 1:1000. But it was really complicated to find the right images. Because the drone flight is systematically around the village, not the building.
So I thought its maybe a good idea to raise the sparse cloud value, to be able to slice my parts through preview rather than scanning through 3000 images by myself to find the right ones for specific buildings. Dense Cloud for whole scene takes too much time and is to heavy to work with. So i want to avoid this step and create dense cloud only for my specific area. I want to save as much time as possible.
What would you do? Are there any recommendations or expieriences? Just to compare a image with the old church (from the whole set) and the new one.