As we will have large production projects in the future and we just built a powerful server it, we want to find some large dataset to do stress test with our stacks. But it seems it is quite hard to find on Internet, all I found so far are:
https://www.sensefly.com/drones/example-datasets.htmlhttps://dronemapper.com/sample_dataI want to test some dataset as large as 10,000+ images, with geotagging and ideally with GCPs.
Is it possible to have larger dataset for test and training, or we can have other ways to build large datasets, e.g. combine all small datasets around the world into one/ dulplicating files/ buy datasets from providers?