Agisoft Metashape
Agisoft Metashape => General => Topic started by: regiser on June 29, 2015, 11:44:16 AM
-
Hi,everyone,
Anybody knows the basic principle or algorithm behind the function of 'Estimate Image Quality'?
-
It looks at 'border sharpness' (contrast between pixels).
Personally I don't find it very useful, because it cannot accurately detect a small directional blur resulting from camera shake.
There is some more information in this thread:
http://www.agisoft.com/forum/index.php?topic=1924.msg10245#msg10245
-
There's a very similar thread going at the moment: http://www.agisoft.com/forum/index.php?topic=3981.0 (http://www.agisoft.com/forum/index.php?topic=3981.0)
Using the images from the previous post here, I tried the approach I'm currently exploring... using an edge filter on the image and measuring the stdev of the resulting image with ImageMagick (you can get the numbers without actually producing the images) I split the image into two, below the text, creating "sharp.tif" and "blurry.tif"
ImageMagick batch file:
convert @dir.txt -edge 2 -auto-level -format "%%f %%[fx:standard_deviation]" info: > stats2.txt
... where the files are listed in "dir.txt"
The result:
sharp.tif 0.135247
blurry.tif 0.0926683
There is no absolute value for a sharp or blurry image, but comparing similar images in a sequence may be feasible. Images with locally low values could be flagged for checking/ omission.
Attached the filtered images for reference.