Forum

Author Topic: What is the algorithm or principle behind the function of Estimate Image Quality  (Read 2872 times)

regiser

  • Newbie
  • *
  • Posts: 18
    • View Profile
Hi,everyone,
     Anybody knows the basic principle or algorithm behind the function of 'Estimate Image Quality'?

Marcel

  • Sr. Member
  • ****
  • Posts: 309
    • View Profile
It looks at 'border sharpness' (contrast between pixels).

Personally I don't find it very useful, because it cannot accurately detect a small directional blur resulting from camera shake.

There is some more information in this thread:

http://www.agisoft.com/forum/index.php?topic=1924.msg10245#msg10245

bigben

  • Sr. Member
  • ****
  • Posts: 406
    • View Profile
There's a very similar thread going at the moment: http://www.agisoft.com/forum/index.php?topic=3981.0

Using the images from the previous post here, I tried the approach I'm currently exploring... using an edge filter on the image and measuring the stdev of the resulting image with ImageMagick (you can get the numbers without actually producing the images)  I split the image into two, below the text, creating "sharp.tif" and "blurry.tif"

ImageMagick batch file:
Code: [Select]
convert @dir.txt -edge 2 -auto-level -format "%%f %%[fx:standard_deviation]" info: > stats2.txt... where the files are listed in "dir.txt"

The result:
sharp.tif 0.135247
blurry.tif 0.0926683

There is no absolute value for a sharp or blurry image, but comparing similar images in a sequence may be feasible.  Images with locally low values could be flagged for checking/ omission.

Attached the filtered images for reference.