Author Topic: What is the algorithm or principle behind the function of Estimate Image Quality  (Read 2855 times)


  • Newbie
  • *
  • Posts: 18
    • View Profile
     Anybody knows the basic principle or algorithm behind the function of 'Estimate Image Quality'?


  • Sr. Member
  • ****
  • Posts: 309
    • View Profile
It looks at 'border sharpness' (contrast between pixels).

Personally I don't find it very useful, because it cannot accurately detect a small directional blur resulting from camera shake.

There is some more information in this thread:


  • Sr. Member
  • ****
  • Posts: 406
    • View Profile
There's a very similar thread going at the moment:

Using the images from the previous post here, I tried the approach I'm currently exploring... using an edge filter on the image and measuring the stdev of the resulting image with ImageMagick (you can get the numbers without actually producing the images)  I split the image into two, below the text, creating "sharp.tif" and "blurry.tif"

ImageMagick batch file:
Code: [Select]
convert @dir.txt -edge 2 -auto-level -format "%%f %%[fx:standard_deviation]" info: > stats2.txt... where the files are listed in "dir.txt"

The result:
sharp.tif 0.135247
blurry.tif 0.0926683

There is no absolute value for a sharp or blurry image, but comparing similar images in a sequence may be feasible.  Images with locally low values could be flagged for checking/ omission.

Attached the filtered images for reference.