skip to main content
research-article

Individual Differences in Image-Quality Estimations: Estimation Rules and Viewing Strategies

Published: 28 May 2016 Publication History

Abstract

Subjective image-quality estimation with high-quality images is often a preference-estimation task. Preferences are subjective, and individual differences exist. Individual differences are also seen in the eye movements of people. A task's subjectivity can result from people using different rules as a basis for their estimation. Using two studies, we investigated whether different preference-estimation rules are related to individual differences in viewing behaviour by examining the process of preference estimation of high-quality images. The estimation rules were measured from free subjective reports on important quality-related attributes (Study 1) and from estimations of the attributes’ importance in preference estimation (Study 2). The free reports showed that the observers used both feature-based image-quality attributes (e.g., sharpness, illumination) and abstract attributes, which include an interpretation of the image features (e.g., atmosphere and naturalness). In addition, the observers were classified into three viewing-strategy groups differing in fixation durations in both studies. These groups also used different estimation rules. In both studies, the group with medium-length fixations differed in their estimation rules from the other groups. In Study 1, the observers in this group used more abstract attributes than those in the other groups; in Study 2, they considered atmosphere to be a more important image feature. The study shows that individual differences in a quality-estimation task are related to both estimation rules and viewing strategies, and that the difference is related to the level of abstraction of the estimations.

Supplementary Material

a14-radun-apndx.pdf (radun.zip)
Supplemental movie, appendix, image and software files for, Individual Differences in Image-Quality Estimations: Estimation Rules and Viewing Strategies

References

[1]
Timothy J. Andrews and David M. Coppola. 1999. Idiosyncratic characteristics of saccadic eye movements when viewing different visual environments. Vision Research 39, 17, 2947--2953.
[2]
James R. Bettman, Mary F. Luce, and John W. Payne. 1998. Constructive consumer choice processes. Journal of Consumer Research 25, 3, 187--217.
[3]
Simone Bianco, Arcangelo R. Bruna, Filippo Naccari, and Raimondo Schettini. 2013. Color correction pipeline optimization for digital cameras. Journal of Electronic Imaging 22, 2, 023014.
[4]
Elina Birmingham, Walter F. Bischof, and Alan Kingstone. 2009. Saliency does not account for fixations to eyes within social scenes. Vision Research 49, 24, 2992--3000.
[5]
Walter R. Boot, Ensar Becic, and Arthur F. Kramer. 2009. Stable individual differences in search strategy? The effect of task demands and motivational factors on scanning strategy in visual search. Journal of Vision 9, 3, 7.1--16.
[6]
Ali Borji and Laurent Itti. 2014. Defending Yarbus: Eye movements reveal observers’ task. Journal of Vision 14, 1--22.
[7]
Monica S. Castelhano and Chelsea Heaven. 2010. The relative contribution of scene context and target features to visual search in scenes. Attention, Perception, & Psychophysics 72, 5, 1283--1297.
[8]
Monica S. Castelhano and John M. Henderson. 2008. Stable individual differences across images in human saccadic eye movements. Canadian Journal of Experimental Psychology 62, 1, 1--14.
[9]
Marianne DeAngelus and Jeff B. Pelz. 2009. Top-down control of eye movements: Yarbus revisited. Visual Cognition 17, 790--811.
[10]
Christine DiStefano and Diana Mindrila. 2013. Cluster analysis. In Handbook of Quantitative Methods for Educational Research, Timothy Teo (Ed.). SensePublishers, Rotterdam, The Netherlands,103--122.
[11]
Michael Dorr, Thomas Martinetz, Karl R. Gegenfurtner, and Erhardt Barth. 2010. Variability of eye movements when viewing dynamic natural scenes. Journal of Vision 10, 10, 28.
[12]
Wolfgang Einhäuser, Ueli Rutishauser, and Christof Koch. 2008. Task-demands can immediately reverse the effects of sensory-driven saliency in complex visual stimuli. Journal of Vision 8, 2, 2.
[13]
Rony. Ferzli and Lina J. Karam. 2009. A no-reference objective image sharpness metric based on the notion of just noticeable blur (JNB). IEEE Transactions on Image Processing 18, 4, 717--728.
[14]
Michelle R. Greene, Tommy Liu, and Jeremy M. Wolfe. 2012. Reconsidering Yarbus: A failure to predict observers’ task from eye movement patterns. Vision Research 62, 1--8.
[15]
Amin Haji-Abolhassani and James J. Clark. 2014. An inverse Yarbus process: Predicting observers’ task from eye movement patterns. Vision Research 103C, 127--142.
[16]
James A. Hanley. 2003. Statistical analysis of correlated data using generalized estimating equations: An orientation. American Journal of Epidemiology 157, 4, 364--375.
[17]
John M. Henderson, G. L. Malcolm, and C. Schandl. 2009. Searching in the dark: Cognitive relevance drives attention in real-world scenes. Psychonomic Bulletin & Review 16, 5, 850--856. ER
[18]
John M. Henderson, Antje Nuthmann, and Steven G. Luke. 2013. Eye movement control during scene viewing: Immediate effects of scene luminance on fixation durations. Journal of Experimental Psychology: Human Perception and Performance 39, 2, 318--22.
[19]
Timotheus J. W. M. Janssen and Frans J. J. Blommaert. 2000. A computational approach to image quality. Displays 21, 4, 129--142.
[20]
Dinesh Jayaraman, Anish Mittal, Anush K. Moorthy, and Alan C. Bovik. 2012. Objective quality assessment of multiply distorted images. In Conference Record — Asilomar Conference on Signals, Systems and Computers. 1693--1697.
[21]
Christoph P. Kaller, Benjamin Rahm, Kristina Bolkenius, and Josef M. Unterrainer. 2009. Eye movements and visuospatial problem solving: Identifying separable phases of complex cognition. Psychophysiology 46, 4, 818--30.
[22]
Wen-Chung Kao, Sheng-Hong Wang, Lien-Yang Chen, and Sheng-Yuan Lin. 2006. Design considerations of color image processing pipeline for digital cameras. IEEE Transactions on Consumer Electronics 52, 4, 1144--1152.
[23]
Brian W. Keelan. 2002. Handbook of Image Quality - Characterization and Prediction. Marcel Dekker, Inc., New York, NY.
[24]
Arie W. Kruglanski and Gerd Gigerenzer. 2011. Intuitive and deliberate judgments are based on common principles. Psychological Review 118, 1, 97--109.
[25]
Eric C. Larson, Cuong Vu, and Damon M. Chandler. 2008. Can visual fixation patterns improve image fidelity assessment?. In 15th IEEE International Conference on Image Processing. IEEE, 2572--2575.
[26]
Tuomas Leisti, Jenni Radun, Toni Virtanen, Göte Nyman, and Jukka Häkkinen. 2014. Concurrent explanations can enhance visual decision making. Acta Psychologica 145, 65--74.
[27]
Hantao Liu and Ingrid Heynderickx. 2011. Visual attention in objective image quality assessment: Based on eye-tracking data. IEEE Transactions on Circuits and Systems for Video Technology 21, 7, 971--982.
[28]
Geoffrey R. Loftus. 1985. Picture perception: Effects of luminance on available information and information-extraction rate. Journal of Experimental Psychology: General 114, 3, 342--356.
[29]
Pina Marziliano, Frederic Dufaux, Stefan Winkler, and Touradj Ebrahimi. 2004. Perceptual blur and ringing metrics: Application to JPEG2000. Signal Processing: Image Communication 19, 2, 163--172.
[30]
Mark Mills, Andrew Hollingworth, Stefan Van der Stigchel, Lesa Hoffman, and Michael D. Dodd. 2011. Examining the influence of task set on eye movements and fixations. Journal of Vision 11, 8, 17.
[31]
Niranjan D. Narvekar and Lina J. Karam. 2011. A no-reference image blur metric based on the cumulative probability of blur detection (CPBD). IEEE Transactions on Image Processing 20, 9, 2678--2683.
[32]
Mikko Nuutinen, Toni Virtanen, Olli Rummukainen, and Jukka Häkkinen. 2015. VQone MATLAB toolbox: A graphical experiment builder for image and video quality evaluations: VQone MATLAB toolbox. Behavioral Research Methods 48, 1, 138--150.
[33]
Göte Nyman, Jenni E. Radun, Tuomas Leisti, and Tero Vuori. 2005. From image fidelity to subjective quality: A hybrid qualitative/quantitative methodology for measuring subjective image quality for different image contents. In Proceedings of the 12th International Display Workshops (IDW’05). Takamatsu, Japan, 1820.
[34]
Romina Palermo and Gillian Rhodes. 2007. Are you always on my mind? A review of how face perception and attention interact. Neuropsychologia 45, 1, 75--92.
[35]
John W. Payne, James R. Bettman, and David A. Schkade. 1999. Measuring constructed preferences: Towards a building code. Journal of Risk and Uncertainty 19, 243--270.
[36]
Marius Pedersen, Nicolas Bonnier, Jon Yngve Hardeberg, and Fritz Albregtsen. 2010. Attributes of image quality for color prints. Journal of Electronic Imaging 19, 1, 011016.
[37]
Jenni Radun et al. 2008. Content and quality: Interpretation-based estimation of image quality. ACM Transactions on Applied Perception 4, 4. ER
[38]
Jenni Radun, Tuomas Leisti, Toni Virtanen, Göte Nyman, and Jukka Häkkinen. 2014. Why is quality estimation judgment fast? Comparison of gaze control strategies in quality and difference estimation tasks. Journal of Electronic Imaging 23, 6, 061103.
[39]
Jenni Radun, Tuomas Leisti, Toni Virtanen, Tero Vuori, Göte Nyman, and Jukka Häkkinen. 2010. Evaluating the multivariate visual quality performance of image-processing components. ACM Transactions on Applied Perception 7, 3, 1--16. ER
[40]
Rayeev Ramanath, Wesley E. Snyder, Youngjun Yoo, and Mark S. Drew. 2005. Color image processing pipeline. IEEE Signal Processing Magazine 22, 1, 34--43.
[41]
Keith Rayner. 2009. Eye movements and attention in reading, scene perception, and visual search. Quarterly Journal of Experimental Psychology 62, 8, 1457--1506. ER
[42]
Keith Rayner, Xingshan Li, Carrick C. Williams, Kyle R. Cave, and Arnold D. Well. 2007. Eye movements during information processing tasks: Individual differences and cultural effects. Vision Research 47, 21, 2714--26.
[43]
Erik Reinhard, Erum Arif Khan, Ahmet Oguz Akyz, and Garrett M. Johnson. 2008. Color Imaging: Fundamentals and Applications, A. K. Peters, Ltd., Natick, MA.
[44]
Keith E. Stanovich and Richard F. West. 2000. Individual differences in reasoning: Implications for the rationality debate?. Behavioral and Brain Sciences 23, 5, 645--+.
[45]
Benjamin W. Tatler. 2007. The central fixation bias in scene viewing: Selecting an optimal viewing position independently of motor biases and image feature distributions. Journal of Vision 7, 14, 4.1--17.
[46]
Caleb Warren, A. Peter Mcgraw, and Leaf Van Boven. 2011. Values and preferences: Defining preference construction. Wiley Interdisciplinary Reviews: Cognitive Science 2, 193--205.
[47]
Cuong T. Vu, Eric C. Larson, and Damon M. Chandler. 2008. Visual fixation patterns when judging image quality: Effects of distortion type, amount, and subject experience. In 2008 IEEE Southwest Symposium on Image Analysis and Interpretation. IEEE, 73--76.
[48]
Alfred L. Yarbus. 1967. Eye Movements and Vision, Plenum Press, New York, NY.
[49]
Jianping Zhou and John Glotzbach. 2007. Image pipeline tuning for digital cameras. In 2007 IEEE International Symposium on ConsumerS Electronics. IEEE, 1--4.

Cited By

View all
  • (2022)The Fewer Reasons, the More You Like It! How Decision-Making Heuristics of Image Quality Estimation Exploit the Content of Subjective ExperienceFrontiers in Psychology10.3389/fpsyg.2022.86787413Online publication date: 21-Jun-2022
  • (2017)Learning to Decide with and without Reasoning: How Task Experience Affects Attribute Weighting and Preference StabilityJournal of Behavioral Decision Making10.1002/bdm.206331:3(367-379)Online publication date: 4-Dec-2017

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Transactions on Applied Perception
ACM Transactions on Applied Perception  Volume 13, Issue 3
May 2016
137 pages
ISSN:1544-3558
EISSN:1544-3965
DOI:10.1145/2912576
Issue’s Table of Contents
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 28 May 2016
Accepted: 01 January 2016
Revised: 01 January 2016
Received: 01 November 2015
Published in TAP Volume 13, Issue 3

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. Subjective image-quality estimation
  2. eye movements
  3. image-quality attributes
  4. individual differences

Qualifiers

  • Research-article
  • Research
  • Refereed

Funding Sources

  • Finnish Doctoral Program in User-Centered Information Technology (UCIT)
  • Academy of Finland Research Programme on the Human Mind project ‘Mind, Picture, Image’

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)8
  • Downloads (Last 6 weeks)1
Reflects downloads up to 19 Oct 2024

Other Metrics

Citations

Cited By

View all
  • (2022)The Fewer Reasons, the More You Like It! How Decision-Making Heuristics of Image Quality Estimation Exploit the Content of Subjective ExperienceFrontiers in Psychology10.3389/fpsyg.2022.86787413Online publication date: 21-Jun-2022
  • (2017)Learning to Decide with and without Reasoning: How Task Experience Affects Attribute Weighting and Preference StabilityJournal of Behavioral Decision Making10.1002/bdm.206331:3(367-379)Online publication date: 4-Dec-2017

View Options

Get Access

Login options

Full Access

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media