Nothing Special   »   [go: up one dir, main page]

skip to main content
10.1145/2207676.2207751acmconferencesArticle/Chapter ViewAbstractPublication PageschiConference Proceedingsconference-collections
research-article

On saliency, affect and focused attention

Published: 05 May 2012 Publication History

Abstract

We study how the visual catchiness (saliency) of relevant information impacts user engagement metrics such as focused attention and emotion (affect). Participants completed tasks in one of two conditions, where the task-relevant information either appeared salient or non-salient. Our analysis provides insights into relationships between saliency, focused attention, and affect. Participants reported more distraction in the non-salient condition, and non-salient information was slower to find than salient. Lack-of-saliency led to a negative impact on affect, while saliency maintained positive affect, suggesting its helpfulness. Participants reported that it was easier to focus in the salient condition, although there was no significant improvement in the focused attention scale rating. Finally, this study suggests user interest in the topic is a good predictor of focused attention, which in turn is a good predictor of positive affect. These results suggest that enhancing saliency of user-interested topics seems a good strategy for boosting user engagement.

References

[1]
Arapakis, J. M. Jose & P. D. Gray. Affective feedback: An investigation into the role of emotions in the information seeking process. In Proc. SIGIR2008, ACM press (2008), 395--402.
[2]
S. Attfield, G. Kazai, M. Lalmas & B. Piwowarski. Towards a science of user engagement (Position Paper), WSDM Workshop on User Modelling for Web Applications (2011).
[3]
M. Banerjee, M. Capozzoli, L. McSweeney, & D. Sinha. Beyond kappa: A review of interrater agreement measures. Canadian Journal of Statistics, 27, 1 (1999).
[4]
R. Carmi & L. Itti. Visual causes versus correlates of attentional selection in dynamic scenes. Vision Research, 46, 26 (2006).
[5]
J. Gwizdka & I. Lopatovska. The role of subjective factors in the information search process. JASIST, 60, 12 (2009).
[6]
M. H. Huang. Designing website attributes to induce experiential encounters. Computers in Human Behavior, 19, 4 (2003).
[7]
L. Itti and C. Koch. A saliency-based search mechanism for overt and covert shifts of visual attention. Vision Research, 40, 10--12 (2000), 1489--1506.
[8]
M. Jennings. Theory and models for creating engaging and immersive e-commerce websites. In Proc. SIGCPR 2000, ACM Press (2000).
[9]
C. Jennett, A. L. Cox, P. Cairns, S. Dhoparee, A. Epps, T. Tijs & A. Walton. Measuring and defining the experience of immersion in games. International Journal of HumanComputer Studies, 66, 9 (2008), 641--661.
[10]
A. Kittur, E. H. Chi & B. Suh. Crowdsourcing user studies with Mechanical Turk. In Proc. SIGCHI 2008, ACM Press (2008).
[11]
I. Lopatovska. Searching for good mood: Examining relationships between search task and mood. American Society for Information Science and Technology, 46, 1 (2009), 1--13.
[12]
C. C. Marshall & F. M. Shipman. The ownership and reuse of visual media. In Proc. JCDL 2011, ACM Press (2011).
[13]
W. Mason & S. Suri. Conducting behavioral research on Amazon's Mechanical Turk. Behavior Research Methods (2010), 1--23.
[14]
V. Navalpakkam & L. Itti. Search goal tunes visual features optimally. Neuron, 53, 4 (2007), 605--617.
[15]
T. P. Novak, D. L. Hoffman, & Y. Yiu-Fai. Measuring the Customer Experience in Online Environments: A Structural Modeling Approach. Marketing Science, 19, 1 (2000), 22--42.
[16]
H. L. O'Brien. Defining and Measuring Engagement in User Experiences with Technology. PhD Thesis, 2008.
[17]
H. L. O'Brien. Exploring user engagement in online news interactions. In Proc. ASIST 2011 (forthcoming).
[18]
H. O'Brien & E. Toms. The development and evaluation of a survey to measure user engagement. JASIST 61, 1 (2010).
[19]
S. Pace. A grounded theory of the flow experiences of Web users. International Journal of Human-Computer Studies, 60, 3 (2004), 327--363.
[20]
K. Purcell, L. Rainie, A. Mitchell, T. Rosenstiel, & K. Olmstead. Understanding the Participatory News Consumer. Pew Internet & American Life Project (2010).
[21]
K. Overbeeke, T. Djajadiningrat, C. Hummels, S. Wensveen & J. Frens. Let's make things engaging. Funology. Kluwer, 2003.
[22]
R. Peters & L. Itti. Applying computational tools to predict gaze direction in interactive visual environments. ACM Transactions on Applied Perception (TAP), 5, 2 (2008).
[23]
B. G. Tabachnick & L. S. Fidell. Using Multivariate Statistics (4th ed.). Boston, MA: Allyn and Bacon, 2001.
[24]
A. Treisman & G. Gelade. A feature-integration theory of attention. Cognitive Psychology, 12, 1 (1980).
[25]
D. Watson, L. A. Clark & A. Tellegen. Development and validation of brief measures of positive and negative affect: The PANAS Scales. Journal of Personality and Social Psychology, 47 (1988).
[26]
J. Wolfe & T. Horowitz. What attributes guide the deployment of visual attention and how do they do it? Nature Reviews Neuroscience, 5, 6 (2004), 495--501.

Cited By

View all
  • (2024)STIVi: Turning Perspective Sketching Videos into Interactive TutorialsProceedings of the 50th Graphics Interface Conference10.1145/3670947.3670969(1-13)Online publication date: 3-Jun-2024
  • (2024)Exploring Saliency Bias in Manipulation Detection2024 IEEE International Conference on Image Processing (ICIP)10.1109/ICIP51287.2024.10648063(3257-3263)Online publication date: 27-Oct-2024
  • (2024)Following the rich and famous A daily diary study on adolescents’ cognitive, affective, and physiological engagement with positive and negative celebrity contentCurrent Psychology10.1007/s12144-024-06437-z43:36(28919-28936)Online publication date: 27-Aug-2024
  • Show More Cited By

Index Terms

  1. On saliency, affect and focused attention

    Recommendations

    Comments

    Please enable JavaScript to view thecomments powered by Disqus.

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    CHI '12: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
    May 2012
    3276 pages
    ISBN:9781450310154
    DOI:10.1145/2207676
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 05 May 2012

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. focused attention
    2. news entertainment
    3. positive affect
    4. saliency
    5. user engagement
    6. user interests

    Qualifiers

    • Research-article

    Conference

    CHI '12
    Sponsor:

    Acceptance Rates

    Overall Acceptance Rate 6,199 of 26,314 submissions, 24%

    Upcoming Conference

    CHI '25
    CHI Conference on Human Factors in Computing Systems
    April 26 - May 1, 2025
    Yokohama , Japan

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)72
    • Downloads (Last 6 weeks)6
    Reflects downloads up to 24 Nov 2024

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)STIVi: Turning Perspective Sketching Videos into Interactive TutorialsProceedings of the 50th Graphics Interface Conference10.1145/3670947.3670969(1-13)Online publication date: 3-Jun-2024
    • (2024)Exploring Saliency Bias in Manipulation Detection2024 IEEE International Conference on Image Processing (ICIP)10.1109/ICIP51287.2024.10648063(3257-3263)Online publication date: 27-Oct-2024
    • (2024)Following the rich and famous A daily diary study on adolescents’ cognitive, affective, and physiological engagement with positive and negative celebrity contentCurrent Psychology10.1007/s12144-024-06437-z43:36(28919-28936)Online publication date: 27-Aug-2024
    • (2023)Emotional Attention: From Eye Tracking to Computational ModelingIEEE Transactions on Pattern Analysis and Machine Intelligence10.1109/TPAMI.2022.316923445:2(1682-1699)Online publication date: 1-Feb-2023
    • (2022)Spotlights: Designs for Directing Learners' Attention in a Large-Scale Social Annotation PlatformProceedings of the ACM on Human-Computer Interaction10.1145/35555986:CSCW2(1-36)Online publication date: 11-Nov-2022
    • (2022)Adaptive Progressive Image Enhancement for Edge-Assisted Mobile Vision2022 18th International Conference on Mobility, Sensing and Networking (MSN)10.1109/MSN57253.2022.00121(744-751)Online publication date: Dec-2022
    • (2022)What really matters?: characterising and predicting user engagement of news postings using multiple platforms, sentiments and topicsBehaviour & Information Technology10.1080/0144929X.2022.203079842:5(545-568)Online publication date: 7-Feb-2022
    • (2021)Investigating the Influence of Ads on User Search Performance, Behaviour, and Experience during Information SeekingProceedings of the 2021 Conference on Human Information Interaction and Retrieval10.1145/3406522.3446024(107-117)Online publication date: 14-Mar-2021
    • (2021)Engaging interaction and long-term engagement with WhatsApp in an everyday life context: exploratory studyJournal of Documentation10.1108/JD-07-2020-011577:4(825-850)Online publication date: 12-Jan-2021
    • (2020)Learning From Science News via Interactive and Animated Data Visualizations: An Investigation Combining Eye Tracking, Online Survey, and Cued Retrospective ReportingScience Communication10.1177/107554702096210042:6(803-828)Online publication date: 13-Oct-2020
    • Show More Cited By

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media