Displaying all 2 publications

Abstract:
Sort:
  1. Irwantoro K, Nimsha Nilakshi Lennon N, Mareschal I, Miflah Hussain Ismail A
    Q J Exp Psychol (Hove), 2023 Feb;76(2):450-459.
    PMID: 35360991 DOI: 10.1177/17470218221094296
    The influence of context on facial expression classification is most often investigated using simple cues in static faces portraying basic expressions with a fixed emotional intensity. We examined (1) whether a perceptually rich, dynamic audiovisual context, presented in the form of movie clips (to achieve closer resemblance to real life), affected the subsequent classification of dynamic basic (happy) and non-basic (sarcastic) facial expressions and (2) whether people's susceptibility to contextual cues was related to their ability to classify facial expressions viewed in isolation. Participants classified facial expressions-gradually progressing from neutral to happy/sarcastic in increasing intensity-that followed movie clips. Classification was relatively more accurate and faster when the preceding context predicted the upcoming expression, compared with when the context did not. Speeded classifications suggested that predictive contexts reduced the emotional intensity required to be accurately classified. More importantly, we show for the first time that participants' accuracy in classifying expressions without an informative context correlated with the magnitude of the contextual effects experienced by them-poor classifiers of isolated expressions were more susceptible to a predictive context. Our findings support the emerging view that contextual cues and individual differences must be considered when explaining mechanisms underlying facial expression classification.
  2. Hussain Ismail AM, Solomon JA, Hansard M, Mareschal I
    Proc Biol Sci, 2019 Nov 06;286(1914):20191492.
    PMID: 31690239 DOI: 10.1098/rspb.2019.1492
    Ambiguous images are widely recognized as a valuable tool for probing human perception. Perceptual biases that arise when people make judgements about ambiguous images reveal their expectations about the environment. While perceptual biases in early visual processing have been well established, their existence in higher-level vision has been explored only for faces, which may be processed differently from other objects. Here we developed a new, highly versatile method of creating ambiguous hybrid images comprising two component objects belonging to distinct categories. We used these hybrids to measure perceptual biases in object classification and found that images of man-made (manufactured) objects dominated those of naturally occurring (non-man-made) ones in hybrids. This dominance generalized to a broad range of object categories, persisted when the horizontal and vertical elements that dominate man-made objects were removed and increased with the real-world size of the manufactured object. Our findings show for the first time that people have perceptual biases to see man-made objects and suggest that extended exposure to manufactured environments in our urban-living participants has changed the way that they see the world.
Related Terms
Filters
Contact Us

Please provide feedback to Administrator ([email protected])

External Links