September 2018
Volume 18, Issue 10
Open Access
Vision Sciences Society Annual Meeting Abstract  |   September 2018
Dyadic perceptual learning of orientation discrimination
Author Affiliations
  • Yifei Zhang
    Peking–Tsinghua Center for Life Sciences, Peking University, Beijing, ChinaSchool of Psychological and Cognitive Sciences, Peking University, Beijing, China
  • Fang Fang
    Peking–Tsinghua Center for Life Sciences, Peking University, Beijing, ChinaSchool of Psychological and Cognitive Sciences, Peking University, Beijing, China
  • Yizhou Wang
    Key Laboratory of Machine Perception (Ministry of Education), Peking University, Beijing, ChinaNational Engineering Laboratory for Video Technology, Cooperative Medianet Innovation Center, and School of Electronics Engineering and Computer Science, Peking University, Beijing, China
Journal of Vision September 2018, Vol.18, 270. doi:https://doi.org/10.1167/18.10.270
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Yifei Zhang, Fang Fang, Yizhou Wang; Dyadic perceptual learning of orientation discrimination. Journal of Vision 2018;18(10):270. https://doi.org/10.1167/18.10.270.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

Unlike traditional perceptual learning studies in which participants are always trained alone, people usually learn together where social factors like collaboration and competition could influence the learning or training process. Here, we tested whether perceptual learning can be influenced by the presentation of a learning partner, thereafter referred to as dyadic perceptual learning. We trained participants with an orientation discrimination task alone or with a partner for 6 days of 1040 trials. In each trial, two ring-shaped gratings centered at the fixation (outer radius: 4°; inner radius: 1°; contrast: 1.0; spatial frequency: 2 cycles/°) were presented sequentially. Participants were asked to indicate the orientation change from the first to the second gratings. Single learners made only one response and got feedback, whereas paired learners were required to make a second response if their first responses are inconsistent (i.e., they had to decide whether to change their original response based on their confidence in themselves and in their partner). Paired learners communicated through computers, rather than in a face-to-face way. Feedback was also provided to them. We measured discrimination thresholds in the trained orientation and its orthogonal (untrained) orientation for both single learners and paired learners before and after training. The results showed that, in comparison to single learners, paired learners achieved better task performance and their learning speed was faster, suggesting that perceptual learning can be more effective and efficient when trained with a partner. Notably, dyadic perceptual learning also exhibited a hallmark of traditional perceptual learning – orientation specificity. Interestingly, we also found that one's probability of changing his/her original response was positively correlated with the transfer rate to the untrained orientation. These findings suggest that high-level social processes can enhance low-level perceptual learning and provide a more powerful way to improve human perceptual abilities than traditional perceptual training.

Meeting abstract presented at VSS 2018

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×