Free
Article  |   December 2015
Age-related changes in auditory and visual interactions in temporal rate perception
Author Affiliations
Journal of Vision December 2015, Vol.15, 2. doi:https://doi.org/10.1167/15.16.2
  • Views
  • PDF
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Cassandra J. Brooks, Andrew J. Anderson, Neil W. Roach, Paul V. McGraw, Allison M. McKendrick; Age-related changes in auditory and visual interactions in temporal rate perception. Journal of Vision 2015;15(16):2. https://doi.org/10.1167/15.16.2.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

We investigated how aging affects the integration of temporal rate for auditory flutter (amplitude modulation) presented with visual flicker. Since older adults were poorer at detecting auditory amplitude modulation, modulation depth was individually adjusted so that temporal rate was equally discriminable for 10 Hz flutter and flicker, thereby balancing the reliability of rate information available to each sensory modality. With age-related sensory differences normalized in this way, rate asynchrony skewed both auditory and visual rate judgments to the same extent in younger and older adults. Therefore, reliability-based weighting of temporal rate is preserved in older adults. Concurrent presentation of synchronous 10 Hz flicker and flutter improved temporal rate discrimination consistent with statistically optimal integration in younger but not older adults. In a control experiment, younger adults were presented with the same physical auditory stimulus as older adults. This time, rate asynchrony skewed perceived rate with greater auditory weighting rather than balanced integration. Taken together, our results indicate that integration of discrepant auditory and visual rates is not altered due to the healthy aging process once sensory deficits are accounted for, but that aging does abolish the minor improvement in discrimination performance seen in younger observers when concordant rates are integrated.

Introduction
Slow periodic modulations in amplitude over time are a feature of both our auditory (Attias & Schreiner, 1997) and visual experience (Dong & Atick, 1995). Light and sound from the same object are likely to oscillate at similar rates, as do vocalizations and lip movements in speech (Chandrasekaran, Trubanova, Stillittano, Caplier, & Ghazanfar, 2009). Integrating redundant rate information across the senses permits more precise discrimination of temporal rate changes (Koene, Arnold, & Johnston, 2007; Recanzone, 2003). While greatly disparate auditory and visual rates are segregated, partial integration of smaller differences can distort the perceived rate (Recanzone, 2003; Roach, Heron, & McGraw, 2006; Shipley, 1964). Initially, perceived rate was viewed as intrinsically dominated by audition (Welch, Dution Hurt, & Warren, 1986). However, vision and audition contribute equally to rate perception when the auditory cue is sufficiently degraded by reducing the depth of amplitude modulation (Roach et al., 2006). It has been previously documented that healthy older adults require greater modulation depth than younger adults in order to perceive the fluctuation inherent in a fluttering pure tone or flickering light (He, Mills, Ahlstrom, & Dubno, 2008; Kim & Mayer, 1994). It is also known that temporal rate discrimination is poorer for less perceptible modulations in amplitude (Roach et al., 2006; Waugh & Hess, 1994). Therefore, if older adults' reduced modulation sensitivity is sufficient to impair sensitivity to the rate of suprathreshold amplitude changes, then their ability to appropriately integrate or segregate auditory and visual stimuli according to rate similarity could be compromised. 
Integration itself can also be altered by aging for some aspects of temporal perception. Older adults can tolerate wider temporal gaps between auditory and visual stimuli yet still see them as simultaneous (Chan, Pianta, & McKendrick, 2014a) and they more frequently report illusory doubling of a single flash accompanied by two beeps (De Loss, Pierce, & Andersen, 2013; McGovern, Roudaia, Stapleton, McGinnity, & Newell, 2014; Setti, Burke, Kenny, & Newell, 2011). From this, an increased tendency to integrate conflicting auditory and visual rates might be expected. However, older adults fuse multiple flashes accompanied by a beep in the same way as younger adults (McGovern et al., 2014). Furthermore, flashes and beeps need not be simultaneous to be perceived as corresponding when part of a matching sequence of repeats over time (Denison, Driver, & Ruff, 2013). Over longer time frames, older adults are more susceptible to integrating incongruent speech in the McGurk effect (Sekiyama, Soshi, & Sakamoto, 2014; Setti, Burke, Kenny, & Newell, 2013) but they retain the ability to integrate congruent speech provided that the visual component is clear (Sommers, 2005; Tye-Murray, Spehar, Myerson, Sommers, & Hale, 2011). Though speech contains corresponding auditory and visual amplitude modulations (Chandrasekaran et al., 2009), semantic content also influences how older adults integrate speech (Maguinness, Setti, Burke, Kenny, & Newell, 2011; Stevenson et al., 2014) thereby making it difficult to infer from speech studies what the influence of aging on basic mechanisms of audio-visual integration might be. 
Additionally, age-related changes in perception are often unequal when performance on complimentary auditory and visual tasks is directly compared in the same group of older adults. For example, duration judgments show greater age-related visual than auditory impairment (Lustig & Meck, 2011) but older adults are also more vulnerable to distracting visual information during an auditory task than vice versa (Guerreiro, Murphy, & Van Gerven, 2013; Guerreiro & Van Gerven, 2011). Event-related potentials diminish at earlier latencies for visual than for auditory stimuli (Čeponienė, Westerfield, Torki, & Townsend, 2008), and fMRI blood oxygen level dependent signals decline with increasing presentation rates of visual but not auditory stimuli (Cliff et al., 2013). Current theory holds that the brain weights a pair of sensory cues according to their relative reliability (Ernst & Banks, 2002). Therefore, if aging differentially affects the precision of temporal rate estimates in vision and audition, changes in audiovisual perceived rate are possible even if the underlying ability to integrate is unaffected by aging. 
In this study, we compared the effect of rate asynchrony on perceived rate, as well as the effect of rate synchrony on rate discrimination, in a group of younger and older adults. We accounted for potential age-related differences in the precision of temporal rate estimates across the senses by first equating auditory and visual temporal rate discriminability (Roach et al., 2006). This approach allowed us to separate the effect of aging on auditory, visual, and integration abilities. 
Experiment 1
Methods
Participants
We recruited 11 young adults (age range: 22–32, mean 26) and 10 older adults (age range: 60–74, mean 68) from the university and the general population. Our recruitment strategy for older adults was similar to that used in previous experiments on aging from our laboratory (Karas & McKendrick, 2011; McKendrick & Battista, 2013; McKendrick et al., 2013; Chan et al., 2014a, 2014b), and typically attracts fit and active members of the community. Our participants had no history of excessive noise exposure, hearing aid use, or diseases or medications known to affect vision or hearing. Consistent with epidemiological research, normal hearing was defined as a pure tone average for 500, 1000, 2000, and 4000 Hz less than or equal to 25 dB (Cruickshanks et al., 2003). Best corrected visual acuity was 6/7.5 or better in both eyes, achieved with a spectacle prescription with spherical error less than five diopters and astigmatism less than two diopters. A clinical examination of anterior and posterior eye excluded the presence of ocular disease, as well as cortical or nuclear changes in the intraocular lens greater than Grade 2 (Chylack et al., 1993). Participants with intraocular pressure greater than 21 mmHg were excluded due to reduced flicker sensitivity (Tyler, 1981). All protocols were approved by the University of Melbourne Human Research Ethics Council, and the participants provided written informed consent according to a protocol consistent with the Declaration of Helsinki. 
Experimental stimuli and setup
We produced our auditory temporal rate stimulus (a fluttering sound) by sinusoidally amplitude modulating a 65 dB 500 Hz pure tone presented via a speaker (Acoustimass Cube, BOSE, Framingham, MA). Our visual temporal rate stimulus, a flickering light, was a 0.7° diameter LED that sinusoidally varied in luminance over time about a mean of 438 cd/m2. The LED sat on top of the speaker, surrounded by a black panel, so spatial cues facilitated the percept of a unified audiovisual object. A computer soundcard (SoundBlaster Live: Version 5.12) drove both the LED and speaker, enabling synchronous presentation of the visual and auditory stimuli. Luminance was controlled by inputting an amplitude modulated 2000 Hz carrier into the soundcard and subsequently demodulating it (Puts, Pokorny, Quinlan, & Glennie, 2005). The system was calibrated by measuring LED luminance across a range of input voltages using a PR-650 SpectraScan photometer (Photoresearch, Chatsworth, CA). Stimulus generation and calibration software were custom written in Matlab (Version R2008a, Mathworks, Natick, MA). The experiment was conducted in a quiet room with dim illumination. Spectacle correction was appropriate for the testing distance of 80 cm, including a near addition for older participants. Participants fixated on the LED during all tasks. A chin rest stabilized head position, and a computer keyboard was used to collect responses. 
Procedure
Our procedure was based on the study by Roach et al. (2006). Participants completed the experiment tasks over three to four sessions, each approximately two hours in duration. Task order was counterbalanced. All tasks were two interval forced choice, with a 500 ms stimulus duration and interstimulus interval each 500 ms. A method of constant stimuli was employed with seven stimulus levels, each presented 20 times, with presentations divided into four blocks of five. Participants responded to each trial at their own pace and were provided with rest breaks between blocks as needed. Participants typically attended three sessions, each of no more than two hours duration. If experimental tasks were not completed within this time frame, participants attended a fourth session to complete the remaining trials. Practice trials were provided to ensure that subjects understood the task and to aid the determination of the appropriate stimulus range. Data were fit with a cumulative Gaussian (Equation 1) using maximum likelihood estimation.    
Equation 1: The psychometric function (Ψ) with guessing rate (γ), lapsing rate (1 − γ) and cumulative Gaussian distribution of mean (μ) and standard deviation (σ) (Treutwein, 1995). 
Experiment 1A: Equating flicker and flutter discriminability
We equated thresholds for discriminating a change in the temporal rate of 10 Hz flicker and flutter for each participant. For the visual task, the two interval forced choice format contrasted the standard 10 Hz flicker with one of seven possible test flicker rates, which varied according to a method of constant stimuli. Participants indicated which interval contained the faster flicker rate by key press. Flicker discrimination thresholds were derived from the standard deviation of the psychometric function (Equation 1). Equivalent discriminability for flicker and flutter required a low auditory modulation depth, so we first determined the smallest modulation in amplitude that gave rise to the percept of flutter (Figure 1A). The standard interval contained an unmodulated tone and the test interval 10 Hz flutter of variable modulation depth. Participants judged which interval contained the fluttering sound. The mean of the psychometric function specified the modulation detection threshold. Flutter discrimination thresholds were then measured using the same format as the visual task (Figure 1B). Thresholds were obtained for four different modulation depths, each a multiple of the individual participant's modulation detection threshold. As flutter rate discriminability varies approximately linearly with auditory modulation depth over this restricted range, we used a linear regression fit to the auditory data to approximate the unique modulation for each participant that equated temporal rate discriminability for 10 Hz flicker and flutter (Figure 1C). This modulation was used in all subsequent audiovisual tasks to allow investigation of integrative ability without the confound of individual differences in auditory and visual sensitivity. 
Figure 1
 
(A) Detection of modulation in a tone. (B) Discrimination of a change in temporal rate in either auditory flutter or visual flicker. (C) Flutter discrimination performed at four different depths of modulation, each a multiple of the participant's detection threshold. The intersection of the linear regression of these data with the flicker temporal rate discrimination threshold (green line) gives the modulation depth for matched discriminability. (D) Auditory asynchronous task: flutter discrimination in presence of task-irrelevant flicker rates of 10 Hz (blue) and 8 Hz (red). Vertical lines indicate temporal rate judged a perceptual match to the 10 Hz flutter standard. (E) Visual asynchronous task: flicker discrimination in presence of task-irrelevant flutter rates of 10 Hz (blue) and 12 Hz (red). Vertical lines indicate temporal rate judged a perceptual match to the 10 Hz flicker standard. (F) Audiovisual synchronous task: temporal rate discrimination for combined flicker and flutter (blue), compared to flicker discrimination alone (green). Horizontal lines indicate thresholds. All psychometric functions are from the same participant.
Figure 1
 
(A) Detection of modulation in a tone. (B) Discrimination of a change in temporal rate in either auditory flutter or visual flicker. (C) Flutter discrimination performed at four different depths of modulation, each a multiple of the participant's detection threshold. The intersection of the linear regression of these data with the flicker temporal rate discrimination threshold (green line) gives the modulation depth for matched discriminability. (D) Auditory asynchronous task: flutter discrimination in presence of task-irrelevant flicker rates of 10 Hz (blue) and 8 Hz (red). Vertical lines indicate temporal rate judged a perceptual match to the 10 Hz flutter standard. (E) Visual asynchronous task: flicker discrimination in presence of task-irrelevant flutter rates of 10 Hz (blue) and 12 Hz (red). Vertical lines indicate temporal rate judged a perceptual match to the 10 Hz flicker standard. (F) Audiovisual synchronous task: temporal rate discrimination for combined flicker and flutter (blue), compared to flicker discrimination alone (green). Horizontal lines indicate thresholds. All psychometric functions are from the same participant.
Experiment 1B: Integration of asynchronous flicker and flutter rates
Perceived auditory temporal rate shifts when a concurrent visual stimulus oscillates at a different rate. Shifts in perceived visual rate can likewise be induced by asynchronous auditory rates (Roach et al., 2006). We measured this shift in perceived rate using a two interval forced choice task with synchronous 10 Hz auditory flutter and visual flicker, presented in phase, as a reference. For the auditory condition, the test interval contained an auditory flutter rate, which varied with a method of constant stimuli, and a fixed task-irrelevant visual flicker rate. Participants indicated which interval fluttered faster, basing their judgments solely on what they heard. This procedure was repeated for seven task-irrelevant rates (8, 9, 10, 11, 12, 14, and 16 Hz), generating a total of seven psychometric functions (Figure 1D, shown with a slow task-irrelevant rate). The mean of each psychometric function corresponded to the point of subjective equality, the physical test rate of flutter that was perceptually equivalent to the 10 Hz reference (Roach et al., 2006). The visual condition was the reverse with a test interval of variable visual flicker rate and a fixed task-irrelevant auditory flutter rate. Participants indicated which interval flickered faster, basing their judgments solely on what they saw (Figure 1E, shown with a fast task-irrelevant rate). This was repeated for the same range of task-irrelevant rates as the auditory condition (8, 9, 10, 11, 12, 14, and 16 Hz flutter). We restricted the range of rates tested to minimize differences in the perceived depth of amplitude modulation with changing temporal rate, and to avoid the fused perception of roughness (rather than flutter) that occurs at higher temporal rates for auditory stimuli (Fastl, 1997; Marks, 1970). 
Experiment 1C: Integration of synchronous flicker and flutter rates
We measured how precisely participants could discriminate between temporal rates when flicker and flutter were presented simultaneously at the same rate and in phase with each other. In a two interval forced choice task, the 10 Hz reference rate was compared to one of seven possible test rates using a method of constant stimuli. Participants indicated which interval contained the faster rate of fluctuation. The standard deviation of the psychometric function gave the audiovisual rate discrimination threshold. This was compared, on an individual basis, to visual flicker rate discrimination threshold obtained in Experiment 1A (Figure 1F). Combined presentation of synchronous auditory and visual temporal rates is expected to improve discrimination between rates in line with statistically optimal integration (Koene et al., 2007). 
Results
Experiment 1A: Equating flicker and flutter discriminability
Younger and older adults showed no significant difference in their ability to discriminate between visual flicker rates, t(19) = 1.2, p = 0.25 (Figure 2A). However, median amplitude modulation detection thresholds were elevated in the older adults (Mann-Whitney, U = 17, p = 0.008; Figure 2B), indicating reduced sensitivity to auditory amplitude modulation with age (note that auditory parameters were not normally distributed). Older adults required greater modulation depth than younger adults to match the discriminability of flicker and flutter temporal rate changes (Mann-Whitney, U = 18, p = 0.01; Figure 2C). This age dependent difference was not statistically significant when the median modulation for a match was expressed as a multiple of each individual's threshold for detecting modulation (Mann-Whitney, U = 29, p = 0.07). Altogether, these results suggest that an age-related decrease in auditory temporal rate discriminability occurs secondary to reduced sensitivity to auditory amplitude modulation. 
Figure 2
 
(A) Box plot of thresholds for discriminating a change in the temporal rate of 10 Hz flicker. (B) Box plot of modulation detection thresholds (%) obtained by discriminating 10 Hz sinusoidal amplitude modulation of a 500 Hz pure tone from the unmodulated tone. (C) Box plot for the flutter modulation (%) required to match the flutter temporal rate discriminability to the flicker temporal rate discriminability. Median (central line), interquartile range (box) and 10th and 90th percentiles (whiskers) are shown in all box plots.
Figure 2
 
(A) Box plot of thresholds for discriminating a change in the temporal rate of 10 Hz flicker. (B) Box plot of modulation detection thresholds (%) obtained by discriminating 10 Hz sinusoidal amplitude modulation of a 500 Hz pure tone from the unmodulated tone. (C) Box plot for the flutter modulation (%) required to match the flutter temporal rate discriminability to the flicker temporal rate discriminability. Median (central line), interquartile range (box) and 10th and 90th percentiles (whiskers) are shown in all box plots.
Experiment 1B: Integration of asynchronous flicker and flutter rates
To address the question of whether or not age altered the mechanism of audiovisual integration for temporal rate, we performed two mixed ANOVAs, one for the auditory asynchronous task and one for the visual asynchronous task. Task-irrelevant rate (8, 9, 10, 11, 12, 14, and 16 Hz) was the within subjects factor, and age group was the between groups factor. Responses for two participants were excluded from analysis of the auditory task due to inability to adequately fit psychometric functions to all their data (8 and 9 Hz condition for one participant, 14 Hz condition for another). 
There was a main effect of task-irrelevant rate on the physical rate perceived as equivalent to 10 Hz whether participants were discriminating changes in flutter rate, F(6, 102) = 30.9, p < 0.001, or flicker rate, F(3.5, 65.6) = 39.8, p < 0.001. As Figure 3 shows, physical rates faster than 10 Hz were required in the presence of slow task-irrelevant rates (8 and 9 Hz) whereas physical rates slower than 10 Hz were required in the presence of fast task-irrelevant rates (11, 12, 14, 16 Hz). This pattern reflects partial integration of conflicting auditory and visual rates, consistent with previous research (Roach et al., 2006). Complete segregation of asynchronous rates was not observed over the same range of task-irrelevant rates, likely reflecting differences in the modulation waveform and participant expertise with psychophysics between this experiment and prior work (Roach et al., 2006). However, a decline in influence is evident for the faster task-irrelevant rates tested, 14 and 16 Hz, where subjective equivalents level off (Figure 3). 
Figure 3
 
(A) Mean temporal modulation rate of auditory flutter that was subjectively equivalent to the 10 Hz reference for each of the task-irrelevant visual flicker rates. (B) Mean temporal modulation of visual flicker that was subjectively equivalent to the 10 Hz reference for each of the task-irrelevant auditory flutter rates. Younger adults are closed circles, and older adults open circles. Error bars are 95% confidence intervals of the mean. The dashed line indicates the physical temporal rate.
Figure 3
 
(A) Mean temporal modulation rate of auditory flutter that was subjectively equivalent to the 10 Hz reference for each of the task-irrelevant visual flicker rates. (B) Mean temporal modulation of visual flicker that was subjectively equivalent to the 10 Hz reference for each of the task-irrelevant auditory flutter rates. Younger adults are closed circles, and older adults open circles. Error bars are 95% confidence intervals of the mean. The dashed line indicates the physical temporal rate.
With auditory and visual reliability individually balanced, there was no main effect of age on the subjective equivalent to 10 Hz; auditory: F(1, 17) = 0.26, p = 0.62; visual: F(1, 19) = 0.13, p = 0.73. This suggests that aging does not affect the degree to which asynchronous auditory and visual rates are integrated under conditions controlling for age-related sensory decline. There was no interaction between age and task-irrelevant rate; [audition: F(6, 102) = 0.35, p = 0.91; vision: F(3.5, 65.6) = 0.69, p = 0.58], indicating that rate asynchrony alter perceived rate in the same systematic manner in young and old, even for large rate disparities. 
It has previously been shown that visual temporal rate discrimination is less precise in the presence of cross-modal rate asynchrony (Roach et al., 2006). In a supplementary analysis, we performed a mixed ANOVA to compare flicker rate discrimination thresholds between younger and older adults across all experimental conditions (vision-alone and vision combined with each of the seven task-irrelevant flutter rates). There was no main effect of age, F(1, 19) = 0.6, p = 0.44, or interaction between age group and task-irrelevant rate, F(4.4, 83) = 0.880, p = 0.49, on the elevation of flicker rate discrimination thresholds due to concurrent but asynchronous flutter rates; main effect: F(4.4, 83) = 3.7, p = 0.006. This indicates that older adults did not find the asynchronous tasks perceptually more difficult to complete than younger participants, and is consistent with fact that psychometric function slopes were not significantly different between younger and older observers in Experiments 1A and 1C. 
Experiment 1C: Integration of synchronous flicker and flutter rates
As Figure 4 shows, mean temporal rate discrimination thresholds based on synchronous flicker and flutter are similar in younger and older adults, t(19) = −0.924, p = 0.37. There was no evidence of greater heterogeneity in older adult responses as 95% confidence interval of the mean for each group are of comparable in width. Supplementary analysis of psychometric functions across age groups found no difference in mean guess rate (Mann-Whitney U = 52.5, p = 0.87) or mean lapse rate (Mann-Whitney U = 50, p = 0.71) for each age group. However, it is the relative improvement in performance under audiovisual compared to visual or auditory alone conditions for each participant that provides a measure of multisensory facilitation (Stein, Stanford, Ramachandran, Perrault, & Rowland, 2009). In fact, combined rather than separate presentation of equally reliable auditory and visual rates is known to improve temporal rate discrimination in a statistically optimal fashion (Koene et al., 2007). This entails reliability-based weighting of the individual sensory estimates of a multisensory object through maximum likelihood estimation to achieve a combined estimate with the smallest possible variance (Ernst & Banks, 2002; Equation 2). Since auditory and visual variances were equivalent by experimental design (see Experiment 1A), maximum likelihood estimation predicts a Image not available improvement (Equation 3). Predicted temporal rate discrimination thresholds were calculated for each participant based on their previously measured visual temporal rate discrimination thresholds (Experiment 1A). Paired t tests indicated that younger but not older adult audiovisual discrimination thresholds were consistent with maximum likelihood predictions; younger: t(10) = 0.70, p = 0.50; older: t(9) = 2.7, p = 0.02. This suggests that older adults were less able to benefit from the addition of synchronous flutter to flicker when discriminating temporal rate changes despite similar overall performance.    
Equation 2: Maximum likelihood estimation predicts that the audiovisual rate, Image not available , results from the sum of the individual auditory, Â, and visual, rates, with each weighted in proportion to their reciprocal variance, Image not available and Image not available , respectively, such that the weights sum to one (Ernst & Banks, 2002).    
Equation 3: Temporal rate discrimination threshold, σAV, based on auditory and visual rates as predicted by maximum likelihood estimation (Ernst & Banks, 2002). 
Figure 4
 
Individual and mean temporal rate discrimination thresholds for younger (closed circles) and older adults (open circles). Thresholds for synchronous flicker and flutter (black symbols) are compared with predicted thresholds for maximum likelihood integration (blue symbols). Error bars are 95% confidence intervals of the mean.
Figure 4
 
Individual and mean temporal rate discrimination thresholds for younger (closed circles) and older adults (open circles). Thresholds for synchronous flicker and flutter (black symbols) are compared with predicted thresholds for maximum likelihood integration (blue symbols). Error bars are 95% confidence intervals of the mean.
Experiment 2
As age unequally compromised auditory and visual temporal rate discriminability, we conducted a supplementary experiment (see below) to determine whether this resulted in any age-differences in the weight the brain applies when integrating flicker and flutter temporal rate estimates. A group of younger adults performed the asynchronous task with the same physical auditory stimulus that resulted in balanced integration in the average older adult. Since this modulation depth exceeded the average modulation for matched discriminability in younger adults in our first experiment, we hypothesized an increase in auditory influence on perceived rate relative to performance under matched discriminability. 
Methods
Six younger adults (age range: 23–28, mean 26) discriminated changes in the temporal rate of 10 Hz flutter presented simultaneously with asynchronous flicker, and changes in the temporal rate of 10 Hz flicker presented simultaneously with asynchronous flutter as described in Experiment 1B. A single task-irrelevant rate of 12 Hz was used as this corresponded to substantial integration for participants in Experiment 1 (Figure 3). Temporal rate discrimination was performed under two conditions which differed in the depth of auditory modulation. For the matched condition, the fluttering stimulus was degraded by reducing its modulation depth to equate temporal rate discriminability across vision and audition on an individual basis (see Experiment 1A). For the unmatched condition, flutter modulation was set at 20%, the average modulation for equated discriminability in older adults for Experiment 1. Four psychometric functions were generated for each participant to determine whether the physical rate subjectively equivalent to 10 Hz for each sensory modality changed with degree of auditory modulation. The point of subjective equality to the 10 Hz reference was given by the mean of the psychometric function. 
Results
Equivalent flicker and flutter temporal rate discriminability was achieved with a modulation of 6.4 % (95% CI [3.8, 9.0]), which was much lower than the average older adult modulation of 20%. As Figure 5A shows, the point of subjective equality was closer to true physical value of 10 Hz in the unmatched, compared to the matched condition for flutter. Figure 5B displays the reciprocal relationship, where the subjective equivalent for flicker is further from its true physical rate of 10 Hz in the unmatched condition. Analysis of the difference in the point of subjective equality to 10 Hz between matched and unmatched conditions confirmed greater auditory influence for 20% modulation whether participants responded to flicker or flutter rate changes; flutter: t(5) = −4.7, p = 0.005; flicker: t(5) = 2.7, p = 0.04 (Figure 5C). 
Figure 5
 
Individual participant perceptual match for 10 Hz under matched and unmatched, auditory dominated rate discriminability for (A) flutter and (B) flicker asynchronous rate discrimination tasks with a task-irrelevant rate of 12 Hz. (C) Individual (open circles) and mean (closed circles) differences in perceptual match for flicker and flutter asynchronous rate discrimination tasks. Error bars are the 95% confidence intervals of the mean difference.
Figure 5
 
Individual participant perceptual match for 10 Hz under matched and unmatched, auditory dominated rate discriminability for (A) flutter and (B) flicker asynchronous rate discrimination tasks with a task-irrelevant rate of 12 Hz. (C) Individual (open circles) and mean (closed circles) differences in perceptual match for flicker and flutter asynchronous rate discrimination tasks. Error bars are the 95% confidence intervals of the mean difference.
Discussion
Whereas older age did not impair discrimination of visual flicker rate changes, discrimination of auditory flutter rate changes was less precise in older adults due to impaired amplitude modulation sensitivity. Older adults retained the ability to partially integrate equally reliable but asynchronous temporal rates of flutter and flicker since distortions in perceived rate were similar to those of younger adults. In contrast, integration of synchronous auditory and visual rates did not facilitate temporal rate discrimination in older adults relative to visual alone performance, though group audiovisual rate discrimination thresholds themselves were unaffected by aging. 
Impaired auditory modulation sensitivity is characteristic of physiological aging (He et al., 2009), which causes a wide range of temporal processing deficits even in older adults with normal audiometric thresholds (Fitzgibbons & Gordon-Salant, 1996). Conversely, though temporal processing likewise declines in the aged visual system (Owsley, 2011), older adults retained the ability to discriminate changes in flicker rate. However, this may reflect saturation of temporal contrast responses given the highly modulated flicker used. Deficits in flicker rate discrimination may emerge at lower modulation depths since decreased sensitivity to flicker modulation also occurs in aging (Kim & Mayer, 1994). Though both audition and vision show impaired coding of temporal information in animal studies (Palombi, Backoff, & Caspary, 2001; Schatteman, Hughes, & Caspary, 2008; Zhang et al., 2008), the vulnerability of temporal rate processing to aging may differ across the senses. 
This age-related auditory deficit renders both amplitude modulation and changes in modulation rate less perceptible to older adults. Whereas our experiment demonstrates that the ability to integrate asynchronous auditory and visual rates is not affected by older age, the age-related decline in auditory rate perceptibility is not normalized in the natural world. The supplementary experiment in younger adults showed that the physical difference in auditory amplitude modulation across age groups was sufficient to alter sensory weighting since audition dominated rate perception in younger adults for a modulation depth that equalized auditory and visual influence in older adults. Changes in sensory weighting with age have been shown to effect audiovisual orientating tasks, where impaired fixation of auditory targets with age leads to vision dominating fixation of audiovisual targets (Dobreva, O'Neill, & Paige, 2012). Since the mechanism of temporal rate integration is reliability based, our results suggest that auditory contribution to temporal rate perception may be reduced in older adults. This contrasts with previous supposition that audition was innately more appropriate for temporal rate judgments (Welch et al., 1986). Reduced facilitation of temporal rate discrimination by rate synchrony in older adults is also likely to be compounded by the age-related impairment in flutter rate discrimination. However, in this case, significant age-related differences in rate perception may not occur in an everyday setting, given that only a small improvement in temporal rate discrimination thresholds was found in younger adults. 
Maintenance of asynchronous rate integration in aging contrasts with previous work showing altered integration of temporally offset cues. Older adults integrate auditory and visual cues into a combined percept over a larger window of time than younger adults (Alm & Behne, 2013; Chan et al., 2014a) and the sound-induced flash illusion persists for greater temporal lags between the flash and the sound (McGovern et al., 2014; Setti et al., 2011). 
Our results may be reconciled with this literature by considering whether the integration process changes with stimulus duration. For example, auditory transients enhance visual search but sustained, synchronous sinusoidal modulations do not (Kösem & van Wassenhove, 2012; Van der Burg, Cass, Olivers, Theeuwes, & Alais, 2010). Consequently, multisensory interactions have been speculated to operate differently depending on whether magnocellular or parvocellular visual pathways are stimulated (Jaekl, Pérez-Bellido, & Soto-Faraco, 2014). Consequently, comparison of the integration of temporally offset, brief auditory and visual stimuli and the integration of longer, coincident stimuli may not be appropriate. 
Few age-related impairments in integration have been reported in the literature. For temporal perception, decreased audiovisual integration has previously been demonstrated for apparent motion in older adults (Roudaia, Sekuler, Bennett, & Sekuler, 2013) and speech syllables in older adults with hearing impairment (Musacchia, 2009). Typically, provision of redundant cues across sensory modalities benefits older adults more than younger adults, such as greater facilitation of response times (Diederich, Colonius, & Schomburg, 2008; Hugenschmidt, Peiffer, McCoy, Hayasaka, & Laurienti, 2009; Laurienti, Burdette, Maldjian, & Wallace, 2006; Peiffer, 2007). However, measures of temporal rate perception and response times may not be expected to align given the differences between tasks. Indeed, crossmodal stimulation shortens neural response latency and causes amplitude enhancement mostly at the beginning of the response (Rowland, Quessy, Stanford, & Stein, 2007) which is likely not an advantage for temporal rate judgments made over a longer period of time. That an age-related deficit was found for the integration of synchronous but not asynchronous auditory and visual rates suggests a possible dissociation between the mechanisms and their susceptibility to age-related decline. However, integration may be susceptible to any age-related decline in the sensory coding of rate information, such as decreased phase locking to amplitude modulation for a 500 Hz carrier (Leigh-Paffenroth & Fowler, 2006). The resulting intersensory asynchrony in the neural representation of auditory and visual rates resulting from such a loss in phase locking could limit the benefit derived from rate integration despite physical synchrony of the stimuli. 
The role of generalized cognitive decline
As noted in our Methods, we did not use any cognitive assessments in the present study, and so it could be asked whether a generalized cognitive decline in our older participants may have influenced our results. We believe this is extremely unlikely for several reasons. Firstly, we find no influence of aging in our Experiment 1B (integration of asynchronous rates), and so there is no effect to be explained by cognitive decline. Secondly, where an ageing effect was present in Experiment 1C (integration of synchronous rates), our analysis of psychometric function guess and lapsing rates showed our older participants could perform our experiments as reliably as younger participants. Specifically, the capacity of older adults to compare successive intervals in the two alternate forced choice task was unimpaired, which argues against any significant influence of age-related decline in attentional resources or working memory on the ability of older adults to complete the experimental tasks. The range of performances in our older participants matched that of our younger group (Figure 4), indicating our older group was as homogenous as our younger group and that any difference between groups is not driven by a small number of outliers with poor performance. That we find no evidence for attentional or working memory decline is consistent with our observation that our recruitment strategy tends to attract older participants who are fit and active members of the community (see Methods). 
Conclusions
Older adults retain the ability to flexibly resolve intersensory conflict in perceived rate through partial integration of asynchronous auditory and visual temporal rates. However, they are not able to benefit from audiovisual rate synchrony like younger adults when discriminating changes in temporal rate. Age-related decline in auditory modulation sensitivity, which affects perceptibility of both amplitude modulation and changes in flutter rate, is expected to further compound the age-related impairment in synchronous rate integration. However, this does not necessarily imply that age-related differences in discrimination performance will be practically significant since audiovisual facilitation produces only small improvements in precision. In contrast, under everyday conditions where age-related losses in audition are not controlled for, we predict that older adults will rely more on vision to achieve a coherent percept of asynchronous rates through their preserved ability to weight sensory information according to relative reliability. 
Acknowledgments
This research was funded by the Australian Research Council FT0990930 (AMM), and the Wellcome Trust WT097387 (NWR). 
Commercial relationships: none. 
Corresponding author: Allison M. McKendrick. 
Email: allisonm@unimelb.edu.au. 
Address: Department of Optometry and Vision Sciences, University of Melbourne, Parkville, Victoria, Australia. 
References
Alm, M., Behne D. (2013). Audio-visual speech experience with age influences perceived audio-visual asynchrony in speech. The Journal of the Acoustical Society of America, 134 (4), 3001–3010, doi:10.1121/1.4820798. [PubMed]
Attias H., Schreiner C. E. (1997). Temporal low-order statistics of natural sounds. In Mozer M. C. Jordan M. I. Petsche T. (Eds.) Advances in neural information processing systems 9 (Vol. 9, pp. 27–33). Cambridge, MA: MIT Press.
Čeponienė, R., Westerfield M., Torki M., Townsend J. (2008). Modality-specificity of sensory aging in vision and audition: Evidence from event-related potentials. Brain Research, 1215, 53–68, doi:10.1016/j.brainres.2008.02.010. [PubMed]
Chan Y. M., Pianta M. J., McKendrick A. M. (2014a). Older age results in difficulties separating auditory and visual signals in time. Journal of Vision, 14 (11): 13, 1–11, doi:10.1167/14.11.13. [PubMed] [Article]
Chan Y. M., Pianta M. J., McKendrick A. M. (2014b). Reduced audiovisual recalibration in the elderly. Frontiers in Aging Neuroscience, 27 (6) 226.
Chandrasekaran C., Trubanova A., Stillittano S., Caplier A., Ghazanfar A. A. (2009). The natural statistics of audiovisual speech. PLoS Computational Biology, 5 (7), e1000436, doi:10.1371/journal.pcbi.1000436. [CrossRef]
Chylack L. T., Wolfe J. K., Singer D. M., Leske M. C., Bullimore M. A., Bailey I. L., …Wu S.-Y. (1993). The lens opacities classification system III. Archives of Ophthalmology, 111 (6), 831–836, doi:10.1001/archopht.1993.01090060119035. [PubMed]
Cliff M., Joyce D. W., Lamar M., Dannhauser T., Tracy D. K., Shergill S. S. (2013). Aging effects on functional auditory and visual processing using fMRI with variable sensory loading. Cortex, 49 (5), 1304–1313, doi:10.1016/j.cortex.2012.04.003. [PubMed]
Cruickshanks K. J., Tweed T. S., Wiley T. L., Klein B. E. K., Klein R., Chappell R., Dalton D. S. (2003). The 5-year incidence and progression of hearing loss: The epidemiology of hearing loss study. Archives of Otolaryngology–Head and Neck Surgery, 129 (10), 1041–1046. [PubMed]
De Loss D. J., Pierce R. S., Andersen G. J. (2013). Multisensory integration, aging, and the sound-induced flash illusion. Psychology and Aging, 28 (3), 802–812, doi:10.1037/a0033289. [PubMed]
Denison R. N., Driver J., Ruff C. C. (2013). Temporal structure and complexity affect audio-visual correspondence detection. Frontiers in Psychology, 3, 619, doi:10.3389/fpsyg.2012.00619. [CrossRef]
Diederich A., Colonius H., Schomburg A. (2008). Assessing age-related multisensory enhancement with the time-window-of-integration model. Neuropsychologia, 46 (10), 2556–2562, doi:10.1016/j.neuropsychologia.2008.03.026. [PubMed]
Dobreva M. S., O'Neill W. E., Paige G. D. (2012). Influence of age, spatial memory, and ocular fixation on localization of auditory, visual, and bimodal targets by human subjects. Experimental Brain Research, 223 (4), 441–455, doi:10.1007/s00221-012-3270-x. [PubMed]
Dong D. W., Atick J. J. (1995). Statistics of natural time-varying images. Network: Computation in Neural Systems, 6 (3), 345–358, doi:10.1088/0954-898X_6_3_003. [CrossRef]
Ernst M. O., Banks M. S. (2002). Humans integrate visual and haptic information in a statistically optimal fashion. Nature, 415 (6870), 429–433, doi:10.1038/415429a. [PubMed]
Fastl H. (1997). The psychoacoustics of sound-quality evaluation. Acta Acustica United with Acustica, 83 (5), 754–764.
Fitzgibbons P. J., Gordon-Salant S. (1996). Auditory temporal processing in elderly listeners. Journal of the American Academy of Audiology, 7 (3), 183–189. [PubMed]
Guerreiro M. J. S., Murphy D. R., Van Gerven P. W. (2013). Making sense of age-related distractibility: The critical role of sensory modality. Acta Psychologica, 142 (2), 184–194, doi:10.1016/j.actpsy.2012.11.007. [PubMed]
Guerreiro M. J. S., Van Gerven P. W. M. (2011). Now you see it, now you don't: Evidence for age-dependent and age-independent cross-modal distraction. Psychology and Aging, 26 (2), 415–426, doi:10.1037/a0021507. [PubMed]
He N. J., Mills J. H., Ahlstrom J. B., Dubno J. R. (2008). Age-related differences in the temporal modulation transfer function with pure-tone carriers. The Journal of the Acoustical Society of America, 124 (6), 3841–3849, doi:10.1121/1.2998779. [PubMed]
Hugenschmidt C. E., Peiffer A. M., McCoy T. P., Hayasaka S., Laurienti P. J. (2009). Preservation of crossmodal selective attention in healthy aging. Experimental Brain Research, 198 (2–3), 273–285, doi:10.1007/s00221-009-1816-3. [PubMed]
Jaekl P., Pérez-Bellido A., Soto-Faraco S. (2014). On the 'visual' in 'audio-visual integration': A hypothesis concerning visual pathways. Experimental Brain Research, 232 (6), 1631–1638, doi:10.1007/s00221-014-3927-8. [PubMed]
Karas R., McKendrick A.M. (2011). Increased surround modulation of perceived contrast in the elderly. Optometry and Vision Science, 88 (11), 1298–1308. [PubMed]
Kim C. B., Mayer M. J. (1994). Foveal flicker sensitivity in healthy aging eyes. II. Cross-sectional aging trends from 18 through 77 years of age. Journal of the Optical Society of America A, 11 (7), 1958–1969, doi:10.1364/JOSAA.11.001958. [PubMed]
Koene A., Arnold D., Johnston A. (2007). Bimodal sensory discrimination is finer than dual single modality discrimination. Journal of Vision, 7 (11): 14, 1–11, doi:10.1167/7.11.14. [PubMed] [Article]
Kösem A., van Wassenhove V. (2012). Temporal structure in audiovisual sensory selection. Plos One, 7 (7), e40936, doi:10.1371/journal.pone.0040936. [CrossRef]
Laurienti P. J., Burdette J. H., Maldjian J. A., Wallace M. T. (2006). Enhanced multisensory integration in older adults. Neurobiology of Aging, 27 (8), 1155–1163, doi:10.1016/j.neurobiolaging.2005.05.024. [PubMed]
Leigh-Paffenroth E., Fowler C. G. (2006). Amplitude-modulated auditory steady-state responses in younger and older listeners. Journal of the American Academy of Audiology, 17 (8), 582–597, doi:10.3766/jaaa.17.8.5. [PubMed]
Lustig C., Meck W. H. (2011). Modality differences in timing and temporal memory throughout the lifespan. Brain and Cognition, 77 (2), 298–303, doi:10.1016/j.bandc.2011.07.007. [PubMed]
Maguinness C., Setti A., Burke K. E., Kenny R. A., Newell F. N. (2011). The effect of combined sensory and semantic components on audio-visual speech perception in older adults. Frontiers in Aging Neuroscience, 3, 19, doi:10.3389/fnagi.2011.00019. [CrossRef]
Marks L. E. (1970). Apparent depth of modulation as a function of frequency and amplitude of temporal modulations of luminance. Journal of the Optical Society of America, 60 (7), 970–977, doi:10.1364/JOSA.60.000970. [PubMed]
McGovern D. P., Roudaia E., Stapleton J., McGinnity T. M., Newell F. N. (2014). The sound-induced flash illusion reveals dissociable age-related effects in multisensory integration. Frontiers in Aging Neuroscience, 6, 250, doi:10.3389/fnagi.2014.00250. [CrossRef]
McKendrick A. M., Battista J. (2013). Perceptual learning of contour integration is not compromised in the elderly. Journal of Vision, 13 (1): 5, 1–10, doi:10.1167/13.1.5. [PubMed] [Article]
McKendrick A. M., Weymouth A. E., Battista J. (2013). Visual form perception from 20 through 80 years. Investigative Ophthalmology & Visual Science, 54 (3), 1730–1739. [PubMed] [Article]
Musacchia G. (2009). Audiovisual deficits in older adults with hearing loss: Biological evidence. Ear & Hearing, 30 (5), 505–514, doi:10.1097/AUD.0b013e3181a7f5b7. [PubMed]
Owsley C. (2011). Aging and vision. Vision Research, 51 (13), 1610–1622, doi:10.1016/j.visres.2010.10.020. [CrossRef]
Palombi P. S., Backoff P. M., Caspary D. M. (2001). Responses of young and aged rat inferior colliculus neurons to sinusoidally amplitude modulated stimuli. Hearing Research, 153 (1–2), 174–180, doi:10.1016/S0378-5955(00)00264-1. [PubMed]
Peiffer A. M. (2007). Age-related multisensory enhancement in a simple audiovisual detection task. Neuroreport, 18 (10), 1077–1081, doi:10.1097/WNR.0b013e3281e72ae7. [PubMed]
Puts M. J. H., Pokorny J., Quinlan J., Glennie L. (2005). Audiophile hardware in vision science; the soundcard as a digital to analog converter. Journal of Neuroscience Methods, 142 (1), 77–81, doi:10.1016/j.jneumeth.2004.07.013. [PubMed]
Recanzone G. H. (2003). Auditory influences on visual temporal rate perception. Journal of Neurophysiology, 89 (2), 1078–1093, doi:10.1152/jn.00706.2002. [CrossRef]
Roach N. W., Heron J., McGraw P. V. (2006). Resolving multisensory conflict: A strategy for balancing the costs and benefits of audio-visual integration. Proceedings: Biological Sciences, 273 (1598), 2159–2168, doi:10.1098/rspb.2006.3578. [CrossRef]
Roudaia E., Sekuler A. B., Bennett P. J., Sekuler R. (2013). Aging and audio-visual and multi-cue integration in motion. Frontiers in Psychology, 4, 267, doi:10.3389/fpsyg.2013.00267. [CrossRef]
Rowland B. A., Quessy S., Stanford T. R., Stein B. E. (2007). Multisensory integration shortens physiological response latencies. The Journal of neuroscience, 27 (22), 5879–5884, doi:10.1523/JNEUROSCI.4986-06.2007. [CrossRef]
Schatteman T. A., Hughes L. F., Caspary D. M. (2008). Aged-related loss of temporal processing: Altered responses to amplitude modulated tones in rat dorsal cochlear nucleus. Neuroscience, 154 (1), 329–337, doi:10.1016/j.neuroscience.2008.02.025. [PubMed]
Sekiyama K., Soshi T., Sakamoto S. (2014). Enhanced audiovisual integration with aging in speech perception: A heightened McGurk effect in older adults. Frontiers in Psychology, 5, 323, doi:10.3389/fpsyg.2014.00323. [CrossRef]
Setti A., Burke K. E., Kenny R., Newell F. N. (2013). Susceptibility to a multisensory speech illusion in older persons is driven by perceptual processes. Frontiers in Psychology, 4, 575, doi:10.3389/fpsyg.2013.00575. [CrossRef]
Setti A., Burke K. E., Kenny R. A., Newell F. N. (2011). Is inefficient multisensory processing associated with falls in older people? Experimental Brain Research, 209 (3), 375–384, doi:10.1007/s00221-011-2560-z. [PubMed]
Shipley T. (1964). Auditory flutter-driving of visual flicker. Science, 145 (3638), 1328–1330, doi:10.1126/science.145.3638.1328. [PubMed]
Sommers M. S. (2005). Auditory-visual speech perception and auditory-visual enhancement in normal-hearing younger and older adults. Ear and Hearing, 26 (3), 263, doi:10.1097/00003446-200506000-00003. [PubMed]
Stein B., Stanford T., Ramachandran R., Perrault T.,Jr., Rowland B. (2009). Challenges in quantifying multisensory integration: Alternative criteria, models, and inverse effectiveness. Experimental Brain Research, 198 (2–3), 113–126, doi:10.1007/s00221-009-1880-8. [PubMed]
Stevenson R. A., Nelms C., Baum S. H., Zurkovsky L., Barense M. D., Newhouse P. A., Wallace M. T. (2014). Deficits in audiovisual speech perception in normal aging emerge at the level of whole-word recognition. Neurobiology of Aging, 36 (1), 283–291, doi:10.1016/j.neurobiolaging.2014.08.003. [PubMed]
Treutwein B. (1995). Adaptive psychophysical procedures. Vision Research, 35 (17), 2503–2522. [PubMed]
Tye-Murray N., Spehar B., Myerson J., Sommers M. S., Hale S. (2011). Cross-modal enhancement of speech detection in young and older adults: Does signal content matter? Ear and Hearing, 32 (5), 650–655, doi:10.1097/AUD.0b013e31821a4578. [PubMed]
Tyler C. W. (1981). Specific deficits of flicker sensitivity in glaucoma and ocular hypertension. Investigative Ophthalmology & Visual Science, 20 (2), 204–212. [PubMed] [Article]
Van der Burg E., Cass J., Olivers C. N. L., Theeuwes J., Alais D. (2010). Efficient visual search from synchronized auditory signals requires transient audiovisual events. PloS One, 5 (5), e10664, doi:10.1371/journal.pone.0010664. [CrossRef]
Waugh S. J., Hess R. F. (1994). Suprathreshold temporal-frequency discrimination in the fovea and the periphery. Journal of the Optical Society of America A, 11 (4), 1199–1212, doi:10.1364/JOSAA.11.001199. [PubMed]
Welch R. B., DutionHurt L. D., Warren D. H. (1986). Contributions of audition and vision to temporal rate perception. Perception & Psychophysics, 39 (4), 294–300, doi:10.3758/BF03204939. [PubMed]
Zhang J., Wang X., Wang Y., Fu Y., Liang Z., Ma Y., Leventhal A. G. (2008). Spatial and temporal sensitivity degradation of primary visual cortical cells in senescent rhesus monkeys. The European Journal of Neuroscience, 28 (1), 201–207, doi:10.1111/j.1460-9568.2008.06300.x. [PubMed]
Figure 1
 
(A) Detection of modulation in a tone. (B) Discrimination of a change in temporal rate in either auditory flutter or visual flicker. (C) Flutter discrimination performed at four different depths of modulation, each a multiple of the participant's detection threshold. The intersection of the linear regression of these data with the flicker temporal rate discrimination threshold (green line) gives the modulation depth for matched discriminability. (D) Auditory asynchronous task: flutter discrimination in presence of task-irrelevant flicker rates of 10 Hz (blue) and 8 Hz (red). Vertical lines indicate temporal rate judged a perceptual match to the 10 Hz flutter standard. (E) Visual asynchronous task: flicker discrimination in presence of task-irrelevant flutter rates of 10 Hz (blue) and 12 Hz (red). Vertical lines indicate temporal rate judged a perceptual match to the 10 Hz flicker standard. (F) Audiovisual synchronous task: temporal rate discrimination for combined flicker and flutter (blue), compared to flicker discrimination alone (green). Horizontal lines indicate thresholds. All psychometric functions are from the same participant.
Figure 1
 
(A) Detection of modulation in a tone. (B) Discrimination of a change in temporal rate in either auditory flutter or visual flicker. (C) Flutter discrimination performed at four different depths of modulation, each a multiple of the participant's detection threshold. The intersection of the linear regression of these data with the flicker temporal rate discrimination threshold (green line) gives the modulation depth for matched discriminability. (D) Auditory asynchronous task: flutter discrimination in presence of task-irrelevant flicker rates of 10 Hz (blue) and 8 Hz (red). Vertical lines indicate temporal rate judged a perceptual match to the 10 Hz flutter standard. (E) Visual asynchronous task: flicker discrimination in presence of task-irrelevant flutter rates of 10 Hz (blue) and 12 Hz (red). Vertical lines indicate temporal rate judged a perceptual match to the 10 Hz flicker standard. (F) Audiovisual synchronous task: temporal rate discrimination for combined flicker and flutter (blue), compared to flicker discrimination alone (green). Horizontal lines indicate thresholds. All psychometric functions are from the same participant.
Figure 2
 
(A) Box plot of thresholds for discriminating a change in the temporal rate of 10 Hz flicker. (B) Box plot of modulation detection thresholds (%) obtained by discriminating 10 Hz sinusoidal amplitude modulation of a 500 Hz pure tone from the unmodulated tone. (C) Box plot for the flutter modulation (%) required to match the flutter temporal rate discriminability to the flicker temporal rate discriminability. Median (central line), interquartile range (box) and 10th and 90th percentiles (whiskers) are shown in all box plots.
Figure 2
 
(A) Box plot of thresholds for discriminating a change in the temporal rate of 10 Hz flicker. (B) Box plot of modulation detection thresholds (%) obtained by discriminating 10 Hz sinusoidal amplitude modulation of a 500 Hz pure tone from the unmodulated tone. (C) Box plot for the flutter modulation (%) required to match the flutter temporal rate discriminability to the flicker temporal rate discriminability. Median (central line), interquartile range (box) and 10th and 90th percentiles (whiskers) are shown in all box plots.
Figure 3
 
(A) Mean temporal modulation rate of auditory flutter that was subjectively equivalent to the 10 Hz reference for each of the task-irrelevant visual flicker rates. (B) Mean temporal modulation of visual flicker that was subjectively equivalent to the 10 Hz reference for each of the task-irrelevant auditory flutter rates. Younger adults are closed circles, and older adults open circles. Error bars are 95% confidence intervals of the mean. The dashed line indicates the physical temporal rate.
Figure 3
 
(A) Mean temporal modulation rate of auditory flutter that was subjectively equivalent to the 10 Hz reference for each of the task-irrelevant visual flicker rates. (B) Mean temporal modulation of visual flicker that was subjectively equivalent to the 10 Hz reference for each of the task-irrelevant auditory flutter rates. Younger adults are closed circles, and older adults open circles. Error bars are 95% confidence intervals of the mean. The dashed line indicates the physical temporal rate.
Figure 4
 
Individual and mean temporal rate discrimination thresholds for younger (closed circles) and older adults (open circles). Thresholds for synchronous flicker and flutter (black symbols) are compared with predicted thresholds for maximum likelihood integration (blue symbols). Error bars are 95% confidence intervals of the mean.
Figure 4
 
Individual and mean temporal rate discrimination thresholds for younger (closed circles) and older adults (open circles). Thresholds for synchronous flicker and flutter (black symbols) are compared with predicted thresholds for maximum likelihood integration (blue symbols). Error bars are 95% confidence intervals of the mean.
Figure 5
 
Individual participant perceptual match for 10 Hz under matched and unmatched, auditory dominated rate discriminability for (A) flutter and (B) flicker asynchronous rate discrimination tasks with a task-irrelevant rate of 12 Hz. (C) Individual (open circles) and mean (closed circles) differences in perceptual match for flicker and flutter asynchronous rate discrimination tasks. Error bars are the 95% confidence intervals of the mean difference.
Figure 5
 
Individual participant perceptual match for 10 Hz under matched and unmatched, auditory dominated rate discriminability for (A) flutter and (B) flicker asynchronous rate discrimination tasks with a task-irrelevant rate of 12 Hz. (C) Individual (open circles) and mean (closed circles) differences in perceptual match for flicker and flutter asynchronous rate discrimination tasks. Error bars are the 95% confidence intervals of the mean difference.
×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×