Superresolution is an image processing technique by which lower-resolution images captured at slightly different spatial positions are combined to create a higher-resolution image. Previous functional magnetic resonance imaging (fMRI) supperresolution studies have relied upon special acquisition protocols to obtain such spatial shifts. However, over the course of a typical fMRI session, subjects' brains are naturally sampled at many different spatial positions due to small inadvertent head movements. By taking advantage of these head movements it is possible to achieve superresolution using standard fMRI acquisition techniques. The first step of this superresolution procedure involves spatially registering the lower-resolution image volumes, which is routinely done in fMRI analysis to compensate for head movements and scanner drift. The second step involves reconstructing a higher-resolution volume from the registered lower-resolution volumes. In addition to more sophisticated reconstruction algorithms that have been developed over the past decade, we also demonstrate resolution increases by simply averaging multiple lower-resolution volumes that were resampled into a higher-resolution registered space.
High-resolution imaging is necessary to reveal the topographic details of small subcortical structures such as the lateral geniculate nucleus (LGN) and superior colliculus (SC), but the deep location of these structures precludes the use of surface coils that are used to increase signal to noise for cortical imaging. With a high resolution scanning sequence on a 3 T scanner we obtained reliable signals using a 1.5 × 1.5 × 2 mm3 voxel size. Using the superresolution algorithm, we are able to increase this spatial resolution by a factor of 1.5 to 2 in each dimension to achieve voxel volumes ∼1 mm3 without decreasing the signal to noise. Using this resolution, we report details of the retinotopic structure of the human LGN, SC and nuclei in the pulvinar.
Support has been provided by NIH training grant T32 MH065214 02, NIMH grants R01MH-64043 and P50MH-62196 and the Whitehall Foundation.