Purchase this article with an account.
Ushma Majmudar, Jillian Nguyen, Elizabeth Torres; The use of graphical user interfaces (GUIs) to analyze motion and temperature. Journal of Vision 2015;15(12):491. doi: https://doi.org/10.1167/15.12.491.
Download citation file:
© ARVO (1962-2015); The Authors (2016-present)
The recent revolution in wearable sensors poses new problems to develop analytics that handle large volumes of rapidly accumulating data from continuous recordings. We develop new analytical techniques to handle big data to process various physiological signals such as motion and temperature output from wearable sensor technology. Subjects performed tasks including dancing, walking, sleeping, performing decision-making tasks, and tasks in response to perceptual stimuli, while wearing sensors on various parts of the body. Sensors that register motion and surface skin temperature are worn for up to 15 hours. Inertial measurement units recorded acceleration, gyration, and temperature. A graphical user interface (GUI) was designed and implemented to automatically process large amounts of data and to provide longitudinal tracking of the subjects’ physiological signals. A GUI is a human-computer interface wherein users give commands to electronic devices using visual indicators such as push buttons, drop-down menus, etc. They can be designed to show outputs in the form of text or visual graphs based on parameters input by the user. Using GUIs to analyze large volumes of data creates ease and efficiency. We paired the GUI with 6 inertial measurement units, IMUs (APDM 128Hz, Oregon). A Matlab program was designed to analyze data captured from the IMUs. In addition to the use of the GUI in typical populations, here we demonstrate the use of this GUI in a variety of clinical populations with neurological disorders and mental illnesses. The latter include schizophrenia, bipolar disorder, and general executive dysfunction. The former included autism, sensory processing disorder, and a genetic disorder that causes a deletion in SHANK-3, resulting in the presence of autistic traits in some cases). We propose the use of the GUI interface to characterize bodily responses during the performance of perceptual tasks to provide real-time, objective read-outs of a subject’s perceptual state.
Meeting abstract presented at VSS 2015
This PDF is available to Subscribers Only