I pursue new interfaces to break down the silos between auditory perception, neuroscience and health care. Imagine an environment where the technologies we use everyday assist us in leading fundamentally more rewarding, creative lives, while simultaneously contributing to the bettermnet of our health. My research enables this opportunity by discovering how complex social and creative environments can generate scientifically valid physical and cognitive data. It is my goal at every step of research to put personalized, neurologically relevant tools in the hands of the general population and those in need.

CogNotes

Having made progress in music-based diagnostic tools for Alzheimer's disease, I'm now testing the reliability of the measure by embedding it in a highly social, open-ended, music workshop environment. A group of ~20 seniors are being mentored by high students to become composers through an extensive, multi-month Hyperscore workshop. During the workshop, the CogNotes software is simultaneously assessing cognitive function relevant to Alzheimer's Disease diagnosis. The CogNotes project is a collaboration between myself, the Hyperinstruments group, the Lincoln Park School for the Performing Arts, and the Yamaha Corporation.

Sub-acute Hand Rehab via Microsoft Surface

A suspiciously non-music project: Together with the Applied Sciences Group of Microsoft, and students of Harvey Mudd College, we're testing multi-touch, tablesized surface interfaces within sub-acute rehab clinics. The project wraps finger and wrist flexion/extension in a game environment.

._

EMG driven instrument control


Developed real-time classifier to disambiguate forearm control signals during attempted finger presses in a simple, one-handed, piano learning environment. Conducted preliminary study in healthy adult subjects. The system includes an audio interface in lieu of a typical lab DAC. The long-term goal is to classify highly irregular, homogenous, pathological forearm signals in hemiparetic stroke victims during note sequence exercises to determine whether non-functional but consistent signaling can be used as reproducible control in a creative environment.

musicGrid


musicGrid is an iPhone application that automatically constructs a working memory auditory-visualspatial cognitive assessment from whatever audio media exists on an user's device. This memory assessment is a direct analog to the tests that we are currently validating for auditory-visualspatial assessment of Alzheimer's Disease. By engaging with assessment as part of technologies situated in our everyday environment, and feeding the performance over time on these tasks directly back to a user, we hope to establish cognitive assessment as part of proactive, personalized, health management.

+supported by the Alzheimer's Association and Intel Corporation.

Everyday Technologies For Alzheimer's Care



A grant co-sponsored by Alzheimer's Association and Intel Corporation has engaged the development of new music interfaces to detect Alzheimer's Disease early. This ongoing research entails:

• validating a first generation of working memory auditory tasks to target the neurological structures at risk in early Alzheimer's pathology.
• disambiguating how varying features of an auditory stimulus advantage and disadvantage various demographics based on expertise, preference, age, and education.
• embedding assessment tasks in everyday technologies and situated environments.
• measuring performance on long-term cognitive assessment.
• studying how assessment fed directly back to a patient as part of a daily engagement modulates their patient-doctor relationship.

enabling musical expression in profoundly disabled individuals (EME)

Featured in the TED2008 conference, I developed an interface that would augemnt a quadriplegic subject's head movement to expressivley control performances of their own compositions, created through use of the Hyperscore composition software.

After a basic IR input device was established, the majority of this application development entailed working with the subject as a performer, iterating between performance practice, remapping input parameters to expressive parameters. We tuned the system so that it was tailored exclusively to the subject's movement, offering control and precision despite the individual's pathological discrpenecies and novel sense of the appropriate movement to engage certain musical gestures. The result is a personalized instrument. The development strategy, in closer study, has implications for how our tools can wrap any individual, despite their prior expereince, and provide fundamentally creative opportunities.

auditory games to assess autism prognosis

Central coherence accounts of autism have shown dysfunction in the process-ing of local versus global information that may be the source of symptoms in social behavior, communication and repetitive behavior. An application was developed to measure cognitive abilities in central coherence tasks as part of a music composition task. The application was evaluated in collaboration with the Spotlight Program, an interdisciplinary social pragmatics program for children with Asperger’s syndrome. This research indicates that it is possible to embed cognitive measure as part of a novel music application. Implications for current treatment interventions, and longitudinal experimentation designs are presented.

dependency structure and harmonic analysis



The following abstract corresponds to research conducted as part of the Computational Cognitive Science course listed below. A portion of my work involves new strategies for music structure analysis. My hope is that this line of work will converge with efforts to develop new music interfaces that can exemplify intuitive music cognition strategies.

Abstract

Many researchers have considered statistical models of music structure representation. However, these models are yet to provide a truly generative framework, capable of elucidating the cognitive processes of induction, concept formation, or categorization. This paper examines dependency networks and feature centrality as a possible foundation for this type of inference. I found that features of music structures do not adhere to the same feature centrality principles as artifacts or essentialized categories. Reasons for this discrepancy are discussed. I further propose a hierarchical analysis of the features within a music phrase, by which traditional characteristics of feature centrality may begin to emerge. These findings support that music is a unique concept-domain, by which novel strategies must be introduced to infer the cognition or adequate representation thereof.

music training program for an fMRI study



In collaboration with Amir Lahav, the MusicCure Lab, Harvard Medical School and the Beth Isreal Deaconness Medical Center.

Using creative software for piano learning, we train non-musicians to play short melodies on a piano. We are conducting an fMRI study to look at the neurobehavioral changes associated with listening to, learning and playing music.

rehab sound sculptor

The Rehab Sound Sculptor is a motion tracking interface that allows the researcher to quickly map a subject's movement parameters to any aspect of sound control. The cross-platform software interface simply requires a real-time, video input. Motion tracking is done by poling for a range of user-specified color within the incoming video stream.

The motivation for developing this interface was to implement a rapid prototyping system that would allow us to determine which attributes of sound are potentially more useful than others for providing feedback in motor rehabilitation. There has been a significant body of work, particularly with Parkinson's disease, that has shown the rehabilitative benefit of auditory feedback during physical therapy. However, little work has been done to determine exactly which parts of sound are responsible for this benefit. With thisthe Rehab Sound Sculptor interface, we can investigate the relative contributions of specific sound attributes to rehabilitation.

hyperscore at tewksbury state hospital

In the Spring of 2004, we introduced the Hyperscore program at Tewksbury State Hospital. Hyperscore was developed by the group. Hyperscore is a composition program that allows individuals without music training to compose by manipulating graphical representations of musical motives, harmonic development and resolution.

The goal of this clinical program was to assess patient change as a result of introducing novel composing technologies in a residential state hospital. We ran weekly hyperscore sessions with patient groups from the mental health and physical health units. Patient diagnoses included: cerebral palsy, various dementias, spina bifida, schizophrenia, manic depression, and others.

As a result of this work, enough patients displayed significant change in the functioning over both physical and mental disease that the use of the Hyperscore program has been integrated into physician prescribed patient treatment at the hospital, and continues to be used.

link to hyperscore at tewksbry videos and audio Here

analog electronics project in historical synthesis techniques

Following the Laboratory Electronics course a small group of Media Lab students and I set out to build a vocoder from analog components without the aid of DSP IC's. We did use an 8051 microcontroller to control an LED tower that we laser cut from acrylic, representing the power in the passed frequency bands.

working computer from scratch

As part of the Laboratory Electronics course we built a working computer from an 8051 microcontroller with extrenal discrete RAM, PLDs, some low level visualization, data and address bussing, serial port input, ADC/DAC, etc.