VibsNet: A deep spatiotemporal convolutional neural network architecture

This 41-layer CNN architecture was developed for classification of human perception of vibration frequencies using computer vision approaches in functional images.

The brain activation pattrns were converted to spatiotemporal images, which were passed through 3 pairs of convolution and pooling layers. Differenet sizes of convolution filters were used in different layers. Bactch normalization, ReLU activation, and max pooling were applied at each these convolutional layers.

The output of the convolutional layers were fed into a a 36 layers of residual network. The output of the residual netowrk was flattened and sent to a fully connected layer followed by a softmax output layer to classify 34 classes of vibrations perception.

Status: Active!

Encoding model of vibration frequency representation in human brain

In this research, we developed a vibration frequency encoding model and showed that many populations of neurons (i.e. voxels) maintain frequency tuning properties. Using a set of basis functions (hypothetical channel tuning) we were able to capture the response patterns of many voxels of the brain.

However, many other voxels also showed different response patterns, including monotonically increasing/decreasing responses with the frequencies. Some voxels were tuned to multiple frequencies those were related by the harmonics. Finally, some voxels do not differentially respond to the frequencies, so their responses were captured by the constant/uniform models.

Status: Active!

Feature and space dependent computations of bimanual touch

In this project, we used psychophysics and computational modeling to understand how touch information over the hands is combined.

We found that the perceived vibration on one hand is influenced by the vibration presented on the other hand. Specifically, when participants judged vibration frequencies, the interactions between the hands got stronger as the hands were held closer to each other.

However, when participants judged vibration intensities, the interactions between the hands did not depend on the positions of hands. This idiosyncratic bimanual interactions were supported by distinct computations for frequency and intensity perception.

Status: Completed! Preprint | Poster PDF (Best poster, TCN 2017) | CCN 2017

Multisensory circuits are embedded in sensory cortex hierarchies

Using fMRI adapation, we found multiple sensory brain regions respond to auditory and tactile inputs in a frequency-dependent manner. We also showed that frequency-dependent interactions between audition and touch occur in classically defined sensory brain regions. Finally, we showed that the representational geometry of sensory responses, in addition to patterns of spontaneous signal fluctuations, reflects traditional hierarchical organization. Based on these findings we conclude that temporal frequency responses in the human brain are consistent with both multimodal processing mechanisms and traditional sensory cortex hierarchies.

Status: Completed! Manuscript in prep. Poster PDF (Best poster, NRC 2017)

Transcranial magnetic stimulation of human brains

We tested the hypothesis that the somatosensory cortex interacts with a distributed cortical network that supports frequency processing for multiple sensory modalities, when attention is deployed to vibration frequency.

Manipulated somatosensory cortex activity with transcranial magnetic stimulation (TMS) while participants performed auditory, tactile, or crossmodal frequency judgments.

Demonstrate that the manipulation of somatosensory cortex activity impairs auditory frequency perception during specific behavioral states.

Status: Completed! Current Biology paper