The signals delivered by different sensory modalities provide us with complementary information about the environment. A key component of interacting with the world is how to direct ones’ sensors so as to extract task-relevant information in order to optimize subsequent perceptual decisions. This process is often referred to as active sensing. Importantly, the processing of multisensory information acquired actively from multiple sensory modalities requires the interaction of multiple brain areas over time. Here we investigated the neural underpinnings of active visual-haptic integration during performance of a two-alternative forced choice (2AFC) reaction time (RT) task. We asked human subjects to discriminate the amplitude of two texture stimuli (a) using only visual (V) information, (b) using only haptic (H) information and (c) combining the two sensory cues (VH), while electroencephalograms (EEG) were recorded. To quantify multivariate interactions between EEG signals and active sensory experience in the three sensory conditions, we employed a novel information-theoretic methodology. This approach provides a principled way to quantify the contribution of each one of the sensory modalities to the perception of the stimulus and assess whether the respective neural representations may interact to form a percept of the stimulus and ultimately drive perceptual decisions. Application of this method to our data identified (a) an EEG component (comprising frontal and occipital electrodes) carrying behavioral information that is common to the two sensory inputs and (b) another EEG component (mainly motor) reflecting a synergistic representational interaction between the two sensory inputs. We suggest that the proposed approach can be used to elucidate the neural mechanisms underlying cross-modal interactions in active multisensory processing and decision-making.