Sensory systems

I am interested in how sensory systems work in ecological environments, and in particular in the perception of space, which is shared across almost all sensory modalities (including pain). I have developed two major aspects. First, unlike lab environments, ecological environments are never simple (simple organisms also do not live in simple environments). They are complex, but in the sense that they obey complex laws, not in the sense that they are “noisy” (25). I have developed theories of how neural circuits might deal with this complexity. Second, after a long personal reflection on the conceptual foundations of theoretical and computational neuroscience, I have concluded that decisive progress on modeling perception cannot be obtained without considering sensorimotor systems. The reason, briefly, is that without an integrated sensorimotor approach, one has no other choice than arbitrarily interpreting neural activity as behavior (or worse, “perceptual states”), rather than modeling behavior. For this reason, I have started working on the theory of sensorimotor systems. For the same reason, I have also started a project on the “swimming neuron”, which is described elsewhere.

  1. Dealing with complexity:

This work on the complexity of ecological environments starts from the view that perception relies on the identification and manipulation of models of the world, understood as relations between observables (ie, not necessarily generative models), where observables are sensory signals. I call these perceiver-oriented models “Subjective physics” (22). This view connects with major theories in psychology (Gestalt psychologyGibson's ecological approach, O'Regan's sensorimotor theory), philosophy of mind (PoincaréMerleau-Ponty) and linguistics (Lakoff). We have developed this line of research mostly in the field of sound localization.

You can have a look at a presentation I almost gave in the Champalimaud Institute: “An ecological approach to neural computation”.

Ecological acoustics

Our first goal is to characterize the structure of sounds in natural environments (see also my series of blog posts “What is sound” and my paper on “subjective physics”). Animals and humans use binaural cues to locate sound souces in space, in particular interaural time differences (ITD). These are often modelled as a fixed difference of propagation delays between the two ears, but this is not accurate: because of diffraction, it is known that ITDs depend on sound frequency. We have quantified this property with binaural recordings on stuffed animals, humans and physical models (17,19, 21). We then found out that the frequency dependence of ITD, measured in acoustics, matches the frequency dependence of the preferred ITD of binaural neurons (measured electrophysiologically) (19). This suggests that binaural neurons are tuned to acoustical features of ecological environments, rather than to fixed interaural delays, and we proposed that this is achieved by a combination of cochlear disparity (binaural neurons receiving inputs from mismatches places on the two cochleae) and axonal delay mismatches.

Another source of complication in real environments is reflections. It is often assumed that echoes are somehow suppressed by the auditory system. But this is not possible for early reflections, such as reflections on the ground (a type of reflections that is always part of our environment). Instead, these produce interferences that change the binaural cues, compared to the anechoic case, which we have quantified in detail (14).

Neural models of sensory systems

My hypothesis is that sensory relations, which constitute subjective models of the world, are identified as temporal invariants in the sensory flow (ie relations that are satisfied over a contiguous period of time), which is most directly connected with Gibson's notion of “invariant structure”. Physiologically, I have proposed that relations between sensory signals are reflected in the relations between the timings of spikes, i.e., in neural synchrony that is tuned to specific sensory models. I have proposed the concept of “synchrony receptive field” to describe the set of sensory signals that elicit synchronous firing in a given set of neurons, together with neural network models that can identify sensory models based on selective synchrony (13) (see my comments on the difference between correlation and synchrony).

This link with neural synchrony is motivated by the observation that neurons are extremely sensitive to coincidences in their inputs (11) and respond to time-varying inputs with precisely timed spikes, which I showed is a general property of spiking models (2) (I had previously developed a mathematical theory of one-dimensional integrate-and-fire models driven by time-varying inputs (3,4), including periodic inputs (1,3)). More generally, I defend the view that neural computation and dynamics cannot be adequately modelled by firing rates, and rather rely on the coordination of spikes (see this review (20) and my blog posts on this debate). See also my presentation (in French) at the Collège de France on time in neural computation.

Sound localization

The synchrony receptive field of two monaural neurons on opposite sides (left/right) is a spatial field, and detecting these location-specific synchrony patterns forms the basis of a simple and accurate model of sound localization (6). In this model, sound location is indicated by the activation of a specific assembly of binaural neurons, and the mapping from assembly to location can be determined by Hebbian learning (7). This hypothesis implies that the filtering properties of monaural inputs to a binaural neuron are precisely matched: this has been confirmed in the barn owl, and can emerge through spike-timing-dependent plasticity (10).

A prediction of the theory is that binaural neurons are tuned to sound location but not necessarily to interaural time difference (ITD), since ITD depends on sound frequency (17). Specifically, it predicts that, for a given neuron, preferred ITD should vary with frequency in the same way as ITD varies with frequency for the preferred location. This is what we found by comparing acoustical and electrophysiological recordings (19). We proposed that this is achieved by coincidence detection between fibers originated from mismatched places on the two cochleae, in combination with axonal delay mismatches. We have also shown previously that the dependence of preferred ITD on preferred frequency across neurons may arise during the development of the circuit, because of the temporal correlations in their inputs (8).

A popular alternative theory postulates that in mammals, sound location is encoded only by the average activity of binaural neurons in each hemisphere. The arguments in favor of this theory are based on an analysis of neuron sensitivity, that is, how neural responses change when the ITD changes. But in realistic situations, I have argued that information cannot be equated with sensitivity: in fact, to encode auditory information in complex environments, neurons should have diverse properties and this diversity should be exploited (5). Specifically, we have shown that the poor performance of the hemispheric model in complex situations (sound diffraction, variable sound spectrum) is incompatible with behavioral performance (16).

Pitch

We have also applied the synchrony receptive field theory (13) to pitch (the perception of how low or high a musical tone is), where we propose that pitch is the perceptual correlate of the regularity structure of the basilar membrane vibration (18). The theory explains in particular why there are perceptual differences between resolved and unresolved harmonic sounds. In agreement with the theory, we have shown that the pitch of low frequency tones decreases when level increases (23).

Tools

On the practical side, to design auditory models, we developed an auditory toolbox for the Brian simulator (9) (see also the page on simulation).

One problematic issue when implementing spiking models of auditory function is that the timing of spikes depends on input level. I showed how to solve this issue with an adaptive threshold (12), and we used this model to predict in vivo responses of bushy cells at various levels (15).

  1. Theory of sensorimotor systems

We started working on sensorimotor systems with my former student Charlotte Le Mouel by considering balance, and in particular the stretch reflex, which involves a short neural circuit in the spinal cord. We shortly realized that what appears as a simple local reflex is not that local after all, and not so clearly involved in balance either. In fact, by a theoretical analysis of the global mechanics of posture and a detailed analysis of the experimental literature, C. Le Mouel showed that global changes in posture appear to often anticipate actions or movements, rather than stabilize the body (24). A striking example is the starting position of a runner.

One interesting implication is that motor control is not just about pulling the right strings at the right times. A major aspect of motor control is actually to modify the mechanical properties of the body in anticipation of future events, whether external (perturbations) or internal (actions). For example, when it is important to be stable (e.g. facing a cliff), we can stiffen the legs by co-contracting the muscles, or if we need to manipulate tools, we can change the posture of the arm so as to be more stable in one particular direction (26). This postural adaptiveness explains why a standard perturbation task fails to detect old people at risk of fall: only unexpected perturbations reveal difficulties in balance recovery (27).

We are currently trying to develop neural models of sensorimotor control, and in particular of autonomous learning in such tasks.

 

Relevant publications (chronological order):

  1. Brette, R. (2003). Rotation numbers of discontinuous orientation-preserving circle maps.
  2. Brette, R. and E. Guigon (2003). Reliability of spike timing is a general property of spiking model neurons.
  3. Brette, R. (2004). Dynamics of one-dimensional spiking neuron models.
  4. Brette, R. (2008). The Cauchy problem for one-dimensional spiking neuron models.
  5. Brette R (2010) On the interpretation of sensitivity analyses of neural responses.
  6. Goodman DF and R Brette (2010). Spike-timing-based computation in sound localization.
  7. Goodman DF and R Brette (2010). Learning to localise sounds with spiking neural networks.
  8. Fontaine B and Brette R (2011). Neural development of binaural tuning through Hebbian learning predicts frequency-dependent best delays. (Supplementary material).
  9. Fontaine B, Goodman DFM, Benichoux F, Brette R (2011). Brian Hears: online auditory processing using vectorisation over channels.
  10. Fischer BJ, Steinberg LJ, Fontaine B, Brette R, Peña JL (2011).Effect of instantaneous frequency glides on ITD processing by auditory coincidence detectors.
  11. Rossant C, Leijon S, Magnusson AK, Brette R (2011).Sensitivity of noisy neurons to coincident inputs.
  12. Brette R (2012). Spiking models for level-invariant encoding.
  13. Brette R (2012). Computing with neural synchrony. (code)
  14. Gourévitch B and Brette R (2012). The impact of early reflections on binaural cues.
  15. Fontaine B, Benichoux V, Joris PX and Brette R (2013).Predicting spike timing in highly synchronous auditory neurons at different sound levels.
  16. Goodman DFM, Benichoux V, Brette R (2013). Decoding neural responses to temporal cues for sound localization.
  17. Rébillat M*, Benichoux V*, Otani M, Keriven R, Brette R (2014). Estimation of the low-frequency components of the head-related transfer functions of animals from photographs.
  18. Laudanski J, Zheng Y, Brette R (2014). A structural theory of pitch.
  19. Bénichoux V, Fontaine B, Karino S, Franken TP, Joris PX*, Brette R* (2015). Neural tuning matches frequency-dependent time differences between the ears.
  20. Brette R (2015). Philosophy of the spike: rate-based vs. spike-based theories of the brain.
  21. Bénichoux V, Rébillat M, Brette R (2016). On the variation of interaural time differences with frequency.
  22. Brette R (2016). Subjective physics. In Closed Loop Neuroscience, El Hady (ed), Academic Press. (Previously as an arXiv paper: Subjective physics (2013)).
  23. Zheng Y and Brette R (2017). On the relation between pitch and level.
  24. Le Mouel C and Brette R (2017). Mobility as the purpose of postural control. (See also the more complete preprint).
  25. Brette R (2018). The world is complex, not just noisy.
  26. Le Mouel C and Brette R (2019). Anticipatory coadaptation of ankle stiffness and neural feedback for standing balance.
  27. Le Mouel C, Tisserand R, Robert T, Brette R (2019). Postural adjustments in anticipation of predictable perturbations allow elderly fallers to achieve a balance recovery performance equivalent to elderly non-fallers.