Ayuda
Ir al contenido

Dialnet


Temporal processing dynamics in neuronal populations

  • Autores: Javier Alegre Cortés
  • Directores de la Tesis: Eduardo Fernández Jover (dir. tes.), Cristina Soto-Sánchez (codir. tes.)
  • Lectura: En la Universidad Miguel Hernández de Elche ( España ) en 2019
  • Idioma: español
  • Tribunal Calificador de la Tesis: Pedro de la Villa Polo (presid.), Ramon Reig Garcia (secret.), Casto Rivadulla (voc.), Albert Compte Braquets (voc.), Luis Miguel Martínez Otero (voc.)
  • Programa de doctorado: Programa de Doctorado en Bioingeniería por la Universidad Miguel Hernández de Elche
  • Enlaces
    • Tesis en acceso abierto en: RediUMH
  • Resumen
    • Time is crucial for our understanding of our environment and our behaviour. We require to process time to execute motor actions, predict future events from the past ones and anticipate events. Despite many efforts, the neural basis of temporal processing remains elusive. It is an open question to understand how the brain perceives time, given the absence of any “time receptor”, as it is the case for vision, audition or any other perception. In this thesis I focused on the study of time representation in the seconds range during sensory stimulation by means of cortical spike oscillations.

      First, I show a novel framework to analyse spike oscillatory activity, specifically considering preserving its nonlinear and nonstationary properties. To do so, I showed that a combination of NA-MEMD + Hilbert Transform overcomes traditional Time-Frequency techniques to analyse neuronal recordings. I demonstrated it by comparing the obtained spectral properties obtained with our framework with previously published results using vibrissal nerve recording during tactile stimulation. As a second example, I used spike oscillations from neuronal populations of deep layers of visual cortex of anesthetised rats during visual stimulation.

      Once a proper framework to analyse spike oscillations was found, I studied how time interval at the seconds range during sensory stimulation in anesthetised rats was represented in deep layers of visual cortex. I demonstrated that when longer intervals than one second are used, the firing rate of deep layers of visual cortex in response to a moving grating is increased and the response becomes more stable. These results were more evident when three or five seconds interval were used and decreased when seven seconds intervals were used. In order to better understand the coding of interval at seconds scale in visual cortex, I studied the Time-Frequency dynamics of the evoked response with different intervals. Multiple differences at different times and frequencies were found when one second interval was compared both with three and five seconds intervals. Some of these differences were still present when seven seconds interval was used, suggesting an optimal interval window around three to five seconds. There were differences during the whole stimulation in the 6 Hz and the 10 Hz bands, as well as transient differences in higher frequencies. Considering these results, I proposed a phase space were interval time could be discriminated by means of the evoked trajectories during stimulation.

      Altogether, these results suggest a multiplexed processing of time interval using spike oscillations of neuronal populations from deep layers of visual cortex. They also suggest an optimal interval length of three to five seconds were the evoked response is maximal.

      At last, I suggest a new framework to study the oscillatory dynamics of single trials in neuroscience. I demonstrated that a combination of NA-MEMD to extract Time-Frequency features combined with Machine Learning classification, both supervised and unsupervised, outperforms classical tools in the characterization of single-trial dynamics. Given the ongoing interest of the field in the study of the brain activity and behaviour in single trials, this new framework promises to become a useful new tool in our quest to understand how the brain works.


Fundación Dialnet

Dialnet Plus

  • Más información sobre Dialnet Plus

Opciones de compartir

Opciones de entorno