Figure 7 - uploaded by Gonzalo G de Polavieja
Content may be subject to copyright.
Calculation of the rate of information transfer from the digitized voltage response. A, Voltage response in a 50 ms window. B, Digitized version of the voltage response, shown here with four voltage levels. C, Sixteen possible " words " of length T 2 letters 0.001 s and v 4 voltage levels. D, Probability of finding the 16 possible words in the digitized voltage response in B. E, Total and noise entropy rates, R S and R N in Equation 4, obtained as the infinite length limit, T 3 , from the finite length values. Black squares indicate the values of the information rates before the infinite length limit for words of length T 2 letters 0.001 s using the probability density in D. F, Digitized version of the voltage response for 12 repetitions in experiments with 80 –200 repetitions, shown for a time window of 15 ms and v 4 voltage levels. G, Probability of finding the 16 two-letter words with four voltage levels at two different times in a trial and calculated across trials. These probabilities are used in the calculation of the noise rate, R N , in Equation 4, marked as a black square (T 0.001 s) in E. 

Calculation of the rate of information transfer from the digitized voltage response. A, Voltage response in a 50 ms window. B, Digitized version of the voltage response, shown here with four voltage levels. C, Sixteen possible " words " of length T 2 letters 0.001 s and v 4 voltage levels. D, Probability of finding the 16 possible words in the digitized voltage response in B. E, Total and noise entropy rates, R S and R N in Equation 4, obtained as the infinite length limit, T 3 , from the finite length values. Black squares indicate the values of the information rates before the infinite length limit for words of length T 2 letters 0.001 s using the probability density in D. F, Digitized version of the voltage response for 12 repetitions in experiments with 80 –200 repetitions, shown for a time window of 15 ms and v 4 voltage levels. G, Probability of finding the 16 two-letter words with four voltage levels at two different times in a trial and calculated across trials. These probabilities are used in the calculation of the noise rate, R N , in Equation 4, marked as a black square (T 0.001 s) in E. 

Source publication
Article
Full-text available
Action potentials have been shown to shunt synaptic charge to a degree that depends on their waveform. In this way, they participate in synaptic integration, and thus in the probability of generating succeeding action potentials, in a shape-dependent way. Here we test whether the different action potential waveforms produced during dynamical stimul...

Contexts in source publication

Context 1
... values of the digitized entropies depend on the length of the "words", T, the number of voltage levels v, and the size of the data file, H T ,v ,size . Take, for example, the case of the voltage changes in a spiking neuron, given in Figure 7A. For only four digitization levels, the voltage changes are represented as in Figure 7B. ...
Context 2
... for example, the case of the voltage changes in a spiking neuron, given in Figure 7A. For only four digitization levels, the voltage changes are represented as in Figure 7B. This digitization can be understood in terms of "words" as follows. ...
Context 3
... digitization can be understood in terms of "words" as follows. Say we consider words of two letters, T 2, depicted in Figure 7C. Counting the occurrence of these two- letter words in the signal of Figure 7B, we find their probabilities of occurrence (Fig. 7D). ...
Context 4
... we consider words of two letters, T 2, depicted in Figure 7C. Counting the occurrence of these two- letter words in the signal of Figure 7B, we find their probabilities of occurrence (Fig. 7D). A naive calculation of the total entropy would use this distribution and obtain the value marked as a black square in Figure 7E. ...
Context 5
... 7A. For only four digitization levels, the voltage changes are represented as in Figure 7B. This digitization can be understood in terms of "words" as follows. Say we consider words of two letters, T 2, depicted in Figure 7C. Counting the occurrence of these two- letter words in the signal of Figure 7B, we find their probabilities of occurrence (Fig. 7D). A naive calculation of the total entropy would use this distribution and obtain the value marked as a black square in Figure 7E. A similar naive calculation for the noise entropy uses repeated trials, as in Figure 7F. From these trials, we calculate the distributions at different times, P i (), two of them shown in Figure 7, G and H. ...
Context 6
... the occurrence of these two- letter words in the signal of Figure 7B, we find their probabilities of occurrence (Fig. 7D). A naive calculation of the total entropy would use this distribution and obtain the value marked as a black square in Figure 7E. A similar naive calculation for the noise entropy uses repeated trials, as in Figure 7F. ...
Context 7
... naive calculation of the total entropy would use this distribution and obtain the value marked as a black square in Figure 7E. A similar naive calculation for the noise entropy uses repeated trials, as in Figure 7F. From these trials, we calculate the distributions at different times, P i (), two of them shown in Figure 7, G and H. ...
Context 8
... similar naive calculation for the noise entropy uses repeated trials, as in Figure 7F. From these trials, we calculate the distributions at different times, P i (), two of them shown in Figure 7, G and H. The corresponding naive noise en- tropy is given in Figure 7E as a red circle. ...
Context 9
... these trials, we calculate the distributions at different times, P i (), two of them shown in Figure 7, G and H. The corresponding naive noise en- tropy is given in Figure 7E as a red circle. The rate of information transfer can be obtained from the limits of infinite word length T, infinite number of voltage levels v, and infinite size of the data file as the difference between the total entropy, R S , and noise entropy, R N , rates: ...
Context 10
... problem for practical calculations is to obtain these limits. Our approach is to extrapolate the values of the naive entropies to the limits in Equation 4. Take, for example, the case of the calcu- lation of the infinite word size limit for the total and noise infor- mation rates in Figure 7E. For the highest word lengths consid- ered, the values obtained deviate from the general trend, and the final information rate tends to zero because of the lack of data. ...

Similar publications

Article
Full-text available
Multiple pathomechanisms triggered by mutant Huntingtin (mHTT) underlie progressive degeneration of dopaminoceptive striatal neurons in Huntington’s disease (HD). The primary cilium is a membrane compartment that functions as a hub for various pathways that are dysregulated in HD, for example, dopamine (DA) receptor transmission and the mechanistic...
Article
Full-text available
A theoretical model of a network of neuron-like elements was constructed. The network included several subnetworks. The first subnetwork was used to translate a constant-amplitude signal into a spike sequence (conversion of amplitude to frequency). A similar process occurs in the brain when perceiving visual information. With an increase in the flo...
Article
Full-text available
Autism spectrum disorder (ASD) is a neurodevelopmental disorder of unknown cause, although one hypothesis suggests a potential imbalance between excitation and inhibition that leads to changes in neuronal activity and a disturbance in the brain network. However, the mechanisms through which neuronal activity contributes to the development of ASD re...
Article
Full-text available
Proper neuronal circuit function relies on precise dendritic projection, which is established through activity-dependent refinement during early postnatal development. Here we revealed dynamics of dendritic refinement in the mammalian brain by conducting long-term imaging of the neonatal mouse barrel cortex. By “retrospective” analyses, we identifi...
Article
Full-text available
Despite a prominent risk factor for Neurodevelopmental disorders (NDD), it remains unclear how Autism Susceptibility Candidate 2 (AUTS2) controls the neurodevelopmental program. Our studies investigated the role of AUTS2 in neuronal differentiation and discovered that AUTS2, together with WDR68 and SKI, forms a novel protein complex (AWS) specifica...

Citations

... Modulatory systems of the network like neuromodulatory transmitters or levels of inhibition represent another dimension of analog tuning. The amplitude of action potentials can also vary and represent information (Polavieja et al., 2005;Teplov et al., 2018). Therefore, information representation in BNNs is neither digital nor analog; it is hybrid, adapted to the local network environment and requirements. ...
Article
Machine learning tools, particularly artificial neural networks (ANN), have become ubiquitous in many scientific disciplines, and machine learning-based techniques flourish not only because of the expanding computational power and the increasing availability of labeled data sets but also because of the increasingly powerful training algorithms and refined topologies of ANN. Some refined topologies were initially motivated by neuronal network architectures found in the brain, such as convolutional ANN. Later topologies of neuronal networks departed from the biological substrate and began to be developed independently as the biological processing units are not well understood or are not transferable to in silico architectures. In the field of neuroscience, the advent of multichannel recordings has enabled recording the activity of many neurons simultaneously and characterizing complex network activity in biological neural networks (BNN). The unique opportunity to compare large neuronal network topologies, processing, and learning strategies with those that have been developed in state-of-the-art ANN has become a reality. The aim of this review is to introduce certain basic concepts of modern ANN, corresponding training algorithms, and biological counterparts. The selection of these modern ANN is prone to be biased (e.g., spiking neural networks are excluded) but may be sufficient for a concise overview.
... Others have found spike rate influences AP shape (de Polavieja et al., 2005). Because we found current-evoked spike rate is dependent on cell type (Figure 2), we measured the influence of spike rate on AP half-width for both αONand αOFF-S RGCs to determine if differences in AP width could be explained solely by firing rate (Figure 3F). ...
Article
Full-text available
Neuronal type-specific physiologic heterogeneity can be driven by both extrinsic and intrinsic mechanisms. In retinal ganglion cells (RGCs), which carry visual information from the retina to central targets, evidence suggests intrinsic properties shaping action potential (AP) generation significantly impact the responses of RGCs to visual stimuli. Here, we explored how differences in intrinsic excitability further distinguish two RCG types with distinct presynaptic circuits, alpha ON-sustained (αON-S) cells and alpha OFF-sustained (αOFF-S) cells. We found that αOFF-S RGCs are more excitable to modest depolarizing currents than αON-S RGCs but excitability plateaued earlier as depolarization increased (i.e., depolarization block). In addition to differences in depolarization block sensitivity, the two cell types also produced distinct AP shapes with increasing stimulation. αOFF-S AP width and variability increased with depolarization magnitude, which correlated with the onset of depolarization block, while αON-S AP width and variability remained stable. We then tested if differences in depolarization block observed in αON-S and αOFF-S RGCs were due to sensitivity to extracellular potassium. We found αOFF-S RGCs more sensitive to increased extracellular potassium concentration, which shifted αON-S RGC excitability to that of αOFF-S cells under baseline potassium conditions. Finally, we investigated the influence of the axon initial segment (AIS) dimensions on RGC spiking. We found that the relationship between AIS length and evoked spike rate varied not only by cell type, but also by the strength of stimulation, suggesting AIS structure alone cannot fully explain the observed differences RGC excitability. Thus, sensitivity to extracellular potassium contributes to differences in intrinsic excitability, a key factor that shapes how RGCs encode visual information.
... Epilepsy is a debilitating disease affecting some 50 million people worldwide (de Polavieja, 2005;Thurman et al., 2011). Epileptic seizures are thought to result from an imbalance between excitatory and inhibitory neuronal activity (Žiburkus et al., 2013). ...
... We exploited the biophysical relationship between depolarization, voltage-gated Na+ channel inactivation, and spike waveform shape (de Polavieja, 2005) to infer the direction of membrane potential changes underlying the changes in firing rate exhibited by single units recorded extracellularly in patients across time. This was done for a given unit by first computing the local spike rate and average trough-to-peak waveform amplitude in moving time windows of width 1 second unless otherwise specified, moving with a step size of 0.1 seconds. ...
Thesis
The dynamics of neural populations must maintain stability while also performing behaviorally useful representation and processing of rapidly changing environmental inputs. How the nervous system accomplishes this flexible coordination of its own dynamics with that of the sensory stream, while also maintaining its own stability, is poorly understood. However, answers to this puzzle promise better understanding of neurological disorders involving dynamic instability such as epilepsy, as well as uncovering evolutionarily-optimized algorithms for processing information that evolves over time, more robustly than our current technology. Given that relevant information can change rapidly in the case of spatial navigation at different running speeds, hippocampal neural circuits that support both externally-tuned spatial cognition and internally stable memory must particularly employ such mechanisms. In Chapter 2, we propose and find empirical support for a novel framework that allows hippocampal cells to maintain stability in the face of rapidly changing external inputs, called the logarithmic theta transform (LTT), as implemented by the nonlinear shape of theta phase precession in individual place cells. Findings suggest how space and time coding mechanisms in hippocampal networks come together to stabilize the internal organization of place cell activity within theta cycles. In Chapter 3, we investigate mechanisms controlling the rapid progression of seizures in epilepsy patients with a focus on fast-spiking inhibitory neuronal activity. These findings suggest that cell-type specific propensities to depolarization block play an important role in the evolution of sequential neural dynamics during focal seizures with secondary generalization, with implications for therapeutic approaches needing the proper timing relative to this sequence. Overall, this work points toward properties on the level of individual cells as supporting stable population dynamics that are nevertheless flexible in responding to changing information.
... On the one hand, it remains to be seen if the application of the FMM approach on multiple signals of the same neuron could generate useful features. However, this question is up in the air as the independence of the AP shape from the applied stimulus is assumed by some authors, such as Raghavan et al. (2019), whereas others like de Polavieja et al. (2005) state that the AP shape is affected by the recent history of applied stimuli. On the other hand, the use of the proposed method in multi-AP signals could be profitable, extracting other interesting features such as the interspike distance or the neuron's firing rate. ...
Article
Full-text available
The complete understanding of the mammalian brain requires exact knowledge of the function of each neuron subpopulation composing its parts. To achieve this goal, an exhaustive, precise, reproducible, and robust neuronal taxonomy should be defined. In this paper, a new circular taxonomy based on transcriptomic features and novel electrophysiological features is proposed. The approach is validated by analysing more than 1850 electrophysiological signals of different mouse visual cortex neurons proceeding from the Allen Cell Types database. The study is conducted on two different levels: neurons and their cell-type aggregation into Cre lines. At the neuronal level, electrophysiological features have been extracted with a promising model that has already proved its worth in neuronal dynamics. At the Cre line level, electrophysiological and transcriptomic features are joined on cell types with available genetic information. A taxonomy with a circular order is revealed by a simple transformation of the first two principal components that allow the characterization of the different Cre lines. Moreover, the proposed methodology locates other Cre lines in the taxonomy that do not have transcriptomic features available. Finally, the taxonomy is validated by Machine Learning methods which are able to discriminate the different neuron types with the proposed electrophysiological features.
... Thankfully there are multiple ways to use the Shannon's formula (Eq. 1) and compensate for this bias, but none of them can be called trivial. There are for example the quadratic extrapolations method (Strong et al., 1998;Juusola and de Polavieja, 2003;de Polavieja et al., 2005), the Panzeri-Treves Bayesian estimation (Panzeri and Treves, 1996), the Best Universal Bound estimation (Paninski, 2003), the Nemenman-Shafee-Bialek method (Nemenman et al., 2004), or some more recent methods using statistic copulas (Ince et al., 2017;Safaai et al., 2018). Each method has its advantages and downsides [see Panzeri et al. (2007) for a comparison of some of them], which leaves the experimenter puzzled and in dire need of a mathematician (Borst and Theunissen, 1999;Magri et al., 2009;Timme and Lapish, 2018;Piasini and Panzeri, 2019). ...
... where p (x i ) τ is the probability of finding the configuration x i at a time τ over all the acquired trials of an experiment (Strong et al., 1998;Juusola and de Polavieja, 2003;de Polavieja et al., 2005). Finally, we can obtain the information transfer rate R, by using the quadratic extrapolations method and dividing by the time sampling (Strong et al., 1998;Juusola and de Polavieja, 2003;de Polavieja et al., 2005), as: ...
... where p (x i ) τ is the probability of finding the configuration x i at a time τ over all the acquired trials of an experiment (Strong et al., 1998;Juusola and de Polavieja, 2003;de Polavieja et al., 2005). Finally, we can obtain the information transfer rate R, by using the quadratic extrapolations method and dividing by the time sampling (Strong et al., 1998;Juusola and de Polavieja, 2003;de Polavieja et al., 2005), as: ...
Article
Full-text available
Calculations of entropy of a signal or mutual information between two variables are valuable analytical tools in the field of neuroscience. They can be applied to all types of data, capture non-linear interactions and are model independent. Yet the limited size and number of recordings one can collect in a series of experiments makes their calculation highly prone to sampling bias. Mathematical methods to overcome this so-called "sampling disaster" exist, but require significant expertise, great time and computational costs. As such, there is a need for a simple, unbiased and computationally efficient tool for estimating the level of entropy and mutual information. In this article, we propose that application of entropy-encoding compression algorithms widely used in text and image compression fulfill these requirements. By simply saving the signal in PNG picture format and measuring the size of the file on the hard drive, we can estimate entropy changes through different conditions. Furthermore, with some simple modifications of the PNG file, we can also estimate the evolution of mutual information between a stimulus and the observed responses through different conditions. We first demonstrate the applicability of this method using white-noise-like signals. Then, while this method can be used in all kind of experimental conditions, we provide examples of its application in patch-clamp recordings, detection of place cells and histological data. Although this method does not give an absolute value of entropy or mutual information, it is mathematically correct, and its simplicity and broad use make it a powerful tool for their estimation through experiments.
... The copyright holder for this preprint this version posted January 31, 2021. ; https://doi.org/10.1101/2020.08.04.236174 doi: bioRxiv preprint Entropy Estimation by PNG Size 3 method (Juusola and de Polavieja, 2003;de Polavieja et al., 2005;Strong et al., 1998), the 72 Panzeri-Treves Bayesian estimation (Panzeri and Treves, 1996), the Best Universal Bound 73 estimation (Paninski, 2003), the Nemenman-Shafee-Bialek method (Nemenman et al., 2004) or 74 some more recent methods using statistic copulas (Ince et al., 2017;Safaai et al., 2018). Each 75 method has its advantages and downsides (see (Panzeri et al., 2007) for a comparison of some of 76 them), which leaves the experimenter puzzled and in dire need of a mathematician (Borst and 77 Theunissen, 1999; Magri et al., 2009; Piasini and Panzeri, 2019; Timme and Lapish, 2018). ...
Preprint
Full-text available
A bstract Calculations of entropy of a signal or mutual information between two variables are valuable analytical tools in the field of neuroscience. They can be applied to all types of data, capture nonlinear interactions and are model independent. Yet the limited size and number of recordings one can collect in a series of experiments makes their calculation highly prone to sampling bias. Mathematical methods to overcome this so-called “sampling disaster” exist, but require significant expertise, great time and computational costs. As such, there is a need for a simple, unbiased and computationally efficient tool for estimating the level of entropy and mutual information. In this paper, we propose that application of entropy-encoding compression algorithms widely used in text and image compression fulfill these requirements. By simply saving the signal in PNG picture format and measuring the size of the file on the hard drive, we can estimate entropy changes through different conditions. Furthermore, with some simple modifications of the PNG file, we can also estimate the evolution of mutual information between a stimulus and the observed responses through different conditions. We first demonstrate the applicability of this method using white-noise-like signals. Then, while this method can be used in all kind of experimental conditions, we provide examples of its application in patch-clamp recordings, detection of place cells and histological data. Although this method does not give an absolute value of entropy or mutual information, it is mathematically correct, and its simplicity and broad use make it a powerful tool for their estimation through experiments.
... Here, the term neural modulation refers to the process by which a neuron translates its synaptic inputs to specific AP pattern. Although such modulation has not been well discussed in the literature, a biological basis is proposed in [8]. ...
Conference Paper
Full-text available
Biological neurons signal each other in a rich and complex manner to perform complex cognitive computing tasks in real-time. The implementation of such capabilities using electronic circuits is a difficult task. Many neuromorphic circuits mimic different aspects of a biological neuron, yet neural modulation has received little focus. Modulation is a key aspect of understanding how neuron processes and interprets information. We posit that each neuron may have a unique built-in modulation function depending on its role. To build rich behavioral neurons, this paper introduces a new technique capable of converting continuous stimulus shapes into modulated spiking/bursting patterns. We test the hypotheses using the simplest modulation function (direct correlation). The results show plausible excitatory spiking patterns consistent with the Izhikevich mathematical model. Implementation of circuits was conducted using the 45nm CMOS technology node, and functional verification is discussed in detail using Cadence Virtuoso Simulator tools.
... In addition, several experimental studies indicate that the precise timing of the first spike carries critical information about the stimulus in sensory responses (34)(35)(36)(37). Our data indicate that ISF conveys presynaptic spike timing information to the postsynaptic cell with high temporal precision and thus provides strong biological support for time coding based on spike latency at cortical synapses (33,38). ISF defines a rule in which highly relevant neuronal information comes early and is transmitted in the circuit, while poorly relevant information comes later and is filtered. ...
Article
Full-text available
Sensory processing requires mechanisms of fast coincidence detection to discriminate synchronous from asyn-chronous inputs. Spike threshold adaptation enables such a discrimination but is ineffective in transmitting this information to the network. We show here that presynaptic axonal sodium channels read and transmit precise levels of input synchrony to the postsynaptic cell by modulating the presynaptic action potential (AP) amplitude. As a consequence, synaptic transmission is facilitated at cortical synapses when the presynaptic spike is produced by synchronous inputs. Using dual soma-axon recordings, imaging, and modeling, we show that this facilitation results from enhanced AP amplitude in the axon due to minimized inactivation of axonal sodium channels. Quantifying local circuit activity and using network modeling, we found that spikes induced by synchronous inputs produced a larger effect on network activity than spikes induced by asynchronous inputs. Therefore, this input synchrony-dependent facilitation may constitute a powerful mechanism, regulating synaptic transmission at proximal synapses.
... On the other hand, the phase slope exhibits the highest relative variation between neurons with 18 % difference (7.2 ± 1.3 ms -1 ). In cortical neurons, the stimulus strength can alter the shape of the APs [21]. The difference in the APs' shape was observed in some of the neurons analyzed here. ...
... Finally, we looked at the correlation between the three methods and the AP amplitude and width. Previous work had shown that in cortical neurons AP height decreases are correlated with wider APs and lower phase slope, which indicate a high degree of VGSCs inactivation [21]. Therefore, here we demonstrate that the HWHM method shows a lower variation from one neuron to another, while exhibiting a high relative variation in neurons that have high AP shape variability. ...
Article
Full-text available
Recent studies show a kink at action potential, AP, onset in some cortical and retinal neurons. Several papers have quantified the rapidity of AP onset, i.e., how kinked the AP initiation is, and found the rapidity to be faster than predicted by the Hodgkin-Huxley model (HH) and conclude the HH model is missing critical biophysics. However, these works typically define AP onset rapidity subjectively, often using an arbitrary value of ̇ , the first time derivative of the membrane potential. Therefore, we propose more systematic methods to analyze the AP initiation using the full width at half maximum (FWHM) and half width at half maximum (HWHM) of the onset peak in ̈, the second time derivative. The maximum ̈ occurs when the Na + current changes fastest. Hence, the FWHM and HWHM of ̈ are well-defined and intuitive measures of onset rapidity. To examine the proposed methods, we have varied the rate constant parameters in the HH model and numerically calculated the resultant FWHM and HWHM of ̈ alongside previously published approaches. The results show that both the FWHM and HWHM methods are more sensitive to changes in HH parameters, giving a wider range of onset rapidity while remaining minimally affected by other aspects such as the AP width. Furthermore, we used the methods to analyze the onset rapidity in somatosensory cortical neurons. The results from FWHM and HWHM methods show a low variation between the neurons analyzed here while exhibiting relatively high variation in neurons with high AP shape variability compared to the widely used phase slope method. Our results indicate that the FWHM and HWHM methods are more sensitive and specific to measure the AP onset dynamics. Thus, quantifying the rapidity of AP onset using the FWHM and HWHM methods provides a systematic tool to analyze the AP onset dynamics and allow direct comparison between experimental data.
... The average dVm dt during the upstroke of APs (measured from when dVm dt exceeds the derivative threshold until the peak V m ) can also be used as a proxy measurement of Na + inactivation [31]. O + E had significantly higher dVm dt than E + E costimulation (O + E = 75.3 ...
Article
Objective: The performance of neuroprostheses, including cochlear and retinal implants, is currently constrained by the spatial resolution of electrical stimulation. Optogenetics has improved the spatial control of neurons in vivo but lacks the fast-temporal dynamics required for auditory and retinal signalling. The objective of this study is to demonstrate that combining optical and electrical stimulation in vitro could address some of the limitations associated with each of the stimulus modes when used independently. Approach: The response of murine auditory neurons expressing ChR2-H134 to combined optical and electrical stimulation was characterised using whole cell patch clamp electrophysiology. Main results: Optogenetic costimulation produces a three-fold increase in peak firing rate compared to optical stimulation alone and allows spikes to be evoked by combined subthreshold optical and electrical inputs. Subthreshold optical depolarisation also facilitated spiking in auditory neurons for periods of up to 30 ms without evidence of wide-scale Na+ inactivation. Significance These findings may contribute to the development of spatially and temporally selective optogenetic-based neuroprosthetics and complement recent developments in "fast opsins".