The 3D neuroanatomical reconstruction of dentate nucleus gross morphology.

The 3D neuroanatomical reconstruction of dentate nucleus gross morphology.

Source publication
Article
Full-text available
Aim: The aim of this study is to find relational connections (interdependence) between the two most general categorical aspects of a neuron, i.e., between the form (morphology) and its function, using as a model for this task dentate nucleus neurons. Furthermore, the configuration of the dentatostriate nucleotopic inter-cluster mapping of the denta...

Contexts in source publication

Context 1
... our previous study [10], we determined two neuromorphotopologically different clusters (types) of dentate nucleus (DN) neurons (Figure 1), namely, the external (EBC) and internal border neurons (IBC), differing in terms of both their morphology and topology. The topological cluster of central neurons, neuromorphologically, can be divided into the EBC and IBC histotypes [11]. ...
Context 2
... model is analyzed separately for the EBN and IBN cluster. They are developed by a specific factor-extraction algorithm ( Figure 10). Two main hierarchical steps in model development are PLSR/PCR factor analysis, as the first step, and RSM modeling as the second, i.e. the most significant factors are first extracted using PLSR/PCR analysis and then an RSM model is built upon them. ...
Context 3
... the AS RSM model it can be seen that neurofunctional parameters which partly determine the size of neurosoma are the VS, VBD, VDD and naturally, the transmitter type of a neuron, as already shown by Uusisaari and Knöpfel [6] (Figure 11-13). ...
Context 4
... for example, the AS/VBD correlation is very highly significant (p<0.001), positive for the EBN cluster, meaning that with larger AV burst voltage, the neuron would have its perikaryon increasing ( Figure 13) and from the model itself, it can be seen that this relation is nonlinear, i.e. far more extensive for AS values greater than 100 µm 2 . For the IBN neuron cluster the situation is different, since VBD correlation with neurosoma size is not significant (p>0.05). ...
Context 5
... the way, the ordinal character of transmitter type (TT) as a multiplier mathematically does not change the behavior of noninteracting terms to interactive ones. Similar relations can be extracted by analyzing the other factors of this and other models (Tables 5-8, Figures 14-16). Integral RSM models represent the models where the response parameter is combined, representative, integral variable (parameter) of an entire parameter category (morphological/functional). ...
Context 6
... are obtained using the combination of matrix manipulation techniques such as concatenation, multiplication and squared deviation summation applied upon the all parameters of one deterministic category (morphology/function). They are obtained for each DN neuronal cluster (EBN and IBN) using the following formula, It should be noticed that the 3D graphs of RSM models (Figures 12 and 15) entail outliers as well as theoretical values of response parameter, which are practically impossible to obtain by measurement, i.e., beyond the boundary values of the real parameters. This is due to the error cumulative effect at the boundaries of the region. ...
Context 7
... X ̅ represents the mean of the each parameter belonging to one category (morphology/function) and to a DN cluster and k=100 is an arbitrary constant. In fact, they are inverse Fourier transforms of the aboveformulated integral variables (Figure 17) but since the suitable RSM models had lower performance on periodic functions, their Fourier transforms are kept as representative integral variables. The morphological EBN integral model is shown to be valid (Figures 18- 20). ...
Context 8
... fact, they are inverse Fourier transforms of the aboveformulated integral variables (Figure 17) but since the suitable RSM models had lower performance on periodic functions, their Fourier transforms are kept as representative integral variables. The morphological EBN integral model is shown to be valid (Figures 18- 20). However, the IBN model could not be obtained as significant one (not shown). ...
Context 9
... PLSR/PCR analysis and correlations are shown in figures 18 and 20, respectively. From the integrative models, it can be seen that all of the parameters are contributing to the model, predominantly through interactions, thus implying very complex implicate relations (Figures 21-23). ...
Context 10
... is probably the expression of underlying interplay of a myriad of synaptic inputs bombarding the neurosoma, behaving like neurotrophic factors and stimulating it simultaneously to become larger. The figure 12b shows that an EBN neuroperikaryion approximately linearly increases as the voltage of depolarizing burst (VBD) also increases. This can be interpreted from the aspect of neural support of suitable output or from the aspect of neural response to a similar neurosynaptic input. ...
Context 11
... in the case of input analysis, the input bursting neurosignal of higher voltage would induce development of a larger neurosoma. This law tends to a limit shown as a plateau on figure 12d for IBN neurons, where VBD is discretized parameter. This tells us that although stimulative to some extent, in both cases, high bursting voltage eventually leads to the exhaustion of neuron electrical abilities, expressed as shrinkage and eventually death, in extreme cases, which, although not explicitly shown by extrapolation of our results, is a well known experimental fact [32- 34]. ...

Citations

... It represents some sort of a combination of correlative and differential statistics. In our neuromorphological studies dealing with the neuron-classification problem [5][6][7][8][9], it represents the ultimate method that confirms or discards the presupposed classification. Furthermore, it explains the observed neuromorphological differences and describes different consequent cluster behavior. ...
... We have already demonstrated that caudate (CINs) and putaminal (PINs) interneurons are Fig. 7 Graphical presentation of results obtained by application of the CCA on some real data. a Two correlation matrices, b their MEP, c MEP contour diagram (red ROI is the region of mismatched correlations of the first kind), d MEP ROI, by applying color map to MEP the middle ROI is focused (solid red) neuromorphofunctionally different [7]. In addition, the caudate interneuron cluster (CCL) itself comprises two types of interneurons, termed cluster I (CCL1) and cluster II interneurons (CCL2) with significantly different neuromorphofunctional properties [8,9]. ...
Article
Full-text available
This paper aims to present a way of multidimensional data-mining termed correlation–comparison analysis (CCA). It was applied to neural data to demonstrate its utility in neuron-classification problem. The CCA represents a semi-quantitative way of inter-sample comparisons. The methodology comprises the generation of inter-parametric correlation and alpha-error (p value) matrices. The main step is p-comparison for the same parametric pair defined between the two samples. This comparison has a semi-quantitative binary character that does not involve issues, such as false discovery rate (FDR) in multiple comparisons. As a result, the outcomes obtained are: (1) a correlation match, (2) a correlation mismatch of the first kind, the main type of a correlation mismatch, (3) a correlation mismatch of the second kind, the strongest one but very rarely observed in biological systems and obtained on a very small number of parameters. The correlation mismatch of the first kind is the target mismatch, i.e., the mismatch of tracing interest and represents the very reason why the study itself is performed. The application of the CCA led to the effective neuromorphofunctional classification of caudate interneurons into appropriate clusters and their feature-based description. The CCA analysis is a multidimensional bi-sampled classification tool that can be very useful for similar samples to explain their differences. Graphical Abstract Correlation–comparison analysis is based on the following transcorrelative relations between correlations belonging two similar samples: (1) the particular inter-parameter correlations are matched between the two samples in terms of their statistical insignificance in both of the samples, or in terms of accomplished statistical significance in both of the samples and both of the correlations are characterized with the same direction, (2) correlations are mismatched in terms of accomplished statistical significance in one of the samples but not in the other one (correlation mismatch of the first kind, the main type of correlation mismatch), (3) correlations are mismatched in terms of accomplished statistical significance in both of the samples but the correlations are characterized with the opposite direction (correlation mismatch of the second kind, the strongest one but very rarely observed in biological systems and obtained for a very few number of parameters).
... In contrast with an artificial neural network, a real neural network (RNN) is comprised of many neurons whose function follows their form and where neuronal morphology and function are interrelated and depend on each other (Ofer et al., 2017;Grbatinić et al., 2019). Detailed analysis of the membrane structure and function have been discussed elsewhere (Hodgkin and Huxley, 1952;Johnson andWinlow, 2018a,b, 2019). ...
Article
Full-text available
Here we provide evidence that the fundamental basis of nervous communication is derived from a pressure pulse/soliton capable of computation with sufficient temporal precision to overcome any processing errors. Signalling and computing within the nervous system are complex and different phenomena. Action potentials are plastic and this makes the action potential peak an inappropriate fixed point for neural computation, but the action potential threshold is suitable for this purpose. Furthermore, neural models timed by spiking neurons operate below the rate necessary to overcome processing error. Using retinal processing as our example, we demonstrate that the contemporary theory of nerve conduction based on cable theory is inappropriate to account for the short computational time necessary for the full functioning of the retina and by implication the rest of the brain. Moreover, cable theory cannot be instrumental in the propagation of the action potential because at the activation-threshold there is insufficient charge at the activation site for successive ion channels to be electrostatically opened. Deconstruction of the brain neural network suggests that it is a member of a group of Quantum phase computers of which the Turing machine is the simplest: the brain is another based upon phase ternary computation. However, attempts to use Turing based mechanisms cannot resolve the coding of the retina or the computation of intelligence, as the technology of Turing based computers is fundamentally different. We demonstrate that that coding in the brain neural network is quantum based, where the quanta have a temporal variable and a phase-base variable enabling phase ternary computation as previously demonstrated in the retina.
... Grbatinić et al 2019). Detailed analysis of the membrane structure and function have been discussed elsewhere(Hodgkin and Huxley 1952, Johnson and Winlow 2018a, Johnson and Winlow 2018b, Johnson and Winlow 2019. ...
Preprint
Full-text available
Here we provide evidence that the fundamental basis of nervous communication is derived from a pressure pulse/soliton capable of computation with sufficient temporal precision to overcome any processing errors. Signalling and computing within the nervous system are complex and different phenomena. Action potentials are plastic and this makes the action potential peak an inappropriate fixed point for neural computation, but the action potential threshold is suitable for this purpose. Furthermore, neural models timed by spiking neurons operate below the rate necessary to overcome processing error. Using retinal processing as our example, we demonstrate that the contemporary theory of nerve conduction based on cable theory is inappropriate to account for the short computational time necessary for the full functioning of the retina and by implication the rest of the brain. Moreover, cable theory cannot be instrumental in the propagation of the action potential because at the activation-threshold there is insufficient charge at the activation site for successive ion channels to be electrostatically opened. Deconstruction of the brain neural network suggests that it is a member of a group of Quantum phase computers of which the Turing machine is the simplest: the brain is another based upon phase ternary computation. However, attempts to use Turing based mechanisms cannot resolve the coding of the retina or the computation of intelligence, as the technology of Turing based computers is fundamentally different. We demonstrate that that coding in the brain neural network is quantum based, where the quanta have a temporal variable and a phase-base variable enabling phase ternary computation as previously demonstrated in the retina.
... In these models the timing of action potential and computation through the network is synchronized as a conventional computer by clock. We know that "Neuronal morphology and function are definitely interrelated" [13] indicating differing latencies and transmission behaviors for morphologically different neurons. In the retina timing of action potential from one point to another depends upon its transmission speed and the length of the neuron -both of which are variable from one neuron to the next. ...
Article
Full-text available
Substantial evidence has accumulated to show that the action potential is always accompanied by a synchronized coupled soliton pressure pulse in the cell membrane, the action potential pulse (APPulse). Furthermore, it has been postulated that, in computational terms, the action potential is a compound ternary structure consisting of two digital phases (the resting potential and the action potential) and a third-time dependent analogue variable, the refractory period. Together, with the APPulse, these phases are described as the computational action potential (CAP), which allows computation by phase. The nature of transmission, and thus computation across membranes, is dependent upon their structures, which have similar components from one neuron to another. Because perception and therefore sentience must be defined by the capabilities of the brain computational model, we propose that phase-ternary mathematics (PTM) is the native mathematical process underlying perception, consciousness and sentience. In this review, we take the CAP concept and apply it to the working of a well-defined neural network, the vertebrate retina. We propose an accurate working computational model of the retina and provide an explanation of computation of the neural transactions within it using PTM, and provide evidence that could form the basis of understanding neural computation within the entire nervous system. Evidence is presented of phase ternary computation (PTC), defined in phase ternary mathematics and shows an exact mathematical correlation between the activity of the amacrine cells, the bipolar cells and ganglion cells of the retina, once these cells have been activated by light falling on the cones. In this model, the computation of luminosity of multiple cones synapsed to a bipolar cell is performed by phase ternary mathematics at the points of convergence of CAPs. Redaction by the refractory periods of converging CAPs eliminates all but the leading APPulse resulting in sampling and averaging. In phase ternary analysis (PTA), the physiology of synapses defines their primary action as latency changers, changing the time taken for impulses to travel between points of convergence. This paper describes a novel type of computation, PTC, with evidence that it is the fundamental computational method used by the retina and by association the rest of the brain. By comparing the morphology of neurons it is now possible to explain their function singly and in networks. This has profound consequences both for our understanding of the brain and in clinical practice.
... In these models the timing of action potential and computation through the network is synchronized as a conventional computer by clock. We know that "Neuronal morphology and function are definitely interrelated" [13] indicating differing latencies and transmission behaviors for morphologically different neurons. In the retina timing of action potential from one point to another depends upon its transmission speed and the length of the neuron -both of which are variable from one neuron to the next. ...
Article
Substantial evidence has accumulated to show that the action potential is always accompanied by a synchronized coupled soliton pressure pulse in the cell membrane, the action potential pulse (APPulse). Furthermore, it has been postulated that, in computational terms, the action potential is a compound ternary structure consisting of two digital phases (the resting potential and the action potential) and a third-time dependent analogue variable, the refractory period. Together, with the APPulse, these phases are described as the computational action potential (CAP), which allows computation by phase. The nature of transmission, and thus computation across membranes, is dependent upon their structures, which have similar components from one neuron to another. Because perception and therefore sentience must be defined by the capabilities of the brain computational model, we propose that phase-ternary mathematics (PTM) is the native mathematical process underlying perception, consciousness and sentience. In this review, we take the CAP concept and apply it to the working of a well-defined neural network, the vertebrate retina. We propose an accurate working computational model of the retina and provide an explanation of computation of the neural transactions within it using PTM, and provide evidence that could form the basis of understanding neural computation within the entire nervous system. Evidence is presented of phase ternary computation (PTC), defined in phase ternary mathematics and shows an exact mathematical correlation between the activity of the amacrine cells, the bipolar cells and ganglion cells of the retina, once these cells have been activated by light falling on the cones. In this model, the computation of luminosity of multiple cones synapsed to a bipolar cell is performed by phase ternary mathematics at the points of convergence of CAPs. Redaction by the refractory periods of converging CAPs eliminates all but the leading APPulse resulting in sampling and averaging. In phase ternary analysis (PTA), the physiology of synapses defines their primary action as latency changers, changing the time taken for impulses to travel between points of convergence. This paper describes a novel type of computation, PTC, with evidence that it is the fundamental computational method used by the retina and by association the rest of the brain. By comparing the morphology of neurons it is now possible to explain their function singly and in networks. This has profound consequences both for our understanding of the brain and in clinical practice.