Figure - available from: Computational Intelligence and Neuroscience
This content is subject to copyright. Terms and conditions apply.
(a) A basic ResNet block. (b) A bottleneck block for ResNet-50/101/152. (c) RestNext 50 building block with cardinality 32.

(a) A basic ResNet block. (b) A bottleneck block for ResNet-50/101/152. (c) RestNext 50 building block with cardinality 32.

Source publication
Article
Full-text available
Breast cancer is a lethal illness that has a high mortality rate. In treatment, the accuracy of diagnosis is crucial. Machine learning and deep learning may be beneficial to doctors. The proposed backbone network is critical for the present performance of CNN-based detectors. Integrating dilated convolution, ResNet, and Alexnet increases detection...

Citations

... AAAs can degenerate without therapy, eventually leading to a rupture and a potentially deadly haemorrhage. As a result, it is critical to obtain an accurate diagnosis and begin therapy as soon as possible in order to enhance patient outcomes and limit the risk of rupture [1]. Endovascular aneurysm repair (EVAR) and open aneurysm repair (OAR) are two typical techniques for treating AAAs and lowering the risk of rupture. ...
Conference Paper
—This study concentrates on a potential strategy for improving Abdominal aortic aneurysms (AAAs) segmentation in CT imaging. The study's dataset included 19 CT images of healthy AAAs acquired from different patients. Ground-truth segmentations that matched the training data were also included. For enhanced accuracy while segmenting AAA, the suggested technique integrates pre-processing, deep learning segmentation, and post-processing into a cohesive workflow. Convolutional neural networks (CNNs) are trained for deep learning segmentation using a U-Net architecture, and noise in input pictures is removed during pre-processing. This differs from the conventional habit of employing linear regression for image segmentation. The results were measured using the dice similarity coefficient, sensitivity, specificity, accuracy, recall, and F1 score. With a DSC of 0.81, sensitivity of 0.89, specificity of 0.97, accuracy of 0.86, recall of 0.89, and F1 score of 0.86, the suggested approach was proven to be successful in segmenting AAAs in CT images. The technique has shown potential in clinical settings for improving AAA diagnosis and treatment planning.
... AAAs can degenerate without therapy, eventually leading to a rupture and a potentially deadly haemorrhage. As a result, it is critical to obtain an accurate diagnosis and begin therapy as soon as possible in order to enhance patient outcomes and limit the risk of rupture [1]. Endovascular aneurysm repair (EVAR) and open aneurysm repair (OAR) are two typical techniques for treating AAAs and lowering the risk of rupture. ...
Conference Paper
This study concentrates on a potential strategy for improving Abdominal aortic aneurysms (AAAs) segmentation in CT imaging. The study's dataset included 19 CT images of healthy AAAs acquired from different patients. Ground-truth segmentations that matched the training data were also included. For enhanced accuracy while segmenting AAA, the suggested technique integrates pre-processing, deep learning segmentation, and post-processing into a cohesive workflow. Convolutional neural networks (CNNs) are trained for deep learning segmentation using a U-Net architecture, and noise in input pictures is removed during pre-processing. This differs from the conventional habit of employing linear regression for image segmentation. The results were measured using the dice similarity coefficient, sensitivity, specificity, accuracy, recall, and F1 score. With a DSC of 0.81, sensitivity of 0.89, specificity of 0.97, accuracy of 0.86, recall of 0.89, and F1 score of 0.86, the suggested approach was proven to be successful in segmenting AAAs in CT images. The technique has shown potential in clinical settings for improving AAA diagnosis and treatment planning.
... This makes them useful for determining time series relationships. Variational Autoencoders (VAEs) locate anomalies well with 0.93 accuracy [13]. Generative Adversarial Networks (GANs) can create realistic normal data for anomaly diagnosis with a precision of 0.91. ...
Conference Paper
This research introduces a novel anomaly detection framework for IoT -based Smart Grid Cybersecurity Systems. Leveraging autoencoders, LSTM networks, GANs, SOMs, and transfer learning, our approach achieves superior precision, recall, and execution time compared to existing methods. Visualizations and an ablation study further validate the method's efficiency, emphasizing the critical roles of attention mechanisms and transfer learning. This comprehensive solution addresses the dynamic challenges of smart grid cybersecurity, offering a versatile and adaptive anomaly detection mechanism for real-world applications. This indicates the real-time efficacy of our anomaly detection method. Through our study of ablation and all aspects of computing, we discovered that attention processes and transfer learning facilitate faster problem solving in a dynamic smart grid. Our method is distinct and adaptable enough to address every problem arising from the discovery of anomalies in IoT-driven Smart Grid Cybersecurity Systems.
... 3.1.2. Composite Dilated Backbone Network Vinodkumar Mohanakurup et al. [17] breast cancer detection on histopathological images using a composite dilated backbone network. The developed method introduces a composite expanded backbone network (CDPN) that integrates multiple similar backbones into a robust backbone for breast cancer detection in histopathological images. ...
Article
Full-text available
In recent years, breast cancer classification can be considered a primary subject for biology and health care, given that cancer is the second leading cause of death in women. From there, the medical community has seen advances in the field of research in the use of various techniques to screen for and identify multifold threatening diseases, such as breast cancer. In this survey, the various deep learning (DL) approaches are analyzed for multi-classifying the breast cancer histopathological images. This survey discusses the significant assumptions, limitations, and advantages are analyzed in existing DL based techniques as segmentation and classification are used for multi-classification of breast cancer histopathological images. The existing method's performance was analyzed by using various performance measures such as accuracy, precision, recall, sensitivity, specificity and f1-score. This survey concludes that the various breast cancer histopathological image multi-classification over DL have feasible to overcome the drawbacks as inefficient supervised feature and enhance the efficiency.
... To address this problem, a unique framework is developed. Combining CNNs with traditional numerical methods is what this method [22] tries to do to make showing complicated thermal dynamics more accurate, useful, and flexible. Table 3 displays the simulation parameters for the four techniques in this dataset. ...
Article
Full-text available
This research describes a novel technique for anticipating unstable heat transfer in porous media. Convolutional neural networks (CNNs) are used with finite volume method (FVM) and long short-term memory (LSTM) networks to accomplish this. Heat transport networks are difficult to characterise using traditional numerical methodologies owing to their nonlinearity and complexity. The proposed solution combines FVM's precise physical modelling with CNN's and LSTM's superior pattern identification and temporal analysis. This collaboration supports the suggested strategy. Heat transport dynamics simulations in porous materials are more accurate, efficient, and adaptable when employing this hybrid framework. The experimental setup focused on porous material properties and gathered and processed a large amount of data. The building's three-dimensional shape, heat transfer, and time were investigated. Temporal fluctuations were also used. Multiple indicators are used to evaluate the overall performance of the model. These criteria include convergence speed, F1 score, accuracy, precision, recall, and computational cost. In the most notable numerical results, the proposed strategy surpasses both the Finite Element and the Lattice Boltzmann methods. The presented method enabled fast convergence and reduced processing costs. These results were: accuracy (0.92), precision (0.93), recall (0.91), and F1 score (0.92). The proposed method is generalizable and adaptable, and it can address a variety of heat transport simulation problems in porous media. Unlike CNNs, which can identify significant spatial patterns, LSTM cells can only see temporal dynamics. These two components are required to show heat transfer, which is a continually changing phenomenon. Modern technology enables more complex simulations. Processing expenses are lowered, and estimations are more accurate. These two discoveries were obtained through the inquiry and methodologies. Finally, the CNN-FVM-LSTM technique simulates heat transport using complicated computer models. Predicting unusually high temperatures in porous materials may improve the model's accuracy, computational efficiency, and flexibility.
... The following are some examples of sections and columns present in the spreadsheet: Some of the characteristics to consider are aesthetics, communication, content, functionality, category, and criticality (critical, essential, good to have, avoid). Ensure that your work contains at least the following items in that order: having a tenuous connection to the study of neurology (either the theory, the technology, or the methodology), An Analysis of Its Importance to the Neurosciences NP Reference/s, with an Illustration, Screenshot, or Online Resource Link to the Project Reference (theory, approach, or technique), despite the fact that each pair kept their own sheet, all of the participants had unrestricted access to the master sheet and could read the data provided by the other pairs (Mohanakurup et al., 2022). A platform that is already in use, the instructional design for an e-learning module that is now being built, and the use of various neuropedagogical concepts are some instances of potential evidence that supports the hypothesis. ...
... One of the most interesting features of these wearable technology is the potential for them to encourage a more preventative approach to healthcare. In the past, healthcare services have been geared toward treating people who are already sick [4]. But these tools can be used to stop health problems before they happen, so they can be treated quickly. ...
Article
Full-text available
The study gives a complete plan for lowering disease through the use of ICT in personal healthcare. The Health Pattern Recognition (HPR), Dynamic Risk Assessment (DRA), and Personalized Intervention Strategy (PIS) formulas are all parts of this method. They are used to collect, prepare, and use data. This research focuses on cybersecurity using health pattern recognition (HPR), dynamic risk assessment (DRA), and personalized intervention strategies (PIS). PIS offers a comprehensive disease prevention approach in personal healthcare that takes advantage of technological advancements. Because they integrate secure data processing with privacy-preserving machine learning, these aspects assure the safety and validity of health data collected from wearable devices. This option allows for the assessment of medical records. It may be helpful to analyze the technique's accuracy and adherence to established security standards in order to evaluate its application for disease prediction and preventive health management. The HPR program looks at each person's health information to find trends in diseases and other results using machine learning. This helps with early evaluation and healthcare management that avoids problems. DRA keeps a person's risk rating up to date so that it takes into account any changes in their health. After that, people are given choices based on the results and risks that PIS has predicted. Some of the tests that were used to compare the suggested method to industry standards were accuracy, sensitivity, specificity, precision, and the Matthews Correlation Coefficient. The suggested way seems to work because it has better predicting power, fewer fake positives, and more users who are involved in preventive health management.
... for multilayer fusion score-level computations since they can handle a broad range of high-dimensional datasets. They can evaluate the value of characteristics, which is critical for identifying which variables are most significant in complex fusion tasks [13]. Because of their interpretability and resistance to overfitting, random forests are useful in a variety of disciplines, including research and commerce. ...
Article
Full-text available
This research introduces a novel technique for determining multiple fusion score levels that operates effectively across various datasets and purposes. The four components of the system work in harmony: Feature Engineering, Ensemble Learning, deep neural networks (DNNs), and Transfer Learning. In the feature engineering phase, raw data undergoes a complete transformation, emphasizing the significance of PCA and MI for predictive power. AdaBoost is incorporated during ensemble learning, repeatedly instructing weak learners and adjusting weights based on errors to create a robust ensemble model. Weighted input processing, ReLU activation, and dropout layers seamlessly integrate with DNNs, revealing subtle data patterns and correlations. In transfer learning (fine-tuning), a trained model is adapted for the feature-engineered dataset. In comparative testing, the proposed technique demonstrated higher accuracy, precision, recall, F1 score, AUC-ROC, and shorter training duration. Efficiency measures reduce reasoning time, memory usage, parameter count, model size, and energy consumption. Visualizations illustrate resource consumption, method scores, and the distribution of reasoning time in the research. This mathematical framework enhances the computation of multilayer fusion score levels, performs effectively, and proves versatile across various scenarios, making it a reliable choice for large and diverse datasets.
... They can evaluate the value of characteristics, which is critical for identifying which variables are most significant in complex fusion tasks. 15 Because of their interpretability and resistance to overfitting, random forests are useful in a variety of disciplines, including research and commerce. A comprehensive knowledge of the model's decision-making process is as vital as having accurate projections. ...
Article
Full-text available
This research introduces a novel technique for determining multiple fusion score levels that operates effectively across various datasets and purposes. The four components of the system work in harmony: Feature Engineering, Ensemble Learning, deep neural networks (DNNs), and Transfer Learning. In the feature engineering phase, raw data undergoes a complete transformation, emphasizing the significance of PCA and MI for predictive power. AdaBoost is incorporated during ensemble learning, repeatedly instructing weak learners and adjusting weights based on errors to create a robust ensemble model. Weighted input processing, ReLU activation, and dropout layers seamlessly integrate with DNNs, revealing subtle data patterns and correlations. In transfer learning (fine-tuning), a trained model is adapted for the feature-engineered dataset. In comparative testing, the proposed technique demonstrated higher accuracy, precision, recall, F1 score, AUC-ROC, and shorter training duration. Efficiency measures reduce reasoning time, memory usage, parameter count, model size, and energy consumption. Visualizations illustrate resource consumption, method scores, and the distribution of reasoning time in the research. This mathematical framework enhances the computation of multilayer fusion score levels, performs effectively, and proves versatile across various scenarios, making it a reliable choice for large and diverse datasets.
... This is because to the simplicity with which they can store and handle enormous amounts of data. This introduction sets the stage for a subsequent, more in-depth examination of the critical roles that AI and data mining play in the quest for solutions to these problems [4]. The historical context's and the location's own importance A complex interplay of variables characterizes the allocation of resources and the planning of hospital capacity in the contemporary healthcare system [5]. ...