Conference PaperPDF Available

Springer Indexed Conference Publication As Lecture Notes in Networks and Systems book series (LNNS, volume 385)

Authors:
A preview of the PDF is not available
ResearchGate has not been able to resolve any citations for this publication.
Article
Full-text available
This article represents a comparative study of various types of multiple users’ access schemes for cellular radio system. One of the promising technique for the significant bandwidth efficiency enhancement in future wireless cellular system compared to the conventional multiple access technologies, especially orthogonal multiple access is Non-Orthogonal multiple access(NOMA). NOMA have grater spectral efficiency and more massive connectivity over orthogonal in fading environment. Third generation partnership projects have recently been proposed NOMA for 4G (3GPP-LTE-A). It is a Novel technique focusing the solution for 5G, advanced multimedia applications and Internet of things in terms of supporting massive heterogeneous data traffic. This review paper primarily focus on diverse NOMA techniques and provides a detailed outline of the Cutting edge in NOMA basic principles, NOMA power domain and its other variants.
Article
Full-text available
Medical diagnosis using Artificial Intelligence (AI) is the metamorphism in the field of modern medicine. Models built with AI gives greater performance in the prognosis of disease features by identifying the risk of disease onset in future. This helps to achieve significant results in early prediction of diseases. Various machine learning and deep learning models are in the recent trends for image processing and disease prediction problems. The clinical data are structured data and imaging modalities that can be utilized by machine learning classifiers such as random forests and decision tree classifiers that yield accurate results. This chapter gives several types of clinical image data features utilized in deep learning models. Several image segmentation functions utilized in deep learning architecture is also evaluated. Since MRI imaging are reaching greater significance for disease diagnosis in recent times, deep learning models prove to be the best accurate model in disease prediction. Diagnosis deals with predicting whether the given input image is affected by pathology or not. Machine learning models build solutions for disease prognosis with outstanding performances. This prognosis enlists the risk and probability of conversion from early stages of disease symptoms to advanced disease pathology. Thus, before the complete symptoms are onset, when disease prognosis is performed, the disease can be postponed or eliminated that greatly helps human community.
Article
Full-text available
Corona Virus continues to harms its effects on the people lives across the globe. The screening of infected persons has to be identified is a vital step because it is a fast and low-cost way. Certain above mentioned things can be recognized by chest X-ray images that plays a significant role and also used for examining in detection of CORONA VIRUS(COVID-19). Here radiological chest X-rays are easily available with low cost only. In this survey paper, Convolutional Neural Network(CNN) based solution that will benefit in detection of the Covid-19 positive patients using radiography chest X-Ray images. To test the efficiency of the solution, using data sets of publicly available X-Ray images of Corona virus positive cases and negative cases. Images of positive Corona Virus patients and pictures of healthy person images are divided into testing images and trainable images. The solution which are providing the good results with classification accuracy within the test set-up. Then GUI based application supports for medical examination areas. This GUI application can be used on any computer and performed by any medical examiner or technician to determine Corona Virus positive patients using radiography X-ray images. The result will be precisely obtaining the Covid-19 Patient analysis through the chest X-ray images and also results may be retrieve within a few seconds.
Article
Full-text available
Open source software has been widely used in various industries due to its openness and flexibility, but it also brings potential software security problems. Together with the large-scale increase in the number of software and the increase in complexity, the traditional manual methods to deal with these security issues are inefficient and cannot meet the current cyberspace security requirements. Therefore, it is an important research topic for researchers in the field of software security to develop more intelligent technologies to apply to potential security issues in software. The development of deep learning technology has brought new opportunities for the study of potential security issues in software, and researchers have successively proposed many automation methods. In this paper, these automation technologies are evaluated and analysed in detail from three aspects: software vulnerability detection, software program repair, and software defect prediction. At the same time, we point out some problems of these research methods, give corresponding solutions, and finally look forward to the application prospect of deep learning technology in automated software vulnerability detection, automated program repair, and automated defect prediction.
Article
Full-text available
Energy conservation is a significant task in the Internet of Things (IoT) because IoT involves highly resource-constrained devices. Clustering is an effective technique for saving energy by reducing duplicate data. In a clustering protocol, the selection of a cluster head (CH) plays a key role in prolonging the lifetime of a network. However, most cluster-based protocols, including routing protocols for low-power and lossy networks (RPLs), have used fuzzy logic and probabilistic approaches to select the CH node. Consequently, early battery depletion is produced near the sink. To overcome this issue, a lion optimization algorithm (LOA) for selecting CH in RPL is proposed in this study. LOA-RPL comprises three processes: cluster formation, CH selection, and route establishment. A cluster is formed using the Euclidean distance. CH selection is performed using LOA. Route establishment is implemented using residual energy information. An extensive simulation is conducted in the network simulator ns-3 on various parameters, such as network lifetime, power consumption, packet delivery ratio (PDR), and throughput. The performance of LOA-RPL is also compared with those of RPL, fuzzy rule-based energy-efficient clustering and immune-inspired routing (FEEC-IIR), and the routing scheme for IoT that uses shuffled frog-leaping optimization algorithm (RISA-RPL). The performance evaluation metrics used in this study are network lifetime, power consumption, PDR, and throughput. The proposed LOA-RPL increases network lifetime by 20% and PDR by 5%-10% compared with RPL, FEEC-IIR, and RISA-RPL. LOA-RPL is also highly energy-efficient compared with other similar routing protocols.
Article
Software enhances the working capability of any business. Developing such a software entrusts the developing organization to build defect free software. In this context we have used PC1 dataset(NASA dataset) which has sufficient parameters for analysis. Intelligent techniques using different methodologies have been applied exhaustively on the PC1 data to find out the best intelligent technique for software defect. As the PC1 data is highly imbalanced data, there was biasness in the prediction of the intelligent techniques. Hence, to overcome this issue, in this paper we tried to propose best balancing method along with the intelligent technique to predict the software defect accurately.
Chapter
Proof of Work (PoW) protocols, originally proposed to circumvent DoS and email spam attacks, are now at the heart of the majority of recent cryptocurrencies. Current popular PoW protocols are based on hash puzzles. These puzzles are solved via a brute force search for a hash output with particular properties, such as a certain number of leading zeros. By considering the hash as a random function, and fixing a priori a sufficiently large search space, Grover’s search algorithm gives an asymptotic quadratic advantage to quantum machines over classical machines. In this paper, as a step towards a fuller understanding of post quantum blockchains, we propose a PoW protocol for which quantum machines have a smaller asymptotic advantage. Specifically, for a lattice of rank n sampled from a particular class, our protocol provides as the PoW an instance of the Hermite Shortest Vector Problem (Hermite-SVP) in the Euclidean norm, with a small approximation factor. Asymptotically, the best known classical and quantum algorithms that directly solve SVP type problems are heuristic lattice sieves, which run in time 20.292n+o(n) and 20.265n+o(n) respectively. We discuss recent advances in SVP type problem solvers and give examples of where the impetus provided by a lattice based PoW would help explore often complex optimization spaces.