ArticlePDF Available

NOVEL HYBRID ARTIFICIAL INTELLIGENCE-BASED ALGORITHM TO DETERMINE THE EFFECTS OF AIR POLLUTION ON HUMAN ELECTROENCEPHALOGRAM SIGNALS

Authors:
  • J. B. Institute of Engineering & Technology, hyderabad, India.

Abstract

Air pollution is a serious global issue. The complicated mix of gases, fluids, and particle matter, heterogeneous, and growing pollutants from cars, manufacturers, or houses, damage the ionosphere and also damage human health. In preliminary research, the risk for short and long-term exposure to coronary heart disease in existing levels of ambient particulate matter was consistently raised. These challenges can be modeled with Artificial intelligence (AI). Their benefit is that they can resolve the issue without the realization of the conceptual link between the input and output data under the situations of partial data. The AI approach may be used to accurately identify coronary disease, and numerous diseases to prevent heart problems. This paper provides a comprehensive analysis of the data on air pollution and cardiovascular disease by healthcare experts and regulatory authorities and also differentiates individuals with cardiovascular disease from healthy individuals easily. The decision support system focused on AI Technology can help physicians better diagnose heart patients. When chosen by the Correlation-based function subset selection algorithm, t
1825
* For correspondence.
Journal of Environmental Protection and Ecology 22, No 5, 1825–1835 (2021)
Atmospheric pollution
NOVEL HYBRID ARTIFICIAL INTELLIGENCE-BASED
ALGORITHM TO DETERMINE THE EFFECTS OF AIR
POLLUTION ON HUMAN ELECTROENCEPHALOGRAM
SIGNALS
M. A. BERLINa*, NIRAJ UPADHAYAYAb, ALI ALGAHTANIc,
VINEET TIRTHc, SAIFUL ISLAMd, K. MURALIe,
PRAVIN R. KSHIRSAGARf*, BUI THANH HUNGg,
PRASUN CHAKRABARTIh, PANKAJ DADHEECHi
aDepartment of Computer Science and Engineering, RMD Engineering College,
Chennai,Tamil Nadu, India
E-mail: mba.cse@rmd.ac.in
bDepartment of Computer Science, J. B. Institute of Engineering and
Technology, 500 075 Telangana, India
cDepartment of Mechanical Engineering, College of Engineering, King Khalid
University, 61 411 Abha, Kingdom of Saudi Arabia; Research Centre for
Advanced Materials Science (RCAMS), King Khalid University, Guraiger,
61 413 Abha, Asir, Kingdom of Saudi Arabia
dDepartment of Civil Engineering College of Engineering, King Khalid
University, 61 413 Abha, Asir, Kingdom of Saudi Arabia
eDepartment of Electronics and Communication Engineering, Vijaya Institute
of Technology for Women, Enikepadu, Andhra Pradesh, Hyderabad, Telengana,
India
fDepartment of Electronics and Communication Engineering, AVN Institute of
Technology & Engineering, Hyderabad, Telengana, India
gData Analytics & Articial Intelligence Laboratory – DAAI Lab, Articial
Intelligence and Information System Programme, Institute of Engineering –
Technology, Thu Dau Mot University, 06 Tran Van On Street, Phu Hoa District,
Thu Dau Mot City, Binh Duong Province, Vietnam
hProvost and lnstitute Endowed Distinguished Senior Chair, Techno lndia NJR
Institute of Technology, 313 003 Udaipur, Rajasthan, lndia
iComputer Science and Engineering, Swami Keshvanand Institute of Technology
(SKIT), Management and Gramothan, 302 017 Jaipur, Rajasthan, India
Abstract. Air pollution is a serious global issue. The complicated mix of gases, uids, and particle
matter, heterogeneous, and growing pollutants from cars, manufacturers, or houses, damage the
1826
ionosphere and also damage human health. In preliminary research, the risk for short and long-term
exposure to coronary heart disease in existing levels of ambient particulate matter was consistently
raised. These challenges can be modelled with Articial intelligence (AI). Their benet is that they
can resolve the issue without the realisation of the conceptual link between the input and output data
under the situations of partial data. The AI approach may be used to accurately identify coronary
disease, and numerous diseases to prevent heart problems. This paper provides a comprehensive
analysis of the data on air pollution and cardiovascular disease by healthcare experts and regulatory
authorities and also differentiates individuals with cardiovascular disease from healthy individuals
easily. The decision support system focused on AI Technology can help physicians better diagnose
heart patients. When chosen by the Correlation-based function subset selection algorithm, the clas-
sier logistic regression for 20-fold cross-validation exhibited the greatest 92% accuracy.
Keywords: environmental factors, air pollution, cardiovascular diseases, Articial intelligence (AI),
correlation-based feature subset selection, relief feature selection algorithm, logistic regression, Naïve
Bayes, k-nearest neighbour algorithm.
AIMS AND BACKGROUND
Air pollution is an important risk factor in the breathing and cardiovascular systems
and environmental circumstances of human health. Receiving nationwide ambient
quality requirements for the preservation of public health and the environment,
different environmental protection authorities in different nations have recently
laid down1,2. The particulate matter (PM) (ozone at ground level (O3), carbon
monoxide (CO), sulphur oxides (SO2), nitrogen oxides (NOx) and lead as “critical
pollutants” are dened in the set. Among these six contaminants, O3 is one of the
most toxic emissions with detrimental impacts on human health and agricultural
development3. Several researchers in the eld of air pollution prediction have
recently been approved for the discovery and implementation of meteorological
modelling models, such as statistical models. Articial neural network (ANN), grey
model, and other hybrid techniques4–6. Governments, companies, and households
not only respond more promptly and economically to air pollution issues but also
acquire more knowledge into more expensive corrective actions6–8.
Kshirsagar et al.4,9,10 discussed about Electrocardiogram (ECG) is a bio-electric
signal that records the electric function of the heart over time. In the diagnosis of
cardiac illnesses and choice of suitable therapy for a patient, timely and precise
diagnosis is essential. Signal ECGs are utilised as parameters for cardiac disease
identication, with the majority of the data being derived from the MIT-BIH
dataset and Physio Data Net. The ECG signal is preprocessed using the toolbox of
the Wavelet and is also utilised to remove the ECG signal. Animesh Hazra et al.11
suggested a few of the existing research on cardiac disease prediction utilising data
mining methods analyses the many mixtures of mining algorithms employed and
determine what strategies are benecial and successful. In addition, certain pos-
sible improvements have been discussed in the prediction model. Suriya Beguma
et al.12 illustrated the application of several Heart Disease Dataset techniques is
1827
addressed. The Random Forest Classier machinery learning method has dem-
onstrated its accuracy and reliability in the suggested system. Skrzypski et al.1
suggested an air pollution prediction system based on articial neural networks.
The level of the toxic gases or a given range of levels can be predicted (class of
air quality)2. Comparable results recommend that air pollution is a key avoidable
cause of increasing respiratory problems and irregular heartbeats.
EXPERIMENTAL
Hazardous air pollutants cause heart disorders such as heart attack blockage (arte-
rial occlusion) and cardiac tissue death related to lack of oxygen, resulting in a
potential loss of heart (infarct formation). Various classication systems for the
diagnosis of cardiac disease have been checked for their success on complete and
certain functions. Selecting main features such as relief, maximum relevance –
minimum redundancy and least absolute shrinkage and selection operator has
been used and the efciency of the classiers has been checked on those features
extracted. Logistic regression, K-NN, and NB were used for common articial
intelligence classication11,12. The approach for the model validation and analysis
of the performance measures is estimated as shown further in Fig. 2.
A variety of possible methods have been developed to trigger immediate ef-
fects of air pollution on the cardiovascular system, plasma, and lung receptors,
as indicated in Fig. 1, or negative impacts caused by oxidative stress and chronic
inammation. Changed autonomy may also lead to arterial plaque destabilisation
and heart arrhythmias start. Severe cardiovascular reactions to PM exposure such
as myocardial infarction may lead to direct air-conditioning consequences1.
The particulate matters (PMs) are a collection of solid and liquid particles
which inuence human health signicantly based on their size (i.e. the lungs reach
PM10, diameter < 10 μm, and the pulmonary particles penetrate further into the
lung with diameters < 2.5 μm) (PM2.5) (Ref. 6).
1828
Air pollution (PM)
Z
Lung
inflammation
Blood
translocation
Autonomic
regulation
Heart Disease
10
PM
2.5
Heart disease
database
Pre-processing
Feature extraction
and reduction
CFS
Relief
LASSO
AI algorithm
LR
KNN
NB
Model
prediction
Existence of Heart disease
Absence of Heart disease
K-fold cross-
validation method
Heart disease caused due
to Air pollution
Fig. 1. Block diagram of a hybrid intelligent system for predicting heart disease caused due to air
pollution
Database. The largest dataset available at the 2016 Physionet Challenge was used to
test the proposed technique. The Physionet database comprises six (A to F) datasets
containing 3240 raw heart sound samples in total. These samples were obtained
separately by various research groups in clinical and non-clinical (i.e. home visits)
settings using heterogeneous sensor devices from different countries9. The dataset
includes clean and noisy cardiovascular sounds. Both healthy individuals and
patients with several cardiac problems, in particular coronary artery disease and
heart valve disease, obtained the information. Themes included girls, adults, and
elderly people from various age groups. The sound registered in the heart ranged
between 5 and just 120 s (Ref. 11). The typical sounds in the heart normally con-
1829
sist of a rst sound (S1) and a second sound (S2). S1 arises when the mitral and
tricuspid valves close due to the sudden increase in pressure within the ventricles
at the start of the insulated ventricular contract. The aortic and pulmonary valves
are closed at the beginning of the diastole12.
Preprocessing of heart sound. The electronic stethoscope sound of the heart often
contains background noise. The preprocessing of cardiac sound is a critical phase
in the automatic detection of cardiac beat recordings. It exposes the underlying
physiological structure of the cardiac signal through the identication of anoma-
lies in the meaningful PCG signal regions and enables the automated detection of
pathological events7. The segmentation process is known as the detection of the
precise dates of the rst and second heart sounds (e.g. S1 and S2). The key objec-
tive of this procedure is to ensure that the incoming heartbeat is aligned correctly
before classication because it improves the recognition score signicantly. A
pattern of ECG beats is the input of this method. We take a few simple steps to
remove ECG beats from a given ECG signal4,5:
1. Standardisation of the ECG to the range from one to zero.
2. Find the t wave set in the Massachusetts Institute of Technology-Beth
Israel Hospital.arrhythmia database for the electrocardiogram R-peaks of their
consequent annotation le.
3. Separate the ECG continuous signal in a heartbeat sequence based on the
t waves obtained and attaches a marker on the annotation le for every heartbeat.
4. Redimensioning each heartbeat to a xed length (280 samples).
FEATURE EXTRACTION AND REDUCTION
One of the main steps in categorisation is to isolate and reduce features as the right
categorisation cannot be accurate even though features do not have to be selected.
This decreased dimensioning helps to reduce the database size while speeding up
the deduction procedure for a large database in particular. For that purpose, there
are various algorithms: Correlation-based feature subset selection (CFS); Relief
feature selection algorithm, and Least absolute shrinkage and selection operator.
Correlation-based feature subset selection (CFS): “A valuable functional subset
contains features that are not (not dened) one with the other, specically connected
to the class”13,14. A function evaluation form, based on test theory assumptions,
gives the estimation as dened in equation (1):
rfc = k rfc (1)
(k + k(k – 1) rff)1/2
where rfc – the correlation between the summed features and class variable; k
the number of features; rfc – the average of correlation between features and class
variable; rff – the average inter-correlation between features.
1830
Continuous functions are converted into categorical attributes using supervised
as a preprocessing step before they are used as classication tasks to satisfy their
nominal or categorical and continuous or ordinal features in equation (1). The
information gain theory is used to estimate the extent of the association between
nominal characteristics. Furthermore, possible sub-sets of reduced features are
possible (n is initially the number of possible initial features). Particularly for a
large characteristic set, it would also be unrealistic to discover every sub-set to nd
the best subset Also, in different applications, both lter-type and wrapper-type
feature selection methods use correlation-based approaches.
Relief feature selection algorithm. Relief is a distance-based method used to assess
feature weight in the calculation of the angle from a randomly selected example
(nearest hit) by observing the variation in functional values while calculation the
distance between a randomly selected case and its closest case, provided it has a
different value on the same feature as calculating the nearest distance the greater
the chance of the closest miss’ distance, the greater the weight of the feature, but
the opposite, the weight of the characteristics is inversely proportional to the prob-
ability of the nearest hit’s distance. A relief algorithm is described here, where the
weight of Feature X is targeted and denoted by D[U] in equation (2):
D[U] = P(different value of X | nearest instance from the different class) –
P(different value of X | nearest instance from same class). (2)
Least absolute shrinkage and selection operator. Using the last absolute reduction
and selection operator to pick features by upgrading the absolute importance of
the features. The LASSO works superbly with low coefcients14. AI classica-
tion techniques are used to identify heart patients and healthy individuals. Some
popular classication algorithms are discussed briey with their theoretical basis:
Logistic regression; Naïve Bayes; k-nearest neighbour.
Logistic regression. The logistic HSMM regression is similar to an SVM-based
probability of emission and allows more segregation in various states. Logistic
regression is a binary classication that uses a logistic function to map the func-
tion space or prediction variables to the binary response variables7,9,11. The logistic
function σ(α) is dened as:
σ(α) = 1/(1 + exp(–α)). (3)
Within the logistic function, it is possible to dene the probability of a state
or class given the input analyses Ot :
P[qt = α|Ot] = σ(wOt), (4)
where w is the weights of the framework used for every analysis or input. The
classier is constructed and weighted on the dataset on lesser squares10. The prob-
1831
ability of each analysis given the state for the one-versus-all logistic regression
bj(Otj) is found by using the Bayes’ rule:
bj(Ot) = P[Ot|qt = α] = P[qt = α|Ot] × P(Ot). (5)
Pj)
The P(Ot) can be measured from a normal multivariate allocation of the entire
training data, and P(α j) is the initial allocation of the probability.
Bayesian networks. A Naive Bayes (NB) is a technique for training. The theorem
determined the form of new factor variables in terms of the highest probability.
The NB uses data sets to evaluate a certain class’s vector likelihood5,13. Depend-
ing on its likelihood value for individual vector the development work class is
determined. For the grouping of texts, NB is used. Then the most possible class
is allocated test text using the Bayes rule:
G(A|X) = G(X|A) G(A), (6)
G(X)
where G(A|X) is the posterior probability of class; G(A) – the prior probability of
class; G(X|A) – the likelihood which is the probability of the predictor given class,
and G(X) – the prior probability of predictor.
k-Nearest neighbour. The k-nearest neighbour approach (KNN) denes the clos-
est neighbour about the k value dening the number of closest neighbours to be
seen to identify the data point class14,15. The structure-based approach covers the
fundamental structure of data with less mechanical data training. In a less techni-
cal structure, all the information is divided into datasets and training data, and
the distance between the sampling points and all the training points is calculated.
RESULT AND DISCUSSION
This sector addresses diverse perspectives on the processes and consequences of
classication. We rst evaluated the efciency of different articial intelligence al-
gorithms, including logistical regression, KNN, and Naive Bayes, to set up complete
heart disease results. In the second, CFS, Relief, and LASSO selection algorithms
were used for essential features. Third, classiers also measured the usefulness of
chosen features. The k-fold method was often used for cross-validation. The clas-
sication algorithms were applied to check the output assessment methods. This
research carried out tests with k = 12 for the KNN classiers. At k = 12, however,
the efciency of KNN as shown in Fig. 2 was excellent.
1832
Fig. 2. Performance of KNN for different values of k
The relation between temperature, humidity, heart rate, and various risk level,
and human comfort decisions that can be applied to the constructed continuous
monitoring system is described in Table 1. 20-fold CV category metrics of heart-
disease algorithms is described in Table 2.
Table 1. Boolean decision-making table based on patient status
Tem -
perature
sensor
(oC)
Humidity
in (%age)
Human perception Pulse rate sensor
(per min)
Action taken Level of
risk
<37 31–41 comfortable 60 to 100 no action 1
37–38 41–46 comfortable for
many people
40–60 or 100–120 informed fam-
ily members
2
>38 46–52 uncomfortable for
many people at up-
per age
40–60 or 100–120 informed to a
local doctor
3
>38 >52 high humid, ex-
tremely uncomfort-
able
<40 or >120 emergency 4
Table 2. 20-fold CV category metrics of heart-disease algorithms
Predictive model Accuracy Specicity Sensitivity
Linear regression 92.00 89.98 88.12
Naive Bayes 89.34 92.45 86.45
KNN 85.57 88.01 84.56
1833
Fig. 3. Accuracy performance of various classiers with 20-fold CV
Fig. 4. Specicity performance of various classiers with 20-fold CV
Fig. 5. Sensitivity performance of various classiers with 20-fold CV
Fig. 6. Performance of various classiers with 20-fold CV
1834
The predictive performance of LR was 92% (accuracy) as shown in Fig 3,
88.12% (sensitivity) as shown in Fig 5, and 89.98% (specicity) as illustrated in
Fig 4. The second classier was NB that has 92% (specicity), 86.45% (sensitivity),
and 89.34% (precision). K-NN, which has 88.01% (precision), 84.56% (accuracy),
and 85.57% (accuracy) of the classication was the third important classication.
The efciency of 20 fold CV classiers on the maximum range is shown in Fig.
6. As shown in it, LR’s output in terms of precision, sensitivity, and specicities
beats the other two classiers.
CONCLUSIONS
Air pollution is a furious worldwide problem. Increased cars, industry, and home
pollutants are detrimental to the atmosphere and impact human lives as well. The
device was checked on a dataset of Cleveland cardiac disease. Three popular clas-
siers, for example, logistic regressions, KNN, and NB have been used for select-
ing the essential characteristics with the three practical algorithms relief, CFS and
LASSO. In the validation scheme, the k-fold cross-validation approach is used.
Different calculation methods were often introduced to verify the efciency of clas-
siers. When chosen using a correlation-based subset sorting algorithm, the logistic
regression with 20-fold cross-validation displayed the highest accuracy of 92%.
Acknowledgements. The authors thankfully acknowledge the Deanship of Scientic Research, King
Khalid University (KKU), Abha, Asir, Kingdom of Saudi Arabia for funding this research under
the grant number R.G.P.2/89/41. This work has been done remotely at DAAI Lab, Thu Dau Mot
University Vietnam and i3 LABs, Techno India NJR Institute of Technology, India.
REFERENCES
1. Pravin Kshirsagar et al.: Operational Collection Strategy for Monitoring Smart Waste Manage-
ment System Using Shortest Path Algorithm. J Environ Prot Ecol, 22 (2), 566 (2021).
2. R. J. LAUMBACH, H. M. KIPEN: Respiratory Health Effects of Air Pollution: Update on
Biomass Smoke and Trafc Pollution. J Allergy Clin Immunol, 129, 3 (2012).
3. T. M. CHEN, J. GOKHALE, S. SHOFER, W. G. KUSCHNER: Outdoor Air Pollution: Nitrogen
Dioxide, Sulfur Dioxide, and Carbon Monoxide Health Effects. Am J Med Sci, 333 (4), 249
(2007).
4. P. KSHIRSAGAR, S. AKOJWAR: Novel Approach for Classication and Prediction of Non
Linear Chaotic Databases. In: proceedings of the 2016 International Conference on Electrical,
Electronics, and Optimization Techniques, 2016, 514-518. DOI: 10.1109/ICEEOT.2016.7755667.
5. NEERAJ SATHAWANE, PRAVIN KSHIRSAGAR: ECG Signals for Chaotic Diagnosis Using
ANN, PSO and Wavelet. International Journal of Electronics, Communication & Instrumentation
Engineering Research and Development (IJECIERD), 4 (2), 127 (2014).
6. C. A. POPE, R. T. BURNETT, G. D. THURSTON: Cardiovascular Mortality and Long-term
Exposure to Particulate Air Pollution: Epidemiological Evidence of General Pathophysiological
Pathways of Disease. Circulation, 109, 71 (2004).
1835
7. SUMITRA SANGWAN, TAZEEM AHMAD KHAN: Review Paper Automatic Console for
Disease Prediction Using Integrated Module of A-priori and k-mean through ECG Signal. Int J
Technol Res Eng, 2 (7) ISSN(Online): 2347-4718, 1368 (2015).
8. A. PETERS, D. W. DOCKERY, J. E. MULLER: Increased Particulate Air Pollution and the
Triggering of Myocardial Infarction. Circulation. 103, 2810 (2001).
9. P. R.KSHIRSAGAR, S. G.AKOJWAR, R. DHANORIYA: Classication of ECG-signals Using
Articial Neural Networks. In: Proceedings of International Conference on Intelligent Technolo-
gies and Engineering Systems, Lecture Notes in Electrical Engineering, vol. 345. Springer, Cham,
2014.
10. PRAVIN KSHIRSAGAR, SUDHIR AKOJWAR: Hybrid Heuristic Optimization for Benchmark
Datasets. Int J Comput Appl, 146 (7), 11 (2016).
11. ANIMESH HAZRA, SUBRATA KUMAR MANDALL: Heart Disease Diagnosis and Prediction
Using Machine Learning and Data Mining Techniques: a Review. Adv Comput Sci Technol, 10
(7), 2137 (2017).
12. SURIYA BEGUMA, FAROOQ AHMED SIDDIQUEB, RAJESH TIWARIC: A Study for Predict-
ing Heart Disease Using Machine Learning. Turk J Comput Math Educ, 12 (10), 4584 (2021).
13. P. R. KSHIRSAGAR, S. G. AKOJWAR: Prediction of Neurological Disorders Using Optimized
Neural Network. In: Proceedings of the International Conference on Signal Processing, Com-
munication, Power and Embedded System, 3–5 October, Paralakhemundi, India, 2016, 1695-169.
DOI: 10.1109/SCOPES.2016.7955731.
14. PRAVIN KSHIRSAGAR, SUDHIR AKOJWAR: Classication and Detection of Neurological
Disorders Using ICA and AR as Feature Extractor. Int J Eng Sci, 1 (1), 1 (2015).
15. PRAVIN KSHIRSAGAR, SUDHIR AKOJWAR, NIDHI BAJAJ: A Hybridised Neural Network
and Optimisation Algorithms for Prediction and Classication of Neurological Disorders. Int J
Biomed Eng Technol, 28 (4), 307 (2020). DOI: 10.1504/IJBET.2018.095981.
Received 26 May 2021
Revised 25 June 2021
... Sensitivity study revealed that sand grain size and fiber percentage play an important impact in compressive strength, accounting for about 50% of the total [41][42][43][44][45][46][47][48][49][50]. These findings point to PSPB's potential [51][52][53][54][55][56][57][58][59][60] in sustainable construction, which promotes environmental preservation while also providing economic benefits. ...
... Concrete's machinability, along with its compressive strength, flexural strength, and overall strength, was one of the properties that was evaluated [37]. In addition to the incorporation of ceramic waste in place of coarse aggregates, the water-cement ratios had also been adjusted to a range between 0.4 and 0.44 [54]. The chemical components of both the powder combination and the compound are listed in Table 2 [55]. ...
... The slope of a line (α 1 ), represented by the number 1, is a crucial part of every dependent variable (b). There is a positive connection if the quantity is more than zero, and a negatives one if it is less than zero [54,94]. ...
Article
Full-text available
Over two centuries, concrete has been crucial to building. Thus, eco-friendly concrete is being developed. Emulating these tangible traits has recently gained popularity. Ceramic waste concrete’s mechanical properties were modeled in this study. Ceramic waste percentages ranged from 5 to 20%. Compressive and tensile concrete strengths were modeled. To predict concrete hardness, regression modeling and artificial neural network (ANN) were used. Model performance was evaluated using prediction coefficients and root-mean-square error (RMSE). ANN models outperformed linear prediction with a coefficient for determination (R²) of 0.97. ANN models achieved root-mean-square errors (RMSEs) of 1.22 MPa, 1.21 MPa, and 1.022 MPa after 7, 14, and 28 days of retraining, respectively. Linear regression model showed RMSE values of 1.21, 1.32, and 1.27 MPa at 7, 14, and 28 days, respectively. In determining the compressive and tensile strength, the R² was 0.70, meanwhile the ANN model achieved 0.87. Given its accuracy in predicting the strength qualities of ceramics cement and structural stiffness, the ANN model presents a promising tool for representing various types of concrete.
... LSTMs are used because they can learn to bridge minimal time lags over 1000 discrete time steps by enforcing constant error flow through "constant error carrousels" (CECs) within special units, called cells; this is because the bidirectional neural network has a higher hit ratio in obstacles like vanishing error problems and time delays. Two major subfields of machine learning are supervised and unsupervised learning [41][42][43][44][45][46][47]. For instance, the k-means algorithm can be used to create hard clustering, which one of two techniques to clustering that make up unsupervised is learning. ...
Article
Full-text available
The evaluation of an English essay is one of the most significant and difficult activities that is manually carried out by knowledgeable and capable instructors and faculty members. The advancement of science and technology has made it possible to automatically evaluate an English essay by employing techniques pertaining to natural language processing. For any given English essay, the intelligent system provides a generic evaluation as well as the topic/question correlation. This evaluation is based on the NLP multiple neural network model, which was used to build the system. The evaluation of essays according to worldwide standards is the primary contribution of this innovation. Any worldwide grading system, such as the Graduate Record Examination, the International English Language Testing System, etc., is qualified to make use of the grading standard. The algorithm gives users the opportunity to test their knowledge on a range of criteria, from the most basic to the most complicated, that are included in the scoring of an English essay.
... There has been an increase in the number of businesses providing programmes in evolving management, but some organisations have developed their own approaches to management transformation. Markets for change management devices and training developed substantially during this time [20][21][22][23][24][25][26]. A lot more training and certification in Change Management is needed in corporations nowadays. ...
Article
In order to successfully deliver Rail project business outcomes, this paper explores the necessary project management approaches, strategies, methodologies, and processes. As well as transitional resources and templates to adopt and improve change management throughout the project. The paper begins by discussing Rail's business environment using the rational structure methodology. And we observe several methods for mitigating danger. In addition, the study suggests the working framework as an approach for project management, planning, and preparation. The research into the Rail Project incorporates findings from all of these
... Once the purchase is made, we can end the purchase by clicking the finish button, which is visible on the LCD touch screen display. After this, it will take you to the payment page [121][122][123][124][125]. The payment page contains the payment modes like PhonePe, Google Pay, and Paytm. ...
Article
Full-text available
Since the proliferation of mobile devices and improved communication methods, online shopping has skyrocketed in popularity. Daily life in major cities increasingly revolves around travelling to and shopping at enormous shopping centres. The weekends and holidays see the greatest influx of customers to these establishments. All sorts of items get thrown into the cart at the store by customers. The checkout counter is where you pay for your items after making a transaction. Waiting in line at the charge counter is inevitable because of the lengthy time it takes the cashier to prepare the bill victimisation bar code reader. Also, people waste time waiting to pay for their items at the register in many supermarkets and shopping malls, which is a deterrent to spending. The billing process is laborious and time consuming. Because of this, shops have had to devote additional workers to the billing counter, although customer wait times remain unacceptable. The intent is to provide a technologically advanced, cost-effective, easily scalable, and robust system to facilitate in-store purchasing.
... Once the purchase is made, we can end the purchase by clicking the finish button, which is visible on the LCD touch screen display. After this, it will take you to the payment page [121][122][123][124][125]. The payment page contains the payment modes like PhonePe, Google Pay, and Paytm. ...
Article
Full-text available
Since the proliferation of mobile devices and improved communication methods, online shopping has skyrocketed in popularity. Daily life in major cities increasingly revolves around travelling to and shopping at enormous shopping centres. The weekends and holidays see the greatest influx of customers to these establishments. All sorts of items get thrown into the cart at the store by customers. The checkout counter is where you pay for your items after making a transaction. Waiting in line at the charge counter is inevitable because of the lengthy time it takes the cashier to prepare the bill victimisation bar code reader. Also, people waste time waiting to pay for their items at the register in many supermarkets and shopping malls, which is a deterrent to spending. The billing process is laborious and time consuming. Because of this, shops have had to devote additional workers to the billing counter, although customer wait times remain unacceptable. The intent is to provide a technologically advanced, cost-effective, easily scalable, and robust system to facilitate in-store purchasing.
... Once the purchase is made, we can end the purchase by clicking the finish button, which is visible on the LCD touch screen display. After this, it will take you to the payment page [121][122][123][124][125]. The payment page contains the payment modes like PhonePe, Google Pay, and Paytm. ...
Article
Since the proliferation of mobile devices and improved communication methods, online shopping has skyrocketed in popularity. Daily life in major cities increasingly revolves around travelling to and shopping at enormous shopping centres. The weekends and holidays see the greatest influx of customers to these establishments. All sorts of items get thrown into the cart at the store by customers. The checkout counter is where you pay for your items after making a transaction. Waiting in line at the charge counter is inevitable because of the lengthy time it takes the cashier to prepare the bill victimisation bar code reader. Also, people waste time waiting to pay for their items at the register in many supermarkets and shopping malls, which is a deterrent to spending. The billing process is laborious and time consuming. Because of this, shops have had to devote additional workers to the billing counter, although customer wait times remain unacceptable. The intent is to provide a technologically advanced, cost-effective, easily scalable, and robust system to facilitate in-store purchasing.
... A microcontroller acts as the hub to which various devices are linked. For the smart garden, a NodeMCU serves as a hub to which various sensors, such as moisture sensors, humidity sensors, and temperature sensors, are attached [14][15][16][17][18][19]. A second soil moisture sensor, this one linked to a storage tank, reports on the liquid volume in the latter. ...
Article
Full-text available
When we're away for a few days. In these circumstances, we can't water our beloved plants or we're too busy to do so, so we often forget about them. Indoor plants require regular watering, so it's hard to keep them alive. Simple solution. It's perfect for lazy individuals like me. We wish to implement IOT and cloud-based technologies, which are popular presently. This project measures the micro garden's soil moisture, temperature, and humidity. We can cut water waste by using machine learning models to determine if a farm needs watering based on soil factors. It has a microcontroller to which other things are attached. NodeMCU is the hub to which moisture, humidity, and temperature sensors are attached. A soil moisture sensor connected to a water tank shows tank level. Other sensors feed data to NodeMCU, which has built-in Wi-Fi. BLYNK is an online database that updates sensor values every second. Users can monitor parameters from anywhere. Soil type affects garden watering. For software automation, sensor values are predetermined. A switch in the app automates garden watering as needed. This improves garden maintenance.
Article
Full-text available
Corrosion inhibition behaviour on mild steel (MS) in 0.5 M H2SO4 by the Brassica juncea stem extract (BJSE) has been considered using weight loss method and electrochemical technique. Effect of inhibitors concentration and temperature (293-313 K) on inhibition efficiency during an immersion period of 6 h has been studied on mild steel. The optimum inhibition efficiency (IE) was obtained 90% at 293 K temperature with 1000 ppm extract concentration. The results of Polarization study indicated the stem extract as mixed type inhibitor. Langmuir adsorption isotherm showed physisorption of extract. The surface was examined using stemi 508 microscope.
Chapter
Nutrients are essential for the plants as like humans. The deficiency of the nutrients can be identified by trained specialists by analyzing the visual symptoms. It is a time-consuming process and needs proper attention since the symptoms appear in various parts of the crops. This work proposes an image classification and recognition system that analyzes for analyzing nutrient deficiency from their visual appearance. The methodology uses deep learning techniques to analyze the nutritional status of the plant using the digital images of its leaves. Initially, the experiments are carried out for the boron and potassium nutrients deficiency of the plantain trees. The banana nutrient deficiency dataset consists of nutrient-deficient leaf images of plantain trees collected from various cultivation fields of Tamil Nadu. The efficacy of convolutional neural network (CNN) makes them suitable for these kinds of applications. The system has experimented with state-of-the-art CNN architectures such as resnet50 and GoogleNet for classifying boron and potassium-deficient plant images. The efficient architecture in terms of classification accuracy is selected for developing the smart phone-centric real-time nutrient deficiency detection system for assisting the farmers. The application presents the nutritional status of the plants, when getting the plantain leaf images through the classification performed by the CNN. In the future, the dataset will expand by collecting the images of other nutrient-deficient plants and will improve the classification accuracy with a greater number of images for obtaining more advantageous and expressive results for the real-time use of the farmers.KeywordsPlantain treesNutrition deficiencyBoron and potassiumDeep learningConvolution neural networkReal-time application
Article
Human deaths can be attributed to leukaemia, a form of cancer. The survival rates of those treated for it only improve with better detection and diagnosis. Currently, pathologists examine microscopic images for signs of cancer or blood problems. To achieve this, we look at the images' texture, geometry, colour, and statistical analysis for differences. This study details a variety of feature extraction strategies for identifying leukaemia in micrographs of blood samples. The use of image analysis is crucial to this process. Here, we begin with a discussion of fundamentals in cell biology and proceed to demonstrate our proposed method in action. In an effort to keep our prices as low as possible, we are just making use of visuals. We have been using MATLAB as a cancer cell detecting tool.
Article
Full-text available
One of the significant challenges that are existing in many smart cities for managing the waste level of bins has been addressed in this article for growth of urban areas. Many concerns that are related in direct connection with management and separation of waste material have been investigated. The process of separating waste materials will have direct impact on generation of revenue for entire city. Thus, the proposed method provides solution for increasing the revenue by considering low weight age and distance with minimised statistical values. The same technique has been incorporated in presence of wireless sensors and is plotted in MATLAB by apprehending the values through Thing Speak. To evaluate the efficiency of proposed method two different traditional methods are considered where statistical values of projected method outperforms the existing method by an increased percentage of 78{\%}. Also the revenue generated at many urban areas is much higher which is considered as one major advantage at each connected areas.
Article
Full-text available
This work introduces a hybrid model of Artificial Neural Network and Particle Swarm Optimisation (PSO) algorithm for the classification and prediction of various neurological disorders. The proposed algorithm is capable of performing classification and prediction of neurological diseases based on the EEG signal input. Here Probabilistic Neural Network (PNN) is used as it is very efficient for classification purposes. Classification results are verified by using tenfold cross-validation technique. Prediction is performed by using modified PSO. The EEG database is obtained from CIIMS Hospital, Nagpur. The results are highly reliable with graphs for predicted signal and prediction error. Percentage of accuracy, sensitivity and mean squared error are calculated as well. With the help of this system classification of EEG signals can now be easily done with accuracy greater than 99% and in a short span of time.
Article
Full-text available
A popular saying goes that we are living in an “information age”. Terabytes of data are produced every day. Data mining is the process which turns a collection of data into knowledge. The health care industry generates a huge amount of data daily. However, most of it is not effectively used. Efficient tools to extract knowledge from these databases for clinical detection of diseases or other purposes are not much prevalent. The aim of this paper is to summarize some of the current research on predicting heart diseases using data mining techniques, analyse the various combinations of mining algorithms used and conclude which technique(s) are effective and efficient. Also, some future directions on prediction systems have been addressed.
Article
Full-text available
In this paper we have given an idea about the detection of neurological disorders. For this we have used the two techniques of feature extraction i.e. ICA and AR. Then these features are given to the ANN and fuzzy model in order to detect the disorders of various EEG signals. The EEG signals used in our research work were real time signals.
Conference Paper
Full-text available
This paper we propose a methodology of classification and prediction of chaotic time series datasets by the beauty of particle swarm optimization (PSO) in multilayer feed forward backpropagation neural network (MFFBPNN) for finding initial weights and biases of the MLFFBPNN. Designed algorithm was used for different horizons in classification and prediction to chaotic datasets by overcoming the disadvantage of feedforward back propagation of getting stuck at local minima or local maxima. The Chaotic datasets such as Mackey glass series, Box Jerkins Gas furnace, Breast Cancer, Diabetic dataset are considered in this paper for testing of designed algorithm. The performance of PSO with MLFFBPNN for finding weights and biases is implemented and compared with random initialization of weights and biases with normal MFFBPNN. The result shows the Comparative performance of MLFFBPNN and designed algorithm on dynamic behavior of chaotic time series datasets in terms of mean square error.
Article
Full-text available
This paper introduces hybridization of particle swarm optimization(PSO) with genetic algorithm (GA)denoted as PSO+GAprovides anefficient approachwhich is usedto solvenon linear chaotic datasets. The proposed algorithm employed in probabilistic neural network(PNN) which is a variant of radial basic function artificial neural network (RBFANN) for finding precise value spread factor for accurate classification of chaotic time series.Hybridizing of particle swarm optimization (PSO)and genetic algorithm (GA) in social learning helps collective efficiency, robustnessand global effectiveness. The hybrid approached whichthenis resulted inthe integrated framework for complete determination of spread factor with evaluation parameters. The algorithm is tested on two benchmark problems and compared the performance with arbitraryspread factor ofPNN. The results showed that the PSO+GA based heuristic optimization algorithm outperform in terms of higher classificationand prediction accuracies with short computation time (PDF) Hybrid Heuristic Optimization for Benchmark Datasets. Available from: https://www.researchgate.net/publication/305361878_Hybrid_Heuristic_Optimization_for_Benchmark_Datasets [accessed Apr 17 2019].
Article
Mounting evidence suggests that air pollution contributes to the large global burden of respiratory and allergic diseases, including asthma, chronic obstructive pulmonary disease, pneumonia, and possibly tuberculosis. Although associations between air pollution and respiratory disease are complex, recent epidemiologic studies have led to an increased recognition of the emerging importance of traffic-related air pollution in both developed and less-developed countries, as well as the continued importance of emissions from domestic fires burning biomass fuels, primarily in the less-developed world. Emissions from these sources lead to personal exposures to complex mixtures of air pollutants that change rapidly in space and time because of varying emission rates, distances from source, ventilation rates, and other factors. Although the high degree of variability in personal exposure to pollutants from these sources remains a challenge, newer methods for measuring and modeling these exposures are beginning to unravel complex associations with asthma and other respiratory tract diseases. These studies indicate that air pollution from these sources is a major preventable cause of increased incidence and exacerbation of respiratory disease. Physicians can help to reduce the risk of adverse respiratory effects of exposure to biomass and traffic air pollutants by promoting awareness and supporting individual and community-level interventions.
Article
Elevated concentrations of ambient particulate air pollution have been associated with increased hospital admissions for cardiovascular disease. Whether high concentrations of ambient particles can trigger the onset of acute myocardial infarction (MI), however, remains unknown. We interviewed 772 patients with MI in the greater Boston area between January 1995 and May 1996 as part of the Determinants of Myocardial Infarction Onset Study. Hourly concentrations of particle mass <2.5 microm (PM(2.5)), carbon black, and gaseous air pollutants were measured. A case-crossover approach was used to analyze the data for evidence of triggering. The risk of MI onset increased in association with elevated concentrations of fine particles in the previous 2-hour period. In addition, a delayed response associated with 24-hour average exposure 1 day before the onset of symptoms was observed. Multivariate analyses considering both time windows jointly revealed an estimated odds ratio of 1.48 associated with an increase of 25 microg/m(3) PM(2.5) during a 2-hour period before the onset and an odds ratio of 1.69 for an increase of 20 microg/m(3) PM(2.5) in the 24-hour period 1 day before the onset (95% CIs 1.09, 2.02 and 1.13, 2.34, respectively). The present study suggests that elevated concentrations of fine particles in the air may transiently elevate the risk of MIs within a few hours and 1 day after exposure. Further studies in other locations are needed to clarify the importance of this potentially preventable trigger of MI.
Article
Epidemiologic studies have linked long-term exposure to fine particulate matter air pollution (PM) to broad cause-of-death mortality. Associations with specific cardiopulmonary diseases might be useful in exploring potential mechanistic pathways linking exposure and mortality. General pathophysiological pathways linking long-term PM exposure with mortality and expected patterns of PM mortality with specific causes of death were proposed a priori. Vital status, risk factor, and cause-of-death data, collected by the American Cancer Society as part of the Cancer Prevention II study, were linked with air pollution data from United States metropolitan areas. Cox Proportional Hazard regression models were used to estimate PM-mortality associations with specific causes of death. Long-term PM exposures were most strongly associated with mortality attributable to ischemic heart disease, dysrhythmias, heart failure, and cardiac arrest. For these cardiovascular causes of death, a 10-microg/m3 elevation in fine PM was associated with 8% to 18% increases in mortality risk, with comparable or larger risks being observed for smokers relative to nonsmokers. Mortality attributable to respiratory disease had relatively weak associations. Fine particulate air pollution is a risk factor for cause-specific cardiovascular disease mortality via mechanisms that likely include pulmonary and systemic inflammation, accelerated atherosclerosis, and altered cardiac autonomic function. Although smoking is a much larger risk factor for cardiovascular disease mortality, exposure to fine PM imposes additional effects that seem to be at least additive to if not synergistic with smoking.