Figure - available from: Multimedia Tools and Applications
This content is subject to copyright. Terms and conditions apply.
Flowchart of the android based mobile application of the mung bean pest and disease detection and classification

Flowchart of the android based mobile application of the mung bean pest and disease detection and classification

Source publication
Article
Full-text available
Crop pests and diseases are major threats to food security globally. The mung bean (Vigna Radiata) is one of the leading crops in India. A large part of the population in India is completely dependent on mung bean. So, high production efficiency for the mung bean is required, which does not happen due to the excessive damage from pests and diseases...

Similar publications

Preprint
Full-text available
Fairness is a fundamental requirement for trustworthy and human-centered Artificial Intelligence (AI) system. However, deep neural networks (DNNs) tend to make unfair predictions when the training data are collected from different sub-populations with different attributes (i.e. color, sex, age), leading to biased DNN predictions. We notice that suc...

Citations

... The plant yields produced from agriculture are an important source of food for human begins and even various other animals. So, if pests or insects affect these agricultural crops, it has a huge impact on the quantity and quality of the agricultural yields that are produced by causing disease to the plants [1]. Insects and pests are the major threat to plants during their growth period. ...
... In Eq. (3), the terms hns Y olo ys ,krn Y olo yd , and f tr Y olo ysa represent the tuned hidden neurons, kernels, and filters of the YoloV3, respectively. The hidden neurons of the YoloV3 are optimized between [5,255], the filters of the YoloV3 are tuned in the range [32,256], and the kernels of the YoloV3 are tuned between [1,3], respectively. The peak signal-to-noise ratio (PSNR) is used to assess the quality of the reconstruction image. ...
Article
Full-text available
Insects harm or destroy the crops and plants in agriculture fields by causing infection to the plants or destroying the valuables, which is called as a pest. When a plant is invaded by pests, the quality of the food it produces decreases drastically. So, it is highly essential to detect the pests before they attack the plants. However, the existing pest detection and categorizing techniques need suggestions and decisions from entomologists, and also this process consumes more time. If pests are identified at an early stage, then it could help the farmer to eliminate the necessity for pesticides and also increase food production. Because of its almost similar look, detecting and classifying the pests associated with a crop is complex work for the farmer, especially during the initial stage of plant growth. The sudden and productive growth in the Internet-of-Things (IoT) technology also finds its application in agriculture, resulting in a transition from statistical to quantitative methods. To alleviate the issues in the agricultural sector, a new framework for an IoT-assisted Automatic Pest Prediction and Classification (APDC) model using ensemble transfer learning of the convolutional neural network (CNN) method is developed. At first, IoT sensors are used to capture pest images from the agricultural field. These images are stored in the standard database, from which these images are taken for conducting experiments. The gathered images are then subjected to image pre-processing for contrast enhancement by median filter (MF). After that, the pests are detected from the pre-processed image by means of a Hybrid You Only Look Once (Yolo) v3 and Single Shot multi-box Detector (HYSSD) model. In this model, two algorithms, namely the Beetle Swarm Optimization (BSO) and the Salp Swarm Algorithm (SSA), are combined to optimize the parameters. An adaptive ensemble transfer CNN (AETC) is used to identify the pests after it has been detected. DenseNet, MobileNet, and ResNet are the three models that constitute this ensemble model. Finally, various metrics are used to verify the effectiveness of the proposed classification model. The findings from the results show that the recommended method has better classification accuracy.
... [22]. ...
Article
Full-text available
Disease and detection is crucial for the protection of forest growth, reproduction, and biodiversity. Traditional detection methods face challenges such as limited coverage, excessive time and resource consumption, and poor accuracy, diminishing the effectiveness of forest disease prevention and control. By addressing these challenges, this study leverages drone remote sensing data combined with deep object detection models, specifically employing the YOLO-v3 algorithm based on loss function optimization, for the efficient and accurate detection of tree diseases and pests. Utilizing drone-mounted cameras, the study captures insect pest image information in pine forest areas, followed by segmentation, merging, and feature extraction processing. The computing system of airborne embedded devices is designed to ensure detection efficiency and accuracy. The improved YOLO-v3 algorithm combined with the CIoU loss function was used to detect forest pests and diseases. Compared to the traditional IoU loss function, CIoU takes into account the overlap area, the distance between the center of the predicted frame and the actual frame, and the consistency of the aspect ratio. The experimental results demonstrate the proposed model’s capability to process pest and disease images at a slightly faster speed, with an average processing time of less than 0.5 s per image, while achieving an accuracy surpassing 95%. The model’s effectiveness in identifying tree pests and diseases with high accuracy and comprehensiveness offers significant potential for developing forest inspection protection and prevention plans. However, limitations exist in the model’s performance in complex forest environments, necessitating further research to improve model universality and adaptability across diverse forest regions. Future directions include exploring advanced deep object detection models to minimize computing resource demands and enhance practical application support for forest protection and pest control.
... Results revealed that Densenet121 outperformed the other models. Mallick et al. [10] used Deep Learning models to classify mung bean leaf diseases in which the experiments revealed an average accuracy of 93.65%. ...
Conference Paper
Crops play a vital role in human nutrition and overall well-being. Beans are an economically significant type of crops that not only provide a rich source of protein but also offers substantial health benefits. However, beans are susceptible to bacterial and fungal infections. If left uncontrolled, these diseases could result in severe consequences. Diagnosis of the affected leaves can be done by plant pathologists, which is not readily available in local and remote areas. Using deep learning algorithms, an accurate classification of bean leaf diseases can offset this problem from an early stage. This research introduces a framework for optimizing MobileNetV3 using Particle Swarm Optimization that can accurately identify healthy or diseased bean leaves from images captured in real-world conditions. With having only 2.9M parameters, the best model achieved a high degree of accuracy at 98.44% on unseen data, outperforming the models proposed by other researchers on the same classification task. Results suggest that the model can be adopted in the field to increase the efficiency in the early identification of bean leaf diseases. Lastly, the same methodology can also be applied in solving plant leaf diseases from other economically significant crops.
... With an accuracy of 98.53%, the meta-DL model outperformed the cotton dataset. Mallick et al. [26] recommended a smartphone-based novel DL model for identifying mung bean pests and diseases. Using the transfer learning approach, the suggested model recognized six distinct forms of mung disease and four kinds of pests with an accuracy of 93.65%. ...
Article
Full-text available
Agriculture is important for the economy of any country, and India is considered to be an agricultural country. One of the primary goals of agriculture is to produce disease-free crops. Since ancient times, farmers and other planting specialists have had to contend with a variety of problems and current agricultural constraints, such as widespread cotton diseases. There is a great need for a rapid, efficient, economical, and reliable approach to diagnosing cotton infection in the agri-informatics area, as severe cotton disease may result in the loss of grain crops. This paper presents an advanced method that automates the detection and classification of diseased cotton leaves and plants through deep learning techniques applied to images. To address the challenge of supervised image classification, we employ a bagging ensemble technique consisting of five transfer learning models: InceptionV3, InceptionResNetV2, VGG16, MobileNet, and Xception. This ensemble approach was adopted to significantly improve the performance of each individual mode. The ETL-NET framework we introduced was thoroughly evaluated using two publicly accessible datasets. Specifically, it achieved an impressive accuracy rate of 99.48% and a sensitivity rate of 99% when applied to binary datasets. Additionally, on the multi-class dataset, the framework achieved an accuracy rate of 98.52% and a sensitivity rate of 99%. Our method outperformed the state-of-the-art techniques and displayed comparatively better results. Remarkably, our approach demonstrated even higher performance than widely used ensemble techniques, generally considered benchmarks in the field.
... Proof of this is the appearance of works related to image recognition, especially in the field of agriculture, where various approaches to using deep learning (DL) methods to classify the phenology of different food crops around the world have been presented. This allows us to have knowledge of the record of critical moments in the life cycle of the plant to program treatments, effectively and timely apply pesticides or fungicides, and prevent and control plagues and diseases; this offers great advantages in precision agriculture in a nonharmful manner, and it helps minimize damage to crops [13][14][15][16][17]. ...
Article
Full-text available
The early and precise identification of the different phenological stages of the bean (Phaseolus vulgaris L.) allows for the determination of critical and timely moments for the implementation of certain agricultural activities that contribute in a significant manner to the output and quality of the harvest, as well as the necessary actions to prevent and control possible damage caused by plagues and diseases. Overall, the standard procedure for phenological identification is conducted by the farmer. This can lead to the possibility of overlooking important findings during the phenological development of the plant, which could result in the appearance of plagues and diseases. In recent years, deep learning (DL) methods have been used to analyze crop behavior and minimize risk in agricultural decision making. One of the most used DL methods in image processing is the convolutional neural network (CNN) due to its high capacity for learning relevant features and recognizing objects in images. In this article, a transfer learning approach and a data augmentation method were applied. A station equipped with RGB cameras was used to gather data from images during the complete phenological cycle of the bean. The information gathered was used to create a set of data to evaluate the performance of each of the four proposed network models: AlexNet, VGG19, SqueezeNet, and GoogleNet. The metrics used were accuracy, precision, sensitivity, specificity, and F1-Score. The results of the best architecture obtained in the validation were those of GoogleNet, which obtained 96.71% accuracy, 96.81% precision, 95.77% sensitivity, 98.73% specificity, and 96.25% F1-Score.
... Barboza et al. [18] improved the detection of maize weevils by utilizing X-ray imaging technology in conjunction with deep learning algorithms. Mallick et al. [19] proposed a deep learning-based method for the rapid detection and identification of pests in mung beans. Choi et al. [20] introduced a method for constructing a deep learning partitioned image dataset for pest detection in strawberries, resulting in an improved accuracy of 91.93% in detecting pests in strawberries. ...
Article
Full-text available
Moldy corn produces aflatoxin and gibberellin, which can have adverse effects on human health if consumed. Mold is a significant factor that affects the safe storage of corn. If not detected and controlled in a timely manner, it will result in substantial food losses. Understanding the infection patterns of mold on corn kernels and the changing characteristics of the internal structure of corn kernels after infection is crucial for guiding innovation and optimizing detection methods for moldy corn. This knowledge also helps maintain corn storage and ensure food safety. This study was based on X-ray tomography technology to non-destructively detect changes in the structural characteristics of moldy corn kernels. It used image processing technology and model reconstruction algorithms to obtain the 3D model of the embryo, pores and cracks, endosperm and seed coat, and kernels of moldy corn kernels; qualitative analysis of the characteristic changes of two-dimensional slice grayscale images and 3D models of moldy corn kernels; and quantitative analysis of changes in the volume parameters of corn kernels, embryos, endosperm, and seed coats as a whole. It explored the detection method of moldy corn kernels based on a combination of X-ray tomography technology and deep learning algorithms. The analysis concluded that mold infection in maize begins in the embryo and gradually spreads and that mold damage to the tissue structure of maize kernels is irregular in nature. The overall volume parameter changes of corn kernels, embryos, endosperm, and seed coats in the four stages of 0 d, 5 d, 10 d, and 15 d showed a trend of first increasing and then decreasing. The ResNet50 model was enhanced for detecting mold on maize kernels, achieving an accuracy of over 93% in identifying mold features in sliced images of maize kernels. This advancement enabled the non-destructive detection and classification of the degree of mold in maize kernel samples. This article studies the characterization of the characteristic changes of moldy corn kernels and the detection of mildew, which will provide certain help for optimizing the monitoring of corn kernel mildew and the development of rapid detection equipment.
... With a 97.38 accuracy score, our MixConvNet surpassed deep learning classifiers. Mallick et al. [50] created a method for detecting DL mung bean pests. Transfer learning can identify pests and illnesses fast, but there are limited highquality mung bean crop photographs accessible for training. ...
Article
Full-text available
In the modern era, agriculture is necessary for human existence globally, and it is imperative to work toward increasing agricultural yields. Yet, crop production may be affected due to the presence of pests, which can cause injury to crops or slow the growth of crops. As a result, pest detection and control in agriculture fields must begin immediately. Pest monitoring methods are labor-intensive, dangerous, and require a lot of physical labor. With the newest AI and IoT breakthroughs, specific upkeep jobs can be controlled automatically and radically, improving performance and reliability for pest detection in the agricultural field. This research offers a real-time remote pest detection strategy utilizing the IoT and DL architectures. The IoT and DMF-ResNet, part of the integrated pest detection approach, are the primary components that make up the architecture of the remote pest detection system. The DMF-ResNet pest detection technique is trained with the help of the sounds made by pests. The findings of this research offer new perspectives on the ambition of IoT and AI for pest monitoring in the field, and maintaining vigilance almost necessitates no active participation from a human being. The recommended DMF-ResNet system accurately automate the detection of agricultural pests based on results from experiments in large agricultural fields. It outscored the traditional works DenseNet, VGG-16, YOLOv5, DCNN, ANN, KNN, Faster RCNN, and ResNet-50 approaches for pest detection with 99.75% accuracy, 98.64% sensitivity, 98.48% specificity, 99.08% recall, 99.18% precision, and an F1 score of 99.11%.
... In the context of our study, we are interested in reviewing the use of modern technologies in the case of early disease detection, which is among the top priorities of farm stakeholders since it will ensure healthy farms and highquality crops. In the literature, several works aim to use modern technologies to ensure the early detection of plant infestation [23,19,31,48,50]. However, this research area still needs more investigation, especially for palm trees [15,18,35]. ...
Preprint
Full-text available
The Red Palm Weevil (RPW), also known as the palm weevil, is considered among the world's most damaging insect pests of palms. Current detection techniques include the detection of symptoms of RPW using visual or sound inspection and chemical detection of volatile signatures generated by infested palm trees. However, efficient detection of RPW diseases at an early stage is considered one of the most challenging issues for cultivating date palms. In this paper, an efficient approach to the early detection of RPW is proposed. The proposed approach is based on RPW sound activities being recorded and analyzed. The first step involves the conversion of sound data into images based on a selected set of features. The second step involves the combination of images from the same sound file but computed by different features into a single image. The third step involves the application of different Deep Learning (DL) techniques to classify resulting images into two classes: infested and not infested. Experimental results show good performances of the proposed approach for RPW detection using different DL techniques, namely MobileNetV2, ResNet50V2, ResNet152V2, VGG16, VGG19, DenseNet121, DenseNet201, Xception, and InceptionV3. The proposed approach outperformed existing techniques for public datasets.
... In the context of our study, we are interested in reviewing the use of modern technologies in the case of early disease detection, which is among the top priorities of farm stakeholders since it will ensure healthy farms and high-quality crops. In the literature, several works aim to use modern technologies to ensure the early detection of plant infestation (Hu et al., 2022b;Haridasan et al., 2023;Mallick et al., 2023;Yu et al., 2023;Zhu et al., 2023). However, this research area still needs more investigation, especially for palm trees (Ferreira et al., 2021;Goldshtein et al., 2022;Putra et al., 2022). ...
Article
The Red Palm Weevil (RPW), also known as the palm weevil, is considered among the world's most damaging insect pests of palms. Current detection techniques include the detection of symptoms of RPW using visual or sound inspection and chemical detection of volatile signatures generated by infested palm trees. However, efficient detection of RPW diseases at an early stage is considered one of the most challenging issues for cultivating date palms. In this paper, an efficient approach to the early detection of RPW is proposed. The proposed approach is based on RPW sound activities being recorded and analyzed. The first step involves the conversion of sound data into images based on a selected set of features. The second step involves the combination of images from the same sound file but computed by different features into a single image. The third step involves the application of different Deep Learning (DL) techniques to classify resulting images into two classes: infested and not infested. Experimental results show good performances of the proposed approach for RPW detection using different DL techniques, namely MobileNetV2, ResNet50V2, ResNet152V2, VGG16, VGG19, DenseNet121, DenseNet201, Xception, and InceptionV3. The proposed approach outperformed existing techniques for public datasets.
... These technologies have proven to be highly effective in enhancing identification efficiency and lowering costs associated with pest prevention and control [7,8]. Computer vision technology holds promising application prospects in the field of crop pest detection and identification and is a hotspot for research at present [9]. Nonetheless, the challenge persists due to the presence of multiple and frequently overlapping pest and disease manifestations, diverse and chaotic pest patterns, the need for scale invariance, and intricate surroundings and backgrounds, as well as the occurrence of various sources of interference. ...
Article
Full-text available
Pests and diseases significantly impact the quality and yield of maize. As a result, it is crucial to conduct disease diagnosis and identification for timely intervention and treatment of maize pests and diseases, ultimately enhancing the quality and economic efficiency of maize production. In this study, we present an enhanced maize pest identification model based on ResNet50. The objective was to achieve efficient and accurate identification of maize pests and diseases. By utilizing convolution and pooling operations for extracting shallow-edge features and compressing data, we introduced additional effective channels (environment-cognition-action) into the residual network module. This step addressed the issue of network degradation, establishes connections between channels, and facilitated the extraction of crucial deep features. Finally, experimental validation was performed to achieve 96.02% recognition accuracy using the ResNet50 model. This study successfully achieved the recognition of various maize pests and diseases, including maize leaf blight, Helminthosporium maydis, gray leaf spot, rust disease, stem borer, and corn armyworm. These results offer valuable insights for the intelligent control and management of maize pests and diseases.