A feather shuttlecock (left) and synthetic shuttlecock (right).

A feather shuttlecock (left) and synthetic shuttlecock (right).

Source publication
Article
Full-text available
With an estimation of 220 million people playing badminton on a regular basis, it was particularly popular in Asia but has growing popularity in different regions of the world. The demands of the relevant products, such as shuttlecocks and rackets, are also increasing in the sports industry. Synthetic shuttlecock, produced to offer similar experien...

Contexts in source publication

Context 1
... is one of the top ten most popular participation sports in the world [2] and made its Olympic debut as an official medal sport at the 1992 Summer Olympics. Although both feather and synthetic shuttlecocks contain sixteen leaves and one cork head (shown in Figure 1), feather shuttlecocks are made from goose or duck and the synthetic shuttlecocks comprise of plastic or nylon materials. ...
Context 2
... are two different types of synthetic shuttlecocks, one is single-piece injection-moulded (right-handed side of Figure 1) and another is two-part skirt design [3]. Since single-piece injection-moulded synthetic shuttlecock has been the mainstream design and dominated the market for the past 50 years, the discussion of this paper will focus on this type. ...

Citations

... To check whether the color sequence of the wire is correct, the traditional method is to manually use a high-resolution digital microscope to zoom in directly, so self-inspection through the human eye alone still results in the waste of many wires. Among the existing methods, deep learning-based methods [1][2][3] are more mainstream to solve such automated optical inspection problems. This method trains the designed neural network model by giving a large amount of data, and the parameters are continuously updated to achieve higher accuracy. ...
... Finally, by using the designed identification method, the common types of wires are identified and then the color To check whether the color sequence of the wire is correct, the traditional method is to manually use a high-resolution digital microscope to zoom in directly, so self-inspection through the human eye alone still results in the waste of many wires. Among the existing methods, deep learning-based methods [1][2][3] are more mainstream to solve such automated optical inspection problems. This method trains the designed neural network model by giving a large amount of data, and the parameters are continuously updated to achieve higher accuracy. ...
Article
Full-text available
Given the huge demand for wire in today’s society, the quality of the wire is especially required. To control the quality of the produced wire, the industry has a great desire for automated optical inspection technology. This technology is a high-speed and highly accurate optical image inspection system that uses mechanical sensing equipment to replace the human eye as the inspection method and simulates manual operation by means of a robotic arm. In this paper, a high-performance algorithm for the automated optical inspection of wire color sequence is proposed. This paper focuses on the design of a high-speed wire color sequence detection that can automatically adapt to different kinds of wires and recognition situations, such as a single wire with only one color, and one or two wires covered with aluminum foil. To be further able to successfully inspect even if the wire is short in the screen and the two wires are close to each other, we calculate the horizontal gradient of the wires by edge detection and morphological calculation and identify the types and color sequences of the wires in the screen by a series of discriminative mechanisms. Experimental results show that this method can achieve good accuracy while maintaining a good computation speed.
Article
Full-text available
Quality assessment in industrial applications is often carried out through visual inspection, usually performed or supported by human domain experts. However, the manual visual inspection of processes and products is error-prone and expensive. It is therefore not surprising that the automation of visual inspection in manufacturing and maintenance is heavily researched and discussed. The use of artificial intelligence as an approach to visual inspection in industrial applications has been considered for decades. Recent successes, driven by advances in deep learning, present a possible paradigm shift and have the potential to facilitate automated visual inspection, even under complex environmental conditions. For this reason, we explore the question of to what extent deep learning is already being used in the field of automated visual inspection and which potential improvements to the state of the art could be realized utilizing concepts from academic research. By conducting an extensive review of the openly accessible literature, we provide an overview of proposed and in-use deep-learning models presented in recent years. Our survey consists of 196 open-access publications, of which 31.7% are manufacturing use cases and 68.3% are maintenance use cases. Furthermore, the survey also shows that the majority of the models currently in use are based on convolutional neural networks, the current de facto standard for image classification, object recognition, or object segmentation tasks. Nevertheless, we see the emergence of vision transformer models that seem to outperform convolutional neural networks but require more resources, which also opens up new research opportunities for the future. Another finding is that in 97% of the publications, the authors use supervised learning techniques to train their models. However, with the median dataset size consisting of 2500 samples, deep-learning models cannot be trained from scratch, so it would be beneficial to use other training paradigms, such as self-supervised learning. In addition, we identified a gap of approximately three years between approaches from deep-learning-based computer vision being published and their introduction in industrial visual inspection applications. Based on our findings, we additionally discuss potential future developments in the area of automated visual inspection.