Figure 3 - uploaded by Sebastian Daniel Rosca
Content may be subject to copyright.
Humanoid robot controlled by BCI System [9] 

Humanoid robot controlled by BCI System [9] 

Source publication
Conference Paper
Full-text available
The paper presents how there can be interconnected two ubiquitous elements nowadays. On one hand, the drones, which are increasingly present and integrated into more and more fields of activity, beyond the military applications they come from, moving towards entertainment, real-estate, delivery and so on. On the other hand, unconventional man-machi...

Contexts in source publication

Context 1
... Materials Science and Engineering 294 (2018) 012048 doi:10.1088/1757-899X/294/1/012048 Figure 13. Simulink 3D Animation for take-off the drone ...
Context 2
... Materials Science and Engineering 294 (2017) 012048 doi:10.1088/1757-899X/294/1/012048 Based of inertial XYZ-world coordinates of the drone's center of mass and euler rotations angles (roll rotation about x axis, pitch rotation about y axis and yaw rotation about z axis) obtained by mathematical model and in accordance with simulated model, we also developed a 3d animation of the behavior of the drone in Simulink 3D Animation as presented in Figure 13. ...

Similar publications

Article
Full-text available
Automatic navigation on the drone is being developed these days, a wide variety of types of drones and its automatic functions. Drones used in this study was an aircraft with four propellers or quadcopter. In this experiment, image processing used to recognize the position of an object and ultrasonic sensor used to detect obstacle distance. The met...
Article
Full-text available
Abstract Photography with small unmanned aircraft systems (sUAS) offers opportunities for researchers to better understand habitat selection in wildlife, especially for species that select habitat from an aerial perspective (e.g., many bird species). The growing number of commercial sUAS being flown by recreational users represents a potentially va...
Conference Paper
Full-text available
In the past, human proxemics research has poorly predicted human robot interaction distances. This paper presents three studies on drone gestures to acknowledge human presence and clarify suitable acknowledging distances. We evaluated four drone gestures based on non-verbal human greetings. The gestures included orienting towards the counterpart an...
Article
Full-text available
In this paper, we propose a novel haptic device consisting of a Parrot quadcopter AR Drone 2.0 that delivers force-feedback to users when they press on the surface of the drone in the vertical direction. This drone haptic device will free users from any cumbersome devices which were utilized in previous haptics systems and allow them to sense kines...

Citations

... On the other hand, UAV technology has made rapid progress in recent years and has been widely adopted in material distribution, aerial photography, mapping, and rescue scenarios. Therefore, many researchers have combined BCI with UAV to create various systems to help quadriplegic patients use brain signals to explore the world and other functions [3][4][5]. ...
Article
The traditional Unmanned Aerial Vehicle (UAV) swarm control mainly adopts the ground station method, which is too fixed, and the interaction is difficult to meet the high dynamic task requirements. There is an urgent need for new interaction methods to integrate the advantages of human thinking in dealing with uncertain problems. Nevertheless, brain-computer interface(BCI) technology is directly controlled by thoughts, one of the most promising next-generation human–computer interaction technologies. Therefore, in this study, we innovatively applied the BCI system based on Virtual Reality (VR) to the group UAV and realized a novel and intelligent group control method, which proposes new ideas and paradigms for the control of swarm UAVs in the future. Specifically, this study takes a quadcopter as an example. A modular and extensible multi-quadcopter system was created, and then a visual stimulation 3D VR scene system with a digital twin function was established. On this basis, the BCI system based on the Stable state visual evoked potential (SSVEP) paradigm was adopted for the swarm control of the quadcopter. The experimental results show that the formation control of multi-quadcopter is successfully realized by the subjects using the proposed VR-based BCI interactive system, with an accuracy rate of 90% and a good performance in information transmission rate. In addition, the immersive VR twin system established one-to-one for EEG signal acquisition allows subjects to have a better experience.
... A direct way of communication of the supervisor's intention to the leader of the robotic swarm in any environment could be achieved through a Brain-Computer-Interface (BCI) system also known as Brain-Machine-Interfaces (BMI) and less as Mind-Machine-Interface (MMI) [12]. A noninvasive BCI system [13] uses electroencephalogram sensors to detect brain activity based on the measurement of biopotentials perceived on the user's scalp. These biopotentials, produced during mental tasks as mental patterns, result from voltage oscillations in the flow of ionic current between neurons [14,15]. ...
Chapter
Full-text available
Robots used in the civilian or military field were inspired from the beginning by the natural biomimetic behavior of insects or vertebrates. Lately there were developed many studies and research activities in the understanding and reproduction of the biological swarm effect. This paper takes from the results in this field and develops a brain-computer (BCI) control system based on the steady-state visually evoked potentials (SSVEP) for a swarm of spider type robots.KeywordsBrain-computer interfaceNeuronal controlSpider type robot
... The most efficient non-invasive BCI systems involve electroencephalographic (EEG) signals, which can be acquired even with portable commercial headsets. Processing methods and artificial intelligence techniques are applying to the EEG signal to enable the detection of particular patterns associated with the task executed by the user, as a command, for example: focusing the attention on something [1], keeping a relaxing state of mind [1], executing the voluntary eye-blinks, counting the number of times a specific element is flashing, eliciting P300 evoked potential [2] or imagining something, especially a specific movement [3], triggering slow cortical potentials [4]. The accomplishment of the previously mentioned tasks will determine the real-time control of mechatronic devices, such as a robotic arm [5], a robotic hand [6], an automated wheelchair [7], necessary in the everyday assistance of disabled people, who suffered from neuromotor illnesses, for example, locked-in syndrome, amyotrophic lateral sclerosis, cerebral stroke, spinal cord injuries, and tetraplegia. ...
Chapter
The paper proposes a Fuzzy Logic-based LabVIEW application to determine the strength of the voluntary eye-blinks used as commands in a brain-computer interface to control a mobile robot. Relevant statistical features (standard deviation, root mean square, Kurtosis coefficient, and maximum value of amplitude) of the raw electroencephalographic signal acquired from the biosensor provided by a portable NeuroSky headset determine the input linguistic variables. A customized counting algorithm of the voluntary eye-blinks generates the various movement commands (move forward, move backward, turn left, turn right, stop). By implementing LabVIEW graphical custom code sequences, it resulted in developing this algorithm. The Bluetooth-based communication between the LabVIEW application and Arduino allows the sending of commands to the mobile robot. The proposed BCI experimental system is necessary to provide an efficient working principle employed by robust mechatronic systems that support people with neuromotor disabilities to regain their confidence and independence in performing simple everyday activities.
... An increasing number of systems allow the control of more sophisticated devices, including orthoses, prostheses, drones, robotic arms or even mobile robots. BCI systems can also be used for neurorehabilitation [42][43][44][45]. ...
... An increasing number of systems allow the control of more sophisticated devices, including orthoses, prostheses, drones, robotic arms or even mobile robots. BCI systems can also be used for neurorehabilitation [42][43][44][45]. ...
Article
Full-text available
In recent years, the control of devices “by the power of the mind” has become a very controversial topic but has also been very well researched in the field of state-of-the-art gadgets, such as smartphones, laptops, tablets and even smart TVs, and also in medicine, to be used by people with disabilities for whom these technologies may be the only way to communicate with the outside world. It is well known that BCI control is a skill and can be improved through practice and training. This paper aims to improve and diversify signal processing methods for the implementation of a brain-computer interface (BCI) based on neurological phenomena recorded during motor tasks using motor imagery (MI). The aim of the research is to extract, select and classify the characteristics of electroencephalogram (EEG) signals, which are based on sensorimotor rhythms, for the implementation of BCI systems. This article investigates systems based on brain-computer interfaces, especially those that use the electroencephalogram as a method of acquisition of MI tasks. The purpose of this article is to allow users to manipulate quadcopter virtual structures (external, robotic objects) simply through brain activity, correlated with certain mental tasks using undecimal transformation (UWT) to reduce noise, Independent Component Analysis (ICA) together with determination coefficient (r2) and, for classification, a hybrid neural network consisting of Radial Basis Functions (RBF) and a multilayer perceptron–recurrent network (MLP–RNN), obtaining a classification accuracy of 95.5%. Following the tests performed, it can be stated that the use of biopotentials in human–computer interfaces is a viable method for applications in the field of BCI. The results presented show that BCI training can produce a rapid change in behavioral performance and cognitive properties. If more than one training session is used, the results may be beneficial for increasing poor cognitive performance. To achieve this goal, three steps were taken: understanding the functioning of BCI systems and the neurological phenomena involved; acquiring EEG signals based on sensorimotor rhythms recorded during MI tasks; applying and optimizing extraction methods, selecting and classifying characteristics using neuronal networks.
... Human-drone interaction has been studied in various contexts, including control interface design [1,2], shared autonomy applications [3], and brain-computer interfaces [4]. ...
... This track was chosen because it mainly required pilots to perform lateral translations at the same altitude, which can be considered more similar to car driving studies. For the second track, referred to as "Wave" track, gates were alternatingly placed at either 1.75 meter height (for gates 1, 3, 6, 8) or at 3.5 meter height (for gates 0, 2,4,5,7,9), which required the pilots to perform elevation changes when flying the race track. Thus, pilots had to not only navigate along the lateral direction, but had to transition between different altitudes for passing through the gate. ...
Article
Humans race drones faster and more agile than algorithms, despite being limited to a fixed camera angle, body rate control, and response latencies in the order of hundreds of milliseconds. A better understanding of the ability of human pilots of selecting appropriate motor commands from highly dynamic visual information may provide key insights for solving current challenges in vision-based autonomous navigation. The aim of this study was to investigate the relationship between flight performance, control behavior, and eye movements of human pilots in a drone racing task. We collected a multimodal dataset from 21 experienced drone pilots using a highly realistic drone racing simulator, also used to recruit professional pilots. Our results showed task-specific improvements in drone racing performance over time. Gaze fixations not only tracked future waypoints but also anticipated the future flight path. Cross-correlation analysis showed a strong spatio-temporal relationship between eye movements, camera orienting behavior, and thrust vector control. These results highlight the importance of coordinated eye movements in human-piloted drone racing.
... The brain-computer interface (BCI) provides an interlink between the brain and external devices (Vidal, 1973;Wolpaw et al., 2002). The information received from the brain in the form of physiological/magnetic/metabolic signals is decoded and interpreted to determine the user intentions, and is later utilized for various purposes, such as rehabilitation (Do et al., 2013;Khan R. A. et al., 2018); control of robots (Doud et al., 2011;Bozinovski, 2016;Rosca et al., 2018;Duan et al., 2019) and of prosthetics (Buch et al., 2018;Yanagisawa et al., 2019); and neurogaming (Paszkiel, 2016(Paszkiel, , 2020Vasiljevic and de Miranda, 2020). Among the existing non-invasive acquisition methods, arguably EEG (Wolpaw et al., 2002;Pfurtscheller et al., 2006;Choi, 2013;Abiri et al., 2019) and fNIRS (Ferrari et al., 1985;Delpy et al., 1988;Coyle et al., 2004Coyle et al., , 2007Fazli et al., 2012;Naseer and Keum-Shik, 2015;Yin et al., 2015) are considered the most explored. ...
Article
Full-text available
Brain-computer interface (BCI) multi-modal fusion has the potential to generate multiple commands in a highly reliable manner by alleviating the drawbacks associated with single modality. In the present work, a hybrid EEG-fNIRS BCI system-achieved through a fusion of concurrently recorded electroencephalography (EEG) and functional near-infrared spectroscopy (fNIRS) signals-is used to overcome the limitations of uni-modality and to achieve higher tasks classification. Although the hybrid approach enhances the performance of the system, the improvements are still modest due to the lack of availability of computational approaches to fuse the two modalities. To overcome this, a novel approach is proposed using Multi-resolution singular value decomposition (MSVD) to achieve system-and feature-based fusion. The two approaches based up different features set are compared using the KNN and Tree classifiers. The results obtained through multiple datasets show that the proposed approach can effectively fuse both modalities with improvement in the classification accuracy.
... The brain wave regarding the winks has been captured using the fives channel Emotiv Insight EEG headset. Emotiv Insight has also been employed in other studies [14][15][16][17][18][19] to capture the brain wave. The remaining part of this paper has been organized in the following sections i.e. ...
Article
Full-text available
Facial expression may establish communication between physically disabled people and assistive devices. Different types of facial expression including eye wink, smile, eye blink, looking up and looking down can be extracted from the brain signal. In this study, the possibility of controlling assistive devices using the individual’s wink has been investigated. Brain signals from the five subjects have been captured to recognize the left wink, right wink, and no wink. The brain signals have been captured using Emotiv Insight which consists of five channels. Fast Fourier transform and the sample range have been computed to extract the features. The extracted features have been classified with the help of different machine learning algorithms. Here, support vector machine (SVM), linear discriminant analysis (LDA) and K-nearest neighbor (K-NN) have been employed to classify the features sets. The performance of the classifier in terms of accuracy, confusion matrix, true positive and false positive rate and the area under curve (AUC)—receiver operating characteristics (ROC) have been evaluated. In the case of sample range, the highest training and testing accuracies are 98.9% and 96.7% respectively which have been achieved by two classifiers namely, SVM and K-NN. The achieved results indicate that the person’s wink can be utilized in controlling assistive devices.
Chapter
The brain-computer interface (BCI) is a growing field of technology, and it has become clear that BCI systems’ cybersecurity needs amelioration. When BCI devices are developed with wireless connection capabilities, more often than not, this creates more surface area for attackers to concentrate their attacks. The more invasive BCI technology is used, the greater the threat to the users’ physical health. In this paper, we summarize and outline the main cybersecurity threats and challenges that BCI systems may face now and in the future. Furthermore, we present avenues for the future BCI systems including cybersecurity solutions and requirements. We emphasize the importance of the health layer to be considered as important as technical layers in BCI systems as people cannot endure life-threatening situations where attackers could cause permanent brain damage to the BCI user.KeywordsBrain-Computer InterfaceDeep Brain StimulationCybersecurityVulnerabilityPrivacy