Simulation parameters for generating dataset.

Simulation parameters for generating dataset.

Source publication
Article
Full-text available
With the proliferation of artificial intelligence (AI) technology, the function of AI in a sixth generation (6G) environment is likely to come into play on a large scale. Moreover, in recent years, with the rapid advancement in AI technology, the ethical issues of AI have become a hot topic. In this paper, the ethical concern of AI in wireless netw...

Context in source publication

Context 1
... three-bit combination is obtained by using one-hot encoding method [33]. The simulation parameters used to create the dataset are given in Table 1. ...

Similar publications

Preprint
Full-text available
Critical transmission ranges (or radii) in wireless ad-hoc and sensor networks have been extensively investigated for various performance metrics such as connectivity, coverage, power assignment and energy consumption. However, the regions on which the networks are distributed are typically either squares or disks in existing works, which seriously...

Citations

... The work of [66] proposes a solution to guarantee low latency and reliability under network slicing. In addition, the work of [101] investigates on fairness of package scheduling. The 6G wireless network is facing tremendous technical challenges, including various design tradeoffs and performance optimization. ...
... Consequently, 6G wireless networks tend to be inherently AI-enabled. Bhandari et al. explore the ethical aspect of 6G wireless network design and propose a scheme for package scheduling fairness [101]. This scheme is dedicated to avoiding an unfair situation where network nodes with better links tend to be selected to transmit data, while nodes with worse links may not be selected. ...
... Analogously, numericalsimulation-generated data are used to train a CNN, which resolves a joint optimization problem of over-the-air-computing [69]. Similarly, a CNN-based data package scheduler can be trained by a numerical-simulation-generated data [101]. The work of [83] produces training data for GAN using the TeraSim of the NS3 simulator. ...
Article
Full-text available
With the rapid development of enabling technologies like VR and AR, we human beings are on the threshold of the ubiquitous human-centric intelligence era. 6G is believed to be an indispensable cornerstone for efficient interaction between humans and computers in this promising vision. 6G is supposed to boost many human-centric applications due to its unprecedented performance improvements compared to 5G and before. However, challenges are still to be addressed, including but not limited to the following six aspects: Terahertz and millimeter-wave communication, low latency and high reliability, energy efficiency, security, efficient edge computing and heterogeneity of services. It is a daunting job to fit traditional analytical methods into these problems due to the complex architecture and highly dynamic features of ubiquitous interactive 6G systems. Fortunately, deep learning can circumvent the interpretability issue and train tremendous neural network parameters, which build mapping relationships from neural network input (status and specific requirements of a 6G application) to neural network output (settings to satisfy the requirements). Deep learning methods can be an efficient alternative to traditional analytical methods or even conquer unresolvable predicaments of analytical methods. We review representative deep learning solutions to the aforementioned six aspects separately and focus on the principles of fitting a deep learning method into specific 6G issues. Based on this review, our main contributions are highlighted as follows. (i) We investigate the representative works in a systematic view and find out some important issues like the vital role of deep reinforcement learning in the 6G context. (ii) We point out solutions to the lack of training data in 6G communication context. (iii) We reveal the relationship between traditional analytical methods and deep learning, in terms of 6G applications. (iv) We identify some frequently used efficient techniques in deep-learning-based 6G solutions. Finally, we point out open problems and future directions.