Article

Satellite Communications Supporting Internet of Remote Things

Authors:
To read the full-text of this research, you can request a copy directly from the authors.

Abstract

This paper focuses on the use of satellite communication systems for the support of Internet of Things (IoT). We refer to the IoT paradigm as the means to collect data from sensors or RFID and to send control messages to actuators. In many application scenarios, sensors and actuators are distributed over a very wide area; in some cases, they are located in remote areas where they are not served by terrestrial access networks and, as a consequence, the use of satellite communication systems becomes of paramount importance for the Internet of Remote Things (IoRT). The enabling factors of IoRT through satellite are: 1) the interoperability between satellite systems and sensors/actuators and 2) the support of IPv6 over satellite. Furthermore, radio resource management algorithms are required to enhance the efficiency of IoT over satellite. In this work, we provide an integrated view of satellite-based IoT, handling this topic as a jigsaw puzzle where the pieces to be assembled are represented by the following topics: MAC protocols for satellite routed sensor networks, efficient IPv6 support, heterogeneous networks interoperability, quality of service (QoS) management, and group-based communications.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the authors.

... 2) Satellite Networks: Since terrestrial networks are inherently limited by coverage, many existing studies consider utilizing satellite-enabled NTNs to provide services in remote and disaster areas. Sanctis et al. [12] introduced emergency management as well as two other application scenarios where satellite plays an important role. Specifically, satellites-enabled incident area networks could support both voice and data transmissions and wireless sensor and actuator communications in disaster areas. ...
... Specifically, we aim to obtain the relationship between the overall latency and the total configured resources B total , R S total and F total . We present the following theorems Theorem 4: Assume two optimization problems, both adopting the form of problem (12). For both problems, the corresponding parameters take the same values except for the total computing capability, where in the first problem it is F (1) total while in the second it is F (2) total . ...
... the closed-form expressions of the optimal solution to problem (12) are given as Consider the following resource configuration problem: Given that the configured total user-UAV communication bandwidth B total and total UAV-satellite data rate R S total are determined, to ensure that the minimized overall latency does not exceed a threshold T th , how much total computing capability F total needs to be configured? ...
Preprint
Quick response to disasters is crucial for saving lives and reducing loss. This requires low-latency uploading of situation information to the remote command center. Since terrestrial infrastructures are often damaged in disaster areas, non-terrestrial networks (NTNs) are preferable to provide network coverage, and mobile edge computing (MEC) could be integrated to improve the latency performance. Nevertheless, the communications and computing in MEC-enabled NTNs are strongly coupled, which complicates the system design. In this paper, an edge information hub (EIH) that incorporates communication, computing and storage capabilities is proposed to synergize communication and computing and enable systematic design. We first address the joint data scheduling and resource orchestration problem to minimize the latency for uploading sensing data. The problem is solved using an optimal resource orchestration algorithm. On that basis, we propose the principles for resource configuration of the EIH considering payload constraints on size, weight and energy supply. Simulation results demonstrate the superiority of our proposed scheme in reducing the overall upload latency, thus enabling quick emergency rescue.
... The satellite networks are used to provide Internet connectivity in support with 5 G technology. The concept of Internet of things (IoT) or Internet of remote things (IoRT) or Internet of satellites have been evolved over the years [5][6][7]. The aim of providing Internet connectivity and accessibility in remote areas can be achieved by integrating satellite connectivity with UAV networks. ...
... The satellite communication is considered to support Internet of remote things (IoRT), where various research activities in this field have been outlined in [6]. Different remote scenarios where satellite communication can act as a key element are explained. ...
... Rights reserved. Now, the desired UAV position is found using the iterative algorithm, and the updated L-UAV position at each iteration is given by (6). In (6), x u k = x u k+1 − x u k ; Lo u i is current L-UAV position (or location) and Lo u i+1 is the updated L-UAV position after i th iteration. ...
Article
Full-text available
Unmanned Aerial Vehicles (UAVs) are becoming one of the promising technologies to improve quality of life, also UAVs are acting one of the reliable sources to provide connectivity over remote areas. In this paper, we present a UAV network architecture over the ocean for the idea of connected ocean. The presented model consists of two layers of UAVs for ocean monitoring and communication. The flying ad hoc network (FANET) over the ocean is formed by lower layer UAVs, so that the area under these UAVs remains connected. In this article, the performance over the ocean is verified for the scenario considering ocean entities and lower layer UAVs. A mechanism to improve the data delivery in the UAV network over the ocean is presented. This can be considered for incorporating in the idea of connected ocean. Simulation results show that the presented mechanism performs better over conventional mechanisms.
... Communication infrastructure based on satellite technology has emerged as a cost-effective solution, serving not only isolated regions where the establishment of terrestrial networks proves infeasible but also providing a viable alternative in emergency scenarios. In addition, the versatility of satellite services is exceptional [1], encompassing a diverse array of applications including military purposes, remote sensing, Earth observation, space exploration, maritime operations, agriculture, and communication provision over the Internet [2,3]. ...
... The earliest start time of data transmission, specifically at time 0:58, is taken as the reference point (time 0), corresponding to the initial measurement from Sensor [4]. Subsequently, all the remaining sensors initiate their transmissions by following the timer of Sensor [4], e.g., Sensor [1] has a startTime of 1 min since its transmission started at 0:59. Furthermore, the number of the distinct ground stations was increased to 26 by incorporating MCCs that mirror certain real testbeds located in Europe. ...
... Regarding the received pings, noteworthy observations have surfaced. While some sensors followed the expected pattern-where an increase in planes and satellites in the constellation corresponded to an increase in received pings (e.g., Sensor[0,4])-others exhibited an inversely proportional relationship (e.g., Sensor [1,3]) and some demonstrated mix results (e.g., Sensor [2]). For the latter case, Sensor [2] received more pings within the 360 satellite constellation compared to the 600 one, but fewer pings in contrast to the 900 satellite constellation. ...
Article
Full-text available
In recent years, the space industry has witnessed a resurgence, characterized by a notable proliferation of satellites operating at progressively lower altitudes, promising extensive global coverage and terrestrial-level data transfer speeds, while remaining cost-effective solutions. In particular, Wireless Sensor Networks (WSNs) can benefit from the wide coverage of space infrastructure due to their extensive deployment, disrupted communication nature, and the potential absence of terrestrial support. This study explored the utility of Low-Earth Orbit (LEO) satellite constellations as a communication infrastructure for interconnecting “smart” devices via ground stations in Internet of Things (IoT) scenarios. To this end, we designed and implemented a series of experiments conducted within the OMNeT++ simulator, utilizing an updated iteration of the original Open Source Satellite Simulator (OS3) framework. Our research encompassed an IoT Case Study, incorporating authentic sensor data sourced from the Smart Santander testbed. Throughout our experimentation, we investigated the impact of the constellation design parameters such as the number of satellites and orbital planes, as well as the inter-satellite link configuration on the obtained Round-Trip Time (RTT) and packet loss rates.
... Global enterprises and governments are increasingly relying on satellite-to-satellite IoT services to monitor and manage assets worldwide. This has led to significant growth in the satellite market, with operators now providing comprehensive IoT solutions utilizing satellite technology [4]. ...
... Furthermore, ensuring Quality of Service (QoS), including factors such as latency and delay, is vital for the performance of satellite systems [4]. Consistently delivering high QoS standards is necessary for the effective operation of these systems. ...
Article
Full-text available
The fusion of satellite technologies with the Internet of Things (IoT) has propelled the evolution of mobile computing, ushering in novel communication paradigms and data management strategies. Within this landscape, the efficient management of computationally intensive tasks in satellite-enabled mist computing environments emerges as a critical challenge. These tasks, spanning from optimizing satellite communication to facilitating blockchain-based IoT processes, necessitate substantial computational resources and timely execution. To address this challenge, we introduce APOLLO, a novel low-layer orchestration algorithm explicitly tailored for satellite mist computing environments. APOLLO leverages proximity-driven decision-making and load balancing to optimize task deployment and performance. We assess APOLLO’s efficacy across various configurations of mist-layer devices while employing a round-robin principle for equitable tasks distribution among the close, low-layer satellites. Our findings underscore APOLLO’s promising outcomes in terms of reduced energy consumption, minimized end-to-end delay, and optimized network resource utilization, particularly in targeted scenarios. However, the evaluation also reveals avenues for refinement, notably in CPU utilization and slightly low task success rates. Our work contributes substantial insights into advancing task orchestration in satellite-enabled mist computing with more focus on energy and end-to-end sensitive applications, paving the way for more efficient, reliable, and sustainable satellite communication systems.
... Ongoing and planned installations of VHF-VHF DC installations are described. 17 ...
... The categorization of installations has evolved from several hundred megawatts 84 to installations with a planned capacity of 1,000 megawatts or more in the last few years. 85 However, there are several important concepts that should be considered in greater detail, 86 specifically regarding some of the currently under development offshore networks that 87 have multiple stations [17]. 88 ...
Article
Full-text available
This article discusses the current state of DC HVDC technology. The article describes the history of DC HVDC technology, starting with its early development as a voltage source, and then discusses the introduction of line-cutting into current designs. The article discusses control and co-ordination of transformers, as well as the need for a DC breaker to facilitate control of multiple plants. Recent developments in the design of DC circuit breakers are discussed. The importance of reliability is recognized, especially in relation to cables, and the issues surrounding cable design are explained. Ongoing and planned installations of VHF-VHF DC installations are described.
... In conclusion, ref. [26] conducts an examination using satellite communication systems to support the IoT. The authors refer to the IoT paradigm as collecting data from sensors or RFID (Radio Frequency ID), and control messages are dispatched to actuators. ...
... Combines the speed and coverage of 6G satellites with the control and flexibility of SD-WAN for various applications. [26] Utilizing satellite communication systems for supporting IoT in remote areas. ...
Article
Full-text available
Due to limited infrastructure and remote locations, rural areas often need help providing reliable and high-quality network connectivity. We propose an innovative approach that leverages Software-Defined Wide Area Network (SD-WAN) architecture to enhance reliability in such challenging rural scenarios. Our study focuses on cases in which network resources are limited to network solutions such as Long-Term Evolution (LTE) and a Low-Earth-Orbit satellite connection. The SD-WAN implementation compares three tunnel selection algorithms that leverage real-time network performance monitoring: Deterministic, Random, and Deep Q-learning. The results offer valuable insights into the practical implementation of SD-WAN for rural connectivity scenarios, showing its potential to bridge the digital divide in underserved areas.
... The burgeoning demand for satellite-to-satellite IoT services is propelled by global enterprises and governments seeking to monitor and manage assets dispersed worldwide. Consequently, the satellite market experiences significant expansion, with operators now offering comprehensive IoT solutions leveraging satellite technology [6]. ...
... Moreover, the meticulous preservation of Quality of Service (QoS), encompassing essential factors such as latency and various types of delays, is indispensable. This necessity transcends general IT systems [5] and holds particular significance within satellite systems [6]. The seamless operation and performance of these systems hinge on consistently delivering QoS standards, underscoring the importance of maintaining and optimizing these aspects for sustained effectiveness. ...
Preprint
Full-text available
The fusion of satellite technologies with the Internet of Things (IoT) has propelled the evolution of mobile computing, ushering in novel communication paradigms and data management strategies. Within this landscape, the efficient management of computationally intensive tasks in satellite-enabled mist computing environments emerges as a critical challenge. These tasks, spanning from optimizing satellite communication to facilitating blockchain-based IoT processes, necessitate substantial computational resources and timely execution. To address this challenge, we introduce APOLLO, a novel low-layer orchestration algorithm explicitly tailored for satellite mist computing environments. APOLLO leverages proximity-driven decision-making and load balancing to optimize task deployment and performance. We assess APOLLO's efficacy across various configurations of mist layer devices while employing a round-robin principle for equitable task distribution among the close low-layer satellites. Our findings underscore APOLLO's promising outcomes in terms of reduced energy consumption, minimized end-to-end delay, and optimized network resource utilization, particularly in targeted scenarios. However, the evaluation also reveals avenues for refinement, notably in CPU utilization and slightly low tasks success rates. Our work contributes substantial insights into advancing task orchestration in satellite-enabled mist computing with more focus on energy and end-to-end sensitive applications, paving the way for more efficient, reliable, and sustainable satellite communication systems.
... Therefore, we omit the processing time at the cloud server and we also ignore the cloud energy consumption involved in computation task execution and transmission of the computation results from the cloud server to GUs. Moreover, by using advanced communication such as multiple input multiple output (MIMO) communications, we assume GUs on the ground can communicate directly with the satellites [32]. For the considered scenario where the terrestrial network area is far away from the cloud server, we rely on multi-hop satellite communication to transmit data related to the offloaded sub-task from each GU to the cloud server. ...
... Solve sub-problem (32) to obtain δ r , β r , and F r ; 11: ...
Article
Full-text available
In this paper, we study the computation offloading problem in hybrid edge-cloud based space-air-ground integrated networks (SAGIN), where joint optimization of partial computation offloading, unmanned aerial vehicle (UAV) trajectory control, user scheduling, edge-cloud computation, radio resource allocation, and admission control is performed. Specifically, the considered SAGIN employs multiple UAV-mounted edge servers with controllable UAV trajectory and a cloud sever which can be reached by ground users (GUs) via multi-hop low-earth-orbit (LEO) satellite communications. This design aims to minimize the weighted energy consumption of the GUs and UAVs while satisfying the maximum delay constraints of underlying computation tasks. To tackle the underlying non-convex mixed integer non-linear optimization problem, we use the alternating optimization approach where we iteratively solve four sub-problems, namely user scheduling, partial offloading control and bit allocation over time slots, computation resource and bandwidth allocation, and multi-UAV trajectory control until convergence. Moreover, feasibility verification and admission control strategies are proposed to handle overloaded network scenarios. Furthermore, the successive convex approximation (SCA) method is employed to convexify and solve the non-convex computation resource and bandwidth allocation and UAV trajectory control sub-problems. Via extensive numerical studies, we illustrate the effectiveness of our proposed design compared to baselines.
... Also, the long communication distance between the terminal and the satellite makes using an acknowledgment (ACK) impossible due to its long response time. To resolve these problems, the Slotted ALOHA (SA), a random access method in which each terminal autonomously sends packets, and the Contention Resolution Diversity Slotted ALOHA (CRDSA)/Irregular-Repetition Slotted ALOHA (IRSA), which improves throughput by successive interference cancellation, have been used [2], [3], [5]. ...
... Systems collecting data from IoT terminals using communication satellites have been widely studied [1]- [3]. In [2], [8], satellite communications supporting IoT were discussed from a broad perspective, such as applications, access methods, and network technologies. Satellite orbits and compatibility with terrestrial networks were discussed in [1], while resource allocation in LEO satellites was investigated in [9]- [11]. ...
Article
Full-text available
In satellite Internet-of-Things (IoT) systems using the 920-MHz band Low Power Wide Area, available channels for each terminal are limited to avoid interference with terrestrial networks. Since this limitation depends on the location of the terminal, the available channels are different, resulting in throughput degradation. This paper proposes a transmission control scheme that reduces throughput degradation due to different available channels in the Contention Resolution Diversity Slotted ALOHA (CRDSA)/Irregular-Repetition Slotted ALOHA (IRSA) system. The proposed transmission control uses constraints that consider the characteristics of CRDSA/IRSA, as well as an objective function that considers fairness of throughput among terminals. Under these constraints and objective function, the transmission probability was calculated using an objective function that maximized the transmitting load. The computer simulation showed that it could achieve high throughput by preventing the influence of difference in the available channels.
... Satellite networks are a promising solution to provide global IoT services in a flexible and affordable way [3], [4], [5]. The idea consists on connecting IoT devices placed in areas without terrestrial infrastructure directly to a constellation of satellites passing over them. ...
... In addition, as predicted by the stability analysis, the transmission probability converges after a moderate number of visibility periods in all the simulated scenarios. 3 For this, both p min and κ parameters have been configured with values that satisfy their respective stability conditions derived in the Appendix: p min = 1/8 < p * tx < 0.19 (the lowest p * tx value obtained with 512 EDs in the unslotted scenario) and 0 < κ = 1/4 < 2G * /G < 0.32 (the lowest threshold obtained when G * = 0.5 and G = 3.13). Finally, note that, if the number of active EDs (i.e., the offered load) is low enough, the EDs always keep their transmission probabilities very close to 1 and, therefore, it is unnecessary to show the results for these scenarios in the graphs. ...
Article
Full-text available
In the next few years, Low Earth Orbit (LEO) constellations will become key enablers for the deployment of global Internet of Things (IoT) services. Due to their proximity to Earth, LEO satellites can directly communicate with ground nodes and, thus, serve as mobile gateways for IoT devices deployed in remote areas lacking terrestrial infrastructure. Within this Direct-to-Satellite IoT (DtS-IoT) context, LoRa (Long Range) technology, capable of providing long range connectivity to power-constrained devices, has received great attention. However, serious scalability issues have been observed in LoRa-based DtS-IoT networks when a high number of LoRa devices perform uplink transmissions driven by the straightforward Aloha protocol during the short visibility periods of the passing-by satellites. In this paper, we evaluate some Aloha-based protocols suitable for this kind of networks and present a new adaptive variant that allows LoRa devices to dynamically adjust their uplink transmission rates in order to make the network work near its optimal operating point. Simulation results show that our proposal is able to significantly improve the network throughput in overloaded scenarios without the need for coordination among LoRa devices, listening to the channel nor gateway support.
... Satellite communication technology can extend the coverage of traditional mainstream communication of electric power to remote areas such as oceans and deserts due to its features of high reliability, global communication coverage, and independence from harsh environmental conditions [6]. This technology enables the transmission of electricity in remote areas [7], ensuring continuous communication services. In power emergency scenarios, ensuring emergency protection, seamless communication, timely warnings, and post-disaster recovery are critical issues for power communication enterprises. ...
Article
Full-text available
Amidst the escalating need for stable power supplies and high-quality communication services in remote regions globally, due to challenges associated with deploying a conventional power communication infrastructure and its susceptibility to natural disasters, LEO satellite networks present a promising solution for broad geographical coverage and the provision of stable and high-speed communication services in remote regions. Given the necessity for frequent handovers to maintain service continuity, due to the high mobility of LEO satellites, a primary technical challenge confronting LEO satellite networks lies in efficiently managing the handover process between satellites, to guarantee the continuity and quality of communication services, particularly for power services. Thus, there is a critical need to explore satellite handover optimization algorithms. This paper presents a handover optimization scheme that integrates deep reinforcement learning (DRL) and graph neural networks (GNN) to dynamically optimize the satellite handover process and adapt to the time-varying satellite network environment. DRL models can effectively detect changes in the topology of satellite handover graphs across different time periods by leveraging the powerful representational capabilities of GNNs to make optimal handover decisions. Simulation experiments confirm that the handover strategy based on the fusion of message-passing neural network and deep Q-network algorithm (MPNN-DQN) outperforms traditional handover mechanisms and DRL-based strategies in reducing handover frequency, lowering communication latency, and achieving network load balancing. Integrating DRL and GNN into the satellite handover mechanism enhances the communication continuity and reliability of power systems in remote areas, while also offering a new direction for the design and optimization of future power system communication networks. This research contributes to the advancement of sophisticated satellite communication architectures that facilitate high-speed and reliable internet access in remote regions worldwide.
... Space is an increasingly ripe realm for RF applications including space solar power [1], [2], [3], [4], [5], [6], [7], communications [8], [9], [10], and the traditional remote sensing applications [11], [12]. These applications demand high bandwidth and/or high power efficiency but are currently limited by the aperture size that can be deployed [13]; large apertures in space are challenging because of the crucial requirement that they fit within the fairing of the launch vehicle. ...
Article
Large apertures in space are critical for high-power and high-bandwidth applications spanning wireless power transfer (WPT) and communication, however progress on this front is stunted by the geometric limitations of rocket flight. We present a light and flexible 10GHz array, which is composed of dipole antennas co-cured to a glass-fiber composite. The arrays can dynamically conform to new shapes and are flexible enough to fold completely flat, coil into a rocket payload, and pop back up upon deployment in orbit. The array is amenable to scalable, automated manufacturing - a requirement for the massive production necessary for large apertures. Moreover, the arrays passed the standard gamut of space-qualification testing: the antennas can survive mechanical stress, extreme temperatures, high-frequency temperature cycling, and prolonged stowage in the flattened configuration. The elements exhibit excellent electromagnetic performance: a return ratio better than -10dB over ≈1.5GHz, a single lobe half-power beamwidth of greater than 110° suitable for broad beamforming, >92% efficiency, and excellent manufacturing consistency. Moreover, its mechanical durability vis-a-vis extreme temperatures and protracted stowage lends itself to demanding space applications. This lightweight and scalable array is equipped to serve a host of new space-based radio-frequency technologies and applications which leverage large, stowable and durable array apertures.
... Avvenuti et al. [6] combined social media capabilities with traditional methods to mitigate the impact of disasters. Furthermore, they have expanded their studies to incorporate remote sensing techniques supported by satellite imagery, along with the implementation of software-defined network technology and IoT on-the-fly gateways [7][8][9][10][11]. Significant efforts have been made to assist regions with severely damaged infrastructure, and virtualization has emerged as a valuable tool for redirecting disastrous risk assessment situations. ...
Article
Full-text available
An earthquake early-warning system (EEWS) is an indispensable tool for mitigating loss of life caused by earthquakes. The ability to rapidly assess the severity of an earthquake is crucial for effectively managing earthquake disasters and implementing successful risk-reduction strategies. In this regard, the utilization of an Internet of Things (IoT) network enables the real-time transmission of on-site intensity measurements. This paper introduces a novel approach based on machine-learning (ML) techniques to accurately and promptly determine earthquake intensity by analyzing the seismic activity 2 s after the onset of the p-wave. The proposed model, referred to as 2S1C1S, leverages data from a single station and a single component to evaluate earthquake intensity. The dataset employed in this study, named “INSTANCE,” comprises data from the Italian National Seismic Network (INSN) via hundreds of stations. The model has been trained on a substantial dataset of 50,000 instances, which corresponds to 150,000 seismic windows of 2 s each, encompassing 3C. By effectively capturing key features from the waveform traces, the proposed model provides a reliable estimation of earthquake intensity, achieving an impressive accuracy rate of 99.05% in forecasting based on any single component from the 3C. The 2S1C1S model can be seamlessly integrated into a centralized IoT system, enabling the swift transmission of alerts to the relevant authorities for prompt response and action. Additionally, a comprehensive comparison is conducted between the results obtained from the 2S1C1S method and those derived from the conventional manual solution method, which is considered the benchmark. The experimental results demonstrate that the proposed 2S1C1S model, employing extreme gradient boosting (XGB), surpasses several ML benchmarks in accurately determining earthquake intensity, thus highlighting the effectiveness of this methodology for earthquake early-warning systems (EEWSs).
... It is crucial to emphasize that LEO satellites rely on store-and-forward transmission to initially store received packets from ground users in buffers before forwarding them to ground gateways or other satellites via ISLs. Moreover, in IoRT applications [1], ground users, such as devices and sensors, may exhibit sporadic activity and send short packets when active. Therefore, in our analysis of packet transmission, we examine the probability of buffer overflow at LEO satellites employing store-and-forward transmission. ...
Preprint
Full-text available
Low Earth orbit (LEO) satellites play a crucial role in providing global connectivity for non-terrestrial networks (NTNs) and supporting various Internet-of-Remote-Things (IoRT) applications. Each LEO satellite functions as a relay node in the sky, employing store-and-forward transmission strategies that necessitate the use of buffers. However, due to the finite size of these buffers, occurrences of buffer overflow leading to packet loss are inevitable. In this paper, we demonstrate how inter-satellite links (ISLs) can mitigate the probability of buffer overflow. Specifically, we propose an approach to reallocate packets among LEO satellites via ISLs to minimize the occurrence of buffer overflow events. Consequently, the implementation of ISLs can lead to a more reliable satellite network, enabling efficient packet reallocation to reduce the probability of buffer overflow.
... In the literature, several researches on channel allocation and spectrum utilization issues in satellite IoT are investigated. For example, Sanctis et al. [6] emphasized the importance of satellite communication systems for sensors and actuators in remote areas lacking terrestrial network connections. Obata et al. [7] explored interference detection techniques, assuming dynamic spectrum sharing between terrestrial and non-terrestrial communications. ...
Article
Due to the scarcity of spectrum resources in Non-orthogonal Multiple Access (NOMA) systems and insufficient satellite-ground integration in satellite Internet of Things (IoT), this paper investigates its issue in spectrum resource management. We propose a resource allocation method based on Multi-Agent Deep Deterministic Policy Gradient (MADDPG) for NOMA enabled satellite IoT. We formulate the spectrum allocation problem of the satellite-ground integrated network as a distributed optimization problem. Then we decouple the problem into two sub-problems. Firstly, a user grouping method based on matching coefficients is defined, and a Linear Programming (LP) method is utilized for obtaining solution. Secondly, the power allocation problem is transformed into a multi-agent problem, where MADDPG is employed to allocate the power. Through this approach, the system is capable of real-time user association and spectrum resource allocation optimization, achieving optimal user grouping while maximizing system transmission rate. Based on the simulation results, the MADDPG-based method demonstrates fast convergence within 100 training iterations. The proposed MADDPG-based resource management method also achieves increased system transmission rate with more effective matching outcomes over Deep Deterministic Policy Gradient (DDPG), Orthogonal Multiple Access (OMA), and random allocation baselines.
... Such applications usually require one to transmit data across several miles, often without the availability of Internet connections. A suitable solution for transferring a small number of data is the deployment of satellite connections [5]. A cheaper but reliable and robust alternative is the deployment of radio devices that exploit the sub-gigahertz bands [6]. ...
Article
Full-text available
This paper focuses on the characterization of radio propagation, and data communication in a marine environment. More specifically, we consider signal propagation when three different sub-gigahertz industrial, scientific, and medical (ISM) bands, i.e., 169 MHz, 434 MHz, and 868 MHz, are used. The main focus of the paper is to evaluate the path loss (PL), i.e., the power loss that a propagation radio wave would experience when communication occurs between a sail boat and a buoy. We describe the measurement results obtained performing three different radio power measurement campaigns, at the three different aforementioned ISM sub-gigahertz bands. We also want to correlate the radio propagation quality with the weather conditions present in the measurement areas. The obtained results show that higher distances are achieved by transmitting at lower frequencies, i.e., 169 MHz, and, on average, the propagation is directly dependent from the dew point index.
... Due to the short connection time between low Earth orbit satellites (LEO) and ground stations, a large amount of data needs to be relayed by geostationary orbit satellites (GEO). Coherent free-space optical communications, with its advantages of high transmission capacity, good security and confidentiality, and abundant spectrum resources, have been widely used in inter-satellite links for broadband data transmission [1][2][3][4]. However, due to the relative motion between the transmitting and receiving satellites, the Doppler effect causes the carrier frequency of the received signal to change, making it impossible to perform regular carrier synchronization and demodulation and severely impairing the signal quality [5]. ...
Article
Full-text available
This Letter presents a real-time coherent receiver using digital signal processing (DSP)-assisted automatic frequency control (AFC) to compensate for the Doppler frequency shift (DFS). DFS compensation range of ±8 GHz and the frequency shifting rate of 33 MHz/s are demonstrated in an FPGA-based 2.5 Gbaud QPSK coherent optical system. The experimental results indicate that the scheme achieves a sensitivity of −47 dBm at a bit error rate (BER) of 2E-4. The power penalty induced by the DFS compensation is less than 1 dB.
... Satellite-based space communication networks (SCNs) play an increasingly vital role in various applications, including marine communications and emergency rescue operations [16]. Overcoming challenges in constructing IoT ecosystems in remote areas, SCNs offer a potential pathway to realize a global IoT [17]. Satellite-based IoT services have emerged, with LEO satellites being particularly suitable for their greater coverage, propagation delay, and power [18]. ...
Article
Full-text available
Vehicular Ad-hoc Networks (VANETs) have enabled intelligent transportation systems by facilitating communication between vehicles and roadside infrastructure. However, the current 5G and 4G networks that support VANETs have certain limitations that hinder the full potential of VANET applications. These limitations include constraints in bandwidth, latency, connectivity, and security. The upcoming 6G network is expected to revolutionize VANETs by introducing several advancements. 6G will provide ultra-fast communication with significantly reduced latency, enabling real-time and high-bandwidth data exchange between vehicles. The network will also offer highly reliable and secure connectivity, ensuring the integrity and privacy of VANET communications. Precise localization and sensing capabilities will be enhanced in 6G-based VANETs, enabling accurate positioning of vehicles and improved situational awareness. This will facilitate collision avoidance, traffic management, and cooperative driving applications. Moreover, integrating edge computing in 6G networks will bring computing resources closer to the edge, lowering response times and facilitating faster decision-making in time-critical scenarios. This paper explores the key features and capabilities of 6G technology and how it can revolutionize intelligent transportation, addressing challenges and opportunities for adopting 6G in VANETs.
... In response to these challenges, low Earth orbit (LEO) satellite-based IoT networks have emerged as a promising solution, serving as a complementary and extended alternative to terrestrial-based networks. LEO satellite-based IoT networks address the limitations associated with deploying terrestrial-based stations and establishing communication networks on the ground [2]. Iridium Communications announced Project Stardust which it describes as "the evolution of its direct-to-device (D2D) strategy with 3GPP 5G standards-based NB-IoT via non-terrestrial network (NTN) systems [3]. ...
Article
Full-text available
With the development of IoT technology and 5G massive machine-type communication, the 3GPP standardization body considered as viable the integration of Narrowband Internet of Things (NB-IoT) in low Earth orbit (LEO) satellite-based architectures. However, the presence of the LEO satellite channel comes up with new challenges for the NB-IoT random access procedures and coverage enhancement mechanism. In this paper, an Adaptive Coverage Enhancement (ACE) method is proposed to meet the requirement of random access parameter configurations for diverse applications. Based on stochastic geometry theory, an expression of random access channel (RACH) success probability is derived for LEO satellite-based NB-IoT networks. On the basis of a power consumption model of the NB-IoT terminal, a multi-objective optimization problem is formulated to trade-off RACH success probability and power consumption. To solve this multi-objective optimization problem, we employ the Non-dominated Sorting Genetic Algorithms-II (NSGA-II) method to obtain the Pareto-front solution set. According to different application requirements, we also design a random access parameter configuration method to minimize the power consumption under the constraints of RACH success probability requirements. Simulation results show that the maximum number of repetitions and back-off window size have a great influence on the system performance and their value ranges should be set within [4, 18] and [0, 2048]. The power consumption of coverage enhancement with ACE is about 58% lower than that of the 3GPP proposed model. All this research together provides good reference for the scale deployment of NB-IoT in LEO satellite networks.
... A corresponding heuristic algorithm is proposed to solve the optimization problem. (3) Numerical simulations to validate the effectiveness of the proposed algorithm are conducted. These simulations compare the performance of the proposed algorithm with two conventional methods that focus on maximizing persistence or optimizing received signal strength, respectively. ...
Article
Full-text available
Internet of Remote Things (IoRT) networks utilize the backhaul links between unmanned aerial vehicles (UAVs) and low-earth-orbit (LEO) satellites to transfer the massive data collected by sensors. However, the backhaul links change rapidly due to the fast movement of both the UAVs and the satellites, which is different from conventional wireless networks. Additionally, due to the various requirements of IoRT multiservices, the system performance should be comprehensively considered. Thus, an adjustable wireless backhaul link selection algorithm for a LEO-UAV-sensor-based IoRT network is proposed. Firstly, an optimization model for backhaul link selection is proposed. This model uses Q, which integrates the remaining service time and capacity as the objective function. Then, based on the snapshot method, the dynamic topology is converted into the static topology and a heuristic optimization algorithm is proposed to solve the backhaul link selection problem. Finally, the proposed algorithm is compared with two traditional algorithms, i.e., maximum service time and maximum capacity algorithms. Numerical simulation results show that the proposed model can achieve better system performance, and the overload of the satellites is more balanced. The algorithm can obtain a trade-off between remaining service time and capacity by dynamically adjusting model parameters. Thus, the adjustable backhaul link selection algorithm can apply to multiservice IoRT scenarios.
... SIoT, as a natural expansion to the terrestrial networks, is characterized by the combination of conventional IoT technologies with satellite communication [22]. As such, it can be exploited for both industrial and commercial use, offering new ways for enhancing remote connectivity and availability of services and applications, especially in critical or high-risk settings where there is a need for continuous and uninterrupted monitoring of the underlying conditions [13], such as wildfire detection in forest regions, disaster or crisis management in machine-to-machine communications, military applications in remote tactical geographical areas, collaborative services in industrial systems, etc. ...
Conference Paper
TCP is a well-known protocol for reliable data transfer. Although TCP was originally designed for networks with low Round Trip Time (RTT) and low error rates over the communication channel, in modern networks these characteristics vary drastically, e.g., Long Fat Networks are usually attributed a high Bandwidth Delay Product. When considering satellite communications, which are also characterized by high error rates but are considered a driving force for future networks, such as the Satellite Internet of Things (SIoT), it becomes clear that there exists an ever-growing need to revisit TCP protocol variants and develop new tools to simulate their behavior and optimize their performance. In this paper, a TCP Cubic implementation for the OMNeT++ INET Framework is presented and made publicly available to the research community. Simulation experiments validate its expected behavior in accordance with the theoretical analysis. A performance comparison against the popular TCP NewReno is also performed to evaluate TCP Cubic’s applicability to satellite environments. The obtained results testify to the latter’s superiority in efficiently allocating the bandwidth among the different information flows with vast gains to the overall system throughput, thus, rendering it the better candidate for future SIoT environments.
... Many recent surveys and tutorial on future SatNets focused on discussing communication and networking related issues. For example, Radhakrishnan et al. [31] focused on inter-satellite communications in small satellite constellations from the perspectives of physical to network layers, and the Internet of remote things applications of satellite communication were reviewed in [32]. However, only a few reviews were published on the mobility management related issues in next generation satellite networks. ...
... The study in [39] provides a comprehensive overview of satellite communication systems for the IoT. However, it overlooks two crucial aspects: energy efficiency considerations in NTNs and the integration of Unmanned Aerial Vehicles (UAVs) in the IoRT on energy-efficient strategies, particularly in utilizing Non-Orthogonal Multiple Access (NOMA) schemes and applying machine learning techniques such as Deep Reinforcement Learning (DRL). ...
Article
Full-text available
The Internet of Things (IoT) is gaining popularity and market share, driven by its ability to connect devices and systems that were previously siloed, enabling new applications and services in a cost-efficient manner. Thus, the IoT fuels societal transformation and enables groundbreaking innovations like autonomous transport, robotic assistance, and remote healthcare solutions. However, when considering the Internet of Remote Things (IoRT), which refers to the expansion of IoT in remote and geographically isolated areas where neither terrestrial nor cellular networks are available, internet connectivity becomes a challenging issue. Non-Terrestrial Networks (NTNs) are increasingly gaining popularity as a solution to provide connectivity in remote areas due to the growing integration of satellites and Unmanned Aerial Vehicles (UAVs) with cellular networks. In this survey, we provide the technological framework for NTNs and Remote IoT, followed by a classification of the most recent scientific research on NTN-based IoRT systems. Therefore, we provide a comprehensive overview of the current state of research in IoRT and identify emerging research areas with high potential. In conclusion, we present and discuss 3GPP’s roadmap for NTN standardization, which aims to establish an energy-efficient IoRT environment in the 6G era.
... In the literature, several researches on channel allocation and spectrum utilization issues in satellite IoT are investigated. For example, Sanctis et al. [6] emphasized the importance of satellite communication systems for sensors and actuators in remote areas lacking terrestrial network connections. Obata et al. [7] explored interference detection techniques, assuming dynamic spectrum sharing between terrestrial and non-terrestrial communications. ...
Preprint
Due to the scarcity of spectrum resources in Non-orthogonal Multiple Access (NOMA) systems and insufficient satellite-ground integration in satellite Internet of Things (IoT), this paper investigates its issue in spectrum resource management. We propose a resource allocation method based on Multi-Agent Deep Deterministic Policy Gradient (MADDPG) for NOMA enabled satellite IoT. We formulate the spectrum allocation problem of the satellite-ground integrated network as a distributed optimization problem. Then we decouple the problem into two sub-problems. Firstly, a user grouping method based on matching coefficients is defined, and a Linear Programming (LP) method is utilized for obtaining solution. Secondly, the power allocation problem is transformed into a multi-agent problem, where MADDPG is employed to allocate the power. Through this approach, the system is capable of real-time user association and spectrum resource allocation optimization, achieving optimal user grouping while maximizing system transmission rate. Based on the simulation results, the MADDPG-based method demonstrates fast convergence within 100 training iterations. The proposed MADDPG-based resource management method also achieves increased system transmission rate with more effective matching outcomes over Deep Deterministic Policy Gradient (DDPG), Orthogonal Multiple Access (OMA), and random allocation baselines.
... Max pooling is employed to compute the greatest no of the features to the translational states of space objects. It further enhances the model generalization [17]. Feature pooling of the convoluted matrix is presented as = ...
Article
Satellites are used for many monitoring, prediction and forecasting application operation in space , hence it has to be continuously monitored and controlled against the catastrophic damage due to collision of space debris and asteroids debris in the space. Timely and efficient Controlling of satellite will eliminate the heavy irreparable damages as number of space debris and asteroids debris is continuously increasing in the space. Hence monitoring of the satellites increases the sustainability and its safety navigation in the space operations. In order to achieve this objective, deep space network (Satellite ground station ) has to be designed as space situational awareness mechanism on basis of the detection and classification of the space objects towards providing optimal path for satellite trajectory with high processing speed and reduction of memory in data processing. In this work, a new deep learning architecture entitled as optimized recurrent convolution neural network is designed to satellite ground station towards detection and classification of space debris and asteroids debris. Initially architecture is interfaced with feature extraction technique named as principle component analysis to extract the space features and those extracted feature were detected and classified using deep learning model. In this work, convolution neural network composed of convolution layer is used to filter the discrete features, max pooling layer for downsizing the features and represent the feature map and finally fully connected layer to detect and classify the space debris and predict the optimal path for satellite trajectory on basis of the velocity of the debris in the particular orbit. Experimental results of the proposed model has been evaluated using accuracy and precision metric. Further design is implemented and verified using Xilinx Software. The performance analysis of the proposed model yields better accuracy on the classifying the monitored information on comparing with conventional approaches such as support vector machine and K- Nearest Neighbour Technique. Keywords: Satellite Collision Analysis, Deep learning, Space Debris , Asteroid Debris, Convolution Neural Network, Principle Component analysis , Deep Space Network
... In areas without network infrastructure, satellite communication systems (SATCOM) [1] are used to collect and exchange data, such systems are called the Internet of Remote Things (IoRT) [2]. Integrated communication systems and heterogeneous networks for data collection using SATCOM are being intensively developed. ...
... These infrastructure requirements drive cost and restrict device operation in low-resource or remote settings where suitable infrastructure has not been developed (11)(12)(13). While some solutions have aimed to address this through the use of satellite communication, the powering, hardware, and cost requirements for implementation inhibit broad dissemination in limited resource environments (14)(15)(16). To address this, the use of low-power wide area network (LPWAN) protocols such as ultra-narrow band (UNB), LoRa, and SigFox have been realized for long-range, remote IoT applications (17,18). ...
Article
Full-text available
Remote patient monitoring is a critical component of digital medicine, and the COVID-19 pandemic has further highlighted its importance. Wearable sensors aimed at noninvasive extraction and transmission of high-fidelity physiological data provide an avenue toward at-home diagnostics and therapeutics; however, the infrastructure requirements for such devices limit their use to areas with well-established connectivity. This accentuates the socioeconomic and geopolitical gap in digital health technology and points toward a need to provide access in areas that have limited resources. Low-power wide area network (LPWAN) protocols, such as LoRa, may provide an avenue toward connectivity in these settings; however, there has been limited work on realizing wearable devices with this functionality because of power and electromagnetic constraints. In this work, we introduce wearables with electromagnetic, electronic, and mechanical features provided by a biosymbiotic platform to realize high-fidelity biosignals transmission of 15 miles without the need for satellite infrastructure. The platform implements wireless power transfer for interaction-free recharging, enabling long-term and uninterrupted use over weeks without the need for the user to interact with the devices. This work presents demonstration of a continuously wearable device with this long-range capability that has the potential to serve resource-constrained and remote areas, providing equitable access to digital health.
... However, the uniqueness of these systems is reflected in the fact that all communication is carried out via a satellite link, which has its advantages and disadvantages. The main disadvantages of the use of satellite link are great propagation delays, significant variations in delay, packet loss, unpredictable interruptions, dependence on meteorological factors, as well as a limited bandwidth that needs to be shared with other users [1][2][3][4][5]. Among other things, Vlatacom Institute deals with development and integration of these types of systems. ...
Conference Paper
-Educational platforms are generally of great importance in the education process of students of technical sciences, because the increase of their practical knowledge is enabled through the experimental work. Such platforms are even more significant when considering highly specialized systems, such as IoT systems that rely on satellite link communication, because training courses at real systems in these cases are impractical for many reasons. One example of these educational platforms was developed and implemented at the Vlatacom Institute for the needs of technical sciences students during their mandatory internship. The aim of this specific educational platform is to provide the possibility to examine the influence of the simulated satellite link on overall communication inside of different IoT systems.
Article
Pervasive Computing has become more personal with the widespread adoption of the Internet of Things(IoT) in our day-to-day lives. The emerging domain that encompasses devices, sensors, storage, and computing of personal use and surroundings leads to Personal IoT(PIoT). PIoT offers users high levels of personalization, automation, and convenience. This proliferation of PIoT technology has extended into society, social engagement, and the interconnectivity of PIoT objects, resulting in the emergence of the Social Internet of Things (SIoT). The combination of PIoT and SIoT has spurred the need for autonomous learning, comprehension, and understanding of both the physical and social worlds. Current research on PIoT is dedicated to enabling seamless communication among devices, striking a balance between observation, sensing, and perceiving the extended physical and social environment, and facilitating information exchange. Furthermore, the virtualization of independent learning from the social environment has given rise to Artificial Social Intelligence (ASI) in PIoT systems. However, autonomous data communication between different nodes within a social setup presents various resource management challenges that require careful consideration. This paper provides a comprehensive review of the evolving domains of PIoT, SIoT, and ASI. Moreover, the paper offers insightful modeling and a case study exploring the role of PIoT in post-COVID scenarios. This study contributes to a deeper understanding of the intricacies of PIoT and its various dimensions, paving the way for further advancements in this transformative field.
Chapter
This work presents the design and implementation of a combined photovoltaic and telecommunication system in the Pacaya Samiria Amazon Lodge Private Reserve, located in a buffer zone, the heart of the Amazon Jungle. The study aims to provide a comprehensive solution for sustainable energy generation and reliable communication in remote and environmentally sensitive areas. The paper outlines the design process, component selection, integration of the systems, and their performance evaluation. The results demonstrate the feasibility and effectiveness of the combined system in meeting the lodge’s energy needs and enhancing communication capabilities. The research highlights the potential of integrating renewable energy and communication technologies in promoting sustainability and improving connectivity in remote ecotourism facilities.
Chapter
Abstract Sixth-Generation (6G) wireless networks will be required to support unprecedented performance levels in terms of data speed, coverage, quality-of-service, and energy efficiency. The challenges associated to their design has recently motivated a deep revision of the fundamental approaches to wireless system conception specifically focusing on the idea that the propagation environment can be included within the design process. The revolutionary “Smart ElectroMagnetic Environment” (SEME) vision arising from such a concept will be made possible by the introduction of wave manipulating devices such as electromagnetic skins (EMS), and it is expected to enable a variety of opportunities within 6G communications systems. The objective of this work is to review the principles, the opportunities, and the challenges concerning the development of EMSs for SEME scenarios in 6G networks, as well as the current trends and envisaged developments in terms of their design and implementations.
Article
In this paper, we investigate a dynamic resource allocation problem for remote Internet of things (IoT) data collection in space-air-ground integrated networks (SAGIN), in which the aerial platforms are deployed to bridge the communications between IoT nodes and satellites. To obtain an efficient resource allocation strategy that accommodates the stochastic data arrivals of IoT nodes and the dynamic network topology due to the high mobility of non-geostationary orbit (NGSO) satellites, we first formulate a resource allocation problem with queue stability constraints. Our objective is to maximize the long-term network utility, ensuring a balance between throughput and fairness among the IoT nodes. The formulated long-term problem is challenging to solve due to the unknown future network states and the coupling between continuous and integer variables. Therefore, we adopt the Lyapunov optimization theory to transform the problem into a deterministic problem in each time slot. Moreover, an online resource allocation algorithm is proposed to dynamically determine data admission, subchannel assignment, and power control in each time slot based on the current network status and data backlog. In addition, theoretical analysis indicates that there is an [O(1/V ), O(V)] trade-off between network utility and data backlog with control parameter V. Numerical results demonstrate that the proposed algorithm can greatly enhance the system throughput and reduce data queue backlog as well as preserve queue stability as compared with the benchmarks.
Article
Extended reality-enabled Internet of Things (XRI) provides new user experiences and a sense of immersion by adding virtual elements to the real world through Internet of Things (IoT) devices and emerging sixth-generation (6G) technologies. However, computational-intensive XRI tasks are challenging for energy-constrained small-size XRI devices to cope with, and moreover certain data require centralized computing that needs to be shared among users. To this end, we propose a cache-assisted space-air-ground integrated network mobile edge computing (SAGIN-MEC) system for XRI applications consisting of two types of edge servers mounted on an unmanned aerial vehicle (UAV) and low Earth orbit (LEO) satellite equipped with a cache and multiple ground XRI devices. For system efficiency, four different offloading procedures of XRI data are considered according to the type of information, i.e., shared data and private data, as well as the offloading decision and the caching status. Specifically, private data can be offloaded to either UAV or LEO satellite, while the offloading decision of shared data to the LEO satellite can be determined by the caching status. With the aim of maximizing the energy efficiency of the overall system, we jointly optimize UAV trajectory, resource allocation and offloading decisions under latency constraints and UAV’s operational limitations by using the alternating optimization (AO)-based method along with the Dinkelbach algorithm and successive convex approximation (SCA). Via numerical results, the proposed algorithm is verified to have superior performance compared to conventional partial optimizations or processes without a cache.
Article
New satellite-based sixth-generation (6G) for the Internet of Things (IoT) is expected to provide complete global coverage and support fully transparent services. In addition, a large number of low earth orbit (LEO) satellites are to be deployed to connect IoT devices (actuators and sensors) are beyond terrestrial network coverage. However, there are the two major LEO satellites' hindering issues: i) spectrum inefficiency leading to high cost; and ii) continuous motions of satellites that limit the contact time to approximately 10 minutes which results in frequent handovers, link budget limitations, and high Doppler effects. This paper discusses design approaches and principles that allow us to develop a cost-effective intelligent Data-Aided Satellite Communication and Control (DASCC) framework for LEO networks by employing key features of 6G multi-connectivity, distributed sensing, and machine learning algorithms. Our ideas are evaluated with preliminary analytic modeling and simulation results.
Article
Full-text available
Due to the limited transmission capabilities of terrestrial intelligent devices within the Internet of Remote Things (IoRT), this paper proposes an optimization scheme aimed at enhancing data transmission rate while ensuring communication reliability. This scheme focuses on multi-unmanned aerial vehicle (UAV)-assisted IoRT data communication within the satellite–aerial–terrestrial integrated network (SATIN), which is one of the key technologies for the sixth generation (6G) networks. To optimize the system’s data transmission rate, we introduce a multi-dimensional coverage and power optimization (CPO) algorithm, rooted in the block coordinate descent (BCD) method. This algorithm concurrently optimizes various parameters, including the number and deployment of UAVs, the correlation between IoRT devices and UAVs, and the transmission power of both devices and UAVs. To ensure comprehensive coverage of a large-scale randomly distributed array of terrestrial devices, combined with machine learning algorithm, we present the Dynamic Deployment based on K-means (DDK) algorithm. Additionally, we address the non-convexity challenge in resource allocation for transmission power through variable substitution and the successive convex approximation technique (SCA). Simulation results substantiate the remarkable efficacy of our CPO algorithm, showcasing a maximum 240% improvement in the uplink transmission rate of IoRT data compared to conventional methods.
Article
With the expanding demands for the space-air-ground integrated Internet of Things (IoT), the low-Earth orbit (LEO) satellite can be regarded as an important complement to IoT networks. Due to the transparency of satellite orbit information and the exposing nature of transmitting links, satellite-ground communication is extremely vulnerable to eavesdropping and jamming attacks. To establish a transmission link, the most crucial procedure was signal acquisition. Existing frequency hopping/direct sequence signal acquisition algorithms were either too time costing for the short visibility window of the LEO satellite or too resource costing for the LEO satellite devices. To alleviate this issue, we propose a novel differential coherent accumulation acquisition strategy for the LEO satellite-enabled IoT network to strike a balance between performance and complexity. It is also demonstrated that the proposed acquisition strategy is capable of achieving high-performance signal acquisition in the low-carrier-to-noise ratio and large dynamic regions. Moreover, we derive and simulate false alarm probability, detection probability, computational complexity, and mean square error of both the delay and Doppler factors in the additive white Gaussian noise channel. Numerical simulation results show that the proposed acquisition strategy improves the performance by 1.6 dB over the noncoherent accumulation strategy but at the expense of 0.43% complexity increase.
Article
Space Air Ground Integrated Network (SAGIN), leveraging Low Earth Orbit (LEO) satellites and Unmanned Aerial Vehicles (UAVs), is expected to play a key role in providing services to Internet of Remote Things (IoRT) in the sixth generation (6 G) communications. Our considered SAGIN incorporates a cache node on the UAV to cope with the data rate fluctuation in the backhaul link (UAV to satellite), allowing temporary storage of collected data during low data rate periods. In this paper, we aim to minimize the completion time of data collection in SAGIN by optimizing the UAV trajectory, IoRT device association scheme, and data caching policy (whether to store data temporarily or not in the UAV). Since the formulated problem is challenging to solve by using traditional optimization methods due to the unknown number of decision variables and the changing environment, we propose a Deep Reinforcement Learning (DRL)-based algorithm to efficiently solve it. Simulation results demonstrate that our proposed algorithm requires less time to complete data collection compared to both the circular trajectory scheme and the no-cache node scheme under various setups. Moreover, our proposed algorithm can adapt to uneven data distribution by approaching closer to the IoRT nodes with large data sizes, and it can also mitigate the influence of backhaul link fluctuations with the aid of the cache node.
Conference Paper
Full-text available
Метою роботи є аналіз методу k-means при кластеризації, який може використовуватись для обробки часових рядів, визначення його переваг та недоліків.
Preprint
Full-text available
The broadcast properties of satellites communication makes transmission less susceptible to eavesdropping or even negative obstruction. Physical Layer Security (PLS) technique prevents privacy of data sent by the sender from being intercepted by illegal hackers, which ensures the confidentiality and transmission security between the sender and authorized users. The evolution of mobile communications has brought new challenges to physical security research. Data security techniques implemented at the physical layer of satellite communication are proving to be effective on many parameters. Considering the down-link of a satellite communication system, with multiple antennas at Earth station (ES), Eavesdropper, as well as on satellite, an eavesdropper wants to hack the signal of the satellite to the ES. We propose a method by which the signal received by the eavesdropper is in a meaningless form. The proposed method uses nulling based approach, in which the signal transmitted by the satellite gets multiplied by a beamforming vector, which is selected in a manner so that no information can be received by the eavesdropper. The performance of the received data at ES is analyzed in terms of symbol error rate.
Conference Paper
Full-text available
In the Broadband Satellite Multimedia (BSM) architecture, developed by the European Telecommunications Standardization Institute (ETSI), the physical layers are isolated rom the rest by a Satellite Independent-Service Access Point (SI-SAP). For technological reasons, Satellite Independent (SI) traffic classes, requiring different QoS guarantees, must be aggregated together within the Satellite Dependent (SD) core and may change the encapsulation format. Moreover, SD layers must assure proper fading countermeasures to face satellite channel degradation. In this work, we investigate a novel control algorithm to tackle the envisaged problems. Simulation results validate the proposed approach.
Article
Full-text available
This paper presents and evaluates a new communications architecture to provide services to terrestrial sensor networks using a space delay-tolerant networking (DTN)–based solution. We propose a new multiple access mechanism based on extended unslotted ALOHA that takes into account the priority of gateway traffic, which we call ALOHA multiple access with gateway priority (ALOHAGP). We assume a finite sensor population model and a saturated traffic condition where every sensor always has frames to transmit. The performance was evaluated in terms of effective throughput, delay and system fairness. In addition, a DTN convergence layer (ALOHAGP-CL) has been defined as a subset of the standard TCP-CL (Transmission Control Protocol-Convergence Layer). Through simulations, this paper reveals that ALOHAGP/CL adequately supports the proposed DTN scenario, mainly when reactive fragmentation is used.
Article
Full-text available
Many technical communities are vigorously pursuing research topics that contribute to the Internet of Things (IoT). Nowadays, as sensing, actuation, communication, and control become even more sophisticated and ubiquitous, there is a significant overlap in these communities, sometimes from slightly different perspectives. More cooperation between communities is encouraged. To provide a basis for discussing open research problems in IoT, a vision for how IoT could change the world in the distant future is first presented. Then, eight key research topics are enumerated and research problems within these topics are discussed.
Conference Paper
Full-text available
Machine-to-machine (M2M) communications have a very large potential market growth, particularly in the low-end segment. Current satellite systems are not adequate to serve very large populations of low cost devices, with low bandwidth requirements, and severe cost and energy constraints. A satellite system can bring unique advantages in terms of cross-border coverage, availability, and security of the communication. However, it must compete in cost with cellular and unlicensed devices, which are rapidly evolving. The high cost of the space segment can be compensated by the high scalability of the system if the terminal cost can be kept sufficiently low. Moreover, the low bandwidth requirements of M2M systems make reusing current infrastructure possible. In this paper we analyze the feasibility of such M2M satellite system. We define a satellite architecture and multiple access technique appropriate for low-cost, energy-constrained devices, and evaluate its performance in terms of system capacity and energy usage.
Article
Full-text available
Multicasting is emerging as an enabling technology for multimedia transmissions over wireless networks to support several groups of users with flexible quality of service (QoS) requirements. Although multicast has huge potential to push the limits of next generation communication systems; it is however one of the most challenging issues currently being addressed. In this survey, we explain multicast group formation and various forms of group rate determination approaches. We also provide a systematic review of recent channel-aware multicast scheduling and resource allocation (MSRA) techniques proposed for downlink multicast services in OFDMA based systems. We study these enabling algorithms, evaluate their core characteristics, limitations and classify them using multidimensional matrix. We cohesively review the algorithms in terms of their throughput maximization, fairness considerations, performance complexities, multi-antenna support, optimality and simplifying assumptions. We discuss existing standards employing multicasting and further highlight some potential research opportunities in multicast systems.
Article
Full-text available
The problem of bandwidth allocation may be simply stated, independently of the target of the allocation: an amount of bandwidth must be shared among different entities. Each entity receives a portion of the overall bandwidth. Bandwidth allocation may be formalized as a Multi-Objective Programming (MOP) problem where the constraint is the maximum available bandwidth. The objectives of the allocation such as loss and power may be modelled through a group of objective functions possibly contrasting with each other. It is quite intuitive that using more bandwidth will reduce losses, but also that transmitted power will increase with the bandwidth. Which is the balance among these needs? This letter proposes an extended model for bandwidth allocation and uses a modified version of a MOP-based bandwidth allocation to provide a possible solution to the mentioned balancing problems.
Article
Full-text available
Smart embedded objects will become an important part of what is called the Internet of Things. However, the integration of embedded devices into the Internet introduces several challenges, since many of the existing Internet technologies and protocols were not designed for this class of devices. In the past few years, there have been many efforts to enable the extension of Internet technologies to constrained devices. Initially, this resulted in proprietary protocols and architectures. Later, the integration of constrained devices into the Internet was embraced by IETF, moving towards standardized IP-based protocols. In this paper, we will briefly review the history of integrating constrained devices into the Internet, followed by an extensive overview of IETF standardization work in the 6LoWPAN, ROLL and CoRE working groups. This is complemented with a broad overview of related research results that illustrate how this work can be extended or used to tackle other problems and with a discussion on open issues and challenges. As such the aim of this paper is twofold: apart from giving readers solid insights in IETF standardization work on the Internet of Things, it also aims to encourage readers to further explore the world of Internet-connected objects, pointing to future research opportunities.
Article
Full-text available
The Internet of Things (IoT) provides a virtual view, via the Internet Protocol, to a huge variety of real life objects, ranging from a car, to a teacup, to a building, to trees in a forest. Its appeal is the ubiquitous generalized access to the status and location of any "thing" we may be interested in. Wireless sensor networks (WSN) are well suited for long-term environmental data acquisition for IoT representation. This paper presents the functional design and implementation of a complete WSN platform that can be used for a range of long-term environmental monitoring IoT applications. The application requirements for low cost, high number of sensors, fast deployment, long lifetime, low maintenance, and high quality of service are considered in the specification and design of the platform and of all its components. Low-effort platform reuse is also considered starting from the specifications and at all design levels for a wide array of related monitoring applications.
Conference Paper
Full-text available
Long Term Evolution (LTE) is the emerging 4G wireless technology developed to provide high quality services in mobile environments. It is foreseen that multimedia services and mobile TV will assume an important role for the LTE proliferation in mobile market. However, several issues are still open and meaningful improvements have to be introduced for managing physical resources when group-oriented services should be supplied. To this end, we propose a Frequency Domain Packet Scheduling (FDPS) algorithm for efficient radio resource man-agement of Multimedia Broadcast/Multicast Services (MBMS) in LTE networks. The proposed scheduler exploits an optimization process to organize multicast subscribers in subgroups according to channel quality feedbacks provided by users. Optimization is driven by the aim of minimizing the "user dissatisfaction" with a consequent improvement in the network capacity. The effectiveness of the proposed scheduling algorithm is evaluated through simulation; obtained results demonstrate significant improvement in the multicast traffic performance.
Conference Paper
Full-text available
This paper explores possible advantages of the Information Centric Networking (ICN) paradigm in a geostationary satellite network. We find out that, with respect to plain HTTP services, ICN makes possible to reduce the downstream bandwidth consumed for Internet access by better exploiting the temporal locality of references within requested streams of Web contents. We present an ICN satellite architecture, describe its peculiar mechanisms and assess our solution through simulations.
Article
Full-text available
We have witnessed the Fixed Internet emerging with virtually every computer being connected today; we are currently witnessing the emergence of the Mobile Internet with the exponential explosion of smart phones, tablets and net-books. However, both will be dwarfed by the anticipated emergence of the Internet of Things (IoT), in which everyday objects are able to connect to the Internet, tweet or be queried. Whilst the impact onto economies and societies around the world is undisputed, the technologies facilitating such a ubiquitous connectivity have struggled so far and only recently commenced to take shape. To this end, this paper introduces in a timely manner and for the first time the wireless communications stack the industry believes to meet the important criteria of power-efficiency, reliability and Internet connectivity. Industrial applications have been the early adopters of this stack, which has become the de-facto standard, thereby bootstrapping early IoT developments with already thousands of wireless nodes deployed. Corroborated throughout this paper and by emerging industry alliances, we believe that a standardized approach, using latest developments in the IEEE 802.15.4 and IETF working groups, is the only way forward. We introduce and relate key embodiments of the power-efficient IEEE 802.15.4-2006 PHY layer, the power-saving and reliable IEEE 802.15.4e MAC layer, the IETF 6LoWPAN adaptation layer enabling universal Internet connectivity, the IETF ROLL routing protocol enabling availability, and finally the IETF CoAP enabling seamless transport and support of Internet applications. The protocol stack proposed in the present work converges towards the standardized notations of the ISO/OSI and TCP/IP stacks. What thus seemed impossible some years back, i.e., building a clearly defined, standards-compliant and Internet-compliant stack given the extreme restrictions of IoT networks, is commencing to become reality.
Conference Paper
Full-text available
With the deployment of high EIRP and high G/T L-and/ or S-band geo-synchronous mobile satellites such as TerreStar, ICO, and SkyTerra, low power and long battery-life user terminals (UTs) can be employed to provide data collection and supervisory control and data acquisition (SCADA) services. These services are typically of low data rate, low duty factor, and geographically dispersed, using unattended and battery operated UTs, which must be low cost and have long battery life. This paper presents a unique spread scrambled coded multiple access (SSCMA) scheme to achieve the low-cost and long battery-life objectives. This scheme employs low-rate forward error correction (FEC) coding and spectral spreading to achieve a power efficiency better than CDMA over a given BW without requiring complicated power control. System design, UT characteristics, and transmission performance will be presented.
Article
Full-text available
This paper considers a packet-based telecommunication network architecture suited to be used as an Environmental Monitoring System (EMS) over wide areas. It can be employed to retrieve the measures of physical quantities, such as temperature, humidity, and vibrations intensity (physical information) together with the geographical position where the measures are taken (position information). The telecommunication network supporting the EMS is composed of: a network of sensors, a group of earth stations called Sinks, a satellite backbone, and a destination. Each sensor collects physical and position information, encapsulates it into packets and conveys it towards the sinks which give access to the satellite backbone that connects the sinks to the destination. A single sensor transmits the information to all sinks but only one sink transmits it over the satellite channel. Even if the redundant transmission of the same data from more than one sink would increase the safety of the system, it would increase also the costs of it. The selection of the sink which forwards the information of a sensor to the destination is important to increase the performance of the EMS. This paper introduces specific performance metrics to evaluate the functionality of the whole EMS in terms of reliability, reactivity, and spent energy. The reference metrics are packet loss rate, average packet delay, and energy consumption. Then the paper presents an algorithm to select the transmitting sink for each sensor, which is aimed at maximizing the performance in terms of the mentioned metrics. The algorithm is tested through simulation.
Conference Paper
Full-text available
This paper identifies key research issues and technologies that we envision will be important to the continued evolution of satellite networking and its integration as a core component of a future Internet that offers reliable, robust and pervasive networking and access to network services.
Article
Full-text available
To enable full mechanical automation where each smart device can play multiple roles among sensor, decision maker, and action executor, it is essential to construct scrupulous connections among all devices. Machine-to-machine communications thus emerge to achieve ubiquitous communications among all devices. With the merit of providing higher-layer connections, scenarios of 3GPP have been regarded as the promising solution facilitating M2M communications, which is being standardized as an emphatic application to be supported by LTE-Advanced. However, distinct features in M2M communications create diverse challenges from those in human-to-human communications. To deeply understand M2M communications in 3GPP, in this article, we provide an overview of the network architecture and features of M2M communications in 3GPP, and identify potential issues on the air interface, including physical layer transmissions, the random access procedure, and radio resources allocation supporting the most critical QoS provisioning. An effective solution is further proposed to provide QoS guarantees to facilitate M2M applications with inviolable hard timing constraints.
Conference Paper
Full-text available
Recent environmental monitoring systems are based on Satellite-based Sensor Networks (SSN) where earth stations (Sinks) gather messages from sensors and use the satellite channel to send sensible information to remote monitoring sites. In these systems, the Sink selection process may play a crucial role and needs to be investigated. The work includes: an introduction of the SSN scenario considered; a brief description of the Sink selection method aimed at guaranteeing the optimization of the energy consumption and, simultaneously, of the message transfer delay; a deep performance investigation, which represents the main contribution of this paper, carried out by simulation, of the studied technique.
Conference Paper
Full-text available
Recent large scale disasters have highlighted the importance of a robust and efficient public safety communication network able to coordinate emergency operations even when existing infrastructures are damaged. The Incident Area Network (IAN) is a self-forming temporary network infrastructures brought to the scene of an incident to support personal and local communications among different public safety end-users. In this work we are interested in investigating how the High Altitude Platform (HAP) can effectively support Multimedia Broadcast/Multicast Service (MBMS) in a scenario wherein the preexistent terrestrial network is not available. To this aim, we propose an efficient policy of Radio Resource Management (RRM) based on cooperation framework between HAP and Mobile Ad-Hoc NETwork (MANET). The proposed solution has been successfully tested through a comprehensive simulation campaign.
Book
Although the Internet of Things (IoT) is a vast and dynamic territory that is evolving rapidly, there has been a need for a book that offers a holistic view of the technologies and applications of the entire IoT spectrum. Filling this void, The Internet of Things in the Cloud: A Middleware Perspective provides a comprehensive introduction to the IoT and its development worldwide. It gives you a panoramic view of the IoT landscape—focusing on the overall technological architecture and design of a tentatively unified IoT framework underpinned by Cloud computing from a middleware perspective. Organized into three sections, it: 1. Describes the many facets of Internet of Things—including the four pillars of IoT and the three layer value chain of IoT 2. Focuses on middleware, the glue and building blocks of a holistic IoT system on every layer of the architecture 3. Explores Cloud computing and IoT as well as their synergy based on the common background of distributed processing The book is based on the author’s two previous bestselling books (in Chinese) on IoT and Cloud computing and more than two decades of hands-on software/middleware programming and architecting experience at organizations such as the Oak Ridge National Laboratory, IBM, BEA Systems, and Silicon Valley startup Doubletwist. Tapping into this wealth of knowledge, the book categorizes the many facets of the IoT and proposes a number of paradigms and classifications about Internet of Things' mass and niche markets and technologies.
Book
"If we had computers that knew everything there was to know about things-using data they gathered without any help from us-we would be able to track and count everything, and greatly reduce waste, loss, and cost. We would know when things needed replacing, repairing or recalling, and whether they were fresh or past their best. The Internet of Things has the potential to change the world, just as the Internet did. Maybe even more so." -Kevin Ashton, originator of the term, Internet of Things. An examination of the concept and unimagined potential unleashed by the Internet of Things (IoT) with IPv6 and MIPv6. What is the Internet of Things? How can it help my organization? What is the cost of deploying such a system? What are the security implications? Building the Internet of Things with IPv6 and MIPv6: The Evolving World of M2M Communications answers these questions and many more. This essential book explains the concept and potential that the IoT presents, from mobile applications that allow home appliances to be programmed remotely, to solutions in manufacturing and energy conservation. It features a tutorial for implementing the IoT using IPv6 and Mobile IPv6 and offers complete chapter coverage that explains: What is the Internet of Things? Internet of Things definitions and frameworks. Internet of Things application examples. Fundamental IoT mechanisms and key technologies. Evolving IoT standards. Layer 1/2 connectivity: wireless technologies for the IoT. Layer 3 connectivity: IPv6 technologies for the IoT. IPv6 over low power WPAN (6lowpan). Easily accessible, applicable, and not overly technical, Building the Internet of Things with IPv6 and MIPv6 is an important resource for Internet and ISP providers, telecommunications companies, wireless providers, logistics professionals, and engineers in equipment development, as well as graduate students in computer science and computer engineering courses.
Article
To enable full mechanical automation where each smart device can play multiple roles among sensor, decision maker, and action executor, it is essential to construct scrupulous connections among all devices. Machine-to-machine communications thus emerge to achieve ubiquitous communications among all devices. With the merit of providing higher-layer connections, scenarios of 3GPP have been regarded as the promising solution facilitating M2M communications, which is being standardized as an emphatic application to be supported by LTE-Advanced. However, distinct features in M2M communications create diverse challenges from those in human-to-human communications. To deeply understand M2M communications in 3GPP, in this article, we provide an overview of the network architecture and features of M2M communications in 3GPP, and identify potential issues on the air interface, including physical layer transmissions, the random access procedure, and radio resources allocation supporting the most critical QoS provisioning. An effective solution is further proposed to provide QoS guarantees to facilitate M2M applications with inviolable hard timing constraints.
Article
The importance of quality of service (QoS) has risen with the recent evolution of telecommunication networks, which are characterised by a great heterogeneity. While many applications require a specific level of assurance from the network; communication networks are characterized by different service providers, transmission means and implementer solutions such as asynchronous transfer mode (ATM), Internet protocol version 4 (IPv4), IPv6 and MPLS. Providing comprehensive coverage of QoS issues within heterogeneous network environments, "QoS Over Heterogeneous Networks" looks to find solutions to questions such as does QoS fit within heterogeneous networks and what is the impact on performance if information traverses different network portions that implement specific QoS schemes. Includes: A series of algorithms and protocols to help solve potential QoS problems. State of the art case studies and operative examples to illustrate points made. Information on QoS mapping in terms of service-level specification (SLS) and an in-depth discussion of related issues Chapters end-to-end (E2E) QoS, QoS architecture, QoS over heterogeneous networks and QoS internetworking and mapping. An ideal book for graduate students, researchers and lecturers. System designers, developers and engineers will also find "QoS Over Heterogeneous Networks" a valuable reference.
Chapter
This chapter surveys basic lower-layer wireless technologies to support Internet of things (IoT) /machine-to- machine (M2M) applications, as it appears that many such implementations will entail wireless connectivity at the physical (PHY)/ media access control (MAC) layer. Personal area networks (PANs) can be used to support wireless body area networks (WBANs), but they can also be used to support other applications. The WBAN technologies can satisfy, in various degrees, major requirements that the healthcare industry considers important: (i) very low-power sensor consumption, (ii) very low transmitted power, and (iii) high reliability and quality of service (QoS). The chapter focuses predominantly on PANs and 3G/4G technologies. The chapter also makes some general comparisons between some of the key PAN technologies. In the near future, M2M applications are expected to become important sources of traffic (and revenues) for cellular data networks. 3G mobile communication; 4G mobile communication; body area networks; Media Access Protocol; personal area networks; quality of service; radio access networks
Technical Report
The objective of Pre-Congestion Notification (PCN) is to protect the quality of service (QoS) of inelastic flows within a Diffserv domain in a simple, scalable, and robust fashion. This document defines the two metering and marking behaviours of PCN-nodes. Threshold-metering and -marking marks all PCN-packets if the rate of PCN-traffic is greater than a configured rate ("PCN-threshold-rate"). Excess-traffic-metering and -marking marks a proportion of PCN-packets, such that the amount marked equals the rate of PCN-traffic in excess of aconfigured rate ("PCN-excess-rate"). The level of marking allows PCN-boundary-nodes to make decisions about whether to admit or terminate PCN-flows.
Conference Paper
In this paper, an extended version of the CRDSA protocol proposed in and is presented. The enhanced CRDSA protocol, dubbed CRDSA++, extends the original CRDSA concept to more than two replicas (e.g. 4 or 5) and exploits power fluctuations in the received signal to further boost the performance of the CRDSA protocol. The paper provides a wide range of performance and sensitivity analysis with respect to the number of replicas and power unbalance. It is shown that a packet loss ratio (PLR) below 10-4 can be achieved with a normalized throughput in excess of 1 packet per slot. This represents a remarkable performance for an open loop random access (RA) scheme without any kind of access control over a medium shared by a large population of terminals. Finally, the paper also provides a preliminary discussion on the implementation aspects of the CRDSA++ RA scheme, as well as on the integration aspects with a DAMA protocol to support a wider range of service scenarios with improved MAC layer performance. (10 pages)
Technical Report
The objective of Pre-Congestion Notification (PCN) is to protect the quality of service (QoS) of inelastic flows within a Diffserv domain in a simple, scalable, and robust fashion. This document defines the two metering and marking behaviours of PCN-nodes. Threshold-metering and -marking marks all PCN-packets if the rate of PCN-traffic is greater than a configured rate ("PCN-threshold-rate"). Excess-traffic-metering and -marking marks a proportion of PCN-packets, such that the amount marked equals the rate of PCN-traffic in excess of aconfigured rate ("PCN-excess-rate"). The level of marking allows PCN-boundary-nodes to make decisions about whether to admit or terminate PCN-flows.
Conference Paper
The work aims at discussing possible engineering solutions to guarantee the interoperability between smartphones and WSN under high mobility levels. The paper discusses an application scenario where some of the most interesting standards for WSNs can be effectively employed together with smartphones. The architectures supporting interoperable smartphone/WSN systems, considering network topology and functionalities of the nodes are presented together with current network solutions for interoperability.
Conference Paper
of the main points arising in a study conducted by Beecham Research for the European Space Agency - SAMOS (SAtellite M2m Observatory Study). This provides the basis for conference presentation slides.
Article
Recently, the concept of Internet of Things, referred to as IoT, has drawn a great deal of research attention for realizing an intelligent society. The IoT is expected to comprise millions of heterogeneous smart “things” having sensor terminals, which may collect various types of information. By sending these collected information via IoT, it is possible to construct many smart systems, e.g., automatic prevention of traffic jam and so on. However, the coverage of IoT and the capacity of its ground networks are, still, not capable enough to connect the numerous devices and terminals deployed all over the world. Therefore, there is an urgent need for effective data collection in such systems. In this paper, we focus on effective data collection by satellite-routed sensor system, which makes it possible to gather data from wide areas arbitrarily yet efficiently. However, multiple accesses from the numerous things to a satellite result in data collisions and increase the delay. For effectively resolving the problem of data collisions, we envision a new method, which utilizes a “divide and conquer” approach to collect data from the numerous things based upon demand. Also, we mathematically demonstrate how to optimize the operating time of our proposed. The effectiveness of our proposal is evaluated through numerical results.
Article
The passive nature of power distribution networks has been changing to an active one in recent years as the number of small-scale Distributed Generators (DGs) connected to them rises. The consensus of recent research is that current slow central network control based upon Supervisory Control and Data Acquisition (SCADA) systems is no longer sufficient and Distributed Network Operators (DNOs) wish to adopt novel management mechanisms coupled with advanced communication infrastructures to meet the emerging control challenges. In this paper, we address this issue from the communication perspective by exploiting the effectiveness of using a Low Earth Orbit (LEO) satellite network as the key component of the underlying communication infrastructure to support a recently suggested active network management solution. The key factors that would affect the communication performance over satellite links are discussed and an analytical LEO network model is presented. The delivery performance of several major data services for supporting the management solution is evaluated against a wide range of satellite link delay and loss conditions under both normal and emergency traffic scenarios through extensive simulation experiments. Our investigation demonstrates encouraging results which suggests that a LEO network can be a viable communication solution for managing the next-generation power energy networks.
Article
Global mobile satellite communications (GMSC) are specific satellite communication systems for maritime, land and aeronautical applications. It enables connections between moving objects such as ships, vehicles and aircrafts, and telecommunications subscribers through the medium of communications satellites, ground earth stations, PTT or other landline telecommunications providers. Mobile satellite communications and technology have been in use for over two decades. Its initial application is aimed at the maritime market for commercial and distress applications.
Article
This paper presents the vision of the Integral SatCom Initiative (ISI) European Technology Platform about the role that Satellite Networks can play towards a global architecture of Future Internet. The current state-of-the-art in Satellite Networks is briefly discussed, the key elements of Satellite Networks in the realisation of Future Internet objectives are highlighted and specific R&D directions are provided in order to achieve these objectives.
Article
Power system automations technologies are essential for power system operations and controls. A variety of communication systems including optical fiber, microwave and power line communications are used to transmit these EMS/SCADA data. In these communication modes, fiber communication is used to transmit IEC 101/104 protocol data in most power grids due to its large bandwidth and high reliability. However, fiber is very fragile when large natural disasters happen. Earthquake, snow-ice strike, hurricane would probably destroy the infrastructures of the fiber systems. In some tough geographical areas such as mountainous West Sichuan Plateaus, the construction and operation of fiber communications are very costly and difficult. After 2008 5.12 Sichuan Earthquake Sichuan Electric Power Corporations started to investigate some alternative communication systems against natural disasters. Compared to traditional fiber communications, VSAT communication (Very Small Aperture Terminal for satellite communications) has 3 advantages: (1) The setup of a VSAT station is convenient and flexible. (2) The VSAT system is robust against natural disasters since it does not need wired cable or fiber. (3) The setup and operation of VSAT system is cost effective in some tough geographic areas. In Sichuan Power, we proposed an new data networks for EMS/SCADA data transmission based on VSAT systems. The structure of our system is a star-like topology: we place the VSAT communication terminals to some substations and power plants in areas which cannot afford fiber infrastructures economically or geographically. A central VSAT station is set on Power Dispatching Center. All data in VSAT system are based on TCP/IP protocol. The first problem for such a system is to select suitable VSAT communication modalities. We chose SCPC (Single Carrier per Channel) over TDMA (Time-Division Multiplex Access) as our favorable modality because SCPC could provide reliable and persistent data link and easy to- - manage. A detailed analysis will be included in full text. For IEC 101 protocol, the traditional 2M channel is transferred to IP packets to access the VSAT system. For IEC 104 protocol, data is directly accessed to VAST system through Ethernet connections. Since the satellite link is not stable due to a variety of environmental reasons, traditional TCP may have large transmission delay or even cannot be used in such environments. To meet the requirements of IEC protocols in transmission delay, we developed a TCP/IP accelerator dedicated for satellite communication based on SCPS-TP (Space Communication Protocol Standards - Transport Protocol). The SCPS-TP is modified version of traditional TCP/IP protocol adapted to space communications and it considers the effects of packet loss due to satellite link outage. A SCPS-TP based TCP/IP accelerator is added to each VSAT terminal in order to minimize the transmission delay. Our experiences showed that such VSAT system a very useful and robust in power systems against disasters.
Article
The work presented here describes the key design drivers and performance of a high efficiency satellite mobile messaging system well adapted to the machine-to-machine communication services targeting, in particular, the vehicular market. It is shown that the proposed return link multiple access solution is providing a random access channel (RACH) aggregated spectral efficiency around 2 bit/s/Hz in the presence of power unbalance with reliable packet delivery over typical land mobile satellite (LMS) channels.
Article
Contention resolution diversity slotted ALOHA (CRDSA) is a simple but effective improvement of slotted ALOHA. CRDSA relies on MAC bursts repetition and on interference cancellation (IC), achieving a peak throughput T ≅ 0.55, whereas for slotted ALOHA T ≅ 0.37. In this paper we show that the IC process of CRDSA can be conveniently described by a bipartite graph, establishing a bridge between the IC process and the iterative erasure decoding of graph-based codes. Exploiting this analogy, we show how a high throughput can be achieved by selecting variable burst repetition rates according to given probability distributions, leading to irregular graphs. A framework for the probability distribution optimization is provided. Based on that, we propose a novel scheme, named irregular repetition slotted ALOHA, that can achieve a throughput T ≅ 0.97 for large frames and near to T ≅ 0.8 in practical implementations, resulting in a gain of ~ 45% w.r.t. CRDSA. An analysis of the normalized efficiency is introduced, allowing performance comparisons under the constraint of equal average transmission power. Simulation results, including an IC mechanism described in the paper, substantiate the validity of the analysis and confirm the high efficiency of the proposed approach down to a signal-to-noise ratio as a low as E<sub>b</sub>/N<sub>0</sub>=2 dB.
Conference Paper
Past crises all over the world (09/11, tsunami, etc.) have highlighted the importance of the availability of critical up-to-date information in real-time at the place where required. The key need is, consequently, for reliable communication systems to coordinate emergency operations even when existing infrastructures are damaged. In this case, rapidly deploying ad-hoc networks in the disaster area is a feasible solution. Incident Area Network (IAN), are self-forming temporary network infrastructures brought to the scene of an incident to support personal and local communications among different public safety end-users. This paper aims at defining the system architecture of IANs where the cooperation among terminals is foreseen. Furthermore, several open issues are identified and possible solutions are proposed.
Conference Paper
In this paper, we investigate the applicability of the 3GPP Long Term Evolution standard to mobile satellite systems. In order to counteract the satellite specific propagation conditions, we propose a novel inter-TTI (transmission time interval) interleaving technique allowing to break the channel correlation in slowly time varying channels. The Inter-TTI transmission is achieved through the reuse of the existing HARQ facilities provided by the LTE physical layer. Simulation results show that the proposed technique can be successfully used to improve the performance of the LTE physical layer when applied to the transmission over satellite links. Performance improvement depends on the considered parameters, while complexity can be traded off with the data rate to achieve the best quality of service for the considered system.
Article
Phasor-based wide-area monitoring and control (WAMC) systems are becoming a reality with increased research, development, and deployments. Many potential control applications based on these systems are being proposed and researched. These applications are either local applications using data from one or a few phasor measurement units (PMUs) or centralized utilizing data from several PMUs. An aspect of these systems, which is less well researched, is the WAMC system's dependence on high-performance communication systems. This paper presents the results of research performed in order to determine the requirements of transmission system operators on the performance of WAMC systems in general as well as the characteristics of communication delays incurred in centralized systems that utilize multiple PMUs distributed over a large geographic area. This paper presents a summary of requirements from transmission system operators with regards to a specific set of applications and simulations of communication networks with a special focus on centralized applications. The results of the simulations indicate that the configuration of central nodes in centralized WAMC systems needs to be optimized based on the intended WAMC application.
Article
This paper addresses the issue of delivering solutions that will enable the incremental implementation of inter-domain quality of service (QoS) in the multi-provider commercial Internet. The paper first introduces a holistic architecture that describes the key functions required to support inter-domain QoS, and then proceeds to present results from two major components of the architecture. A genetic algorithm for QoS-aware offline inter-domain traffic engineering is first presented, and it is shown through simulation studies how this can optimise the apportionment of QoS provisioning between adjacent domains. Secondly, QoS enhancements to BGP are proposed and the results of a testbed implementation are described, demonstrating how this QoS-enhanced BGP can deliver inter-domain QoS routing.
Article
The Quality of Service (QoS) provision requires the cooperation of all network layers from bottom to top. More specifically, for Broadband Satellite Multimedia (BSM) communications, the physical layers (strictly satellite dependent) are isolated from the rest by a Satellite Independent Service Access Point (SI-SAP), which should offer specific QoS to IP and the upper layers. The structure of SI-SAP and the BSM protocol model “opens the door” to the problem of mapping the performance requests of Satellite Independent layers over Satellite Dependent technology. In such a context, a QoS mapping problem arises when different encapsulation formats are employed along the protocol stack of the SI-SAP interface, for instance, when IP packets are transferred over an ATM, DVB or MPLS core network. In this perspective, we investigate here a novel control algorithm to estimate the bandwidth shift required to keep the same performance guarantees, independent of the technology change. By exploiting Infinitesimal Perturbation Analysis to capture the network performance sensitivity, we obtain an adaptive control law suitable for on-line control on the basis of traffic samples acquired during the network evolution. Owing to the generality of the mathematical framework under investigation, our control mechanism can be generalized for other network scenarios and functional costs.
Article
In this paper the main performances of communication services based on low earth orbit satellites (LEOs) have been analyzed in order to evaluate their suitability for a typical set of power system monitoring functionalities. As an experimental test bed, an intelligent electronic device (IED) for remote monitoring and protection of power components equipped with a bi-directional communication system based on the LEO satellites of the Globalstar® consortium has been prototyped. Thanks to the adoption of this facility the main parameters characterizing the performance of the satellite TCP/IP services have been evaluated. They comprise in particular the connection time, the degradation of service and data latency for both packet and asynchronous data services. The experimental results obtained show that the application of LEO satellites based communication technologies exhibits a set of intrinsic advantages that could be particularly useful in several fields of power system communication.
Conference Paper
In this paper an analysis on the use of Adaptive Coding and Modulation (ACM) techniques for EHF satellite communications is presented. In particular, our analysis is focused on W-band channels and includes the main channel impairments in this frequency band, i.e. rain fading and HPA non-linearity. The aim of the analysis is to identify modifications to the ACM mode switching algorithm to optimize their use at those high frequency bands.