Content uploaded by Giovanni B. Palmerini
Author content
All content in this area was uploaded by Giovanni B. Palmerini on Mar 23, 2015
Content may be subject to copyright.
Combining Thermal and Visual Imaging in Spacecraft
Proximity Operations
Giovanni B. Palmerini
DIAEE – Astronautics, Electrical Engineering and Energetics Dept.
Sapienza Università di Roma
Rome, Italy
giovanni.palmerini@uniroma1.it
Abstract—Advances in multiple spacecraft operations require
precise determination of the relative kinematic state of the
platforms involved. Such knowledge – above all in case of non-
cooperative approaches - can be usefully attained by means of
imaging. However, the extremely varying light conditions in the
orbital environment do not allow for extended operations in the
visible spectrum. The paper investigates the possible coupling of
thermal imagery with visible one in order to have a continued
relative tracking even during eclipse. The focus is on simple
camera devices, suitable for mounting onboard microsatellites,
and on the combined (visual and infrared) image processing and
data fusion helped by the purposeful modeling of the spacecraft
dynamics and bus signature.
Keywords— microsatellite onboard camera, formation flying
navigation, in-space thermal imaging, infrared/visible data fusion
I. INTRODUCTION
Advances in space operations call for a number of
maneuvers where the accurate knowledge of the relative
kinematic state (including position and velocity as well as
attitude and attitude rates) is required onboard to drive
autonomous guidance, navigation and control (GNC) loops.
Examples of these maneuvers are represented by rendezvous
and docking between spacecraft and to the International Space
Station (ISS), by in situ servicing operations and by grasping of
space debris with the help of robotic manipulators. The
required accuracy depends on the phase of the maneuver, and
can be defined - in the frame of proximity operations – as
ranging from tens of meters to the centimeter or the fraction of.
Specifically, in case of intermediate phases of rendezvous [1]
or close formation flying [2], the distance between the involved
spacecraft is in the order of hundreds of meters or less and the
requested accuracy is better than one meter. Two to three
orders of magnitude decrease on both figures should be
considered in final rendezvous phases, as well as in grasping
operations by means of robotic manipulators.
These accuracies can be difficult and expensive to obtain by
means of active sensors, as they would present bounded fields
of view and ranges and should coexist with the limitations in
power typical of space application. At the same time, common
radio frequency ranging, with transmitting source and receiver
located onboard the two platforms, is unsuitable in non-
cooperative maneuvers, including debris capture but also
operations with older or not prepared spacecraft. Instead, visual
imaging is clearly a suitable option, and the implementation of
closed-loop systems has been studied in depth and then applied
in cooperative rendez-vous (see [3] for a proposed real time
solution process in the ISS case). On the other hand, studies are
currently in progress to better define the characteristics of
visual navigation in non-co-operative formation flying [4].
Visual imaging is of course limited to the fraction of the
orbit exposed to sunlight, that ranges between 75% and 80% of
the orbital period for low Earth circular orbits, i.e. the ones
where the proximity condition is meaningful and expected.
Moreover, the real suitability of the technique can be
furthermore reduced by blinding and outages. Aiding by other
sensors, even with a poorer accuracy, is needed to enlarge this
availability window, in order to attain a continuous tracking of
the relative position. Furthermore, this aiding could provide a
coarse but useful estimate in direction and distance for the
visual imaging system to capture the target as soon as the
spacecraft comes out from the eclipse.
Due to the fact that, in the case of interest, the two
platforms involved move along similar orbits, with almost
equal light/eclipse cycles, and to the latency proper to thermal
phenomena, a significant infrared (IR) signature is granted
even in case one of the bodies should be a spent satellites or
debris in general, i.e. in the non-cooperative case. Therefore,
thermal imaging [5] could provide the required aiding. The
typical poorer accuracy of thermal camera information is not
expected to improve the quality of results, but at least to allow
for a continuous observation and some ranging data. If a
target’s thermal model should be available, or roughly inferred
from a sequence of thermal and visual images, the
measurements’ scenario would change, and even a continuous
complete kinematic state, including attitude and attitude rate,
could be estimated.
This paper is intended to discuss these topics with specific
reference to simple imaging hardware, fitting the requirements
in terms of volume, mass and power that are typical of
microsatellites, i.e. spacecraft in the order of tens of kilograms,
where these sensors are not considered to be the main payload.
The goal is to evaluate possible advantages of a combination of
measurement techniques in visible and infrared spectra. Short
notes on the issues related to the thermographic camera
[depicted in next section] will be first provided, inspired by
recently available low cost, uncooled sensors. Similarly, the
sensing part in the visible band [Section III] will be briefly
introduced. It has to be stressed that the reference to hardware
is aimed to handle typical microsatellite limitations as well as
to indicate a class of equipment that could be easily adopted in
lab experiments. The processing of the images in both bands,
discussed together with the relevant hardware, would differ
depending upon the previous availability of a reasonable model
for the target or the need to build it on-the-fly by means of the
observations. In this paper, the application to spacecraft
formation flying with a previous knowledge of the size of the
chaser satellite will be the main focus. Details of the
mathematical model assumed for the dynamics are included,
and the observation conditions are depicted [Section IV]. The
fusion of the information generated by the two imaging
systems with different accuracy will be performed on the basis
of Kalman filtering [Section V]. Finally, the analysis of the
results obtained by numerical simulations will indicate if the
performance can be deemed of interest to enable close
formation flying or proximity operations by microsatellites.
II. THERMAL IMAGING IN SPACE
A. Hardware
The thermographic device is considered as based on
standard microbolometers, recently made available for
consumer applications, operating in the 8 to 14 μm band with a
resolution of 160 x 220 pixels. To simplify the design and
reduce the constraints on the accommodation, the detector can
be uncooled, a reasonable choice given the typical range of
temperature of satellites and the capability to properly limit –
by passive means - the thermal excursion. Also the lens is
relatively simple, with a limited focal length.
B. Image processing
With respect to the processing, the first constraint is given
by the refresh rate of the camera, which is considered capable
to deliver the readings at 9Hz frequency (standard
performance). Looking at the understanding of the image
content, the two cases of availability/unavailability of a thermal
model for the target should be discussed separately.
In case of a model already available, the idea is to process
first a binary version of the thermal image in order to define the
size mask of the target spacecraft and to evaluate its distance
from the imaging satellite. Such a binary image could be
obtained by a quite rough threshold, as within the considered
scenario the background is 4K except in the critical condition
where the sun is at local zenith, behind the imager, and the
target, at nadir, has the Earth in background. Even in that case,
however, zenith side of the target should be hotter (at least 10K
more) than Earth’s one (290K), while resolution of the thermal
cameras is remarkably better than 1°C. Then, strictly
depending on the specific characteristics of the specific target,
the Hough transform can be applied to images generated with a
sequence of different thresholds to identify geometric features
with similar temperature, as edges in conductive materials.
Basic knowledge in spacecraft thermal control systems [6]
greatly helps in obtaining a sketch of the structure behavior and
of the temperature values attained. This approximate
knowledge, which of course should take into account the
spacecraft’s materials, remains significant even with respect to
poorer detail typical of thermal models. Indeed, the slow
variation of temperature and the difference with the cold 4 K
background are factors that strongly increase the observability.
Relative position and attitude can be then computed. To notice
that the benchmark images allowing to identify possible
features are represented by a sequence of thermal pictures of
the spacecraft at different orientation stored onboard [a specific
approach to limit the size of benchmark data by considering a
number of different rotation angles as well as the intermediate
ones will be considered]. Differences between features’
size/position identified in following images produce the rate
information to complete the state estimate.
On the other hand, the case in which no model is available
approaches a blind start. Only the first operation, i.e. the
identification of the coarse mask of the target is attempted, and
by differentiating between following images the (not
exhaustive) information on the ranging velocity can be
obtained. More elaborate operations, aiming to compute the
relative state, are delayed until a number of visible images –
easier to understand – will be available, and a rough thermal
model could be inferred by operators. Indeed, a completely
autonomous relative navigation procedure with respect to a
previously unknown orbiting object seems unlikely.
III. IMAGING IN THE VISIBLE BAND
A. Hardware
The visual imaging hardware is represented by
microcamera class devices, which already proved successful
onboard several microsatellites. As in case of the thermal
camera, the focal length is short in order for the camera to be
easily accommodated. Available resolution is instead fairly
better than thermographic cameras, easily attaining 640x480
(320x240 can be considered, in agreement with previous works
[4], [7], [8]).
B. Image processing
At least two techniques to identify the target in the captured
images are available and proved to be successful in similar
studies: the Scale Invariant Feature Transform (SIFT) [7] and
the Hough transform [3]. The larger computational burden
associated with SIFT and a preliminary preference to use the
same algorithm in the two bands suggested to adopt the Hough
transform. In fact, SIFT – with the definition of the descriptors
at the key points based on the intensity gradient - has been
deemed as not enough robust with respect to the poorer quality
characteristics of the thermal images.
Figure 1. A Simple sketch of a side of a cubic microsatellite. The inner part,
largely covered with cells, will have very different optical and thermal
properties with respect to outer part where aluminum structure is exposed.
Additional components as patch antenna (pink) or mechanism (grey) appear.
≈ 1m
In some way, the issue of the previously known model
happens to be less relevant due to the increased resolution. It is
assumed that the identification of the elements will be easier in
visible than in infrared captures, and that smaller details (as an
example the patch antenna in Fig.1), once recognized in a
sequence of images, can even help in attitude determination.
IV. APPLICATION TO PROXIMITY FORMATION FLYING
The process to determine the relative kinematic state
between two satellites, one of them being the observer (deputy
or chaser) and the other labeled as chief or target, conveniently
builds upon the available knowledge about their relative
dynamics. The behavior of two spacecraft flying in close
formation is well represented by the linear model known as
Euler-Hill (EH) or Clohessy-Wiltshire equations [9]. With the
assumption of the chief belonging to a circular orbit (Fig.2),
and labeling as x and y the in-plane coordinates of the chaser in
a local vertical, local horizontal frame centered in the chief, the
dynamics allowing for closed relative trajectories is given in
the orbital plane by:
⎟
⎠
⎞
⎜
⎝
⎛−+
⎟
⎠
⎞
⎜
⎝
⎛++=
⎟
⎠
⎞
⎜
⎝
⎛++
⎟
⎠
⎞
⎜
⎝
⎛+−=
ω
ω
ω
ω
ω
ω
ω
ω
ω
ω
0
0
0
0
0
0
0
0
0
0
2sin232cos2)(
22cos23sin)(
x
yt
y
xt
x
ty
y
xt
y
xt
x
tx
(1)
where 3
r
μω
= is the mean motion along the orbit of radius r
and μ=398600 Km3 s
-2 the gravitational parameter, while the
index 0 refers to the initial conditions. The resulting orbit -
sketched in Fig.3 in the Earth centered inertial (ECI) frame,
with the deputy moving on a slightly eccentric path - is an ideal
one, valid as far as the linearization of the differential Earth
gravitational attraction between the positions of the two
satellites holds.
This simple behavior, successfully adopted in the
preliminary design of several missions, will be actually
affected by orbital perturbations. Fig.4, represented in the local
frame centered in the chief location and obtained by adding to
the EH model an approximate, but reasonably evaluated effect
of the differential solar radiation pressure, shows that the real
relative trajectory does not close exactly. The knowledge in
time of the relative kinematic state is therefore extremely
significant to exploit GNC loops performing designed
maneuvers, or even to safely maintain the orbital configuration.
With respect to the attitude, the chaser is assumed to
constantly point the Earth (nadir pointing) while the deputy is
considered to be commanded to constantly point the center of
mass of the target and to maintain it at the center of its field of
view. Furthermore, a special initial condition can be selected to
produce easier-to-compare data without losing in the general
value of the simulations. Such a condition assumes that the
deputy starts along the radial direction above the target, and
following simulations will refer to the initial configuration
where Earth, target, chaser, Sun are aligned in the given order.
This configuration (position * in Fig.3) corresponds to the
critical condition previously recalled and, due to the common
orbital period of the two spacecraft, ensures that blinding (Sun
and target falling in the sensor’s field of view at the same time)
will be avoided.
V. VISUA L / INFRARED OBSERVABLES
The observables of the two different imaging systems will
follow, in this preliminary analysis, separate paths, meaning
that each of them will be analyzed to provide distinct
information about the target. The observables themselves are
equal in nature for the two systems, while they differ as far as it
concerns the accuracy (the visible can be easily assumed to be
the better one) and the availability (the infrared sensor output is
continuous, while the visible is not).
The first observable is the distance to the target, which can
be obtained by the evaluation of the size of the target body in
terms of pixels of the digital image once the value (or a good
estimate of) of the real chief spacecraft size should be known.
The distance to the target will be given by
f
L
L
yxd
image
real
=+= 22 (2)
where f is the focal distance. Due to the (strong) hypothesis of a
perfectly centered vision, and to a cubic shape assumed for the
target (Fig.1), the characteristic dimension, i.e. the length of the
square side represented in the image Limage can be measured and
compared to the real one Lrea l (the overall problem practically
becomes bi-dimensional).
There is also a second observable, which determination is
based in the capabilities of the imaging systems. In fact a
generic image will include two sides of the cubic microsatellite
structure. As these two sides will receive very different light
and can have different exposition to surrounding environment,
they will have different signatures in both visible and infrared
bands.
Figure 2. A sketch of the orbital configuration analyzed for the dual band
image-based navigation.
Figure 3. Chief (target, with constant nadir pointing attitude) and deputy (green,
pointing constantly to the chaser) trajectories in the common orbital plane. Note
that the chaser is never blinded from the Sun while observing the target, and
sees the Earth in background only when the Sun is behind, which is also the
initial condition labeled as *.
-0.1 -0.08 -0.06 -0.04 -0. 02 00.02 0. 04 0.06 0. 08
-0.06
-0.04
-0.02
0
0.02
0.04
0.06
tangential (k m)
radial (k m)
Figure 4. Relative trajectory in the chief-centered frame. Small triangle indicate
that the observer continuously and perfectly points towards the target (chief).
Figure 5. Observable lengths of sides (effective in the IR spectrum)
01000 2000 30 00 4000 5000 600 0
0
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1
time (s)
sides in IR (m)
Zenith
Side1
Nadir
Side2
Figure 6. The observable length – along the orbit - of the different chief
satellite’s sides (fully observed orbit, i.e. effective in the IR spectrum only)
01000 2000 3000 4000 5000 6000
0
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1
time (s)
sides in visible (m)
Zenith
Side1
Nadir
Side2
Figure 7. The observable length of the different microsatellite’s sides
consid ering ec lipse (effective in the visible portion of the sp ectrum)
01000 2000 3000 4000 5000 600 0
-0.2
-0.15
-0.1
-0.05
0
0.05
0.1
0.15
time (s)
blue radial , red tang ential (km)
true
estimated
Figure 8. True and reconstructed trajectory components in the local frame
-0.1 -0. 08 -0.06 -0. 04 -0.02 00. 02 0.04 0.06 0. 08
-0.06
-0.04
-0.02
0
0.02
0.04
0.06
tangent ial (k m)
radial (km)
true
est imat ed
Figure 9. True and reconstructed (estimated) relative trajectory in
case of a fully observed orbit (view in a local orbital frame)
01000 2000 3000 4000 5000 6000
-0.15
-0.1
-0.05
0
0.05
0.1
time (s)
blue radial, red tangential (km)
true
estimated
Figure 10. Comparison between true and reconstructed trajectory without
the contribution of observables during the eclipse phase
-0.1 -0.08 -0 .06 -0.04 -0.02 00.02 0.04 0.06 0. 08
-0.06
-0.04
-0.02
0
0.02
0.04
0.06
tangential (k m)
radial (k m)
true
estim ated
Figure 11. Comparison between true and reconstructed trajectory without the
contribution of observables during the eclipse phase
As sketched in Fig.5, the analysis of these two different
signatures, that are always corresponding to
⎩
⎨
⎧
sidelateral L
side nadir / zenith L
real
real
δ
δ
sin
cos , (3)
with δ equal to the angular position of the chaser along its
relative orbit around the target, will provide the additional
observable
nadir or zenith
image
llatera
image
L
L
arctan=
δ
(4)
This observable offers insight for relative attitude, and
actually adds information content as the angle δ is strictly
related to the relative position. In fact, once the pointing of the
target is fixed and known and the one of the chaser is assumed
as correct, δ allows to exploit their relation with the position.
Figures 6 and 7 plot the behavior in time of the length of the
sides listed in Eq.(3) in the two cases where imaging can be
performed all along the orbit (i.e. in the infrared band) or the
eclipse should be considered (i.e. the visible case). From these
values it is possible to compute the different observables
reported in Eqs.(2) and (4).
VI. VISUA L / INFRARED DATA FUSION
The two imaging techniques are, at least in the space
environment, totally complementary. In fact the visible
hardware is expected to be more accurate and rich, but also
blind for a significant fraction of time, while the infrared
detectors, even with a poorer resolution and with the well-
known limits of thermal images in terms of precise shapes’
edging and temporal variation of thermal field, always provide
an output [10]. It is interesting to remark that the fact the two
sensors work simultaneously during orbit’s sunny side can be
used to adapt the relative weights between their measurements.
The idea to fuse information contained in visible and thermal
images is clearly not new [11], and the issue is basically to
identify the most proficient way to combine them. Advanced
approaches aim to blend the information included in the images
at a primitive level ([12], [13]). Even if usually poorer, the
simpler, most immediate and straightforward approach to
process separately the two synchronous images and to obtain
from each of them the estimate of the kinematic state [14], or a
part of, has been selected for this preliminary analysis. In order
to combine the two separate measurements, to take into
account the intervals – outages - where one of them should not
be available, and to duly consider the knowledge about the
dynamics, a recursive Kalman filtering seems a suitable
solution. Due to the nature of the observables, the extended
(EKF) formulation is used, while the dynamic process
dynamic, based on the EH equations described in previous
section, is linear.
The implemented filter is a standard, total state formulation
written referring to the kinematic state variables, i.e.
[]
T
y x y x x
≡, and not to their errors [15]. Classical relation
hold, with a prediction phase:
k
kk xx
Φ
=
−+1 (5)
QPP T
kkkk += +−
+
ΦΦ
1 (6)
the computation of the Kalman gain K:
[]
1
ˆˆˆ −
−− += k
T
kkk
T
kkk RHPHHPK (7)
and the updates in the state (when measurements would be
available) and in the covariance:
(
)
k
kkkk zzKxx ˆ
−+= −+ (8)
[]
−+ −= kkkk PHKIP ˆ (9)
where, as usual, Φ is the transition matrix to be obtained by the
Euler-Hill dynamics represented in Eq.(1), z is the
measurements vector, P is the covariance matrix, Q and R are
the noise matrices associated to the process and to the
measurements, the index k relates to the time instant, and the
indexes – and + define predicted or updated quantities. Aside
from these routine formulas, some interest should be devoted to
the measurement matrix H that, following EKF rules, will read
as (^ indicates the result of an evaluation at the current step):
x
z
Hk∂
∂
=
ˆ (10)
In the present case, when the two observables, i.e. the
distance and the relative angle, recovered from one (either
visible or infrared) image are considered, H reads as
⎥
⎥
⎥
⎥
⎥
⎦
⎤
⎢
⎢
⎢
⎢
⎢
⎣
⎡
−
++
=
×
00
1
00
ˆ
2
2222
24
x
x
y
yx
y
yx
x
Hk
(11)
while when both images are considered it becomes
⎥
⎥
⎦
⎤
⎢
⎢
⎣
⎡
=
×
×
×
k
k
kH
H
Hˆ
ˆ
ˆ
24
24
44 (12)
VII. FINDINGS AND REMARKS
Preliminary simulations carried on, a sample of them being
reported in Fig. 8-11, show that the fusion of visible and
infrared images is certainly beneficial, even if the R matrix
representing the expected noise in measurements is certainly
higher for thermal camera (one and two order of magnitudes in
the performed tests). In detail, comparing Fig. 9 and 11, the
reconstructed trajectory is far closer to the real one when
measurements all along the orbit – i.e. including the infrared
ones - are used. The second observable, i.e. the angle δ
representing either the angular position along the relative orbit
or the relative attitude between chaser and target has a
recognized, even if not dominant, effect on the estimate quality.
Additional analysis will take into account a larger number
of numerical tests to compute statistical indexes for the
trajectory reconstruction (not to be limited by the effects of the
random values included in some filter’s issues). Even more
important, the infrared image for the simulations will be built
on the basis of a simple thermal model of the microsatellite,
assuming standard rules for conductivity, absorption and
emissivity, able to provide temperature maps for all sides. To
take into account the different behavior of the portion of the
sides covered by solar cells such a model should include at
least 9 nodes per side.
Further improvements can be expected by fully exploiting
the attitude information coming from the difference in sides’
imaging (δ observable). The addition of more complete filters,
including multiple behavioral models [16] and recursive
schemes more complex than EKF, already tested in spacecraft
proximity dynamics [17], should work towards this goal.
Overall, the combination of visible and infrared imaging in
spacecraft proximity operations appears an interesting option.
Analysis to scenarios different from planar formation flying
referenced to a circular orbit should also be considered.
Relaxing some of the hypotheses, mainly the one on the perfect
camera viewing direction or the ideal attitude of the chaser, is
requested in order to validate the system, estimate its
robustness, and prepare for lab tests. At the same time,
reference to current available IR hardware will improve the
significance of the study.
REFERENCES
[1] CNES, Mécanique Spatiale, Toulose: Cépaduès Editions, 1997.
[2] K.T. Alfrend, S.R. Vadali, P. Gurfil, J.P. How, L.S. Breger, Spacecraft
Formation Flying: Dynamics, Control and Navigation, Burlington, MA:
Elsevier, 2010.
[3] G. Casonato, G.B. Palmerini, “Visual techniques applied to the
ATV/ISS rendez-vous monitoring,” 2004 IEEE Aerospace Conference
Proceedings, vol. 1, pp. 613-624.
[4] P. Gasbarri, M. Sabatini, G.B. Palmerini, “Ground tests for vision based
determination and control of formation flying spacecraft trajectories,”
Acta Astronautica, vol.102, pp. 378-391, 2014.
[5] R. Gade, T.B. Moeslund, “Thermographic cameras and applications: a
survey,” Machine Vision and Applications, vo.25, 2014, pp.245-262.
[6] D.G. Gilmore (Ed.), “Spacecraft Thermal Control Handbook”, 2nd ed. El
Segundo, CA: The Aerospace Press & AIAA, 2002.
[7] G. Palmerini, M. Sabatini, P. Gasbarri, “Analysis and tests of visual
based techniques for orbital rendezvous operations,” 2013 IEEE
Aerospace Conference Proceedings.
[8] M. Sabatini, R. Monti, P. Gasbarri, G.B. Palmerini, “Adaptive and
robust algorithms and tests for visual-based navigation of a space robotic
manipulator,” Acta Astronautica, vol. 83, pp. 65–84, February-March
2013.
[9] W.H. Clohessy, R.S. Wiltshire, “Terminal Guidance System for Satellite
Rendezvous,” Journal of the Aerospace Sciences, Vol. 27, No. 9, 1960,
pp. 653–658.
[10] S.G. Kong, J. Heo, F. B oughorbel, Y. Zheng, B.R. Abidi, A. Koschan,
M. Yi, M.A. Abidi, “Multiscale Fusion of Visible and Thermal IR
Images for Illumination-Invariant Face Recognition,” Int. Journal of
Computer Vision, vol.71, No.2, 2007, pp.215-233.
[11] V. Deodeshmukh, S. Chaudhuri, S. Dutta Roy, “Cooperative Infrared
and Visible Band Tracking,” Proceedings of the International
Conference on Applied Pattern Recognition, 2009.
[12] M.J. Johnson, P. Bajcsy, “Integration of Thermal and Visible Imagery
for Robust Foreground Detection in Tele-immersive Spaces,”
Proceedings of the 11th International Conference on Information Fusion,
2008, pp. 1265-1272.
[13] J. Saeedi, K. Faez, “Infrared and visible image fusion using fuzzy logic
and population-based optimization,” Applied Soft Computing, vol.12,
No.3, pp.1041-1054, 2012
[14] P. Kumar, A Mittal, P. Kumar, “Fusion of Thermal Infrared and Visible
Spectrum Video for Robust Surveillance,” Computer Vision, Graphics
and Image Processing, Springer Lecture Notes in Computer Science Vol.
4338, 2006, pp 528-539.
[15] J.A. Farrell, M. Barth, The Global Positioning System and Inertial
Navigation, New York: McGraw-Hill, 1998.
[16] M. Airouche, L. Bentabet, M. Zamer, G. Gao “Pedestrian tracking using
thermal, color and location clue measurements: a DSmT-based
framework,” Machine Vision and Applications, vo.23, 2012, pp.999-
1010.
[17] F. Reali, G.B. Palmerini, “Estimate Problems for Satellite Clusters,”
2008 IEEE Aerospace Conference Proceedings.