Content uploaded by Mahmudul Hasan
Author content
All content in this area was uploaded by Mahmudul Hasan on Aug 21, 2020
Content may be subject to copyright.
Tracking People Using Ankle-Level 2D LiDAR
for Gait Analysis
Mahmudul Hasan
1,2(&)
, Junichi Hanawa
1
, Riku Goto
1
,
Hisato Fukuda
1
, Yoshinori Kuno
1
, and Yoshinori Kobayashi
1
1
Saitama University, Saitama, Japan
{hasan,hanawa0801,r.goto,fukuda,kuno,
yosinori}@hci.ics.saitama-u.ac.jp
2
Comilla University, Comilla, Bangladesh
Abstract. People tracking is one of the fundamental goals of human behavior
recognition. Development of cameras, tracking algorithms and effective com-
putations make it appropriate. But, when the question is privacy and secrecy,
cameras have a great obligation on it. Our fundamental goal of this research is to
replace video camera with a device (2D LiDAR) that significantly preserve the
privacy of the user, solve the issue of narrow field of view and make the system
functional simultaneously. We consider individual movements of every moving
objects on the plane and figure out the objects as a person based on ankle
orientation and movements. Our approach calculates the number of frames of
every moving object and finally create a video based on those frames.
Keywords: People tracking 2D LiDAR Kalman filter Ankle level tracking
1 Introduction
Person Tracking (PT) with machine is a salient field in Human Computer Interaction
(HCI). Research on person tracking reached on utmost approximations in recent years.
It involves with surface mapping, pointing persons’position, consequent movements,
differentiate these with other properties and finally projecting these on desired surface.
Time series of individual position data enables us to analyze trajectory for many
purpose (e.g. marketing). PT with 2D and 3D cameras play significant role in different
practical applications. Real time PT from a live video makes it more robust and usable
in different scenarios. Some statistical models and their efficacies make the PT well
accepted to all. Here video camera plays the role of data acquisition and some of these
can perform enhancement of that captured data. Recently, along with the development
of deep learning-based image processing, the performance of people detection and
tracking using cameras are dramatically improved. However, when we consider using
cameras everywhere in a daily life, privacy issue cannot be ignored. In addition, some
phenomena make it difficult to use cameras in the special situation such as smoke, fog
etc. Furthermore, though the low-cost cameras (not only RGB cameras but also RGB-D
cameras) are widely used, but computational cost of image processing is not least,
besides it may go to apex in case of using deep learning techniques especially for many
cameras for wide area surveillance.
©The Editor(s) (if applicable) and The Author(s), under exclusive license
to Springer Nature Switzerland AG 2021
T. Ahram (Ed.): AHFE 2020, AISC 1213, pp. 40–46, 2021.
https://doi.org/10.1007/978-3-030-51328-3_7
Our focus of this research is to use a sensor that will not compromise with privacy
but enhance the efficiency of tracking. To cope with these problems, we propose a new
people tracking technique using 2D LiDAR. Cost and real-time computational facility
payed as the key influence behind this. To minimize the occlusions of pedestrians each
other, we put 2D LiDAR at an ankle level and range the target area horizontally. Main
issue of tracking people in this sensor setup is how to discriminate individuals from the
isolated observation of multiple ankles of multiple persons. Here, we proposed an
avant-garde method to use the time series data of ranging for classifying individuals.
Individual ankles trajectories were considered for movement detection. Distances
between ankles were calculated by well-known Euclidean nearest neighbor technique.
This approach helps to determine the cluster of every pairs of ankles as a person. We
clearly identified the paths of walking, running even if it goes very fast. This approach
calculates the number of frames of every moving object and finally creates a video
based on those frames. These videos can be further used for surveillances or any other
use. Our method provides very accurate and robust tracking when the target is walking
or running. Experimental result shows the effectiveness of our proposed method.
2 Related Works
LiDAR stands for Light Detection and Ranging is a sensing device that can measure
variable distances on the plane. Many researchers contributed on this tracking [1]field
using cameras or LiDARs. Trajectory based human behavior analysis [2] using LiDAR
was our initial appreciation of better use of this sensor. Then museum guide robot [3,4]
shows how LiDAR works on practical scenario.
Specific person tracking is a vital task in daily applications. Misu, K. et al. [5]
illustrates an approach for identifying and tracking a specific individual in outdoor by a
mobile robot. Here their robot used 3D LIDARs for person recognition and identifi-
cation, and a directivity-variable antenna (named ESPAR antenna) for finding a certain
person if he is occluded and/or goes out-of-view. On the other hand, Yan, Zhi et al. [6]
presented a framework which permit a robot to learn a sophisticated 3D LiDAR-based
person classifier from other sensors time to time and getting benefits of a multi-sensor
tracking system. Here, Koide, Kenji et al. [7] depicted a human identification system
for a mobile service robot using a smart cellphone and laser range finders (LRFs). All
these approaches used 3D LiDAR for their purposes which is not cost effective, but in
our method, we prepared our system as it can track people with 2D LiDAR in outdoor
and indoor also.
Claudia, A.A. et al. [8]defined an experimental outcome using a single LIDAR
sensor to provide a continuous recognition of an individual with respect of time and
space. This system was based on the People Tracker package, aka PeTra, that used a
convolutional neural network (CNN) to detect person legs in intricated scenarios. This
system tracks only an individual with timely basis, but our system can track multiple
people on the surface same time. Dimitrievski M. et al. [9] introduced a unique 2D–3D
pedestrian tracker created for applications in autonomous vehicles. Here they used
multiple sensors for the same. Sualeh, M. et al. [10] proposed a vigorous Multiple
Object Detection and Tracking (MODT) procedure, using various 3D LiDARs for
Tracking People Using Ankle-Level 2D LiDAR for Gait Analysis 41
perception. The combined LiDAR data is considered with an effective MODT struc-
ture, think about the shortcomings of the vehicle-embedded handling situation. Bence
Gálai et al. [11,12] introduced a performance analysis of numerous descriptors suitable
to person gait analysis in Rotating Multi-Beam (RMB) Lidar measurement systems. All
these methods are based on multiple sensor-based methods and need gigantic calcu-
lations to make it robust. Qing Li et al. [13] presented a unique deep convolutional
network pipeline, LO-Net, for immediate lidar odometry valuation. Jiaxiong Qiu et al.
[14] suggested a deep learning architecture that delivered perfect dense depth for the
alfresco scene from a single-color image and a sparse depth. These applications are
specially prepared for autonomous driving based on LiDAR and depth sensor cameras
for tracking.
Research on person tracking and their positioning is wide covering and conducted
over a period. Many researches on this arena used cameras, 2D/3D LiDARs, ultrasonic
sensors etc. for their experiments. But based on only 2D LiDAR person tracking and
gait analysis is really a challenging and new concept. We concentrate on the topic and
showed how it can be traced. Our focus is to develop a low-cost 2D LiDAR based
tracking system which can be used any application of tracking without interfering
human privacy.
3 Proposed Method
We introduce a tracking method based on LiDAR sensor. Our method works on
different environment. Here we have used 2D lidar sensor for its lower price and
computational effectiveness. For our experiment we used HOKUYO UMT30LX Lidar
sensor in a plane ground surface. We placed our LiDAR at ankle level and collected
our data in 270-degree directions. When people are walking within the range of LiDAR
sensor it collects actual position of the moving object and distance from its own
position. We plot ankle positions of persons on the plane and tracked their movements.
As shown in Fig. 1. a LiDAR is placed in the ankle level of a person and it provides
visual information to the corresponding computer. In the first frame white lines indicate
the boundaries of the surface, i.e. walls. We then remove the boundaries from the frame
by background subtraction and showed only ankle position on the surface. The green
marked lines indicate the ankle position of a person. If more than one person appears in
front of the sensor it clearly identifies the persons. Distances between pixels of an ankle
and between two ankles are measured and used for getting decision about number of
people moving in front of a sensor. An experimented threshold is used for making
decision of a person in different circumstances. Here we used Euclidian distance
measurement approach for tracking persons.
dist xa;ya
ðÞ;xb;yb
ðÞðÞ¼
ffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffi
xaxb
ðÞ
2þyayb
ðÞ
2
rð1Þ
where, dist is a function for calculating distance between pixel (x
a
,y
a
) and (x
b
,y
b
)ofan
ankles’position then between all appeared ankles on the plane.
42 M. Hasan et al.
Depending on the positional relationship with the sensor sometimes it is possible
that one ankle occludes another one. But this situation does not influence the decision.
By using cluster-based techniques, it does not assume that both feet are visible. The
system only calculates the distance between ankles appeared in the LiDAR field of
view. If there is no other ankle is found, system identify that there is only one people
walking on the floor. If the distance is larger than the threshold our system tracks the
ankles as a separate person even though one ankle position is found. Thus, system
overcomes the problem of obscure and disappearance of feet on the floor. Here in
Fig. 2. it shows that our system clearly identifies the ankles with disappearance. In two
frames before, here it calculated the ankle distance and found that it is more than
maximum threshold and there are two persons here. But in one frame before the
distance became smaller and it counted that this is within the range and there is only
one person here. Finally, in current frame it finds that one ankle is occluded by another
and without misclassification it shows that there is one person in the frame.
Fig. 1. (a) Lidar sensor placed on ankle level and getting data (b) Persons’ankle movements
and standing positions are showed and (c) Tracking people with a handler marker sign ‘|’and
shows his/her moving directions.
Fig. 2. (a) ankles are in different positions and it goes in overlapping position, (b) frame by
frame detection
Tracking People Using Ankle-Level 2D LiDAR for Gait Analysis 43
4 Experimental Results
There are no well-known data sets for 2D LiDAR based person tracking and gait
analysis. For the experiments we prepared our own data sets and appraise our method
on this benchmark, which has 35 samples with 27 female and 8 male participants. The
standard contains two scenarios: normal and highly crowded. We ensure that we
evaluate the accomplishment of our proposals on the validation data set. We used
Kalman filter to predict for tracking individuals. Here our system can track a person
even if only one ankle is appeared on the frame.
In Fig. 3. we clearly see that in different criteria our system can predict the
movements of different persons with utmost accuracy. Here in the first upper frame
individual walking is shown, and corresponding lower frame shows that person is
going far from LiDAR. In 2
nd
frame one person is running in different directions then
3
rd,
4
th
and 5
th
frames show other scenarios and corresponding lower frames show their
positions tracked by LiDAR using Kalman filter technique respectively.
Individual
Walking
Individual
Running
Only Ankle
Movement
Combined
Walking
Combined
Running
Fig. 3. Kalman Filter based movements in different circumstances; Upper Row: Ankle Positions
on the frame; Lower Row: Direction of the movements
Table 1. Experimental data and its performance
Individual
Walking
Individual
Running
Only
Ankle
movement
Combined Walking Combined Running
Persons/Frames 4 48
Frms
451
Frms
443
Frms
242
Frms
347
Frms
443
Frms
247
Frms
347
Frms
442
Frms
Gesture Correctly
Identified
448
Frms
450
Frms
440
Frms
241
Frms
344
Frms
439
Frms
236
Frms
333
Frms
329
Frms
Percentage 100% 98.04% 93.02% 97.62% 93.62% 90.70% 76.60% 70.21% 69.04%
44 M. Hasan et al.
In Table 1. we showed different persons and their recoded videos for our experi-
mental evaluation with different gestures. We categorized our experiments into 5 dif-
ferent ways. Here individual walking, running, only ankle movement and combined
walking can be tracked with absolute confidence. We have few observations on
combined running scenario. We considered 4 peoples and their captured LiDAR videos
for performance evaluation. For validation we considered near about 4 s videos of
every person in different gestures where from each video we considered 42–51 frames.
From the above table we see that this system performs relatively flat on different
running situations. But compared with other camera-based systems the performance is
impressive. A gait analysis and person height estimation based on ankle movements is
also performed on the dataset and we interestingly found some consequences with
walking and running patterns.
5 Conclusion
In the cyber world a person is being tracked every time, everywhere. But when the
question is privacy, people want a space from eyes around him, even it is in real or
virtual world. On the other hand, questions of surveillance cannot be ignored. Here
LiDAR plays as a kernel of these issues. Without disclosing persons identity our
approach tracks a person and identifies his/her movements. Now our system is ready
for commercial use. We will enhance our study to estimate the property of human
activities using LiDAR. We are walking on Gait analysis using Ankle level 2D LiDAR.
In future we will integrate the tracking and analyzing into one system.
References
1. Chen, L., Ai, H., Zhuang, Z., Shang, C.: Multiple people tracking with deeply learned
candidate selection and person re-identification. In: Proceedings of Multimedia and Expo
(ICME) (2018)
2. Rashed, M.G., Suzuki, R., Yonezawa, T., Lam, A., Kobayashi, Y., Kuno, Y.: Robustly
tracking people with LIDARs in a crowded museum for behavioral analysis. IEICE Trans.
Fundam. Electron. Commun. Comput. Sci. E100 A, 2458 (2017)
3. Oyama, T., Yoshida, E., Kobayashi, Y., Kuno, Y.: Tracking visitors with sensor poles for
robot’s museum guide tour. In: Proceedings of Human System Interactions (HSI), Sopot,
pp. 645–650 (2013)
4. Oyama, T., Yoshida, E., Kobayashi, Y., Kuno, Y.: Tracking a Robot and Visitors in a
Museum Using Sensor Poles. Proc. of Frontiers of Computer Vision, pp. 36–41, (2013)
5. Misu, K., Miura, J.: Specific person detection and tracking by a mobile robot using 3D
LIDAR and ESPAR antenna. In: Proceedings of Intelligent Autonomous Systems
(IAS) (2014)
6. Zhi, Y., Sun, L., Duckctr, T., Bellotto, N.: Multisensor online transfer learning for 3D
LiDAR-based human detection with a mobile robot. In: Proceedings of Intelligent Robots
and Systems (IROS), pp. 7635–7640 (2018)
Tracking People Using Ankle-Level 2D LiDAR for Gait Analysis 45
7. Koide, K., Miura, J.: Person identification based on the matching of foot strike timings
obtained by LRFs and a smartphone. In: Proceedings of Intelligent Robots and Systems
(IROS), pp: 4187–4192 (2016)
8. Álvarez-Aparicio, C., Guerrero-Higueras, Á.M., Rodríguez-Lera, F.J., Clavero, J.G., Rico,
F.M., Matellán, V.: People detection and tracking using LIDAR sensors. Robotics 8,75
(2019)
9. Dimitrievski, M., Veelaert, P., Philips, W.: Behavioral pedestrian tracking using a camera
and LiDAR sensors on a moving vehicle. Sensors 19, 391 (2019)
10. Sualeh, M., Kim, G.-W.: Dynamic Multi-LiDAR Based Multiple Object Detection and
Tracking. Sensors. 19, 6, 1474 (2019)
11. Gálai, B., Benedek, C.: Feature selection for Lidar-based gait recognition. In: Proceedings of
Computational Intelligence for Multimedia Understanding (IWCIM), Prague, pp. 1–5 (2015)
12. Benedek, C., Nagy, B., Gálai, B., Jankó, Z.: Lidar-based gait analysis in people tracking and
4D visualization. In: Proceedings of European Signal Processing Conference (EUSIPCO),
Nice, pp. 1138–1142 (2015)
13. Li, Q., Chen, S., Wang, C., Li, X., Wen, C., Cheng, M., Li, J.: LO-Net: deep real-time lidar
odometry. In: Proceedings of Computer Vision and Pattern Recognition (CVPR), pp. 8473–
8482 (2019)
14. Qiu, I., Cui, Z., Zhang, Y., Zhang, X., Liu, S., Zeng, B., Pollefeys, M.: DeepLiDAR: deep
surface normal guided depth prediction for outdoor scene from sparse liDAR data and
single-color image. In: Proceedings of Computer Vision and Pattern Recognition (CVPR),
pp. 3313–3322 (2019)
46 M. Hasan et al.