Conference PaperPDF Available

Remote experimentation using a smartphone application with haptic feedback

Authors:

Abstract and Figures

This paper presents the development of a remote experimentation for automatic control engineering education. By using a smartphone application, students can interact with real laboratory didactic equipment. Through a friendly graphical user interface and information provided by built-in smartphone sensors, students can modify the process parameters and verify in real time what’s happening on the experimental setup. Mobile device vibration technology is used to add haptic tactile feedback to the application. This additional sense complements the visual and hearing information the user usually acquires when performing a practical experience. Therefore, students pay more attention to the results achieved online. Taking advantage of today’s smartphone technological features, the students feel highly motivated to use this device to achieve improvements in their learning process. To reach a great number of users, the application runs across the most widely used mobile platforms. All user lab activities are stored in a database for the teacher later analysis. Likewise, students can store the resulted experiments on their mobile device when finished in order to make an off-line result analysis.
Content may be subject to copyright.
Remote Experimentation Using a Smartphone
Application with Haptic Feedback
Ernesto Granado
Universidad Simón Bolívar, Caracas, Venezuela
Prometeo Project Researcher-SENESCYT, Ecuador
Universidad Politécnica Salesiana, Cuenca, Ecuador
granado@usb.ve
Flavio Quizhpi
Julio Zambrano
Universidad Politécnica Salesiana
Cuenca, Ecuador
{fquizhpi, jzambranoa}@ups.edu.ec
William Colmenares
Universidad Simón Bolívar
Caracas, Venezuela
williamc@usb.ve
AbstractThis paper presents the development of a remote
experimentation for automatic control engineering education. By
using a smartphone application, students can interact with real
laboratory didactic equipment. Through a friendly graphical
user interface and information provided by built-in smartphone
sensors, students can modify the process parameters and verify in
real time whats happening on the experimental setup. Mobile
device vibration technology is used to add haptic tactile feedback
to the application. This additional sense complements the visual
and hearing information the user usually acquires when
performing a practical experience. Therefore, students pay more
attention to the results achieved online. Taking advantage of
todays smartphone technological features, the students feel
highly motivated to use this device to achieve improvements in
their learning process. To reach a great number of users, the
application runs across the most widely used mobile platforms.
All user lab activities are stored in a database for the teacher
later analysis. Likewise, students can store the resulted
experiments on their mobile device when finished in order to
make an off-line result analysis.
Keywordsadobe AIR; control engineering education; haptic
feedback; remote experimentation; technology enhanced learning.
I. INTRODUCTION
Cell phones have had a quick development in recent years.
They have become from simple devices to receive and make
calls to the really powerful communication-computing tool that
are today. These smartphones contain an advanced mobile OS
that allows to run several software applications such as: word
processors, spreadsheets, and play multimedia. They have
wireless connectivity technologies such as Bluetooth and Wi-
Fi, and also have Internet access capability that allows to send
and receiving e-mails, browse the web, and chat through them.
Users, in particular young people, like to use them to perform a
wide variety of daily tasks. For instance: taking pictures,
listening to music, communicating with others, reading news,
checking bank statements, paying accounts, buying goods,
playing video game, and so forth. Thus, these devices that
students carry on as a garment, and like to interact with at all
times, have become a convenient tool that increases students
attention and improve the teaching/learning methods.
This work goes a little step further in the use of mobile
devices that will improve the teaching/learning process. The
following described application contributes with the remote
laboratory experimentation development, where the
smartphone technology features are taken into account to create
an attractive application. The built-in sensors are used to
interact with the application, and the built-in vibration
technology offers haptic tactile feedback. Although there are
feasibility limitations in the current vibration technology on
mobile devices, the vibratory stimulus that students can feel in
their hands can be used to emphasize the experiment control
signal behavior. Having additional information besides vision
and audition, the learning/teaching process can be enhanced.
This work contributes in the unexplored field of potential use
of haptic feedback in remote laboratories.
The main aim of this application is to provide students with
a tool that will enable them to take advantage of their spare
time at the University. Meanwhile the lab doors are closed; the
students are able to do the lab practices while waiting for the
next class. They will do these practices within the university
campus by connecting to the control lab wireless network.
Its well known that the growing demand for science and
engineering careers calls for more university resources.
Nowadays, the need for lab equipment is not usually enough to
fulfill due the lack of economical funds and physical space
limitations. This applications allows the efficient and intense
use of the existing lab equipment, since the students could
perform their experiments even when the lab and its facilities
are closed.
978-1-4673-8633-3/16/$31.00 ©2016 IEEE 10-13 April 2016, Abu Dhabi, UAE
2016 IEEE Global Engineering Education Conference (EDUCON)
Page 240
The remote experimentation system can be easily adapted
to any kind of physical experimental hardware and also be
extended to other science or engineering laboratories areas.
II. RELATED WORKS
The opportunities offered by mobile technologies in
learning anytime and anywhere, introduced the mobile learning
or m-learning concept [1], [2]. M-learning has evolved
considerably during the last decade and many teaching process
have been developed using a mobile phone. Some of them are:
[3]  [6] and therein references. More recently, there has been
also important research in the area of remote laboratories in
teaching science and engineering. For example, the
transformation of remote lab WebLab-Deusto (at University of
Deusto, Spain), into a Web 2.0-enabled application accessible
from any mobile devices is described in [7]. Reference [8] has
shown two strategies to adapt a desktop remote laboratory to
mobile devices. The first one, reuses the Web based code but
adjusts it to the mobile device features. The second one uses
native mobile technologies to employ all the resources that the
mobile device provides. In [9] the authors suggest a
smartphone multimedia learning environment. The mobile
device is used to display a web-based multimedia lecture
explanation that reviews the theoretical material. Also, the
smartphone is used to control the system of a medium size
industrial process through Bluetooth communication. In [10]
its presented a mobile application that enables the users to
control real experiments and monitor results via video
streaming, providing a sense of involvement similar to hands-
on laboratories. The mobile application maintains compatibility
among the widely used platforms.
A vibrotactile feedback technology is currently used in
portable devices to provide silent alerts, to confirm keystroke
action when typing a text on a virtual keyboard, and to enhance
experience when playing mobile videogames. Numerous
literature research have show the use of haptic feedback
technology mainly focused in the improvement of human-
machine interaction. Some interesting works in the
teleoperation area are [11] [15] and therein references.
Experiments have shown that the haptic feedback can improve
performance in teleoperation. There are also many haptic
researche in virtual reality applications that enrich the sensorial
perception of a virtual environment, some of these are: [16]
[18] and in press [19].
In recent years, there has been an interest in exploring the
possibilities of haptic technology in mobile handheld devices.
The sense of touch, as a channel for handheld communication
devices, was investigated in [20]. They embedded a PDA with
a universal tactile display (TouchEngine) that can produce a
wide variety of tactile feelings, which can go from simple
clicks to complex vibrotactile patterns. In [21] authors present
a miniature actuator embedded in a handheld for users thumb
lateral skin stretch. The lateral deformation to the fingertip skin
resulting in a dynamic tactile sensation. Reference [22] shows a
smartphone based system which uses a built-in vibrator and
accelerometer that recognizes the type of surfaces contacted by
the mobile device. In [23] a mechanism for mobile devices that
provides realistic and interactive haptic feedback is presented.
A thin actuator is installed inside the case to simulate a rapid
realistic response stimulus.
III. ARCHITECTURE AND COMPONENTS
The remote experimentation system has the typical
client/server architecture, and consists of three components: the
server, the client and the supervisor application. The fig. 1
shows the system architecture and components.
Fig. 1. Remote experimentation architecture.
The server is a standard computer connected to an
experiment setup via a standard data acquisition (DAQ)
system. The experimental hardware can be any physical
experimental process used for didactic purposes with electrical
input and output signals, such as: Quanser Innovate Education
[24], Feedback Instruments Limited [25], and Festo [26],
among others.
The Quanser tanks system is being used for
experimentations. The process consists of a one pump and a
coupled water tanks, forming an autonomous closed and
recirculating system as shown in Fig. 1. This system has two
uniform cross sections tanks that are mounted on the front plate
above a reservoir where water is stored. The water flows from
the first (upper) tank into the second (lower) tank, and the
water outflow from the second tank returns to the water
reservoir. Each tanks has an outflow orifice at the bottom from
which the liquid is withdrawn, and the pump drives it from the
bottom basin to the top of the system. The water level is
measured using a pressure-sensitive sensor located at the
bottom of each tank. The system input is the pump voltage, and
the system output is the two tanks measured levels.
The client is a smartphone from which students can
manipulate experimental hardware and verify if the control
strategy fulfils the design specifications.
On the other hand, the supervisor consists of an application
that allows the course teacher to supervise the students learning
978-1-4673-8633-3/16/$31.00 ©2016 IEEE 10-13 April 2016, Abu Dhabi, UAE
2016 IEEE Global Engineering Education Conference (EDUCON)
Page 241
process. From his personal computer, the teacher, can accesses
the data base (DB) system that stores all activity performed by
students in the lab.
Client-server communication is performed using the
TCP/IP protocol, and both devices must be on the same
wireless sub-network. Only one client at a time can be
connected to the server. Each user can use the system
according to the laboratory schedule.
IV. SERVER APPLICATION
The server application was developed using LabVIEW
platform [27] version 2014. This software is a powerful and
useful graphical development tool, since it provides real
solutions to practical problems like: data processing, data
communication applications, and computer instruments
interfacing. Also, LabVIEW enables inexperienced
programmers to easily develop complex applications.
As shows in fig. 2, the server application performs the
following five tasks: TCP/IP communication, control
strategies, data acquisition communication, data base stored,
and hardware safety. The server application uses the
Producer/Consumer design pattern [28]. Each task runs on its
own timed loop at different rates and the five loops are parallel
executed. This software architecture allows easy application
maintenance since each task is programmed as a standalone
module. For instance, if the TCP/IP communication or control
strategies task needs to be changed, those modifications are
done only on this modules without affecting the other ones
within the application. As a result, this facilitates updates
performance without affecting the systems integrity.
Fig. 2. Server application tasks.
The TCP/IP communication module is responsible to
manage communications with the client; that is, process the
orders sent by the user through the mobile device. Besides that,
it is also responsible to periodically transmit the process
variable values to the client during the experimentations.
The control module is in charge to manage the control
strategy. Students are able to select the controller types they
have studied in the theory sessions like: PID, lead/lag, state
feedback, etc. Different control strategies can be implemented
in this module and new controller types can also be easily
added. Due to the importance of the control tasks in the correct
operation of the experiment setup, this module presents the
highest execution priority and the lowest loop time period.
The data acquisition module manages the input and output
DAQ channels. This module sends/receives data to/from the
experimental setup according to commands received from the
client application. According to the experiment being used
(tank, motor, etc.), the system's administrator may adjust the
sampling and control times.
The DB module is in charge of the data base management.
All the activity performed by the student is stored for the
teacher later analysis. The system has four relational tables.
The first table contains the students identification information
such as: name, surname, student ID, password, email, etc.
When the user tries to connect to the server lab, the login
information is validated against data that is stored in this table.
The second table stores the time and date the student has
logged in and out. The third table stores all the information
created by the student during the connection time. Each test the
user performs is stored in this table (control parameters, set
point, output, and control signal evolution). The fourth and last
table is administrated by the course teacher and presents the
system access information that the student is allowed to do,
such as: the scheduled laboratory practices, the maximum lab
time usage, and the qualified practices to be perform.
The safety module prevents the students inappropriate
hardware manipulation. Depending on the experimental setup
used in the laboratory, the maximum tank level, temperature
and speed motor are configured. This safety value is adjusted
by the systems administrator. If this value is reached, the
actuators voltage (pump, motor, etc.) is suppressed, and the
user cannot make a new test until the laboratory equipment
reaches the initial condition.
V. CLIENTS APPLICATION
Adobe Flash technology [29] is used to develop the client
application with a well-designed friendly graphical user
interfaces. This software is suitable for graphic design
development. It also allows to deploy Adobe Integrated
Runtime (AIR) applications to be able to run across different
mobile platform such as: Google Android, Apple iOS and
Microsoft Windows Mobile, allowed deliver app to much
larger audiences. In other words, to execute the application in
the mobile device, only the AIR runtime must be downloaded
and installed. The ActionScript 3.0 [30] is used as the
programming language.
A. Tasks
The client application performs the following four tasks:
TCP/IP communication, graphical display, data file saving, and
built-in sensors/actuators. The client tasks are showed in Fig. 3.
978-1-4673-8633-3/16/$31.00 ©2016 IEEE 10-13 April 2016, Abu Dhabi, UAE
2016 IEEE Global Engineering Education Conference (EDUCON)
Page 242
The TCP/IP communication module is responsible to
manage communications with the server. The client both
initiates and terminates the communication with the server. All
the actions performed by the user on a mobile interface are sent
to the servers application, among others, the following can be
mentioned: the experiment start and stop commands, the
process parameters modification, and the control strategies
selection. In addition, TCP/IP communication module receives
from the server the values resulted from the experimental
setup.
Fig. 3. Client application tasks.
The graphical display task allows the real time update of
both graphical animations and other information shown in the
interface. This is being performed through signal values
resulted from the experimental setup.
The save data file task enables the user to store in the
mobile device a text file with all the experimental data.
Afterwards, the student can use this file with any
computational tool like Matlab [31] o Scilab [32] to make the
report and perform an off line analysis. Finally, the student
must deliver this report to the course teacher the week after of
the experiment performance.
The built-in sensors/actuators task controls the smartphone
sensors and actuators. Although the touch screen is the main
way of interaction with the client application, the user can also
use the built in sensors for data input. This module obtains the
information provided by the accelerometer sensor and also in
charge to send to the vibration actuator the control value signal
resulted from the experiment to produce the vibrotactile
feedback.
B. Haptic Feedback
Nowadays several smartphones come with vibration
technology. The oscillating movement that the user can feel,
known as vibrotactile feedback, is created by an actuator
controlled by software. This sensation is experienced when
typing a text and a specific keystroke has been registered.
There are three main technologies for these actuators [33]: the
Eccentric Rotating Mass (ERM), the Linear Resonant (LRA)
and the Piezoelectric Actuator.
In this work, the haptic tactile feedback is used to enhance
the remote experimentation experiences. The vibration is used
to indicate the control signal amplitude, so the student can feel
the control signal differences between various controllers
types. As shows in Fig. 4, the built-in sensor/actuator task is in
charge of create a vibration intensity proportional to the control
signal magnitude.
Fig. 4. Haptic generation from control signal value.
This additional sense complements the acoustic and visual
information that helps the student to get a more precise feeling
about control signal behavior.
Nowadays, the smartphones more standardized vibration
technology is the ERM. It is important to mention that even
though the vibration amplitude is not possible to be changed
with this technology, the time duration can be modified.
Therefore, the ERM on and off timing must be correctly
controlled to generate different vibration intensities. That is,
the motor spin duty cycle must be modified to generate
different vibration intensities. On the other hand, since the
motor is an inertial system, both a start-up and stop time is
required. To conclude, this latency must be taken into account
when the motor vibration should be turned on/off in a
periodical patterns as shows in the Fig. 5. In addition, this
latency time varies since it depends from the motors speed and
acceleration.
Fig. 5. Control signal to pulse vibration duration conversion.
200 ms was selected for the haptic feedback period. This
time frame coincides with the communication or actualization
time values from the experimental setup. Different vibration
sensations are needed to be generated for different control
signal values. Due to the ERM latency time, it is not possible to
associate one control signal value with one vibration intensity
978-1-4673-8633-3/16/$31.00 ©2016 IEEE 10-13 April 2016, Abu Dhabi, UAE
2016 IEEE Global Engineering Education Conference (EDUCON)
Page 243
value. For this reason, a different vibration intensity must be
generated for different control signal values range, as shown in
the Fig. 5. Besides that, it is also required to create a soft
vibration for the small control signal values, and a very strong
vibration for the higher control signal values. Then main
objective centres in the creation of great vibration patterns
quantity easily differentiated and recognized by the user.
After several tests, four different duty cycle patterns have
been defined to spin the vibrating motor. These four vibrating
pulse train patterns generate different vibrating sensations that
the user can easily differentiate and recognize. It is important to
mention that if a higher duty cycle patterns value is used, the
vibrating intensity cannot be differentiate. A vibration with a
duty cycle of 15% value results from a control signal value
between 0,5 and 2,5 V. As a result, the user senses a very soft
but appreciable vibration generated by the telephone. As the
signal control value increases, the motor activation duty cycle
also rises, as shown in the Fig. 5. At the same time, this
phenomenon also intensifies the vibration intensity. The
maximum vibration intensity is generated with a duty cycle of
60%, and belongs to the control signal of 7,5 at 10V voltage
range. The maximum voltage that the DAQ can measured is
10V.
Even though mobile devices vibration technology still
presents some limitations, it is used to explore the possible
benefits that the haptic feedback can bring to the remote lab
learning process.
C. Graphic User Interface
The graphical user interface is designed in suitable ways for
the mobile device small screen size. Fig.6 shows the
smartphone application user interface. The student interacts
with the experimental setup through this graphical interface.
Fig. 6. Client application user interface.
The interface has on its right hand side the main control
bar. This control bar has four buttons: start/stop, view, save and
exit. To disconnect from the server the exit button must be
pressed. The start/stop buttons are used to start and stop the
experimental test. The view button is used to switch quickly
between the two application views: the graphical view (bottom
of Fig. 6) and the animation view (top of Fig. 6). The save
button is pressed to store in the smartphone a text file with data
obtained from the experiment, and it is only enabled when the
user has concluded the test.
The animation screen view displays an experimental setup
diagram that shows in real time the data resulted from the
experimentation. From here, the user may also change the
controller parameters and the reference level set-point. In
addition, by pressing the top screen tabs the student can also
select the control strategies needed to be studied, such as: PID,
lead/lag, state feedback, etc.).
The graphical view area is composed of two graphs. The
first one on top plots the evolution of the output and reference
signals. The other graph in the bottom, displays the control
signal evolution. Both plots are also in real time.
This graphs display 120s of information, if the
experimentation exceeds this time limit, the plots will
automatically move to the left to display on the screen the latest
values obtained from the experimental setup.
To be able to verify the past values, the user can move the
plots to the right or to the left by horizontally inclining the
phone. As the inclination angle increases towards the vertical
axis, the plots movement is greater and faster, such as
illustrated in the fig. 7. The smart phone accelerometer is being
used to identify the sided phone inclinations performed by the
user at this time. This calculations are performed inside the
built-in sensor/actuator task.
Fig. 7. Horizontally inclining to review past values.
978-1-4673-8633-3/16/$31.00 ©2016 IEEE 10-13 April 2016, Abu Dhabi, UAE
2016 IEEE Global Engineering Education Conference (EDUCON)
Page 244
VI. STUDENTS SUPERVISION APPLICATION
The students supervision application represent the
systems most important component of remote
experimentation. It allows checking the students overall
learning process. This application has been also developed in
LabVIEW.
All the activities performed by students are stored in the
server database. The students supervision activity application
enables the extraction of this information and its graphical
display. This allows the teacher to analyse the students lab
performance and evaluate their learning process. The Fig. 8
shows the teachers interface tool.
This application displays four rows distributed as follows.
The first row, located on the top of the screen, shows the
student information (name, surname, ID) and studying course.
The second row, located below the first one, displays the server
connections information the user performed, like the date and
login/logout time. This row also shows the dispositive features
used by the student to get connected to the remote
experimentation system. The third row shows the controller's
types and parameters used in the test. Finally, at the bottom of
the screen, the output variables/reference and control signal
evolution graphs are shown.
Besides the above mentioned, other statistical information
like the total laboratory entries and tests performed on each
work session, are also shown.
Fig. 8. Supervision student activity application.
Due to the fact that all students activity is stored in the
laboratory data base server, it is possible to include any type of
statistical analysis that the course teacher necessarily considers.
VII. REMOTE EXPERIMENTATION ACTIVITY
For every lab session students must do the following
activities:
Execute a pre-laboratory report with the controller
design before testing it in the experimental setup. This
activity should be shown to the teacher to be able to
access the remote experimentation server.
Connect to the remote experimentation server through a
smartphone.
Select the controller type to be used, set all the
experiment parameters through the use of graphical
interface and run the experiment.
Save the data collected in their devices for later off line
analysis.
Submit a written report describing what they have
learned. This report must be made a week after the
experiment was done, that way, the teacher will be able
to measure the students learning outcome.
VIII. PILOT EXPERIENCE RESULTS
The pilot experience course was implemented on the 4th
academic year of the electronic engineering obligatory
automatic control course. The above mentioned course is done
in a twelve week period as follows: 4 hours a week of
theoretical sessions, two written tests, and five practical
experiences that are carried out every two weeks.
The pilot experience has been widely accepted and
preferred by the students because they found that this remote
experimentation application is easy and fun to use, and felt
highly motivated to use it in their free time at the University
instead of attending a regular experimental practical class. In
both instances, equal learning experiences are obtained since
all students must present a final results analysis report. The
only difference between both methods falls on the laboratory
equipment manipulation. Meanwhile some students operate
this equipment through the remote device according to their
time schedule; others locally hands-on the same equipment
when attending the laboratory classroom.
The student have applied a great effort to achieve better
results in the controllers design when using this client
application. The haptic feedback allows students to notice
when the control signal exhibited an inappropriate behavior,
that way, they can make design corrections to obtain better
answers. With the usage of the clients application previous
version [34], the students only acknowledged the system
behavior feedback through the graphical interface; therefore,
when the students observed an inappropriate control signal
behavior, they werent interested to improve their designs.
Adobe Flash technology allows creating well-designed
friendly graphical user interfaces attractive to students. This
program also permits the applications deploy to run across
most important mobile platforms such as: Android, iOS and
Windows mobile. In other words, it was possible to increase up
to a 90% the number of remote experimentation system users,
978-1-4673-8633-3/16/$31.00 ©2016 IEEE 10-13 April 2016, Abu Dhabi, UAE
2016 IEEE Global Engineering Education Conference (EDUCON)
Page 245
compared to the 10% of users from the previous application
version mentioned in the previous paragraph.
IX. CONCLUSIONS
Nowadays, the technological advances achieved by the
smartphones has transformed this electronic devices in
powerful tools to be used in the effective teaching and learning
process.
This paper presents a mobile device application that allows
the manipulation and control of physical laboratory equipment.
By the usage of a smartphone, students are able to do
experiments on real lab equipment at their convenience during
their spare time when attending the University campus.
The remote experimentation system encourages the lab
equipment intense usage because it enables the students to do
the experiment even though the lab is closed. It also easily
adapts to a variety of experiments, and its usage can be applied
to other engineering or science lab areas.
Even though the actual vibration technology developed on
mobile devices is still very limited, the haptic feedback permits
to incorporate new experimental remote possibilities that will
actually improve the teaching and learning process.
This work describes how the haptic feedback was used to
complement the visual and audio information that the students
normally get when doing their experimental practices. This
tactile information sensed by the students helped them to easily
improve the outcome of their controllers design.
The students supervision application helps the course
teacher to control each student laboratory activity, and evaluate
their learning process.
X. FUTURE WORK
Future work will be oriented in the following directions:
Incorporate the capacity to manage video into the
system. This should allow the user to see the real
experimental setup in the mobile device, and also could
provide a greater sense of telepresence to the
application.
Add an online supervision module that monitors the
student activity during a test. This module should help
identify any design mistakes, and allow to send an
instant message to the student and warning them to
make the necessary design corrections.
Include a detailed written report option to the students
supervision application describing the activity
performed by each student during the Lab tests. This
includes: date, access time to the system, and quantity
of tests achieved in each practice. Additionally, it
should also compare this information with the course
average score. This resulted report could periodically
and automatically be send to the teacher.
ACKNOWLEDGMENT
The authors gratefully acknowledge the support offered by
The Prometeo Project: Secretaría de Educación Superior,
Ciencia, Tecnología e Innovación (Senescyt) and Universidad
Politécnica Salesiana, both in the Republic of Ecuador, and the
Decanato de Investigación y Desarrollo at the Universidad
Simón Bolívar in Venezuela.
REFERENCES
[1] A. Kukulska-Hulme and J. Traxler, Mobile learning: a handbook for
educators and trainers, Routledge Taylor & Francis Group, 2005.
[2] M. Sarrab, Mobile learning (m-learning) concepts, characteristics,
methods, components: platforms and frameworks, Nova Science Pub
Inc, 2015.
[3] M. Pasamontes, J.L. Guzman, F. Rodríguez, M. Berenguel, and S.
Dormido, Easy mobile device programming for educational purposes,
Proc. of the 44th IEEE Conference on Decision and Control and
European Control Conference (CDC-ECC05), pp. 3420-3425, Dec.
2005.
[4] A. Kukulska-Hulme, M. Sharples, M. Milrad, I. Arnedillo-Sánchez and
G. Vavoula, "Innovation in Mobile Learning: A European Perspective,"
Int. Journal of Mobile and Blended Learning, Vol. 1, pp. 1335, 2009.
[5] Z. Yusoff and H.M. Dahlan, Mobile based learning: an integrated
framework to support learning engagement through augmented reality
environment, Proc. of the 2013 International Conference on Research a
in Information Systems (ICRIIS 2013), pp. 251-256, Nov. 2013.
[6] G.J. Hwang and P.H. Wu, Applications, impacts and trends of mobile
technology-enhanced learning: a review of 2008-2012 publications in
selected SSCI journals, Int. Journal of Mobile Learning and
Organisation, vol. 8, pp. 83-95, 2014.
[7] D. Lopez-de-Ipina, J. Garcia-Zubia, P. Orduna, Remote control of Web
2.0-enabled laboratories from mobile devices, in Proc. of the Second
IEEE International Conference on e-Science and Grid Computing (e-
Science'06), dec. 2006.
[8] P. Orduña, J. García-Zubia, J. Irurzun, D. López-de-Ipiña and L.
Rodriguez-Gil, Enabling mobile access to remote laboratories, in Proc.
of the 2011 I EEE Global Engineering Education Conference (EDUCON
2011), pp. 312-318 , Apr. 2011.
[9] H. Hassan, J.M. Martínez-Rubio, A. Perles, J.V. Capella, C. Domínguez
and J. Albaladejo, Smartphone-based industrial informatics projects
and laboratories, IEEE transactions on industrial informatics, vol. 9, pp.
557-566, Feb. 2013.
[10] J.P.C. de Lima, W. Rochadel, A .M. Silva, J.P.S. Simão, J.B. da Silva
and J.B.M. Alves, Application of remote experiments in basic
education through mobile devices, in Proc. of the 2014 IEEE Global
Engineering Education Conference (EDUCON 2014), pp. 1093-1096,
Apr. 2014.
[11] A.K. Bejczy and J.K. Salisbury, Kinesthetic coupling between operator
and remote manipulator, in Proc. of the Int. Computer Technology
Conference, vol. 1, pp. 197-211, Aug. 1980.
[12] J.T. Dennerlein, P.A. Millman and R.D. Howe, Vibrotactile feedback
for industrial t elemanipulators, nn Proc. of the ASME IMECE Sixth
Annual Symposium on Haptic Interfaces for Virtual Environment and
Teleoperator Systems, Nov. 1997.
[13] W.B. Griffin, W.R. Provancher and M.R. Cutkosky, Feedback
strategies for shared control in dexterous telemanipulation,in Proc. of
the 2003 IEEE/RSJ International Conference on Intelligent Robots and
Systems (IROS 2003), vol. 3, pp. 2791-2796, Oct. 2003.
[14] D. Pamungkas and K. Ward, Immersive teleoperation of a robot arm
using electro-tactile feedback, in Proc. of the 6th International
Conference on Automation, Robotics and Applications, pp. 300-305,
Feb. 2015.
[15] J. Rosen, B. Hannaford, M.P. MacFarlane and M.N. Sinanan, Force
controlled and teleoperated endoscopic grasper for minimally invasive
surgery-experimental performance evaluation, IEEE Transactions on
Biomedical Engineering, vol.46, pp. 1212-1221, Oct. 1999.
[16] M. Minsky, O. Ming, O. Steele, F.P. Brooks and M. Behensky, Feeling
and seeing: issues in force display, In Proc. of the 1990 Symposium on
Interactive 3D Graphics, pp. 235-241, 1990.
978-1-4673-8633-3/16/$31.00 ©2016 IEEE 10-13 April 2016, Abu Dhabi, UAE
2016 IEEE Global Engineering Education Conference (EDUCON)
Page 246
[17] C. Basdogan, S. De, J. Kim, M. Manivannan, H. Kim and M.A.
Srinivasan, Haptics in minimally invasive surgical simulation and
training, IEEE Computer Graphics and Applications, vol. 24, pp. 56-64,
March-April 2004.
[18] M. Aiple and A. Schiele, Pushing the limits of the CyberGraspTM for
haptic rendering, in Proc. of the IEEE International Conference on
Robotics and Automation (ICRA 2013), pp.3541-3546 , May. 2013.
[19] J. Martínez, A. García, M. Oliver, J.P. Molina and P. González,
Identifying 3D geometric shapes with a vibrotactile glove, IEEE
Computer Graphics and Applications, in press.
[20] I. Poupyrev, S. Maruyama and J. Rekimoto, A mbient touch: designing
tactile interfaces for handheld devices, in Proc. of the 15th annual ACM
symposium on User interface software and technology (UIST'02), pp.
51-60, 2002.
[21] J. Luk et al., A role for haptics in mobile interaction: initial design
using a handheld tactile display prototype, in Proc. of the 2006
Conference on human factors in computing systems (CHI 2006), pp.
171-180, Apr. 2006.
[22] J. Cho, I. Hwang, and S. Oh, Vibration-based surface recognition for
smartphones, in Proc. of the 2012 IEEE International Conference on
Embedded and Real-Time Computing Systems and Applications
(RTCSA 2012), pp. 459-464, Aug. 2012.
[23] J.U. Lee, J.M. Lim, H. Shin, and K.U. Kyun, Haptic interaction with
user manipulation for smartphone, in Proc. of the 2013 IEEE
International Conference on Consumer Electronics (ICCE 2013), pp. 47-
48, Jan. 2013.
[24] Quanser Innovate Educate, July 15, 2015 [Online]. Available:
http://www.quanser.com/.
[25] Feedback Instruments Limited, July 15, 2015 [Online]. Available:
http://www.feedback-instruments.com/.
[26] Festo, July 15, 2015 [Online]. Available: http://www.festo.com/.
[27] LabVIEW, July 15, 2015 [Online]. Available:
http://www.ni.com/labview/.
[28] I. Titov, Using labVIEW for building laboratory server: pros and cons,
design patterns, software architecturing and common pitfalls, in Proc.
of the 2014 I EEE Global Engineering Education Conference (EDUCON
2014), pp. 1101-1107, Apr. 2014.
[29] Adobe Flash, July 15, 2015 [Online]. Available:
http://www.adobe.com/products/flash.html.
[30] R. Braunstein, ActionScript 3.0 Bible, Willey, 2010.
[31] Matlab, July 15, 2015 [Online]. Available:
http://www.mathworks.com/products/matlab/.
[32] Scilab, July 15, 2015 [Online]. Available: http:// www.scilab.org/.
[33] S. Choi, K.J. Kuchenbecker, Vibrotactile display: perception,
technology, and applications, In Proceedings of the IEEE, Vol. 101, pp.
2093-2104, Sept. 2013.
[34] E. Granado, W. Colmenares, O. Pérez and G. Catalodo, Remote
experimentation using mobile technology, IEEE Latin America
Transactions, vol. 11, pp. 1121-1126, June 2013.
978-1-4673-8633-3/16/$31.00 ©2016 IEEE 10-13 April 2016, Abu Dhabi, UAE
2016 IEEE Global Engineering Education Conference (EDUCON)
Page 247
... Additionally, the design focus for the vibrotactile experience on smartphones has consistently prioritized conveying simple messages or notifications rather than complex sounds. To reduce costs and keep them as compact as possible, the majority of smartphones are equipped with ERM actuators that usually operate on one single frequency (resonant frequency) [77]. Furthermore, using such devices in this field introduces significant challenges in controlling potentially confounding variables that are typically less pronounced in controlled laboratory settings and equipment, hence complicating and reducing the reliability of experiments and evaluations. ...
Article
Full-text available
Sensory substitution and augmentation are pivotal concepts in multi-modal perception, particularly when confronting the challenges associated with impaired or missing sense rehabilitation. The present systematic review investigates the role of haptics for the hearing impaired in training or gamified activities. We applied a set of keywords to the Scopus® and PubMed® databases, obtaining a collection of 35 manuscripts spanning 23 years. Each article has been categorized following a documented procedure and thoroughly analyzed. Our findings reveal a rising number of studies in this field in the last five years, mostly testing the effectiveness of the developed rehabilitative method (77.14%). Despite a wide variety in almost every category we analyzed, such as haptic devices, body location, and data collection, we report a constant difficulty in recruitment, reflected in the low number of hearing-impaired participants (mean of 8.31). This review found that in all six papers reporting statistically significant positive results, the vibrotactile device in use generated vibrations starting from a sound, suggesting that some perceptual aspects connected to sound are transmittable through touch. This fact provides evidence that haptics and vibrotactile devices could be viable solutions for hearing-impaired rehabilitation and training.
Article
Full-text available
The past few decades have seen extraordinary gain in interest for bio‐based products, driven by the intensifying call of the society for petrochemical material replacement and developing materials with next‐to‐no environmental impact. Cellulose, which is an abundantly available “green” material, can be derived from plant fibers and tailored for a plethora of possible uses where it can be used as a substrate or as a filler material. However, emerging technologies and product advancements necessitate the search for materials that are small, biodegradable, lightweight, and strong. Nanocellulose, which can be obtained through as mechanical and chemical production methods with tensile strength and Young's modulus of up to 0.5 and 130 GPa, respectively, proves to be the answer that they were looking for. However, the inherent hydrophilic nature of nanocellulose limited its potential widespread application. Surface modifications of nanocellulose to alter and diminish its hydrophilicity were done to address the aforementioned issues. In this article, we had reviewed on different types of surface modifications and their resulting impact on the properties of nanocellulose and their effect on polymer composites. The importance of nanocellulose in emerging applications such as biosensor, nanoremediation, papermaking, and automotive as well as the current state of the industry and the commercialization progress of nanocellulose were also discussed. © 2017 Wiley Periodicals, Inc. J. Appl. Polym. Sci. 2018, 135, 46065.
Article
Full-text available
Full text of this item is not currently available on the LRA. The final published version is available at http://www.igi-global.com/Bookstore/Article.aspx?TitleId=2755, Doi: 10.4018/jmbl.2009010102. This article was also published as Kukulska-Hulme, A., Sharples, M., Milrad, M., Arnedillo-Sanchez, I., and Vavoula, G. (2008). Innovizione nel mobile learning: Una prospettiva europea sulle potentialitá didattiche della technologia mobile per l'apprendimento. Technologie Didattiche, 44 (2), pp. 4-21. In the evolving landscape of mobile learning, European researchers have conducted significant mobile learning projects, representing a distinct perspective on mobile learning research and development. Our article aims to explore how these projects have arisen, showing the driving forces of European innovation in mobile learning. We propose context as a central construct in mobile learning and examine theories of learning for the mobile world, based on physical, technological, conceptual, social and temporal mobility. We also examine the impacts of mobile learning research on educational practices and the implications for policy. Throughout, we identify lessons learnt from European experiences to date.
Conference Paper
Full-text available
This paper presents some findings related to the experience of developing and deploying an software application aimed at using remote experimentation with mobile devices and the use of Virtual Learning Environments (VLE) as tools to support teaching and learning. Currently the technological resources have been misused in education, and the potential use of mobile devices in education is virtually untapped. The architecture implemented enables the users to control real experiments and to monitor results via video streaming, which provides a sense of involvement similar to hands-on laboratories. The authors describe the deployment of this mobile learning tool in a second year Brazilian public high school. The mobile application uses HTML5, CSS3 and jQuery Mobile in order to maintain compatibility among the platforms widely used. The VLE Moodle has been used in order to provide homework, assignments and teaching material. The experiments have been automated using open hardware and software sources, which facilitate replication in different areas.
Article
Full-text available
The use of mobile technologies in learning has attracted much attention from researchers and educators in the past decade. However, the impacts of mobile learning on students' learning performance are still unclear. In particular, some schoolteachers still doubt the effectiveness of using such new technologies in school settings. In this study, a survey has been conducted by reviewing the 2008-2012 publications in seven well-recognised Social Science Citation Index (SSCI) journals of technology-enhanced learning to investigate the applications and impacts of mobile technology-enhanced learning. It is found that mobile learning is promising in improving students' learning achievements, motivations and interests. In addition, from the survey, it is found that smartphones and tablet PCs have gradually become widely adopted mobile learning devices in recent years, which could affect the adoption of sensing technologies in the future. Accordingly, several open issues of mobile learning are addressed.
Article
Teleoperation can allow an operator to control a robot remotely in inaccessible and hostile places. To achieve more dexterous control of a tele-operated robot some researchers are developing user interfaces equipped with vision and tactile feedback. 3D visual perception and tactile feedback can also assist the operator to feel immersed in the robot's environment and embodied within the robot to some extent. Most existing tactile feedback systems use electro-mechanical actuators and linkages. However, these systems are complex, cumbersome and consequently make it difficult for the operator feel embodied within the robot. To improve on these drawbacks, this paper introduces an immersive teleoperation system comprised of a 3D stereo vision head set combined with an electro-tactile feedback system. Our electro-tactile feedback system is compact, nonmechanical and versatile. Experimental results are provided which show how this form of immersive 3D perception and tactile feedback system can enable the user to achieve more dexterous control of a robot arm by enabling the operator to effectively see what robot sees and experience what the robot feels while performing work with the robot.
Article
A universal force-reflecting hand controller has been developed which allows the establishment of a kinesthetic coupling between the operator and a remote manipulator. The six-degree-of-freedom controller was designed to generate forces and torques on its three positional and three rotational axes in order to permit the operator to accurately feel the forces encountered by the manipulator and be as transparent to operate as possible. The universal controller has been used in an application involving a six-degree-of-freedom mechanical arm equipped with a six-dimensional force-torque sensor at its base. In this application, the hand controller acts as a position control input device to the arm, while forces and torques sensed at the base of the mechanical hand back drive the hand controller. The positional control relation and the back driving of the controller according to inputs experienced by the force-torque sensor are established through complex mathematical transformations performed by a minicomputer. The hand controller is intended as a development tool for investigating force-reflecting master-slave manipulator control technology.
Book
The widespread use of mobile technologies, both hardware and software, is quickly becoming a prerequisite to support development. This widespread use, combined with improvements in mobile connectivity, has led to increasing interest in the use of mobile devices as learning tools. Distance and electronic learning have proven to be potential approaches, insuring progress in education that reduces the limitations of traditional education systems. Mobile learning (M-learning) represents how best to address a number of traditional, distance, visual and electronic learning challenges, issues and limitations. The opportunity to use mobile devices, such as PDAs, tablets and smart phones, as learning tools, enables innovation and supports students, teachers and decision makers access to digital study materials and personalized assessment. Much of the work done on the subject of M-learning has taken the form of requirement analysis, design needs and issues and challenges affecting application development. In order to ascertain the current level of knowledge and state of research, this book pinpoints and harnesses the potential factors and gaps in M-learning development and adoption. This book presents different aspects related to M-learning to help readers understand and distinguish the primary characteristics and features of M-learning. The book begins with an introductory chapter that describes its scope. The second chapter describes the principles of learning and teaching. This is followed by 6 chapters, which describe and discuss mobile computing, different definitions of M-learning and its theoretical background, different M-learning requirements, M-learning frameworks and number of M-learning applications in the field of education. The final chapter highlights M-learning issues and suggests a future direction for M-learning.
Conference Paper
This paper describes software engineering best practices in building remote laboratory servers with LabVIEW. Most remote laboratories are started as academic projects. They are usually developed by students during their course projects and not by professional software developers. Thus, such projects would benefit from professional guidance. In this paper we list common LabVIEW design patterns applicable to remote labs. Best practices are described and typical pitfalls are analyzed. Code snippets, demos and live examples will be given during accompanying presentation at the conference.
Article
The emergence of new off-screen interaction devices is bringing the field of Virtual Reality to a broad range of applications and means that virtual objects can be manipulated without the use of the traditional peripherals. However, in order to increase the sensation of reality and facilitate interaction, other stimuli need to be included. We therefore propose the incorporation of haptic feedback to assist in the execution of manipulative tasks and improve user experience in these new environments. To this end, we designed a new haptic display based on a vibrotactile glove that includes several solutions to control vibration and allow the user to feel gentle sensations. The actuator position is specifically designed to enhance the border detection capacity required for the object recognition task. We also performed an experiment with sixteen participants to evaluate the ability of our proposal in the highly demanding task of identifying 3D objects without visual feedback. Finally, the results demonstrate the capacity of this technology in practical applications and we indicate some possible lines for future research.
Conference Paper
The CyberGrasp™ is a well known dataglove-exoskeleton device combination that allows to render haptic feedback to the human fingers. Its design, however, restricts its usability for teleoperation through a limited control bandwidth and position sensor resolution. Therefore the system is restricted to low achievable contact stiffness and feedback gain magnitudes in haptic rendering. Moreover, the system prohibits simple adaption of its controller implementation. In this paper, the ExHand Box is presented, a newly designed back-end to widen the CyberGraspTM's bandwidth restrictions and to open it up for fully customized controller implementations. The ExHand Box provides a new computer, interface electronics and motor controllers for the otherwise unmodified CyberGloveand CyberGraspTM hand systems. The loop frequency of the new system can be freely varied up to 2 kHz and custom controllers can be implemented through an automatic code generation interface. System performance identification experiments are presented that demonstrate improved behavior in hard contact situations over a range of sampling periods. Maximum contact stiffnesses of up to 50kN/m in a stable condition are demonstrated, which is significantly higher than what could be achieved with the non-customized original system version. Moreover, a bilateral control experiment is conducted to demonstrate the new system's usability for generic teleoperation research. In this experiment a raycasting algorithm is introduced for pre-contact detection in order to compensate for high delay and jitter communication links between master and slave as they appear in an Ethernet network. It is demonstrated that the contact stiffness can be maintained in the order of magnitude of the system performance identification with a demonstrated stiffness of 41kN/m in a stable condition.
Conference Paper
Tactile feedback from a remotely controlled robotic arm can facilitate certain tasks by enabling the user to experience tactile or force sensations from the robot's interaction with the environment. However, equipping both the robot and the user with tactile sensing and feedback systems can be complex, expensive, restrictive and application specific. This paper introduces a new tele-operation haptic feedback method involving electro-tactile feedback. This feedback system is inexpensive, easy to setup and versatile in that it can provide the user with a diverse range of tactile sensations and is suitable for a variety of tasks. We demonstrate the potential of our electro-tactile feedback system by providing experimental results showing how electro-tactile feedback from a tele-operated robotic arm equipped with range sensors can help with avoiding obstacles in cluttered workspace. We also show how interactive tasks, like placing a peg in a hole, can be facilitated with electro-tactile feedback from force sensors.