Content uploaded by Ernesto Granado
Author content
All content in this area was uploaded by Ernesto Granado on Oct 07, 2017
Content may be subject to copyright.
Remote Experimentation Using a Smartphone
Application with Haptic Feedback
Ernesto Granado
Universidad Simón Bolívar, Caracas, Venezuela
Prometeo Project Researcher-SENESCYT, Ecuador
Universidad Politécnica Salesiana, Cuenca, Ecuador
granado@usb.ve
Flavio Quizhpi
Julio Zambrano
Universidad Politécnica Salesiana
Cuenca, Ecuador
{fquizhpi, jzambranoa}@ups.edu.ec
William Colmenares
Universidad Simón Bolívar
Caracas, Venezuela
williamc@usb.ve
AbstractThis paper presents the development of a remote
experimentation for automatic control engineering education. By
using a smartphone application, students can interact with real
laboratory didactic equipment. Through a friendly graphical
user interface and information provided by built-in smartphone
sensors, students can modify the process parameters and verify in
real time whats happening on the experimental setup. Mobile
device vibration technology is used to add haptic tactile feedback
to the application. This additional sense complements the visual
and hearing information the user usually acquires when
performing a practical experience. Therefore, students pay more
attention to the results achieved online. Taking advantage of
todays smartphone technological features, the students feel
highly motivated to use this device to achieve improvements in
their learning process. To reach a great number of users, the
application runs across the most widely used mobile platforms.
All user lab activities are stored in a database for the teacher
later analysis. Likewise, students can store the resulted
experiments on their mobile device when finished in order to
make an off-line result analysis.
Keywordsadobe AIR; control engineering education; haptic
feedback; remote experimentation; technology enhanced learning.
I. INTRODUCTION
Cell phones have had a quick development in recent years.
They have become from simple devices to receive and make
calls to the really powerful communication-computing tool that
are today. These smartphones contain an advanced mobile OS
that allows to run several software applications such as: word
processors, spreadsheets, and play multimedia. They have
wireless connectivity technologies such as Bluetooth and Wi-
Fi, and also have Internet access capability that allows to send
and receiving e-mails, browse the web, and chat through them.
Users, in particular young people, like to use them to perform a
wide variety of daily tasks. For instance: taking pictures,
listening to music, communicating with others, reading news,
checking bank statements, paying accounts, buying goods,
playing video game, and so forth. Thus, these devices that
students carry on as a garment, and like to interact with at all
times, have become a convenient tool that increases students
attention and improve the teaching/learning methods.
This work goes a little step further in the use of mobile
devices that will improve the teaching/learning process. The
following described application contributes with the remote
laboratory experimentation development, where the
smartphone technology features are taken into account to create
an attractive application. The built-in sensors are used to
interact with the application, and the built-in vibration
technology offers haptic tactile feedback. Although there are
feasibility limitations in the current vibration technology on
mobile devices, the vibratory stimulus that students can feel in
their hands can be used to emphasize the experiment control
signal behavior. Having additional information besides vision
and audition, the learning/teaching process can be enhanced.
This work contributes in the unexplored field of potential use
of haptic feedback in remote laboratories.
The main aim of this application is to provide students with
a tool that will enable them to take advantage of their spare
time at the University. Meanwhile the lab doors are closed; the
students are able to do the lab practices while waiting for the
next class. They will do these practices within the university
campus by connecting to the control lab wireless network.
Its well known that the growing demand for science and
engineering careers calls for more university resources.
Nowadays, the need for lab equipment is not usually enough to
fulfill due the lack of economical funds and physical space
limitations. This applications allows the efficient and intense
use of the existing lab equipment, since the students could
perform their experiments even when the lab and its facilities
are closed.
978-1-4673-8633-3/16/$31.00 ©2016 IEEE 10-13 April 2016, Abu Dhabi, UAE
2016 IEEE Global Engineering Education Conference (EDUCON)
Page 240
The remote experimentation system can be easily adapted
to any kind of physical experimental hardware and also be
extended to other science or engineering laboratories areas.
II. RELATED WORKS
The opportunities offered by mobile technologies in
learning anytime and anywhere, introduced the mobile learning
or m-learning concept [1], [2]. M-learning has evolved
considerably during the last decade and many teaching process
have been developed using a mobile phone. Some of them are:
[3] [6] and therein references. More recently, there has been
also important research in the area of remote laboratories in
teaching science and engineering. For example, the
transformation of remote lab WebLab-Deusto (at University of
Deusto, Spain), into a Web 2.0-enabled application accessible
from any mobile devices is described in [7]. Reference [8] has
shown two strategies to adapt a desktop remote laboratory to
mobile devices. The first one, reuses the Web based code but
adjusts it to the mobile device features. The second one uses
native mobile technologies to employ all the resources that the
mobile device provides. In [9] the authors suggest a
smartphone multimedia learning environment. The mobile
device is used to display a web-based multimedia lecture
explanation that reviews the theoretical material. Also, the
smartphone is used to control the system of a medium size
industrial process through Bluetooth communication. In [10]
its presented a mobile application that enables the users to
control real experiments and monitor results via video
streaming, providing a sense of involvement similar to hands-
on laboratories. The mobile application maintains compatibility
among the widely used platforms.
A vibrotactile feedback technology is currently used in
portable devices to provide silent alerts, to confirm keystroke
action when typing a text on a virtual keyboard, and to enhance
experience when playing mobile videogames. Numerous
literature research have show the use of haptic feedback
technology mainly focused in the improvement of human-
machine interaction. Some interesting works in the
teleoperation area are [11] [15] and therein references.
Experiments have shown that the haptic feedback can improve
performance in teleoperation. There are also many haptic
researche in virtual reality applications that enrich the sensorial
perception of a virtual environment, some of these are: [16]
[18] and in press [19].
In recent years, there has been an interest in exploring the
possibilities of haptic technology in mobile handheld devices.
The sense of touch, as a channel for handheld communication
devices, was investigated in [20]. They embedded a PDA with
a universal tactile display (TouchEngine) that can produce a
wide variety of tactile feelings, which can go from simple
clicks to complex vibrotactile patterns. In [21] authors present
a miniature actuator embedded in a handheld for users thumb
lateral skin stretch. The lateral deformation to the fingertip skin
resulting in a dynamic tactile sensation. Reference [22] shows a
smartphone based system which uses a built-in vibrator and
accelerometer that recognizes the type of surfaces contacted by
the mobile device. In [23] a mechanism for mobile devices that
provides realistic and interactive haptic feedback is presented.
A thin actuator is installed inside the case to simulate a rapid
realistic response stimulus.
III. ARCHITECTURE AND COMPONENTS
The remote experimentation system has the typical
client/server architecture, and consists of three components: the
server, the client and the supervisor application. The fig. 1
shows the system architecture and components.
Fig. 1. Remote experimentation architecture.
The server is a standard computer connected to an
experiment setup via a standard data acquisition (DAQ)
system. The experimental hardware can be any physical
experimental process used for didactic purposes with electrical
input and output signals, such as: Quanser Innovate Education
[24], Feedback Instruments Limited [25], and Festo [26],
among others.
The Quanser tanks system is being used for
experimentations. The process consists of a one pump and a
coupled water tanks, forming an autonomous closed and
recirculating system as shown in Fig. 1. This system has two
uniform cross sections tanks that are mounted on the front plate
above a reservoir where water is stored. The water flows from
the first (upper) tank into the second (lower) tank, and the
water outflow from the second tank returns to the water
reservoir. Each tanks has an outflow orifice at the bottom from
which the liquid is withdrawn, and the pump drives it from the
bottom basin to the top of the system. The water level is
measured using a pressure-sensitive sensor located at the
bottom of each tank. The system input is the pump voltage, and
the system output is the two tanks measured levels.
The client is a smartphone from which students can
manipulate experimental hardware and verify if the control
strategy fulfils the design specifications.
On the other hand, the supervisor consists of an application
that allows the course teacher to supervise the students learning
978-1-4673-8633-3/16/$31.00 ©2016 IEEE 10-13 April 2016, Abu Dhabi, UAE
2016 IEEE Global Engineering Education Conference (EDUCON)
Page 241
process. From his personal computer, the teacher, can accesses
the data base (DB) system that stores all activity performed by
students in the lab.
Client-server communication is performed using the
TCP/IP protocol, and both devices must be on the same
wireless sub-network. Only one client at a time can be
connected to the server. Each user can use the system
according to the laboratory schedule.
IV. SERVER APPLICATION
The server application was developed using LabVIEW
platform [27] version 2014. This software is a powerful and
useful graphical development tool, since it provides real
solutions to practical problems like: data processing, data
communication applications, and computer instruments
interfacing. Also, LabVIEW enables inexperienced
programmers to easily develop complex applications.
As shows in fig. 2, the server application performs the
following five tasks: TCP/IP communication, control
strategies, data acquisition communication, data base stored,
and hardware safety. The server application uses the
Producer/Consumer design pattern [28]. Each task runs on its
own timed loop at different rates and the five loops are parallel
executed. This software architecture allows easy application
maintenance since each task is programmed as a standalone
module. For instance, if the TCP/IP communication or control
strategies task needs to be changed, those modifications are
done only on this modules without affecting the other ones
within the application. As a result, this facilitates updates
performance without affecting the systems integrity.
Fig. 2. Server application tasks.
The TCP/IP communication module is responsible to
manage communications with the client; that is, process the
orders sent by the user through the mobile device. Besides that,
it is also responsible to periodically transmit the process
variable values to the client during the experimentations.
The control module is in charge to manage the control
strategy. Students are able to select the controller types they
have studied in the theory sessions like: PID, lead/lag, state
feedback, etc. Different control strategies can be implemented
in this module and new controller types can also be easily
added. Due to the importance of the control tasks in the correct
operation of the experiment setup, this module presents the
highest execution priority and the lowest loop time period.
The data acquisition module manages the input and output
DAQ channels. This module sends/receives data to/from the
experimental setup according to commands received from the
client application. According to the experiment being used
(tank, motor, etc.), the system's administrator may adjust the
sampling and control times.
The DB module is in charge of the data base management.
All the activity performed by the student is stored for the
teacher later analysis. The system has four relational tables.
The first table contains the students identification information
such as: name, surname, student ID, password, email, etc.
When the user tries to connect to the server lab, the login
information is validated against data that is stored in this table.
The second table stores the time and date the student has
logged in and out. The third table stores all the information
created by the student during the connection time. Each test the
user performs is stored in this table (control parameters, set
point, output, and control signal evolution). The fourth and last
table is administrated by the course teacher and presents the
system access information that the student is allowed to do,
such as: the scheduled laboratory practices, the maximum lab
time usage, and the qualified practices to be perform.
The safety module prevents the students inappropriate
hardware manipulation. Depending on the experimental setup
used in the laboratory, the maximum tank level, temperature
and speed motor are configured. This safety value is adjusted
by the systems administrator. If this value is reached, the
actuators voltage (pump, motor, etc.) is suppressed, and the
user cannot make a new test until the laboratory equipment
reaches the initial condition.
V. CLIENTS APPLICATION
Adobe Flash technology [29] is used to develop the client
application with a well-designed friendly graphical user
interfaces. This software is suitable for graphic design
development. It also allows to deploy Adobe Integrated
Runtime (AIR) applications to be able to run across different
mobile platform such as: Google Android, Apple iOS and
Microsoft Windows Mobile, allowed deliver app to much
larger audiences. In other words, to execute the application in
the mobile device, only the AIR runtime must be downloaded
and installed. The ActionScript 3.0 [30] is used as the
programming language.
A. Tasks
The client application performs the following four tasks:
TCP/IP communication, graphical display, data file saving, and
built-in sensors/actuators. The client tasks are showed in Fig. 3.
978-1-4673-8633-3/16/$31.00 ©2016 IEEE 10-13 April 2016, Abu Dhabi, UAE
2016 IEEE Global Engineering Education Conference (EDUCON)
Page 242
The TCP/IP communication module is responsible to
manage communications with the server. The client both
initiates and terminates the communication with the server. All
the actions performed by the user on a mobile interface are sent
to the servers application, among others, the following can be
mentioned: the experiment start and stop commands, the
process parameters modification, and the control strategies
selection. In addition, TCP/IP communication module receives
from the server the values resulted from the experimental
setup.
Fig. 3. Client application tasks.
The graphical display task allows the real time update of
both graphical animations and other information shown in the
interface. This is being performed through signal values
resulted from the experimental setup.
The save data file task enables the user to store in the
mobile device a text file with all the experimental data.
Afterwards, the student can use this file with any
computational tool like Matlab [31] o Scilab [32] to make the
report and perform an off line analysis. Finally, the student
must deliver this report to the course teacher the week after of
the experiment performance.
The built-in sensors/actuators task controls the smartphone
sensors and actuators. Although the touch screen is the main
way of interaction with the client application, the user can also
use the built in sensors for data input. This module obtains the
information provided by the accelerometer sensor and also in
charge to send to the vibration actuator the control value signal
resulted from the experiment to produce the vibrotactile
feedback.
B. Haptic Feedback
Nowadays several smartphones come with vibration
technology. The oscillating movement that the user can feel,
known as vibrotactile feedback, is created by an actuator
controlled by software. This sensation is experienced when
typing a text and a specific keystroke has been registered.
There are three main technologies for these actuators [33]: the
Eccentric Rotating Mass (ERM), the Linear Resonant (LRA)
and the Piezoelectric Actuator.
In this work, the haptic tactile feedback is used to enhance
the remote experimentation experiences. The vibration is used
to indicate the control signal amplitude, so the student can feel
the control signal differences between various controllers
types. As shows in Fig. 4, the built-in sensor/actuator task is in
charge of create a vibration intensity proportional to the control
signal magnitude.
Fig. 4. Haptic generation from control signal value.
This additional sense complements the acoustic and visual
information that helps the student to get a more precise feeling
about control signal behavior.
Nowadays, the smartphones more standardized vibration
technology is the ERM. It is important to mention that even
though the vibration amplitude is not possible to be changed
with this technology, the time duration can be modified.
Therefore, the ERM on and off timing must be correctly
controlled to generate different vibration intensities. That is,
the motor spin duty cycle must be modified to generate
different vibration intensities. On the other hand, since the
motor is an inertial system, both a start-up and stop time is
required. To conclude, this latency must be taken into account
when the motor vibration should be turned on/off in a
periodical patterns as shows in the Fig. 5. In addition, this
latency time varies since it depends from the motors speed and
acceleration.
Fig. 5. Control signal to pulse vibration duration conversion.
200 ms was selected for the haptic feedback period. This
time frame coincides with the communication or actualization
time values from the experimental setup. Different vibration
sensations are needed to be generated for different control
signal values. Due to the ERM latency time, it is not possible to
associate one control signal value with one vibration intensity
978-1-4673-8633-3/16/$31.00 ©2016 IEEE 10-13 April 2016, Abu Dhabi, UAE
2016 IEEE Global Engineering Education Conference (EDUCON)
Page 243
value. For this reason, a different vibration intensity must be
generated for different control signal values range, as shown in
the Fig. 5. Besides that, it is also required to create a soft
vibration for the small control signal values, and a very strong
vibration for the higher control signal values. Then main
objective centres in the creation of great vibration patterns
quantity easily differentiated and recognized by the user.
After several tests, four different duty cycle patterns have
been defined to spin the vibrating motor. These four vibrating
pulse train patterns generate different vibrating sensations that
the user can easily differentiate and recognize. It is important to
mention that if a higher duty cycle patterns value is used, the
vibrating intensity cannot be differentiate. A vibration with a
duty cycle of 15% value results from a control signal value
between 0,5 and 2,5 V. As a result, the user senses a very soft
but appreciable vibration generated by the telephone. As the
signal control value increases, the motor activation duty cycle
also rises, as shown in the Fig. 5. At the same time, this
phenomenon also intensifies the vibration intensity. The
maximum vibration intensity is generated with a duty cycle of
60%, and belongs to the control signal of 7,5 at 10V voltage
range. The maximum voltage that the DAQ can measured is
10V.
Even though mobile devices vibration technology still
presents some limitations, it is used to explore the possible
benefits that the haptic feedback can bring to the remote lab
learning process.
C. Graphic User Interface
The graphical user interface is designed in suitable ways for
the mobile device small screen size. Fig.6 shows the
smartphone application user interface. The student interacts
with the experimental setup through this graphical interface.
Fig. 6. Client application user interface.
The interface has on its right hand side the main control
bar. This control bar has four buttons: start/stop, view, save and
exit. To disconnect from the server the exit button must be
pressed. The start/stop buttons are used to start and stop the
experimental test. The view button is used to switch quickly
between the two application views: the graphical view (bottom
of Fig. 6) and the animation view (top of Fig. 6). The save
button is pressed to store in the smartphone a text file with data
obtained from the experiment, and it is only enabled when the
user has concluded the test.
The animation screen view displays an experimental setup
diagram that shows in real time the data resulted from the
experimentation. From here, the user may also change the
controller parameters and the reference level set-point. In
addition, by pressing the top screen tabs the student can also
select the control strategies needed to be studied, such as: PID,
lead/lag, state feedback, etc.).
The graphical view area is composed of two graphs. The
first one on top plots the evolution of the output and reference
signals. The other graph in the bottom, displays the control
signal evolution. Both plots are also in real time.
This graphs display 120s of information, if the
experimentation exceeds this time limit, the plots will
automatically move to the left to display on the screen the latest
values obtained from the experimental setup.
To be able to verify the past values, the user can move the
plots to the right or to the left by horizontally inclining the
phone. As the inclination angle increases towards the vertical
axis, the plots movement is greater and faster, such as
illustrated in the fig. 7. The smart phone accelerometer is being
used to identify the sided phone inclinations performed by the
user at this time. This calculations are performed inside the
built-in sensor/actuator task.
Fig. 7. Horizontally inclining to review past values.
978-1-4673-8633-3/16/$31.00 ©2016 IEEE 10-13 April 2016, Abu Dhabi, UAE
2016 IEEE Global Engineering Education Conference (EDUCON)
Page 244
VI. STUDENTS SUPERVISION APPLICATION
The students supervision application represent the
systems most important component of remote
experimentation. It allows checking the students overall
learning process. This application has been also developed in
LabVIEW.
All the activities performed by students are stored in the
server database. The students supervision activity application
enables the extraction of this information and its graphical
display. This allows the teacher to analyse the students lab
performance and evaluate their learning process. The Fig. 8
shows the teachers interface tool.
This application displays four rows distributed as follows.
The first row, located on the top of the screen, shows the
student information (name, surname, ID) and studying course.
The second row, located below the first one, displays the server
connections information the user performed, like the date and
login/logout time. This row also shows the dispositive features
used by the student to get connected to the remote
experimentation system. The third row shows the controller's
types and parameters used in the test. Finally, at the bottom of
the screen, the output variables/reference and control signal
evolution graphs are shown.
Besides the above mentioned, other statistical information
like the total laboratory entries and tests performed on each
work session, are also shown.
Fig. 8. Supervision student activity application.
Due to the fact that all students activity is stored in the
laboratory data base server, it is possible to include any type of
statistical analysis that the course teacher necessarily considers.
VII. REMOTE EXPERIMENTATION ACTIVITY
For every lab session students must do the following
activities:
Execute a pre-laboratory report with the controller
design before testing it in the experimental setup. This
activity should be shown to the teacher to be able to
access the remote experimentation server.
Connect to the remote experimentation server through a
smartphone.
Select the controller type to be used, set all the
experiment parameters through the use of graphical
interface and run the experiment.
Save the data collected in their devices for later off line
analysis.
Submit a written report describing what they have
learned. This report must be made a week after the
experiment was done, that way, the teacher will be able
to measure the students learning outcome.
VIII. PILOT EXPERIENCE RESULTS
The pilot experience course was implemented on the 4th
academic year of the electronic engineering obligatory
automatic control course. The above mentioned course is done
in a twelve week period as follows: 4 hours a week of
theoretical sessions, two written tests, and five practical
experiences that are carried out every two weeks.
The pilot experience has been widely accepted and
preferred by the students because they found that this remote
experimentation application is easy and fun to use, and felt
highly motivated to use it in their free time at the University
instead of attending a regular experimental practical class. In
both instances, equal learning experiences are obtained since
all students must present a final results analysis report. The
only difference between both methods falls on the laboratory
equipment manipulation. Meanwhile some students operate
this equipment through the remote device according to their
time schedule; others locally hands-on the same equipment
when attending the laboratory classroom.
The student have applied a great effort to achieve better
results in the controllers design when using this client
application. The haptic feedback allows students to notice
when the control signal exhibited an inappropriate behavior,
that way, they can make design corrections to obtain better
answers. With the usage of the clients application previous
version [34], the students only acknowledged the system
behavior feedback through the graphical interface; therefore,
when the students observed an inappropriate control signal
behavior, they werent interested to improve their designs.
Adobe Flash technology allows creating well-designed
friendly graphical user interfaces attractive to students. This
program also permits the applications deploy to run across
most important mobile platforms such as: Android, iOS and
Windows mobile. In other words, it was possible to increase up
to a 90% the number of remote experimentation system users,
978-1-4673-8633-3/16/$31.00 ©2016 IEEE 10-13 April 2016, Abu Dhabi, UAE
2016 IEEE Global Engineering Education Conference (EDUCON)
Page 245
compared to the 10% of users from the previous application
version mentioned in the previous paragraph.
IX. CONCLUSIONS
Nowadays, the technological advances achieved by the
smartphones has transformed this electronic devices in
powerful tools to be used in the effective teaching and learning
process.
This paper presents a mobile device application that allows
the manipulation and control of physical laboratory equipment.
By the usage of a smartphone, students are able to do
experiments on real lab equipment at their convenience during
their spare time when attending the University campus.
The remote experimentation system encourages the lab
equipment intense usage because it enables the students to do
the experiment even though the lab is closed. It also easily
adapts to a variety of experiments, and its usage can be applied
to other engineering or science lab areas.
Even though the actual vibration technology developed on
mobile devices is still very limited, the haptic feedback permits
to incorporate new experimental remote possibilities that will
actually improve the teaching and learning process.
This work describes how the haptic feedback was used to
complement the visual and audio information that the students
normally get when doing their experimental practices. This
tactile information sensed by the students helped them to easily
improve the outcome of their controllers design.
The students supervision application helps the course
teacher to control each student laboratory activity, and evaluate
their learning process.
X. FUTURE WORK
Future work will be oriented in the following directions:
Incorporate the capacity to manage video into the
system. This should allow the user to see the real
experimental setup in the mobile device, and also could
provide a greater sense of telepresence to the
application.
Add an online supervision module that monitors the
student activity during a test. This module should help
identify any design mistakes, and allow to send an
instant message to the student and warning them to
make the necessary design corrections.
Include a detailed written report option to the students
supervision application describing the activity
performed by each student during the Lab tests. This
includes: date, access time to the system, and quantity
of tests achieved in each practice. Additionally, it
should also compare this information with the course
average score. This resulted report could periodically
and automatically be send to the teacher.
ACKNOWLEDGMENT
The authors gratefully acknowledge the support offered by
The Prometeo Project: Secretaría de Educación Superior,
Ciencia, Tecnología e Innovación (Senescyt) and Universidad
Politécnica Salesiana, both in the Republic of Ecuador, and the
Decanato de Investigación y Desarrollo at the Universidad
Simón Bolívar in Venezuela.
REFERENCES
[1] A. Kukulska-Hulme and J. Traxler, Mobile learning: a handbook for
educators and trainers, Routledge Taylor & Francis Group, 2005.
[2] M. Sarrab, Mobile learning (m-learning) concepts, characteristics,
methods, components: platforms and frameworks, Nova Science Pub
Inc, 2015.
[3] M. Pasamontes, J.L. Guzman, F. Rodríguez, M. Berenguel, and S.
Dormido, Easy mobile device programming for educational purposes,
Proc. of the 44th IEEE Conference on Decision and Control and
European Control Conference (CDC-ECC05), pp. 3420-3425, Dec.
2005.
[4] A. Kukulska-Hulme, M. Sharples, M. Milrad, I. Arnedillo-Sánchez and
G. Vavoula, "Innovation in Mobile Learning: A European Perspective,"
Int. Journal of Mobile and Blended Learning, Vol. 1, pp. 1335, 2009.
[5] Z. Yusoff and H.M. Dahlan, Mobile based learning: an integrated
framework to support learning engagement through augmented reality
environment, Proc. of the 2013 International Conference on Research a
in Information Systems (ICRIIS 2013), pp. 251-256, Nov. 2013.
[6] G.J. Hwang and P.H. Wu, Applications, impacts and trends of mobile
technology-enhanced learning: a review of 2008-2012 publications in
selected SSCI journals, Int. Journal of Mobile Learning and
Organisation, vol. 8, pp. 83-95, 2014.
[7] D. Lopez-de-Ipina, J. Garcia-Zubia, P. Orduna, Remote control of Web
2.0-enabled laboratories from mobile devices, in Proc. of the Second
IEEE International Conference on e-Science and Grid Computing (e-
Science'06), dec. 2006.
[8] P. Orduña, J. García-Zubia, J. Irurzun, D. López-de-Ipiña and L.
Rodriguez-Gil, Enabling mobile access to remote laboratories, in Proc.
of the 2011 I EEE Global Engineering Education Conference (EDUCON
2011), pp. 312-318 , Apr. 2011.
[9] H. Hassan, J.M. Martínez-Rubio, A. Perles, J.V. Capella, C. Domínguez
and J. Albaladejo, Smartphone-based industrial informatics projects
and laboratories, IEEE transactions on industrial informatics, vol. 9, pp.
557-566, Feb. 2013.
[10] J.P.C. de Lima, W. Rochadel, A .M. Silva, J.P.S. Simão, J.B. da Silva
and J.B.M. Alves, Application of remote experiments in basic
education through mobile devices, in Proc. of the 2014 IEEE Global
Engineering Education Conference (EDUCON 2014), pp. 1093-1096,
Apr. 2014.
[11] A.K. Bejczy and J.K. Salisbury, Kinesthetic coupling between operator
and remote manipulator, in Proc. of the Int. Computer Technology
Conference, vol. 1, pp. 197-211, Aug. 1980.
[12] J.T. Dennerlein, P.A. Millman and R.D. Howe, Vibrotactile feedback
for industrial t elemanipulators, nn Proc. of the ASME IMECE Sixth
Annual Symposium on Haptic Interfaces for Virtual Environment and
Teleoperator Systems, Nov. 1997.
[13] W.B. Griffin, W.R. Provancher and M.R. Cutkosky, Feedback
strategies for shared control in dexterous telemanipulation,in Proc. of
the 2003 IEEE/RSJ International Conference on Intelligent Robots and
Systems (IROS 2003), vol. 3, pp. 2791-2796, Oct. 2003.
[14] D. Pamungkas and K. Ward, Immersive teleoperation of a robot arm
using electro-tactile feedback, in Proc. of the 6th International
Conference on Automation, Robotics and Applications, pp. 300-305,
Feb. 2015.
[15] J. Rosen, B. Hannaford, M.P. MacFarlane and M.N. Sinanan, Force
controlled and teleoperated endoscopic grasper for minimally invasive
surgery-experimental performance evaluation, IEEE Transactions on
Biomedical Engineering, vol.46, pp. 1212-1221, Oct. 1999.
[16] M. Minsky, O. Ming, O. Steele, F.P. Brooks and M. Behensky, Feeling
and seeing: issues in force display, In Proc. of the 1990 Symposium on
Interactive 3D Graphics, pp. 235-241, 1990.
978-1-4673-8633-3/16/$31.00 ©2016 IEEE 10-13 April 2016, Abu Dhabi, UAE
2016 IEEE Global Engineering Education Conference (EDUCON)
Page 246
[17] C. Basdogan, S. De, J. Kim, M. Manivannan, H. Kim and M.A.
Srinivasan, Haptics in minimally invasive surgical simulation and
training, IEEE Computer Graphics and Applications, vol. 24, pp. 56-64,
March-April 2004.
[18] M. Aiple and A. Schiele, Pushing the limits of the CyberGraspTM for
haptic rendering, in Proc. of the IEEE International Conference on
Robotics and Automation (ICRA 2013), pp.3541-3546 , May. 2013.
[19] J. Martínez, A. García, M. Oliver, J.P. Molina and P. González,
Identifying 3D geometric shapes with a vibrotactile glove, IEEE
Computer Graphics and Applications, in press.
[20] I. Poupyrev, S. Maruyama and J. Rekimoto, A mbient touch: designing
tactile interfaces for handheld devices, in Proc. of the 15th annual ACM
symposium on User interface software and technology (UIST'02), pp.
51-60, 2002.
[21] J. Luk et al., A role for haptics in mobile interaction: initial design
using a handheld tactile display prototype, in Proc. of the 2006
Conference on human factors in computing systems (CHI 2006), pp.
171-180, Apr. 2006.
[22] J. Cho, I. Hwang, and S. Oh, Vibration-based surface recognition for
smartphones, in Proc. of the 2012 IEEE International Conference on
Embedded and Real-Time Computing Systems and Applications
(RTCSA 2012), pp. 459-464, Aug. 2012.
[23] J.U. Lee, J.M. Lim, H. Shin, and K.U. Kyun, Haptic interaction with
user manipulation for smartphone, in Proc. of the 2013 IEEE
International Conference on Consumer Electronics (ICCE 2013), pp. 47-
48, Jan. 2013.
[24] Quanser Innovate Educate, July 15, 2015 [Online]. Available:
http://www.quanser.com/.
[25] Feedback Instruments Limited, July 15, 2015 [Online]. Available:
http://www.feedback-instruments.com/.
[26] Festo, July 15, 2015 [Online]. Available: http://www.festo.com/.
[27] LabVIEW, July 15, 2015 [Online]. Available:
http://www.ni.com/labview/.
[28] I. Titov, Using labVIEW for building laboratory server: pros and cons,
design patterns, software architecturing and common pitfalls, in Proc.
of the 2014 I EEE Global Engineering Education Conference (EDUCON
2014), pp. 1101-1107, Apr. 2014.
[29] Adobe Flash, July 15, 2015 [Online]. Available:
http://www.adobe.com/products/flash.html.
[30] R. Braunstein, ActionScript 3.0 Bible, Willey, 2010.
[31] Matlab, July 15, 2015 [Online]. Available:
http://www.mathworks.com/products/matlab/.
[32] Scilab, July 15, 2015 [Online]. Available: http:// www.scilab.org/.
[33] S. Choi, K.J. Kuchenbecker, Vibrotactile display: perception,
technology, and applications, In Proceedings of the IEEE, Vol. 101, pp.
2093-2104, Sept. 2013.
[34] E. Granado, W. Colmenares, O. Pérez and G. Catalodo, Remote
experimentation using mobile technology, IEEE Latin America
Transactions, vol. 11, pp. 1121-1126, June 2013.
978-1-4673-8633-3/16/$31.00 ©2016 IEEE 10-13 April 2016, Abu Dhabi, UAE
2016 IEEE Global Engineering Education Conference (EDUCON)
Page 247