Figure 7 - uploaded by J. Norberto Pires
Content may be subject to copyright.
Module fcteste6 loaded to the robot controller ready for execution 

Module fcteste6 loaded to the robot controller ready for execution 

Contexts in source publication

Context 1
... robot can now be guided by hand (Pires, 2008) enabling the user to move it to the desired positions: this means that the user can easily reach the positions and orientations more adapted for the task being programmed. After reaching a desired position the user can command a MOVE instruction using the velocity and precision values in use. For example, the following sequence of commands will draw the first “S” of the desired word: User moves the robot to 10 mm above the beginning of the character “S”: User: robot position final ! defines maximum precision Robot: Position final, master. User: computer instruction move joint ! acquires position and sets MOVEJ (ABB Robotics, 2007a) Robot: Instruction move joint, master. User moves the robot to touch the paper (point 1) and draw the character “S” using straight lines (6 points are necessary). User: computer instruction move line ! acquires position and sets MOVEL (ABB Robotics, 2007a) Robot: Instruction move line, master. User moves the robot to point 2. User: computer instruction move line Robot: Instruction move line, master. After finalizing the program, the user should save the module, upload it to the robot controller and execute it (Figure 7, (ABB Robotics, 2007a)): User: computer save program Robot: Save program, master. j ! Robot is saved using the RAPID language syntax User: robot load module Robot: Load module, master. j ! Module uploaded to the robot controller using the servies offered by the TCP/IP server User: robot execute Robot: Are you sure, master? User: robot yes/no Robot: Executing, master/Ok. Command aborted, master. Figure 7 shows the modules loaded in the robot control system (the view is obtained from RobotStudio Online, a tool that enables the users to browse the applications loaded in the controller). The fcteste6 module is the one built in the example presented in this section, after being uploaded to the system. Figure 7 shows part of the PbD user interface (right) with the RobotStudio Online window bellow. It is important to stress out that the robot program built in this section was completely obtained without writing a single line of code, but instead just by moving the robot to the desired positions and adding the required robot instructions using speech. Even the upload task of the obtained module to the robot controller is commanded with speech, along with its execution/termination. Consequently, teaching a new feature to the robotic system is accessible to any type of user with only minor training. Most of the industrial systems are semiautonomous systems that require only minor operator intervention to operate. Considering a real production setup, it could be interesting to have more portable solutions. Consequently, the use of portable devices like PDAs (Pocket PCs) and mobile phones may be interesting to run part of the monitoring and control software, or even also to run speech interfaces. Part of the features presented in this paper were also developed to run on Pocket PCs and SmartPhones, i.e. on portable devices that run the windows mobile operating system. Both type of devices are programmable in C# and run equivalent ASR and TTS engines which enables us to develop the same solutions for this portable devices. Figure 8 shows the shell of an application developed to run on SmartPhones (Windows Mobile 5 or higher) to control an industrial system similar to the one presented in this paper. Portable devices may be very interesting for monitoring and on-the-task (online) operations, improving in this way the coworker scenario (Pires, 2006a, b, 2007; SMErobot e , 2005- 2009; Dillmann et al. , 1999). This paper introduces a PbD system designed to work with industrial cells, namely the ones composed by industrial robots. The presented system combines force control, a multilingual speech interface (runs Portuguese and English) and code generation techniques to build a system that uses currently available standards to reach its level of functionality. The paper focus on application details and describes ways to have industrial systems interfacing better and more naturally with human operators. The PbD system presented in this paper proved to be very efficient with the task of programming entirely new features to an industrial robotic system. The system uses a speech interface for user command, and a force controlled guiding system for teaching to the robot the details about the task being programmed. With only a small set of implemented robot instructions it was fairly easy to teach a new task to the robot system, generate the robot code and execute it immediately. Although a particular robot controller was used, the system is in many aspects general, since the options adopted are mainly based on standards. It can obviously be implemented with other robot controllers without significant changes. If fact, most of the features were ported to run with Motoman robots with success (Pires, 2006a). The authors are convinced that this type of PbD systems will constitute a major advantage for SMEs, since most of those companies do not have the necessary engineering resources to make changes or add new functionalities to their robotic manufacturing systems. Even at the system integrator level these systems are very useful as a way to avoid having specific knowledge about all the controllers they work with: complexity is hidden beyond the speech interfaces and portable interface devices, with specific and user-friendly APIs making the connection between the programmer and the system. This is the basic idea of HLP that is really needed to have developers focusing on the problem they have to solve, instead of worrying about the details of getting things setup and done. Future work is directed in supporting these features with specially designed services, supported under a standard SOA (Veiga et al. , 2007; Veiga and Pires, 2008a; James and Smith, 2005; SIRENA Project, 2005). This approach will enable users at both operational and integrator levels to reduce their activity to the utilization of services that expose the complete functionality of the system being used, handling all the features and events, allowing them to focus on the task under development or the problem being solved. The authors selected a few SOA architectures, namely UPnP and DSSP (Veiga et al. , 2007; Veiga and Pires, 2008a; James and Smith, 2005; SIRENA Project, 2005), to control manufacturing cells and compared the results obtained from application to a test- bed. The idea was to make a comprehensive comparison of both technologies and in the process discuss the utilization of SOA for system integration and high-level programming of robotic manufacturing cells (the topic of the current paper). Automatic generation of services directly from robot code is also being implemented as a way to easily generate services that can be used with high-level programming strategies. ABB Robotics (2007a), ABB IRB140 Users’ Manual , ABB Robotics, V ̈ster ̈s. ABB Robotics (2007b), ABB IRC5 Machining FC Users Manual , ABB Robotics, V ̈ster ̈s. ATI Industrial Automation (2008), “ATI NANO17 Force- Torque Sensor”, available at: www.ati-ia.com/ Biggs, G. and MacDonald, B. (2003), “A survey of robot programming systems”, Proceedings of the Australian December 1-3 . Dillmann, R., Rogalla, O., Ehrenmann, M., Zo ̈ lner, R. and Bordegoni, M. (1999), “Learning robot behaviour and skills based on human demonstration and advice: the machine learning paradigm”, Proceedings of the 9th Snowbird, Utah, USA , October 9-12 , pp. 229-38. Hagele, M., Nilsson, K. and Pires, J.N. (2008), “Industrial robotics”, in Siciliano, B. and Khatib, O. (Eds), Handbook of Robotics , Chapter 42, Springer, New York, NY. James, F. and Smith, H. (2005), “Service oriented paradigms for industrial automation”, IEEE Transactions on Industrial Informatics , Vol. 1 No. 1, pp. 62-70. Mittal, R.K. and Nagrath, I.J. (2003), Robotics and Control , McGraw-Hill, New York, NY. Myers, D., Pritchard, M. and Brown, M. (2001), “Automated programming of an industrial robot through teach-by- showing”, Proceedings of the IEEE International Conference on May 21-26 , pp. 4078-83. Pires, J.N. (2005), “Robot-by-voice: experiments on commanding an industrial robot using the human voice”, Industrial Robot: An International Journal , Vol. 32 No. 6, pp. 505-11. Pires, J.N. (2006a), Industrial Robots Programming, Building Applications for the Factories of the Future , Springer, New York, NY. Pires, J.N. (2006b), “Robotics for small and medium enterprises: control and programming challenges”, Industrial Robot , Vol. 33 No. ...
Context 2
... robot can now be guided by hand (Pires, 2008) enabling the user to move it to the desired positions: this means that the user can easily reach the positions and orientations more adapted for the task being programmed. After reaching a desired position the user can command a MOVE instruction using the velocity and precision values in use. For example, the following sequence of commands will draw the first “S” of the desired word: User moves the robot to 10 mm above the beginning of the character “S”: User: robot position final ! defines maximum precision Robot: Position final, master. User: computer instruction move joint ! acquires position and sets MOVEJ (ABB Robotics, 2007a) Robot: Instruction move joint, master. User moves the robot to touch the paper (point 1) and draw the character “S” using straight lines (6 points are necessary). User: computer instruction move line ! acquires position and sets MOVEL (ABB Robotics, 2007a) Robot: Instruction move line, master. User moves the robot to point 2. User: computer instruction move line Robot: Instruction move line, master. After finalizing the program, the user should save the module, upload it to the robot controller and execute it (Figure 7, (ABB Robotics, 2007a)): User: computer save program Robot: Save program, master. j ! Robot is saved using the RAPID language syntax User: robot load module Robot: Load module, master. j ! Module uploaded to the robot controller using the servies offered by the TCP/IP server User: robot execute Robot: Are you sure, master? User: robot yes/no Robot: Executing, master/Ok. Command aborted, master. Figure 7 shows the modules loaded in the robot control system (the view is obtained from RobotStudio Online, a tool that enables the users to browse the applications loaded in the controller). The fcteste6 module is the one built in the example presented in this section, after being uploaded to the system. Figure 7 shows part of the PbD user interface (right) with the RobotStudio Online window bellow. It is important to stress out that the robot program built in this section was completely obtained without writing a single line of code, but instead just by moving the robot to the desired positions and adding the required robot instructions using speech. Even the upload task of the obtained module to the robot controller is commanded with speech, along with its execution/termination. Consequently, teaching a new feature to the robotic system is accessible to any type of user with only minor training. Most of the industrial systems are semiautonomous systems that require only minor operator intervention to operate. Considering a real production setup, it could be interesting to have more portable solutions. Consequently, the use of portable devices like PDAs (Pocket PCs) and mobile phones may be interesting to run part of the monitoring and control software, or even also to run speech interfaces. Part of the features presented in this paper were also developed to run on Pocket PCs and SmartPhones, i.e. on portable devices that run the windows mobile operating system. Both type of devices are programmable in C# and run equivalent ASR and TTS engines which enables us to develop the same solutions for this portable devices. Figure 8 shows the shell of an application developed to run on SmartPhones (Windows Mobile 5 or higher) to control an industrial system similar to the one presented in this paper. Portable devices may be very interesting for monitoring and on-the-task (online) operations, improving in this way the coworker scenario (Pires, 2006a, b, 2007; SMErobot e , 2005- 2009; Dillmann et al. , 1999). This paper introduces a PbD system designed to work with industrial cells, namely the ones composed by industrial robots. The presented system combines force control, a multilingual speech interface (runs Portuguese and English) and code generation techniques to build a system that uses currently available standards to reach its level of functionality. The paper focus on application details and describes ways to have industrial systems interfacing better and more naturally with human operators. The PbD system presented in this paper proved to be very efficient with the task of programming entirely new features to an industrial robotic system. The system uses a speech interface for user command, and a force controlled guiding system for teaching to the robot the details about the task being programmed. With only a small set of implemented robot instructions it was fairly easy to teach a new task to the robot system, generate the robot code and execute it immediately. Although a particular robot controller was used, the system is in many aspects general, since the options adopted are mainly based on standards. It can obviously be implemented with other robot controllers without significant changes. If fact, most of the features were ported to run with Motoman robots with success (Pires, 2006a). The authors are convinced that this type of PbD systems will constitute a major advantage for SMEs, since most of those companies do not have the necessary engineering resources to make changes or add new functionalities to their robotic manufacturing systems. Even at the system integrator level these systems are very useful as a way to avoid having specific knowledge about all the controllers they work with: complexity is hidden beyond the speech interfaces and portable interface devices, with specific and user-friendly APIs making the connection between the programmer and the system. This is the basic idea of HLP that is really needed to have developers focusing on the problem they have to solve, instead of worrying about the details of getting things setup and done. Future work is directed in supporting these features with specially designed services, supported under a standard SOA (Veiga et al. , 2007; Veiga and Pires, 2008a; James and Smith, 2005; SIRENA Project, 2005). This approach will enable users at both operational and integrator levels to reduce their activity to the utilization of services that expose the complete functionality of the system being used, handling all the features and events, allowing them to focus on the task under development or the problem being solved. The authors selected a few SOA architectures, namely UPnP and DSSP (Veiga et al. , 2007; Veiga and Pires, 2008a; James and Smith, 2005; SIRENA Project, 2005), to control manufacturing cells and compared the results obtained from application to a test- bed. The idea was to make a comprehensive comparison of both technologies and in the process discuss the utilization of SOA for system integration and high-level programming of robotic manufacturing cells (the topic of the current paper). Automatic generation of services directly from robot code is also being implemented as a way to easily generate services that can be used with high-level programming strategies. ABB Robotics (2007a), ABB IRB140 Users’ Manual , ABB Robotics, V ̈ster ̈s. ABB Robotics (2007b), ABB IRC5 Machining FC Users Manual , ABB Robotics, V ̈ster ̈s. ATI Industrial Automation (2008), “ATI NANO17 Force- Torque Sensor”, available at: www.ati-ia.com/ Biggs, G. and MacDonald, B. (2003), “A survey of robot programming systems”, Proceedings of the Australian December 1-3 . Dillmann, R., Rogalla, O., Ehrenmann, M., Zo ̈ lner, R. and Bordegoni, M. (1999), “Learning robot behaviour and skills based on human demonstration and advice: the machine learning paradigm”, Proceedings of the 9th Snowbird, Utah, USA , October 9-12 , pp. 229-38. Hagele, M., Nilsson, K. and Pires, J.N. (2008), “Industrial robotics”, in Siciliano, B. and Khatib, O. (Eds), Handbook of Robotics , Chapter 42, Springer, New York, NY. James, F. and Smith, H. (2005), “Service oriented paradigms for industrial automation”, IEEE Transactions on Industrial Informatics , Vol. 1 No. 1, pp. 62-70. Mittal, R.K. and Nagrath, I.J. (2003), Robotics and Control , McGraw-Hill, New York, NY. Myers, D., Pritchard, M. and Brown, M. (2001), “Automated programming of an industrial robot through teach-by- showing”, Proceedings of the IEEE International Conference on May 21-26 , pp. 4078-83. Pires, J.N. (2005), “Robot-by-voice: experiments on commanding an industrial robot using the human voice”, Industrial Robot: An International Journal , Vol. 32 No. 6, pp. 505-11. Pires, J.N. (2006a), Industrial Robots Programming, Building Applications for the Factories of the Future , Springer, New York, NY. Pires, J.N. (2006b), “Robotics for small and medium enterprises: control and programming challenges”, Industrial Robot , Vol. 33 No. ...
Context 3
... robot can now be guided by hand (Pires, 2008) enabling the user to move it to the desired positions: this means that the user can easily reach the positions and orientations more adapted for the task being programmed. After reaching a desired position the user can command a MOVE instruction using the velocity and precision values in use. For example, the following sequence of commands will draw the first “S” of the desired word: User moves the robot to 10 mm above the beginning of the character “S”: User: robot position final ! defines maximum precision Robot: Position final, master. User: computer instruction move joint ! acquires position and sets MOVEJ (ABB Robotics, 2007a) Robot: Instruction move joint, master. User moves the robot to touch the paper (point 1) and draw the character “S” using straight lines (6 points are necessary). User: computer instruction move line ! acquires position and sets MOVEL (ABB Robotics, 2007a) Robot: Instruction move line, master. User moves the robot to point 2. User: computer instruction move line Robot: Instruction move line, master. After finalizing the program, the user should save the module, upload it to the robot controller and execute it (Figure 7, (ABB Robotics, 2007a)): User: computer save program Robot: Save program, master. j ! Robot is saved using the RAPID language syntax User: robot load module Robot: Load module, master. j ! Module uploaded to the robot controller using the servies offered by the TCP/IP server User: robot execute Robot: Are you sure, master? User: robot yes/no Robot: Executing, master/Ok. Command aborted, master. Figure 7 shows the modules loaded in the robot control system (the view is obtained from RobotStudio Online, a tool that enables the users to browse the applications loaded in the controller). The fcteste6 module is the one built in the example presented in this section, after being uploaded to the system. Figure 7 shows part of the PbD user interface (right) with the RobotStudio Online window bellow. It is important to stress out that the robot program built in this section was completely obtained without writing a single line of code, but instead just by moving the robot to the desired positions and adding the required robot instructions using speech. Even the upload task of the obtained module to the robot controller is commanded with speech, along with its execution/termination. Consequently, teaching a new feature to the robotic system is accessible to any type of user with only minor training. Most of the industrial systems are semiautonomous systems that require only minor operator intervention to operate. Considering a real production setup, it could be interesting to have more portable solutions. Consequently, the use of portable devices like PDAs (Pocket PCs) and mobile phones may be interesting to run part of the monitoring and control software, or even also to run speech interfaces. Part of the features presented in this paper were also developed to run on Pocket PCs and SmartPhones, i.e. on portable devices that run the windows mobile operating system. Both type of devices are programmable in C# and run equivalent ASR and TTS engines which enables us to develop the same solutions for this portable devices. Figure 8 shows the shell of an application developed to run on SmartPhones (Windows Mobile 5 or higher) to control an industrial system similar to the one presented in this paper. Portable devices may be very interesting for monitoring and on-the-task (online) operations, improving in this way the coworker scenario (Pires, 2006a, b, 2007; SMErobot e , 2005- 2009; Dillmann et al. , 1999). This paper introduces a PbD system designed to work with industrial cells, namely the ones composed by industrial robots. The presented system combines force control, a multilingual speech interface (runs Portuguese and English) and code generation techniques to build a system that uses currently available standards to reach its level of functionality. The paper focus on application details and describes ways to have industrial systems interfacing better and more naturally with human operators. The PbD system presented in this paper proved to be very efficient with the task of programming entirely new features to an industrial robotic system. The system uses a speech interface for user command, and a force controlled guiding system for teaching to the robot the details about the task being programmed. With only a small set of implemented robot instructions it was fairly easy to teach a new task to the robot system, generate the robot code and execute it immediately. Although a particular robot controller was used, the system is in many aspects general, since the options adopted are mainly based on standards. It can obviously be implemented with other robot controllers without significant changes. If fact, most of the features were ported to run with Motoman robots with success (Pires, 2006a). The authors are convinced that this type of PbD systems will constitute a major advantage for SMEs, since most of those companies do not have the necessary engineering resources to make changes or add new functionalities to their robotic manufacturing systems. Even at the system integrator level these systems are very useful as a way to avoid having specific knowledge about all the controllers they work with: complexity is hidden beyond the speech interfaces and portable interface devices, with specific and user-friendly APIs making the connection between the programmer and the system. This is the basic idea of HLP that is really needed to have developers focusing on the problem they have to solve, instead of worrying about the details of getting things setup and done. Future work is directed in supporting these features with specially designed services, supported under a standard SOA (Veiga et al. , 2007; Veiga and Pires, 2008a; James and Smith, 2005; SIRENA Project, 2005). This approach will enable users at both operational and integrator levels to reduce their activity to the utilization of services that expose the complete functionality of the system being used, handling all the features and events, allowing them to focus on the task under development or the problem being solved. The authors selected a few SOA architectures, namely UPnP and DSSP (Veiga et al. , 2007; Veiga and Pires, 2008a; James and Smith, 2005; SIRENA Project, 2005), to control manufacturing cells and compared the results obtained from application to a test- bed. The idea was to make a comprehensive comparison of both technologies and in the process discuss the utilization of SOA for system integration and high-level programming of robotic manufacturing cells (the topic of the current paper). Automatic generation of services directly from robot code is also being implemented as a way to easily generate services that can be used with high-level programming strategies. ABB Robotics (2007a), ABB IRB140 Users’ Manual , ABB Robotics, V ̈ster ̈s. ABB Robotics (2007b), ABB IRC5 Machining FC Users Manual , ABB Robotics, V ̈ster ̈s. ATI Industrial Automation (2008), “ATI NANO17 Force- Torque Sensor”, available at: www.ati-ia.com/ Biggs, G. and MacDonald, B. (2003), “A survey of robot programming systems”, Proceedings of the Australian December 1-3 . Dillmann, R., Rogalla, O., Ehrenmann, M., Zo ̈ lner, R. and Bordegoni, M. (1999), “Learning robot behaviour and skills based on human demonstration and advice: the machine learning paradigm”, Proceedings of the 9th Snowbird, Utah, USA , October 9-12 , pp. 229-38. Hagele, M., Nilsson, K. and Pires, J.N. (2008), “Industrial robotics”, in Siciliano, B. and Khatib, O. (Eds), Handbook of Robotics , Chapter 42, Springer, New York, NY. James, F. and Smith, H. (2005), “Service oriented paradigms for industrial automation”, IEEE Transactions on Industrial Informatics , Vol. 1 No. 1, pp. 62-70. Mittal, R.K. and Nagrath, I.J. (2003), Robotics and Control , McGraw-Hill, New York, NY. Myers, D., Pritchard, M. and Brown, M. (2001), “Automated programming of an industrial robot through teach-by- showing”, Proceedings of the IEEE International Conference on May 21-26 , pp. 4078-83. Pires, J.N. (2005), “Robot-by-voice: experiments on commanding an industrial robot using the human voice”, Industrial Robot: An International Journal , Vol. 32 No. 6, pp. 505-11. Pires, J.N. (2006a), Industrial Robots Programming, Building Applications for the Factories of the Future , Springer, New York, NY. Pires, J.N. (2006b), “Robotics for small and medium enterprises: control and programming challenges”, Industrial Robot , Vol. 33 No. ...

Similar publications

Article
Full-text available
Most firms are increasingly realizing the benefits of involving the outside suppliers by considering their manufacturing processes and technological capabilities, especially regarding quality, time to market, configuration, control and cost. Nevertheless, in the context of small to medium enterprises (SMEs), scant attention has been given to the em...
Article
Full-text available
This study focuses on the impact that organizational culture has on driving strategic innovation in Egyptian Micro, Small and Medium Enterprises (M/SMEs). Organizational culture constitutes an integral intangible resource that defines the organizational fabric of these enterprises, which calls for better understanding of the dimensions of the suppo...
Article
Full-text available
This research is conducted to understand the process of implementing an entrepreneurial culture in Islamic boarding schools-also known as pesantren-and explaining this implementation based on an Islamic perspective. This research used an interdisciplinary qualitative approach and a phenomenology strategy. The result of this research indicated that...

Citations

... presented by a robot manufacturer have embedded high speed force control feedback into the robot controller [5], which provides good perspectives in terms of advanced sensor integration, crucial to improve the man/machine interaction. These systems have been around in research for a long time but their industrialization opens possibilities in mass products with advanced teaching techniques, namely Programming-by-demonstration [6][7]. The original contribution of the present work seeks two audiences, the research community in the biomedical area and the robotics researchers dealing with human-robot interaction. ...
Conference Paper
Full-text available
The paper presented herein describes the development of an advanced robotized system applied to in vitro implant dentistry research. To the biomedical community this paper shows the possibilities of industrial robots to help the research. Robots are special suitable to the biomedical field specially when integrated with advanced sensors and technologies, which may facilitate both the programming tasks and the data acquisition. Robotics researchers will find in this paper one of the first applications of programming by demonstration with real users with a novel explicit robot programming technique making use of multi-camera vision combined with speech recognition. This programming method targets users who have minimal robot experience but aims at 'teaching' the robot to execute a specific task.
... To this purpose, interactive programming interfaces that allow non-skilled users to program robots have been developed over the years. These developments require higher levels of abstraction and in some way tend to lead to machines that understand human-like instructions [4]. ...
Article
In this paper a method to control and teach industrial robots in real-time by human demonstration is presented. This system uses high-intensity visual markers that make it sufficiently robust for industrial environments not requiring special lighting. An automated camera calibration method was implemented which enables any non-skilled user a quick and easy configuration of the system. The teaching of the robot is achieved by recording the detected points and replaying them to the manipulator, therefore this is a "teaching-by-showing" method. This system can work with any kind of industrial mani-pulator capable of receiving remote positioning commands.
Article
Full-text available
Today, most industrial robots are programmed using the typical teaching process. This paper presents a robotic system where the user can instruct and program a robot just showing what it should do, and with a high-level of abstraction from the robot language. This is done using the two most natural human interfaces (gestures and speech), a force control system and several code generation techniques. The performance of this system is compared with a similar system that instead of gestures uses a manual guidance system based on a force control strategy. Two different demonstrations with two different robots (MOTOMAN and ABB) are presented, showing that the developed systems can be customised for different users and robots.