Figure 4 - uploaded by Tariq Pervez Sattar
Content may be subject to copyright.
Geometric model of 3D point triangulation 

Geometric model of 3D point triangulation 

Source publication
Article
Full-text available
The purpose of this paper is to develop a wireless positioning system. The automation of non-destructive testing (NDT) of large and complex geometry structures such as aircraft wings and fuselage is prohibitively expensive, though automation promises to improve on manual ultrasound testing. One inexpensive way to achieve automation is by using a sm...

Contexts in source publication

Context 1
... this triangulation problem, 3D coordinates P L and P R can be obtained from the image coordinates p ¡ L and p ¡ R as shown in Figure 4. Let A ϭ ͓ Ϫ R L R p ¡ R ¡ p L ͔ , a 3 ϫ 2 matrix. The least-squares solution of equation (3) ...
Context 2
... non-destructive testing (UT) is vital in safety critical industries such as aerospace, chemical processing and power generation and is well-known for its higher attainable flaw sensitivity than all other volumetric NDT (non-destructive testing) systems. UT involves scanning a single transducer over the surface of a component, transmitting and receiving an ultrasonic pulse into the material. The automation of NDT could increase reliability by eliminating human factors that decrease the probability of defect detection. In general, studies have shown the superiority of automatic inspection over manual inspection. However, a human operator can cope with large complex shapes and curved surfaces very easily, whereas machines find these quite difficult. The prospects of a cost-effective robotic arm which can manipulate the probe in such situations are still remote. Portable scanners have been reported which use a range of novel positioning systems from acoustic-pulse triangulation (Passi et al. , 1999) to imaging of a probe-mounted light using a video camera (Dulay, 2007). The first attempt to realize a measurement universe using receivers that are sensitive to acoustic emitters was proposed by Lund and Jensen (1976) in the well-known P-Scan system. A stable working version has been created by Sirota (1981) in USSR and Chang et al. (1982) in the USA. Here, the receivers are placed at right angles to one another to create a coordinate reference frame. However, it is fair to say that both systems did not leave the research laboratories. This is due in part to impracticalities of its realization for in-service inspection. Passi et al. (1995) published some work on their I-Sonic system. It is able to track the position of the ultrasonic probe while monitoring the acoustic coupling. This is alongside its capability to determine the probe swiveling to 1° resolution in the Ϫ 90° to ϩ 90° range. Optical tracking systems can also be applied in the same manner. Sensors (e.g. cameras) are mounted at fixed locations. Ultrasonic probes to be tracked are marked with passive or active landmarks. Such techniques are known to be susceptible to occlusion problems, albeit, this is overcome by using additional landmarks which incidentally improve the accuracy of pose estimation. These types of position tracking systems nowadays offer highly portable solutions which scale well on any surface geometry, although many are prohibitively expensive due to high development costs of relatively complex technologies. The aim is to automate ultrasound NDT of large surfaces by using small inexpensive mobile robots with surface adhesion capability that move a single ultrasound probe over the surface to detect cracks at various orientations and measure the depth and, through wall, extent of a flaw. A permanent inspection record of the plan, side and end views of a component with the flaws superimposed on the image (Deng et al. , 2004; Cornwell and Mcnab, 1999), could then be used to create 3D images. The objectives to meet this aim are: ● Develop very low-cost systems using electronic devices developed for game playing that capture both the precise 3D spatial position of the ultrasonic test probe relative to the test component throughout the inspection and determines its pitch, roll and yaw. The location of ultrasonic pulse echoes captured during the inspection can then be used to build 3D visualization maps. ● Use the real-time spatial position system to provide instructions to the robot when testing a component to carry out a specified scan pattern according to an approved procedure, and thereafter capture the actual scan pattern executed by the robot/probe during the inspection, creating a verification of practice against specification. By using Wii remote controllers, a cheap but very accurate positioning system can be developed. Here we report the development of a wireless spatial positioning system that is capable of tracking the position of an ultrasonic probe relative to the component under inspection, with an accuracy of Ϯ 2 mm at distances up to 5 m. To verify the concept, a small steel test object (Figure 1) was modeled and its computer-aided drawing (CAD) imported into the MATLAB display environment (Figure 2). The object is 300 mm long and 100 mm wide. A UT probe was mounted as a tool tip to a FARO measurement arm and a random motion trajectory executed by manually moving the probe. The trajectory was recorded and visualized in real time (Figure 3). The system now always knows where the probe tip is located relative to the test object and how the probe is oriented relative to the object surface. This information can be used to verify if a specified qualified procedure and scanning trajectory is executed by a human operator or a mobile robot/UT probe. While the FARO arm is a very precise mechanical tool to determine the orientation of the tool tip and measure spatial position x-y-z to micron accuracy, its work envelope is limited and it is very expensive. Motion of the probe is restricted by the constraints of the mechanical arm. To obtain similar capability but with an unconstrained wireless probe which could be deployed by mobile robots, an inexpensive spatial position system is required that measures probe x-y-z and orientation. A solution is proposed in the next section that is based on recent advances in sophisticated electronic devices that are volume produced for game-playing consoles and hence very cheap. Many infrared-based optical motion capture systems are commercially available in the market, e.g. iotracker, PhaseSpace, Vicon, ART GmbH, NaturalPoint, etc. These systems can give high precision, flexibility and high data rate. However, they cost more than tens of thousands of dollars, and because of the high price, those systems are not widely used. Because of the availability and cheap cost of the Wii remote, many researchers have conducted experiments using one or more Wii remotes to make a positioning system, and they have achieved less than Ϯ 2 mm accuracy (Caruso and Re, 2010; Hay et al. , 2008; Wang and Huang, 2008; De Amici et al. , 2010; Zhu et al. , 2011), which is enough for our proposed NDT system. The Wii remote is a sophisticated sensor-dependent controller that contains MEMS 3-axis accelerometer, 3-axis gyroscope, IR camera sensor and Bluetooth technology for communication. The IR camera chip contains a multi-object tracking engine which provides on-board processing capability to detect up to four IR LEDs with high resolution and high tracking speed. Its refresh rate is 100 Hz. Experiments conducted in the laboratory show that the horizontal and vertical field of view of the Wii IR Camera is 45° and 32°, respectively. The resolution is 128 ϫ 96 and the on-board chip uses 8 sub-pixel analysis to give the detected four IR LED blobs at a resolution of 1,028 ϫ 768. Web cameras found in the market at the same price have lower resolution and refresh rate than the Wii infrared camera. The Wii remote returns coordinates and the intensity of the detected LEDs to the host PC by Bluetooth connectivity and into the MATLAB environment by using a MEX file. Two Wii controllers were mounted on a tripod using an aluminum extrusion bar with their field of view overlapping each other. For stereo triangulation, the focal point of each camera must be known, and for more accuracy, the distortion model of each camera is required. Also required are the positions and the orientations of both cameras relative to each other. These parameters can be obtained by running the calibration process. In a two-camera vision ¡ ¡ system (Figure 4), for a point P in 3D space, its P L and P R in the left and the right camera coordinates systems have the following ...

Citations

Chapter
In order to address the issue of low efficiency and poor accuracy in measuring the radius of large vertical oil tanks, a novel measurement method based on the combination of laser tracking and wall-climbing robot was proposed in this manuscript. To solve the problem of light interruption during the laser tracking measurement process, an active spherically mounted retroreflector (SMR) device that can automatically align the laser was designed. A wall-climbing robot equipped with the active target and temperature sensor was developed to crawl on the tank wall and collect four-dimensional point cloud data. And a self-written program based on the PCL point cloud processing library was adopted to fit and calculate the tank plate radius. Furthermore, multiple measurements were carried out on a vertical oil tank with a capacity of 1000 m3, and the average radius error was only 1.39 mm, which strongly verified the repeatability and feasibility of the method proposed in this manuscript. This method provides a more effective and accurate method for measuring the radius of oil tanks, and it is foreseeable that it will be a good supplement to existing methods.