Sie sind auf Seite 1von 8

Controlling cm3 Sized Autonomous Micro Robots Operating in the Micro and Nano World

J. Seyfried, R. Estaa, F. Schmoeckel, M. Thiel, A. Brkle, H. Woern Institute for Process Control and Robotics (IPR), Geb. 40.28, Universitt Karlsruhe (TH), 76128 Karlsruhe, Germany {seyfried, estana, schmoeckel, thiel, abuerkle, woern}@ira.uka.de Fax: +49-721-608-7141

SYNOPSIS In order to bridge the increasing gap between micro- and nano-technologies, a European consortium is currently developing and investigating a cluster of mobile, wireless cm3-sized micro robots. The control and sensor issues which are to be solved for such a robot system are demanding. This paper describes the work carried out by one of the project partners. An interferometrical principle employing the so-called mechanical interferometer based on the Moir-effect is used for the position sensor system. This system will yield micrometre resolution on a 500500 mm2 work space. Further sensor systems involve local microscope cameras, for which the extraction of depth information is crucial. The control system of the robots must then be able to fuse all available sensor data, which is available only non-synchronously to generate motion commands for the robots. For this, a two-staged control architecture with different control strategies is presented which discriminates coarse and fine motion of each single robot. 1 INTRODUCTION Between the rapidly evolving Nanotechnology, the well-established micro-technologies and the development of micro-handling technologies there is a large and increasing gap concerning the interfacing of the micro- and nano-worlds. First successes in manipulation of microsystem components are just an initial step towards the automated and reliable robot-based handling of objects with a few m in dimensions. To take this step, a European consortium is currently

developing mobile, autonomous micro robots within the framework of the MiCRoN project1. These wireless robots will be piezoelectrically driven and include integrated onboard electronics for the actuator control, infrared transceivers for communication, grippers and onboard sensors within 1 cm3 of size. These robots will be employed in a small cluster (about 5 robots) to achieve micro- to nanohandling tasks co-operatively. Several robots will be equipped with several types of sensors, including AFM sensors and CCD-cameras (mounted on slightly larger robots). This paper describes a novel high-resolution global position sensor to supervise the robot cluster, a depth measuring strategy and the sensor data fusion approach taken to be able to evaluate the multitude of asynchronously supplied sensor data. 2 MOIR-BASED POSITIONING SYSTEM For all micro-robot handling tasks, a so called Global Positioning System must provide the robot control with the pose of all robots. For this, a Micro Position Sensing system (MPS) is currently being developed. Its non-contact measurement principle works two-staged: The first approximation step uses the photogrammetrical principle. The next step is based on interferometrical effects using a combination of a physical grid and a virtual grid generated within a computer. The interferometrical effect used is the Moir-Phenomenon being a well-known measurement principle (12). The result of the interferometrical calculation delivers a very high resolution of 1 m. The robot (Fig. 2) is equipped with three Moir gratings they are used both as a photogrammetrical and as a Moir-based mark by having three or four concentric cosinusodial shaped full circles. A high-resolution CCD-camera is supervising the whole scene permanently.

Fig. 1: The virtual world grid and Fig. 2: CAD model of a micro a micro robot robot, equipped with three Moir-marks

Fig. 3: The Moireffect

Two grids are necessary to generate the Moir-fringes, which have to be analysed by the sensor algorithms. The first one is the grid on the robot marks. The second one, which is fixed to the working plate, is virtually created inside the computer (this grid is also circular and cosinusodial shaped). Merging the CCD-picture by image processing with the virtual grid will cause a Moireffect (Fig. 3). The evaluation of the Moir pattern leads to an over-determined linear equation system. Solving this equation system results in the position of the centre of the mark with a very low error (Fig 5).
1

EU project IST-2001-33567

Fig. 4: The developed measurement Software moebius

Fig. 5: The measurement error of the MPS over a measure ment plane of 500 x 500mm (20 x 20 plane parts)

A graphical user interface called Moebius has been developed which provides the user with the necessary measurement interface (Fig. 4) and performs the required calibration procedures automatically. It has been used to perform a complete simulation of the measurement system and is about to be integrated into the robot control system. The measurement error shown in Fig. 5 is the Taylorised error function out of the (theoretical) Moir-based calculation of the mark centre. Near the work plane centre and therefore near the centre of the virtual world grid the measurement error is smaller compared to the regions near the work plane border because of the resulting Moir point character. Within the inner region, a so called high frequency Moir pattern is produced which can be easily detected and analysed with the computer, because the world grid radii are small. The error of the over-determined equation system which describes the geometric relationship is smaller because there are more Moir points. Near the work plane borders, the world grid radii are bigger, and the resulting Moir fringes are of low frequency character, which are difficult to detect automatically. So there are less equations in the equation system, which results in a higher measurement error. Nevertheless, the theoretical overall error (about 0,1m) is much smaller than the current minimum resolution of the system (1m). Moebius will be able to track 32 measurement marks permanently and parallel in one image and with one camera. This is done with the help of so called Regions of Interests which represent one mark position each. Because the measurement cycle for one mark is about 15 milliseconds (using a 1.6 GHz Pentium 4 Processor), all 32 marks are measured within one second turnaround cycle time. The camera system will be connected by IEEE1394 Fire Wire to the computer system. Alternatively, a consumer camera with a high quality optical subsystem like the Sony DSC F717 in combination with an I-LINK adapter for command communication and an USB2.0 high speed adapter for the image transfer from the camera to the computer system can be used. With a notebook installation of Moebius, the system is portable. For system calibration, an automated calibration algorithm based on the Tsai calibration method (1) has been developed. The calibration results are promising - with a normalised calibration error of about 0.65, the system is able to transform the measured images into the undistorted world view with a high accuracy. The result can be seen in Figure 6 (distorted reference pattern) and Figure 7 (same image but undistorted).

Fig. 6: The distorted reference pattern

Fig. 7: Undistorted reference pattern

3 FOCUS-BASED DEPTH ESTIMATION AND OBJECT RECONSTRUCTION An essential requirement for micro- and nano-handling tasks is the need for depth information of the scene. One possible solution is the use of two robots equipped with CCD-cameras. However, a stereo vision system not only requires more space than a single camera robot, one also has to cope with solving the correspondence problem. For this reason, another method based on depth-fromfocus has been developed. The basic concept of the depth-from-focus approach is to maximise the local sharpness in a series of images taken with different focal settings. The smaller the depth of field of the optical system, the better the achievable accuracy. However, evaluating local sharpness requires highly structured surfaces. One solution to this problem is projecting an artificial pattern onto the object. Another approach is introduced in this paper. This novel depth-from-focus method is capable of reconstructing even weakly structured surfaces without special illumination. Thus, it is applicable for a wide range of different object types.

Fig. 8: Reconstruction of a micro gear wheel ( 500 m) with a novel depth-from-focus approach

Previous methods like (2) use sharpness filters of fixed size. The filter size directly corresponds to the spatial frequency of the surface structure it can detect. This means, the filter size has to be chosen carefully. These approaches do not adapt well to variations of the object structure. An adaptive sharpness estimation is expected to provide much better results. The new method is based on multi resolution analysis. As sharpness measure the normalized local variance measure is used. It is efficient to implement and proved to be robust to noise. Figure 8 shows the reconstruction of a weakly textured micro gear wheel. Due to the optical systems small field of view the model was concatenated from three individual measuring passes. 4 SENSOR DATA FUSION The presupposition for position control is the accurate estimation of the robots configurations by fusing all sensor data. The configuration of a mobile robot is e.g. its pose and, depending on the dynamics of the robots, also its velocity and acceleration. Therefore, a Kalman Filter has been realised, being the method of choice for estimating a systems state from noisy sensor data. The implemented sensor data fusion architecture provides accurate configuration (position) information of a mobile micro robot based on the sensor data available and the control vector of the robot. Other software modules like the controller or the user interface obtain the current robot configuration by an update function of the robot model corresponding to the robot of interest. Being part of the systems world model, the robot model is a software object containing a robots configuration and kinematics in form of a frame. The robot models update function gets an estimate from the sensor data fusion, i.e. the configuration is always being predicted for a given point of time (e.g. now). This prediction is based on the latest sensor data and uses a dynamic model of the robot, which is implemented within the robot model.
move commands (control vector)

Controller

update

Robot Hardware
update control vector predict configuration

Robot Model Robot Dynamics Model


begin measurement end measurement

Sensor Data Fusion

Sensor Sensor device

Sensor

Sensor

Fig. 9: Sensor data fusion As single robots will be independent from each other, each robot has its own sensor data fusion object. Sensors that can measure a part of a particular robots configuration send their measurements to the respective fusion object. Sensor devices and logical sensors can be distinguished. A logical sensor can be e.g. a tracking module that provides just the 2D image coordinates of one recognised feature on a robot using the images taken by a camera that is a sensor device. Figure 9 illustrates the collaboration of the modules. The robot object on the left is used for the communication with a

robots hardware in order to send move commands to the respective robot. These commands may be given by a UI or, as depicted, by the controller. As the information about the robots moving direction is used for predicting the robots position, the control vector is sent to the fusion object. Whenever an update of the robot model is requested, the fusion object provides an estimate using all available sensor data. For this, an Unscented Kalman Filter (3) is employed. The filtering procedure bases on the SCAAT method described by (4). The sensors being developed will provide their measurements asynchronously and with time delays because of the required calculations. A special feature of the sensor data fusion, which internally will maintain a sensor data history, solves the problem. By this means, the sensor data can be filtered in a chronologically correct way after each measurement is finished. Running on a standard PC, this approach is the straightforward solution for an exact tracking using out-of-sequence measurements. As all sensors provide their measurements in similar time intervals, it is well suited for this microrobot system. Recently developed solutions as presented in (5), (6) and (7) go beyond this approach. They focus on the parallelisation and decentralisation of fusion and processing steps or the efficiency optimisation respectively. The principle used for the microrobot system is realised as described in the following. When a sensor starts a measurement (e.g. acquires an image), it registers itself at the sensor data fusion module, so that the measurement time t m is stored. After computing the measurement, the sensor registers the measurement at the fusion module as finished. Now the filter can correct the estimation at the (past) point of time t m by the new measurement. If there are newer state estimates that have been stored in the sensor history meanwhile, which were calculated when other sensors finished their measurements, these sub-optimal predictions are corrected by the anew filtering of these measurements. A new prediction of the state will base on this corrected estimate. 5 CONTROL SYSTEM In order to accomplish a variety of different tasks, a reliable and robust control system is essential. As the robots differ from each other in various aspects (type of manipulator, camera, material fatigue), a two-stage approach has been chosen. On a lower level, learning algorithms have been implemented in order to minimize differences between the robots and eliminate nonlinearities as far as possible. A high level controller can then rely on a consistent behaviour of the control path independent of the configuration of the current robot.
High-level controller Low-level controller Robot

Calibration Closed-loop operation

Fig. 10: Cooperation of two controllers For the low-level controller, a neural feed-forward network with four layers has given satisfactory results concerning both accuracy and computing time. The network will be fed in a calibration procedure before starting the real operation, hence the high-level controller is still switched off at this time. Otherwise two closed-loop controllers would disturb each other. Results from calibration

obviously will concern only one robot configuration, therefore a database including network parameters of all existing configurations has to be established. The high-level controller will close the loop in normal operation mode. Adaptive and non-adaptive MIMO PID controller are being investigated, giving up to now sufficient accuracy only for coarse movements. In order to improve fine movement precision further investigations using NARMA-L2 (8) and NGPC (9) algorithms are being carried out. 6 CONCLUSION AND FUTURE WORK The sensor data fusion has already been successfully tested using the robot prototypes and the global camera sensors of the microrobot system Miniman that was presented in (10) and (11). Figure 11 shows a workspace of the Miniman robots for micro assembly tasks. In the foreground, the global CCD camera is visible that is used for tracking robot-mounted infrared-LED markers.

Local camera

Miniman robot with micro gripper Global camera

Fig. 11: Part of the microrobot system Miniman. Since all components of the system have been designed to be easily adaptable, it is expected that the first experiments with the cm3-sized MiCRoN robots will be satisfactory. However, these robots will differ greatly in their behaviour due to their complex actuation system, their low weight and the wireless operation mode. 7 ACKNOLEDGEMENTS This research work has been performed at the Institute for Process Control and Robotics (Head Prof. H. Wrn), Computer Science Department, Universitt Karlsruhe (TH). The sensor data fusion architecture has been implemented by one of our students, Julius Ziegler. The research work is being supported by the European Union (FET-open project MiCRoN, IST-2001-33567).

REFERENCES (1) Tsai, R.Y.: A Versatile Camera Calibration Technique for High-Accuracy 3D Machine Vision Metrology Using Off-the-Shelf TV Cameras and Lenses, IEEE Journal of Robotics and Automation, No 4, Aug. 1987 (2) Nayar, Sh. K.; Nakagawa, Y.: Shape from Focus, IEEE Transactions on Pattern Analysis and Machine Intelligence, Vol. 16(8), Aug.1994, pp. 824-831 (3) Julier, S. J.; Uhlmann, J. K.: A New Extension of the Kalman Filter to Nonlinear Systems; SPIE Proceedings Vol. 3068, 1997, pp. 182-193 (4) Welch, Gregory F.; Bishop, Gary: SCAAT: Incremental Tracking with Incomplete Information; Computer Graphics, SIGGRAPH 97 Conference Proceedings, T. Whitted (Ed.), Los Angeles, CA, USA, August 3-8, ACM Press Addison-Wesley, 1997, pp. 333-344 (5) Davidson, Frdric. Study of Fusion Architectures for target tracking with Kalman filtering. Laboratoire de Radiocommunications et de Traitement du Signal, Rapport annuel dactivits 19971998 (6) Grocholsky, Ben: Information-Theoretic Control of Multiple Sensor Platforms, Dissertation, University of Sydney, 2002 (7) Bar-Shalom, Yaakov: Update with Out-of-Sequence Measurements in Tracking: Exact solution. IEEE Transactions on Aerospace and Electronic Systems Vol. 28, 2002 (8) Narendra, K.S.; Mukhopadhyay, S.: Adaptive control using neural networks and approximate models; IEEE Transactions on Neural Networks, Volume: 8 Issue: 3 , May 1997 pp. 475 -485 (9) Haley, P.; Soloway, D.; Gold, B.: Real-time adaptive control using neural generalized predictive control; Proceedings of the American Control Conference, 1999, Volume: 6 , 1999 pp. 4278 -4282 vol.6 (10) Wrn, Heinz; Schmoeckel, Ferdinand; Buerkle, Axel; Samitier, Josep; Puig-Vidal, Manel; Johansson, Stefan; Simu, Urban; Meyer, Jrg-Uwe; Biehl, Margit: From decimeter- to centimetersized mobile microrobots the development of the MINIMAN system; SPIEs Int. Symp. on Intelligent Systems & Advanced Manufacturing, Conference on Microrobotics and Microassembly, Boston, MA, USA, October 28 - November 2, 2001, pp. 175-186 (11) Buerkle, Axel; Schmoeckel, Ferdinand; Kiefer, Matthias; Amavasai, Bala P.; Caparrelli, Fabio; Selvan, Arul N.; Travis, Jon R.: Vision-based closed-loop control of mobile microrobots for micro handling tasks; SPIEs Int. Symp. on Intelligent Systems & Advanced Manufacturing, Conference on Microrobotics and Microassembly, Boston, MA, USA, October 28 - November 2, 2001, pp. 187-198 (12) Amidror, Isaac: The Physics of the Moir Phenomenon, Kluwer Academic Publishers, Dordrecht, March 2000

Das könnte Ihnen auch gefallen