Beruflich Dokumente
Kultur Dokumente
Abstract—Robotic endoscope holders constitute an alternative manipulates the instruments, the assistant handles the laparo-
to complete telesurgery systems, by offering a “third hand” to scope, thus, making the surgeon rely on his assistant for appro-
the surgeon during a laparoscopic procedure. ViKY robotic scope priate visualization of the surgical field and instruments. Robotic
holder (Endocontrol, Grenoble, France) is a lightweight, steriliz-
able body-mounted robot with 3 DOF. In this paper, we present the scope holders (such as the Aesop scope holder) have been devel-
specifics of this robot, the newly developed XL version dedicated oped to overcome these limitations. They give the surgeon full
to single-port surgery and the initial clinical experience. We have control over his visualization of the surgical field, while free-
proposed a method to control the robot, based on the detection and ing the assistant’s hand. Complete telesurgery systems, such as
tracking of surgical instrument from image analysis and shape pri- the daVinci robotic system (Intuitive Surgical, Sunnyvale, CA),
ors, to enrich the interaction between the surgeon and the robot.
We present here, our work in progress toward its integration into a which offer 3-D vision and articulated instruments, are another
probabilistic framework, with the aim of improving the method’s type of robotic system for laparoscopic surgery, which has fa-
speed and robustness. We also present the surgeons’ viewpoints on cilitated intracorporeal surgery and increased surgeons’ inter-
the feasibility of its integration into the operating theater. est in robotics and patient acceptability of new technologies in
Index Terms—Automatic instruments detection and track- surgery. However, the daVinci system is bulky and expensive,
ing, clinical experiments, laparoscopy, robotic endoscope holder, thereby, limiting its use to selected patients and procedures [2].
robotic surgery, vision-based control. Thus, a current need for a robotic system less bulky and ex-
pensive was voiced. In this paper, we present the specifics of
the ViKY robotic scope holder system (Endocontrol, Grenoble,
France) and our initial clinical experience with the system. In
I. INTRODUCTION its current clinical use, the robot is controlled by a pedal or a
voice command. Our initial clinical experience suggested a po-
APAROSCOPIC surgery, also called keyhole surgery, uses
L tiny incisions, usually less than 1 cm to perform intraab-
dominal or intrathoracic procedures. This approach offers de-
tential interest in replacing this type of “low-level” control with
a more intuitive control. Thus, an alternative control mode for
the system based on visual servoing using instrument tracking
creased blood loss and postoperative pain, in addition to shorter
was developed. We present here, its latest improvements and our
hospital stay and convalescence, while offering better cosme-
preliminary results of visual servoing using instrument tracking.
sis [1]. This technique requires the use of a laparoscope and spe-
cific instruments designed for this approach. While the surgeon
II. STATE OF THE ART
Manuscript received March 1, 2010; revised July 3, 2010; accepted Au- Robotic endoscope holders, such as the AESOP system, the
gust 30, 2010. Date of publication November 15, 2010; date of current version
December 15, 2010. Recommended by Guest Editor R. L. Galloway. This work EndoAssist system [3], or more recently the body-mounted
was supported by the French National Research Agency (ANR) through its ViKY system [4], maintain the endoscope with a robotic arm
TecSan Program under Project ROSACE ANR-06-TecSan-008. while the surgeon performs surgery with conventional laparo-
S. Voros is with the Techniques de l’Ingénierie Médicale et de la Complexité-
Informatique, Mathématiques et Applications de Grenoble Laboratory, Cen- scopic instruments. Thus, robotic endoscope holders provide a
tre National de la Recherche Scientifique, Unités Mixtes de Recherche 5525, “third hand” to the surgeon, allowing for solo surgery. They
INSERM, IFR 130, 38000 Grenoble, France (e-mail: sandrine.voros@imag.fr). enhance the stability and quality of the images and reduce
G.-P. Haber is with Cleveland Clinic, Cleveland, OH 44195 USA (e-mail:
gphaber@hotmail.com). the staining (appearance stains on the endoscope lens due to
J.-F. Menudet is with the Endocontrol Company, 38700 Grenoble, France blood projections) of the endoscope [5]. The robotic endoscope
(e-mail: jean-francois.menudet@endocontrol-medical.com). holders can be controlled by a vocal command or head move-
J.-A. Long is with the Department of Urology, University Hospital, 38000
Grenoble, France (e-mail: JALong@chu-grenoble.fr). ments [6], [7], but the interactions between the surgeon and the
P. Cinquin is with the Techniques de l’Ingénierie Médicale et de la system remain limited (e.g., left, right, up, down, zoom in, and
Complexité- Informatique, Mathématiques et Applications de Grenoble Labora- zoom out).
tory, Centre National de la Recherche Scientifique, Unités Mixtes de Recherche
5525, INSERM, IFR 130, 38000 Grenoble, France, and also with the Center for One elegant solution aimed at developing more sophisticated
Technological Innovation (CIC-IT, INSERM), Grenoble University Hospital, interactions between the surgeon and the system is based upon
38000 Grenoble, France (e-mail: philippe.cinquin@imag.fr). the visual servoing of the surgical instruments. Such an ap-
Color versions of one or more of the figures in this paper are available online
at http://ieeexplore.ieee.org. proach would allow for the development of “high-level” com-
Digital Object Identifier 10.1109/TMECH.2010.2080683 mands of the endoscope, such as the automatic displacement of
1083-4435/$26.00 © 2010 IEEE
880 IEEE/ASME TRANSACTIONS ON MECHATRONICS, VOL. 15, NO. 6, DECEMBER 2010
Fig. 7. Planar calibration grid (square size = 8 mm) used to calibrate the
{ViKY + imaging system}.
Fig. 11. Framework of the Condensation algorithm used to detect the instru-
ment’s axis in the endoscopic images.
the robot was slightly superior to the one without the robot. This
can be explained by the fact that the surgeons were discovering B. Toward an Image-Analysis-Based Command of the ViKY
System
the system and learning to use it.
After this success, a multicenter prospective study has been The quantitative evaluation of the tool-tracking method is
launched so as to prove the efficiency of the robot system for still a work in progress. We present here two aspects regarding
laparoscopic procedures such as radical prostatectomy, radical the accuracy of the system that were quantitatively evaluated:
nephrectomy, pyeloplasty, sacral colpopexy, and cholecystec- calibration accuracy, and the validation of the “fixed” insertion
tomy. The study is still in process in three French centers. points hypothesis. As of today, we have not quantitatively eval-
Preliminary results on 53 patients (30 prolapse surgeries, 3 uated the precision and robustness of the method, although, we
nephrectomies, 6 cholecystectomies, 2 pyeloplasties, 12 prosta- can qualitatively say that it was dramatically improved com-
tectomies) show that most of the procedures can be performed pared to our original implementation of the method.
with the robot. The installation time is approximately 5 min. No 1) Quantitative Evaluation of the Calibration Accuracy: We
perioperative complications related to the robot were recorded. evaluated the accuracy of the intrinsic calibration of the system
The mean operative time for each procedure is described in by using the rms reprojection error metric classically used with
Table I. The prospective study being still in process, we cannot Zhang’s grid-calibration method. We used ten images of the
yet compare the operative time to classical laparoscopic surgery, calibration grid to compute the intrinsic calibration parameters.
but the surgeons who performed the surgeries with the robot did These parameters were then used to reproject the 3-D coordi-
not note a drastic difference in operating time. nates of the grid corners in the images of the grid, and compared
The major limitation observed was the lack of amplitude to the 2-D grid corners automatically extracted using image
of motions that appeared during colonic dissection for kidney analysis. The global reprojection error for the image set was
surgery, which may in some cases require a momentary help 1.06 pixels (on images of a 720 × 480 resolution).
of the assistant to hold the camera, or a switch to classical We then evaluated the accuracy of the overall calibration of
laparoscopy (see Table I). The major advantage of the robot the system (intrinsic + extrinsic). To do so, we took an image of
stands in the freeing of the assistant hand by the robot. the calibration grid for a position p1 of the ViKY robot. Then,
2) Single-Port Laparoscopic Surgery: The ViKY robot was we moved the robot to another position p2 with the calibration
also tested in a single-port configuration on five male farm pigs. grid still visible. Using motor encoders, geometrical model of
Bilateral partial nephrectomy and bilateral pyeloplasty was per- the robot, and precomputed transformation between robot and
formed through a single port before the completion of bilateral camera frames, the geometric transformation T between the
radical nephrectomy. There were no intraoperative complica- camera frames c1 and c2 corresponding to positions p1 and p2
tions, and there was no need for additional ports to be placed. was computed (i.e., calibration grid images were not used in this
The mean (range) operative duration for partial nephrectomy, computation). Using T, we expressed the 3-D coordinates of the
pyeloplasty, and nephrectomy was 120 (100–150), 110 (95– grid corners in c2 and projected them on the image plane. These
130), and 20 (15–30) min, respectively. The mean (range) esti- 2-D coordinates were then compared to the 2-D grid corners
mated blood loss for all procedures was 240 (200–280) mL. The automatically extracted using image analysis. The reprojection
preparation time decreased from 25 to 15 min after 10 cases (p error for one grid image of size 720 × 480 pixels was 49.4
= 0.002) [24]. pixels. This result shows that the extrinsic calibration is approx-
Following the animal pilot study, the base of the robot was imative (probably due to a poor robot geometrical model), but
enlarged and adapted to the specifics of single port, and this we use the calibration only to roughly determine the position of
version of the VIKy was called XL. The VIKY XL was then used the insertion points, which then gives instrument orientation. In-
for single-port procedures in five patients undergoing radical sertion points being generally outside and far away from image,
nephrectomy. A 5-mm, 30◦ scope was used, robot positioning instrument orientation is actually not strongly affected by such
did not differ from the Standard ViKY, however, installation was an error. We will see in the next section that this approxima-
found to be 10 min longer to be able to optimize the range of tive calibration is sufficient for the determination of the tool’s
movement of the instruments avoiding clutching with the robot. position within our precision requirements.
VOROS et al.: ViKY ROBOTIC SCOPE HOLDER: INITIAL CLINICAL EXPERIENCE AND PRELIMINARY RESULTS 885
intraoperatively using the results of the instrument’s detection. [5] L. R. Kavoussi, R. G. Moore, J. B. Adams, and A. W. Partin, “Comparison
This would be important if the positions of the insertion points of robotic versus human laparoscopic camera control,” J. Urol., vol. 154,
no. 6, pp. 2134–2136, Dec. 1995.
change by modifying the pneumoperitoneum pressure. With [6] P. A. Finlay, “Clinical experience with a goniometric head-controlled
the patient’s abdomen being insufflated with gas, the effects laparoscope manipulator,” in Proc. IARP Workshop Med. Robot., Vienna,
of respiration should be minimal. Further, we did not observe Austria, Oct. 1996.
[7] A. Nishikawa, T. Hosoi, K. Koara, D. Negoro, A. Hikita, S. Asano, F.
significant displacements of the insertion points during our pig Miyazaki, M. Sekimoto, Y. Miyake, M. Yasui, and M. Monden, “Real-
experiments. time visual tracking of the surgeon’s face for laparoscopic surgery,” Med.
The integration into a probabilistic framework allowed us Image Comput. Comput.-Assisted Intervention Conf. (MICCAI), 2001,
Lecture Notes Comput. Sci., vol. 2208, pp. 9–16, 2001.
to take into account the position of the tip of the instruments [8] O. Tonet, T. U. Ramesh, G. Megali, and P. Dario, “Image analysis-based
in previous frames instead of working frame-by-frame, thus approach for localization of endoscopic tools,” in Proc. Surgetica, 2005,
increasing the speed of the detection as shown by the results pp. 221–228.
[9] P. Berkelman, E. Boidard, and P. Cinquin, “Automatic instrument tracking
presented in Section V-B. The feasibility of this new approach with a compact laparoscopic endoscope robot using an external optical
was validated during animal experiments, in which, we observed localizer,” in Proc. Surgetica, 2002, pp. 17–24.
two main reasons for a tracking failure. [10] A. Krupa, J. Gangloff, C. Doignon, M. de Mathelin, G. Morel, J. Leroy,
L. Soler, and J. Marescaux, “Autonomous 3-D positioning of surgical
1) Specular reflections on the organs. We are currently inves- instruments in robotic laparoscopic surgery using visual servoing,” IEEE
tigating two solutions for specular reflections. A software- Trans. Robot. Autom., vol. 19, no. 5, pp. 842–853, 2003.
based removal (but it is very costly in computation time); [11] D. Burschka, J. J. Corso, M. Dewan, W. Lau, M. Li, H. Lin, P. Marayong,
N. Ramey, G. D. Hager, B. Hoffman, D. Larkin, and C. Hasser, “Navigating
solutions based on polarizing filters, but they decrease the inner space: 3-D assistance for minimally invasive surgery,” Robot. Auton.
illumination of the scene [26]. Syst., vol. 52, no. 1, pp. 5–26, 2005.
2) Presence of organs with long and straight edges. We are [12] G. Wei, K. Arbter, and G. Hirzinger, “Real-time visual servoing for laparo-
scopic surgery: Controlling robot motion with color image segmentation,”
currently investigating a solution that would consist in IEEE Eng. Med. Biol. Mag., vol. 16, no. 1, pp. 40–45, Jan./Feb. 1997.
measuring patient-specific color information at the begin- [13] A. Casals, J. Amat, and E. Laporte, “Automatic guidance of an assistant
ning of the procedure, in order to remove edges corre- robot in laparoscopic surgery,” in Proc. IEEE Int. Conf. Robot. Autom.,
Apr. 1996, vol. 1, pp. 895–900.
sponding to organs in the search for the instrument. [14] Y. Wang, D. R. Uecker, and Y. Wang, “A new framework for vision
enabled and robotically assisted minimally invasive surgery,” Comput.
Med. Imaging Graph., vol. 22, no. 6, pp. 429–437, 1998.
[15] S. Spiedel, G. Sudra, S. Schalck, B. P. Müller-Stich, C. Gutt, and R.
VII. CONCLUSION Dillman, “Automatic image-based analysis for context-aware assistance
in minimally invasive surgery,” in Proc. Surgetica, 2007, pp. 53–62.
Our clinical experience with a lightweight robotic endoscope [16] C. Doignon, P. Graebling, and M. de Mathelin, “Real-time segmentation
of surgical instruments inside the abdominal cavity using a joint hue
holder is very encouraging. Indeed, such a system can provide saturation color feature,” Real-Time Imaging, vol. 11, no. 5–6, pp. 429–
a “third hand” to the surgeon during laparoscopy and a viable 442, 2005.
option for single-port surgery. Preliminary results of instrument [17] J. Climent and P. Marés, “Automatic instrument localization in laparo-
scopic surgery,” Electron. Lett. Comput. Vis. Image Anal., vol. 4, no. 1,
tracking based on image analysis demonstrated the feasibility pp. 21–23, 2004.
and the safety of this approach in preclinical conditions. Lim- [18] S. Voros, E. Orvain, J.-A. Long, and P. Cinquin, “Automatic detection
itations were identified and solutions are being investigated. A of instruments in laparoscopic images: A first step towards high level
command of robotized endoscopic holders,” Int. J. Robot. Res., vol. 26,
prospective randomized study comparing the use of the ViKY no. 11–12, pp. 1173–119, 2007.
to conventional practice is ongoing. This technology, once fine [19] Z. Zhang, “A flexible new technique for camera calibration,” IEEE Trans.
tuned, has the potential to take robotic surgery to a new level Pattern Anal. Mach. Intell. (PAMI), vol. 22, no. 11, pp. 1330–1334, 2000.
[20] C. Doignon, F. Nageotte, and M. de Mathelin, “Segmentation and Guid-
by letting the robot intelligently assist the surgeon, while allow- ance of Multiple Rigid Objects for Intra-Operative Endoscopic Vision,”
ing the surgeon to focus on the surgical act rather than on the in Proc. Workshop on Dynamical Vision of ECCV, , 2005/2006, Lecture
interaction with the robot. Notes Comput. Sci., vol. 4358, pp. 314–327, 2007.
[21] [Online]. Available: http://sourceforge.net/projects/opencvlibrary/
[22] M. Isard and A. Blake, “Condensation—Conditional density propagation
for visual tracking,” Int. J. Comput. Vis., vol. 29, no. 1, pp. 5–28, 1998.
[23] N. Otsu, “A threshold selection method from gray level histograms,” IEEE
REFERENCES Trans. Syst., Man Cybern., vol. 9, no. 1, pp. 62–66, Jan. 1979.
[24] S. Crouzet, G. P. Haber, W. M. White, K. Kamoi, R. K. Goel, and
[1] B. Makhoul, A. De La Taille, D. Vordos, L. Salomon, P. Sebe, J. Audet, J. H. Kaouk, “Single-port, single-operator-light endoscopic robot-assisted
L. Ruiz, A. Hoznek, P. Antiphon, A. Cicco, R. Yiou, D. Chopin, and C. laparoscopic urology: Pilot study in a pig model,” BJU Int., vol. 105,
Abbou, “Laparoscopic radical nephrectomy for T1 renal cancer: The gold no. 5, pp. 682–685, Mar. 2010.
standard? A comparison of laparoscopic vs open nephrectomy,” BJU Int., [25] F. Dornaika and R. Horaud, “Simultaneous robot-world and hand-eye
vol. 93, no. 1, pp. 67–70, 2004. calibration,” IEEE Trans. Robot. Autom., vol. 14, no. 4, pp. 617–622,
[2] J. C. Hu, X. Gu, S. R. Lipsitz, M. J. Barry, A. V. D’Amico, A. C. Weinberg, Aug. 1998.
and N. L. Keating, “Comparative affectiveness of minimally invasive vs [26] J. Duchateau, S. Voros, and P. Cinquin, “Development and precision eval-
open radical prostatectomy,” J. Amer. Med. Assoc. (JAMA), vol. 302, uation of an image-analysis based endoscopic tool tracking system,” pre-
no. 14, pp. 1557–1564, 2009. sented at the Surgetica Conf., Chambéry, France, Sep. 19, 2007.
[3] P. Ballester, Y. Jain, K. R. Haylett, and R. F. McCloy, “Comparison of
task performance of robotic camera holders EndoAssist and Aesop,” Int.
Congr. Ser., vol. 1230, pp. 1100–1103, Jun. 2001.
[4] J.-A. Long, P. Cinquin, J. Troccaz, S. Voros, P. Berkelman, J.-L. Descotes,
C. Letoublon, and J.-J. Rambeaud, “Development of miniaturized light
endoscope holder robot for laparoscopic surgery,” J. Endourol., vol. 21,
no. 8, pp. 911–914, Aug. 2007. Authors’ photographs and biographies not available at the time of publication.