Get 20M+ Full-Text Papers For Less Than $1.50/day. Start a 14-Day Trial for You or Your Team.

Learn More →

Oblique-viewing endoscope calibration in the diagnostics and treatment in the pelvis minor area

Oblique-viewing endoscope calibration in the diagnostics and treatment in the pelvis minor area Urinary incontinence (UI) is a common condition, especially in women, and affects the quality of life in a physical, social, and economic meaning. Despite improvements in surgical techniques and the implementation of minimally invasive procedures, male and female stress UI still affects their well-being. Treatment limitations have encouraged researchers to investigate new approaches, including those of tissue engineering. The injection of autologous mesenchymal-derived stem cells (AMDC) might rebuild the urethra sphincter function and minimize leakage symptoms. The treatment is carried out with a rigid endoscope. The aim of this study is to present a practical calibration procedure for an oblique-viewing endoscope imaging system. This article presents the results of an examination of the variability of the internal camera's parameters with the angle of rotation of the endoscope's cylinder. The research proves that the most variable parameters are the coordinates of the image plane's principal point. The developed model of variability can be implemented as a simple look-up table in a realtime operating device. In this article, a tool is proposed for the computation of the relative angle of cylinder's rotation based only on images. All developed methods can be implemented in a robot-assisted system for an AMDC urethra sphincter injection procedure. Introduction Endoscopic procedures are commonly used for diagnostic and therapeutic indications in urology. An optical system allows the examiner to obtain a view of parts of the body, including the urinary bladder. Treatments are performed with flexible endoscopes and local anesthesia. One of the applications is the identification of urinary bladder cancer. Rigid endoscopes are used for the investigation of cystolithiasis. The elimination of detected bladder stones is executed by laser light or ultrasonic waves using a technique called lithotripsy. Endoscopic methods are also used in the treatment of stress urinary incontinence (UI). The device enters the urethra and delivers a substance, such as autologous mesenchymal-derived stem cells (AMDC), into the sphincter, which helps in its reconstruction. In endoscopy of the lower urinary system, flexible devices are used for the observation of the urethra and urinary bladder. Nowadays, optical fibers are used for image and video transmission. This type of cystoscopy can also be used for biopsy and the excision of small tumors. Rigid endoscopes in diagnostics are almost entirely replaced by flexible ones, but they still find an application in urinary bladder biopsy and others. Ureoteroscopy or the diagnosis of the upper urinary system is carried out by semiflexible or flexible devices. The semiflexible ones are used in the diagnosis of ureter and renal pelvis. Fully flexible ones are applied in the observation of major and minor renal calyces. Ureoteroscopy is used in the treatment of urolithiasis in the upper urinary system. The same methods (laser or ultrasound based) are applied in the case of bladder stones [1]. UI may affect up to 200 million people in the world and a substantial number of patients are women [2]. The implantation of AMDC into the urethra sphincter is one of the most promising treatments. Stem cells make it possible to reconstruct the sphincter function and eliminate leakage. The treatment is carried out with a *Corresponding author: Krzysztof Holak, Faculty of Mechanical Engineering and Robotics, AGH University of Science and Technology, Al. Mickiewicza 30, 30-059 Krakow, Poland, E-mail: holak@agh.edu.pl Piotr Kohut and Maciej Petko: Faculty of Mechanical Engineering and Robotics, AGH University of Science and Technology, Krakow, Poland. http://orcid.org/0000-0001-7307-6389 (M. Petko) Klaudia Stangel-Wojcikiewicz and Monika Kabziska-Turek: Department of Gynecology and Oncology, Jagiellonian University Collegium Medicum, Krakow, Poland. http://orcid.org/0000-00019773-6767 (K. Stangel-Wojcikiewicz) Joanna Florczak: Department of Measurement and Electronics, AGH University of Science and Technology, Krakow, Poland 134Holak et al.: Oblique-viewing endoscope calibration rigid oblique-viewing endoscope, which is put into the patient's body through the urethra. Based on the camera image, the physician finds the approximate point for stem cell injection. Cells are injected through a needle mounted in the additional channel of the device. The main limitation of the method is the absence of a noticeable border between the sphincter and the other surrounding tissues on the image, which makes the injection more difficult to carry out. The physician chooses the best place based on general guidelines and personal experience. In addition, the entire diagnosis or treatment is performed manually by a physician with only a video presented on a monitor screen as a guide. In current medical practice, diagnostic or treatment techniques can be improved by the introduction of modern technology and computational methods. Advanced image processing and analysis as well as computer graphics are becoming integral parts of various biomedical systems [3, 4]. For example, augmented reality (AR) systems make it possible to superimpose 3D models of organs created by ultrasound or tomography with the image from a camera captured by a physician during diagnosis [5]. Additionally, the area of endoscopy and minimally invasive surgery has been revolutionized by the introduction of surgical robots. These two advances in technology allow scientists to design a robotic system for the diagnosis and treatment of the pelvis minor area. The main purpose of the developed medical robot [6] is to deliver stem cells to the area of the urinary bladder sphincter through the natural orifice by injecting them using a needle mounted at the end of the robot's tool. The device makes the treatment much faster and more repeatable. The automatic processing of the video captured by an endoscopic camera helps position the robot's hand inside the lumen of the urethra. An interactive graphical user interface (GUI) is developed to help the user localize interesting parts of the inside of a body registered in images transmitted by an endoscopic camera as well as in cross-sections obtained by magnetic resonance imaging (MRI) or computed tomography (CT) scanners and ultrasonography (USG) systems. The correctness of the measurement of any geometric quantities in images needs a precise knowledge of camera parameters obtained in the process of camera calibration. In this article, the calibration procedure for an oblique-viewing endoscope imaging system is presented. The procedure uses a standard method of calibration for a set of angles of a cylinder's rotation with respect to the camera's body frame. This article presents the results of a laboratory examination of the variability of the internal camera's parameters with the angle of rotation of the endoscope's cylinder. The research proves that the most variable parameters are the coordinates of the principal point. The focal length and lens distortion parameters are assumed constant. The fifth-degree polynomial is used to approximate the values of the image plane's principal point coordinates for all intermediate angles of rotation. The model can be implemented as a simple look-up table in a real-time operating device. The necessary parameter is the relative angle of cylinder's rotation, which can be obtained by an encoder or external tracking device. In this article, a tool for the computation of a relative angle based only on the images is proposed. All developed methods can be implemented in a robot system applicable for the diagnosis and treatment of the pelvis minor area of the human body mainly for the robot-assisted delivery of stem cells to the female bladder sphincter muscle. Materials and methods Calibration of an oblique-viewing endoscope Rigid endoscopes can be divided into two groups: forward-viewing endoscopes with the optical axis coincident with the axis of the cylinder's rotation and oblique-viewing endoscopes with the optical axis skewed with respect to the cylinder's axis of rotation. The angle between the two axes is usually 30° or 70°. In the first group, the field of view is restricted and depends on the point at which the device was inserted into the patient's body. For the oblique-viewing endoscopes, the field of view can be increased by the cylinder's rotation. The calibration of a forward-viewing endoscope is straightforward, because it has to be performed only once using a standard method, whereas, for oblique-viewing devices, the variability of camera parameters with the changing angle of cylinder's rotation must be taken into account. Camera calibration is used for the computation of the camera model, which describes the projection of 3D points on the image plane of a sensor. For the forward-viewing endoscope, image formation can be approximated by the well-known pinhole camera model [7] and modeled by a central projection, as described by Eq. (1): x = K0GX (1) where X is the 3D point of an object, x is the image point corresponding to X in both homogeneous coordinates, P0 is the central projection matrix, K is the matrix of intrinsic parameters, G is the matrix of extrinsic parameters consisting of rotation matrix and translation vector, and l is the scaling factor. Holak et al.: Oblique-viewing endoscope calibration135 The intrinsic parameters are used to describe the process of image formation and include the focal length f, pixel sizes sx and sy (pixels/mm), or more commonly focal length in pixels (fx=fsx, fy=fsy); the principal point (px and py in pixels), which is approximately the center of the image plane; and the radial (k1 and k2) and tangential (p1 and p2) lens distortion coefficients. The first two parameters are given in the form of a 3×3 matrix K [Eq. (2)]. The extrinsic parameters describe the position and orientation of the camera {R,T} with respect to a global coordinate frame of reference associated with an object on the scene (i.e. calibration board). fsx 0 px K = 0 fsy p y 0 0 1 (2) The impact of lens distortions on the processed image formation is modeled, for example, by the following mathematical relation: x d = x(1 + k 1 r 2 + k 2 r 4 + k 3 r6 ) + 2p1 xy + p2 ( r 2 + 2x 2 ) y d = y(1 + k 1 r 2 + k 2 r 4 + k 3 r6 ) + 2p2 xy + p1 (r 2 + 2y 2 ) (3) where xd and yd are the coordinates of image points with lens distortions present in the image and r2=x2+y2. The image formation model is graphically presented in Figure 1. For an oblique-viewing endoscope, the camera's internal parameters change during the time of endoscope operation, because there is a relative rotation between the camera's image plane and the optical elements inside the endoscope's cylinder. Therefore, a single camera calibration is not sufficient. In the literature, there are various types of modification of a standard camera model together with the developed calibration procedures. Additionally, the modeling and correction of lens distortions is currently attracting a lot of attention. Mathematical models that can be divided into three groups have been proposed. ­ Models in which the cylinder's rotation is described by a change in the camera's external parameters but not in the internal parameters. A model proposed by Yamaguchi et al. [8, 9] in which the cylinder's physical rotation is modeled by two mathematical rotations. A modification of the Yamaguchi model introduced by Wu et al. [10], in which the cylinder's rotation is captured in an additional external parameter. ­ Camera models in which the cylinder's rotation is described by a change of internal parameters. De Buck's model [11] in which the influence of the cylinder's rotation on the internal parameters is modeled by homographic transformation and the correction coefficient for the position of the principal point. Melo's model [12], where the cylinder's rotation is modeled by a 2D rotation of the image plane, modifying the K matrix of the optical system. ­ The model developed by Barreto et al. [13] based on the tracking of projection lines, which is a generalization of the image formation process. This model can be used for the description of devices that cannot be modeled by the classical pinhole camera model and central projection geometry. The calibration procedure and endoscope tracking methods described in the literature [8­11] require the application of encoders or special markers and additional Figure 1:Image formation process. Pinhole camera model and graphical representation of intrinsic camera parameters. 136Holak et al.: Oblique-viewing endoscope calibration tracking devices (e.g. the leap motion controller that uses two optical sensors and three infrared LEDs and gives the location of the tip in Cartesian coordinates for every single frame) [14]. On the contrary, the methods developed by Fukuda [15] can be applied to the cylinder's angle of rotation computation based only on captured images. A special marker is placed at the end of the endoscope, which carries information about the relative angle between the body of the endoscope and its cylinder. The calibration method described in Barreto et al. [13] requires the application of a special table of coded markers. In the research presented in this article, a two-step method of camera calibration was developed. In the first step, the set of reference parameters was defined for a chosen, fixed orientation of the endoscope's cylinder ­ the reference orientation. The values of focal length, principal point, and distortion coefficients were computed using a standard calibration procedure [16]. In the next step, the variability of parameters with the changing angle of cylinder's rotation was investigated. The obtained data were used to find all camera parameters as functions of the angle of rotation. The process of camera calibration for a chosen set of angular orientations of the cylinder was carried out. The values of parameters for the intermediate angles were computed using polynomial interpolation. During the robot-aided treatment, the values of camera parameters were stored in the form of a look-up table. The relative angle of rotation was calculated with the help of a special marker placed at the end of the endoscope, which was detected by means of image processing techniques. The correct values of camera parameters can be found in the look-up table and used for the computation of geometrical quantities that are of interest to the physician operating the device. ­ ­ Log-polar transform of the image and thresholding, and Detection of the needle's tip in the transformed image using corner detection and bottom-up raster scanning and computation of the angle of rotation. In the first step of the method, thresholding and edge detection are carried out. Edge detection can be performed using high-pass filters or the Canny algorithm. This is a preprocessing step that makes the next steps computationally less expensive. In the second step, the shape encircling the active area of the image is detected. The Hough transform is applied to detect the circular contour. The output parameters of the transform are the radius and x and y coordinates of the found circle. The detection of the center of the circle is necessary for the next step, the log-polar transform of the image. The last step is computation of a log-polar transform of the input image. It changes the coordinate frame of the image from xy basis to rq basis according to the following formula: = log x 2 + y 2 y = atan x (4) Detection of the angle of cylinder's rotation A marker in the shape of a needle was placed at the end point of the endoscope's cylinder, as shown in Figure 1. The task of the vision system was the detection of the marker and computation of the angle of rotation with respect to the reference rotation, which was fixed in the first step of the calibration. The position value was used to calculate the true relative angle of cylinder's rotation. The developed vision-based detection of the needlelike marker algorithm consists of four steps: ­ Image segmentation and edge detection, ­ Hough transform for the detection of an active part of the image marked by the circle and computation of its geometrical center, The center of the transform must be coincident with the detected center of the active image region. The detection of the marker's tip is much simpler when the image is first transformed by the log-polar operation. First, the image is processed by a corner detection algorithm (e.g. Harris corner detector). A big set of corners is extracted because of the nature of the image (checkerboard pattern). The next task is to find a corner corresponding to the marker. The image is scanned row by row from the bottom and the first corner, which is encountered by the raster scan that is the corner corresponding to the marker. After the detection is performed, the angle of rotation can be computed by inverse log-polar mapping. The entire marker detection algorithm is presented in the form of a flow chart in Figure 2. A calibration procedure based on the developed algorithms has been proposed. It consists of several steps. First, a set of calibration images is captured for each chosen set of angular positions of an endoscope's cylinder. Next, the intrinsic calibration is performed for each angle of rotation based on the method presented by Zhang [16]. After the camera parameters are computed for each angle, the values of parameters for angular positions in between the processed ones are calculated by polynomial interpolation. The resulting data are stored in a look-up table. The entire procedure is summarized in a flow chart presented in Figure 3. Holak et al.: Oblique-viewing endoscope calibration137 Figure 3:Flow chart of the developed calibration procedure. Robot RV-2AJ Calibration table Endoscop (camera Flea3) Figure 2:Flow chart of the developed marker detection method. Experimental investigation of the method The experimental rig is depicted in Figure 4. The experimental stand was composed of the following items: a Mitsubishi RV-2AJ robot with a calibration table mounted in the flange of the last robot's link, an endoscope camera (Point Grey Flea3 FL3-GE-14S3C-C) equipped with Karl Storz 28208BA rigid oblique-viewing endoscope optics, and halogen-based lighting systems. The basic technical parameters of the camera are CCD image sensor: Sony progressive scan ICX267 image sensor, size: 1/2 inch with a pixel size of 4.65 (m), resolution: 1384×1032 pixels, frame rate: 18 (fps) at 1384×1032, and A/D converter: 12-bit. The robot's motion sequences were programmed in Melfa Basic language, whereas image acquisition was carried out by the FlyCapture2 Camera Selection software. The camera calibration process was performed by Lighting halogen lamps Figure 4:Experimental setup consisting of the RV-2AJ Mitsubishi robot, the endoscope equipped with a Point Grey Flea3 camera, the calibration board (in the form of a chessboard planar pattern), and two halogen lamps. the developed Wiz2D calibration module [17]. In the first stage of calibration, the reference parameters of the model were determined. The internal parameters of the camera in a fixed position of the cylinder were estimated. As a calibration board, the chessboard planar pattern with black-and-white squares, an even number of rows and an odd number of columns, was used. The angle of rotation 138Holak et al.: Oblique-viewing endoscope calibration Table 1:Means and standard deviations of the camera's internal parameters for each investigated angle of rotation of an endoscope's cylinder obtained in the experiment. Angle (°) fx mean fx std fy mean fy std px mean px std py mean py std k1 mean k1 std k2 mean k2 std p1 mean p1 std p2 mean p2 std 0 719.117 2.6296 719.564 2.56305 670.581 1.31471 562.728 1.60064 ­0.3818 0.00431 0.15818 0.00652 0.00286 0.00069 ­0.0001 0.00042 45 725.3 2.09435 725.537 2.0033 750.388 1.24077 524.603 1.64191 ­0.3853 0.00303 0.1546 0.00368 0.00235 0.00063 0.00052 0.00022 90 726.114 1.45325 725.512 1.45278 772.661 1.00261 464.023 0.84191 ­0.3908 0.00297 0.16417 0.00498 0.00078 0.00029 0.00086 0.00038 135 724.513 1.98251 724.169 2.38192 750.92 2.24873 398.506 0.95794 ­0.3851 0.00335 0.1443 0.00497 0.00092 0.00024 0.00041 0.00121 180 725.056 1.12787 724.577 1.15488 659.92 1.2315 358.45 1.30467 ­0.3942 0.00824 0.16046 0.01872 0.00038 0.00031 ­0.0007 0.00033 225 720.862 1.61145 720.604 1.57159 607.425 1.28217 379.434 1.42264 ­0.3847 0.00505 0.15743 0.01432 0.00801 0.03734 ­0.0007 0.00027 270 721.932 1.01998 720.424 1.09162 566.559 1.25828 456.539 1.46235 ­0.3846 0.00383 0.15963 0.00996 ­0.0008 0.00026 ­0.0022 0.0003 315 719.352 2.55638 717.433 2.65979 586.401 1.52514 518.741 0.73297 ­0.3822 0.00347 0.16296 0.00817 ­0.0008 0.00022 ­0.0038 0.00082 360 717.701 1.19453 717.967 1.20968 676.725 1.11996 564.883 0.95919 ­0.3809 0.0031 0.1623 0.00917 0.00172 0.0003 0.0005 0.00028 Camera internal parameters: fx and fy (pixels) are the pixel focal length computed for horizontal and vertical directions, respectively; px and py (pixels) are the coordinates of the principal point; k1 and k2 (­) are radial lens distortion coefficients; and p1 and p2 (­) are tangential lens distortion coefficients. Focal length x and y, pixels was changed in the range of 0 to 360° (45° steps). For each rotation angle of the endoscope's cylinder, 15 calibrations were performed. To achieve high repeatability (0.02 mm) of the calibration board locations, for each calibration series, the Melfa RV-2AJ robot was used to move the calibration pattern to the same set of locations. The robot's motion program was programmed in Melfa Basic language. Focal length x and y, pixels Focal length x Focal length y Results In the experiment, the camera's internal parameters were computed for a set of angles of rotation of the endoscope's cylinder in the range of 0­360° (steps of 45°). For each angle of rotation, the mean and standard deviation of each camera parameter were computed from a set of 15 calibration results. The total number of images of the calibration board is 3240. The results are presented in Table 1. The graphs of the variability of all camera parameters with respect to the angle of rotation are presented in Figures 5­8. Figure 9 shows the map of lens distortions of the examined device for one chosen angle of cylinder's rotation. As it turned out in the examination, the only substantial change of the camera's internal parameters was associated with the principal point x and y coordinates, which have shown sinusoidal characteristics. Other parameters 200 150 Angle, ° Figure 5:Focal length variability as a function of angle rotation. have indicated no periodic behavior and their variability was very small compared to the principal point case. The fifth-degree polynomial [Eq. (5)] was chosen to approximate the px and py parameters as a function of the angle. The graphs of approximating polynomials are shown in Figure 10. The obtained functions were used to generate the look-up table for the robot system. f ( x ) = ax 5 + bx 4 + cx 3 + dx 2 + ex + f (5) Holak et al.: Oblique-viewing endoscope calibration139 1000 900 800 Principal point x and y, pixels 700 600 500 400 300 200 Principal point x and y, pixels Principal point x coordinate Principal point y coordinate ×10­3 Tangential distortions coefficients p1 and p2 Coefficient p1 Coefficient p2 6 Coefficients p1 and p2 ­2 100 0 0 50 100 200 150 Angle, ° 250 300 350 ­4 0 50 100 150 200 Angle, ° 250 300 350 400 Figure 6:Principal point coordinate variability as a function of angle rotation. Figure 8:Tangential lens distortion coefficient variability as a function of angle rotation. Radial distortions coefficients k1 and k2 Coefficient k1 Coefficient k2 Image of object in camera 1 900 800 700 0.1 Coefficients k1 and k2 y, pixels ­0.1 ­0.2 ­0.3 ­0.4 0 600 x, pixels 200 Angle, ° Figure 7:Radial lens distortion coefficient variability as a function of angle rotation. Figure 9:Map of lens distortions for one fixed orientation of the endoscope's cylinder. Conclusion Endoscopy procedures are an important diagnostic and therapeutic tool in urology. The oblique-viewing endoscope has a larger field of view than the forward-viewing one due to the possibility of rotating the cylinder and the optical axis skewed with respect to the axis of the cylinder. Therefore, oblique-viewing endoscopes have become increasingly popular in clinical practice. The calibration procedure for an oblique-viewing endoscope camera has been developed and examined in the research presented in this article. The research proves that the most variable parameters are the coordinates of the principal point. The variation of other intrinsic parameters is relatively small and can be neglected in practical applications. The calibration of the internal parameters is performed for a small set of angles of rotation and the fifth-degree polynomial is used to approximate the values of principal point coordinates for all the intermediate angles of rotation. This simple model can be implemented as a simple 140Holak et al.: Oblique-viewing endoscope calibration Principal point coordinates Px Py 800 750 Principal point coordinates, pixels 700 650 600 550 500 450 400 350 0 50 100 200 Angle, ° Figure 10:Polynomials approximating the measured values of the principal points (marked as stars). look-up table in a real-time operating device. The necessary input is the relative angle of cylinder's rotation. A computation tool of the relative angle based only on the images is proposed in this article. All developed methods can be implemented in a robot system used for the delivery of AMDC into the female bladder urethra sphincter muscle through the urethra as well as for other types of diagnosis and treatment in the pelvis minor area of the human body. Author contributions: All the authors have accepted responsibility for the entire content of this submitted manuscript and approved submission. Research funding: Reported research realized within confines of PBS 1/A9/3/2012 project supported by the Polish National Center for Research and Development. Employment or leadership: None declared. Honorarium: None declared. Competing interests: The funding organization(s) played no role in the study design; in the collection, analysis, and interpretation of data; in the writing of the report; or in the decision to submit the report for publication. http://www.deepdyve.com/assets/images/DeepDyve-Logo-lg.png Bio-Algorithms and Med-Systems de Gruyter

Oblique-viewing endoscope calibration in the diagnostics and treatment in the pelvis minor area

Loading next page...
 
/lp/de-gruyter/oblique-viewing-endoscope-calibration-in-the-diagnostics-and-treatment-eqioRHdDXM
Publisher
de Gruyter
Copyright
Copyright © 2016 by the
ISSN
1895-9091
eISSN
1896-530X
DOI
10.1515/bams-2016-0013
Publisher site
See Article on Publisher Site

Abstract

Urinary incontinence (UI) is a common condition, especially in women, and affects the quality of life in a physical, social, and economic meaning. Despite improvements in surgical techniques and the implementation of minimally invasive procedures, male and female stress UI still affects their well-being. Treatment limitations have encouraged researchers to investigate new approaches, including those of tissue engineering. The injection of autologous mesenchymal-derived stem cells (AMDC) might rebuild the urethra sphincter function and minimize leakage symptoms. The treatment is carried out with a rigid endoscope. The aim of this study is to present a practical calibration procedure for an oblique-viewing endoscope imaging system. This article presents the results of an examination of the variability of the internal camera's parameters with the angle of rotation of the endoscope's cylinder. The research proves that the most variable parameters are the coordinates of the image plane's principal point. The developed model of variability can be implemented as a simple look-up table in a realtime operating device. In this article, a tool is proposed for the computation of the relative angle of cylinder's rotation based only on images. All developed methods can be implemented in a robot-assisted system for an AMDC urethra sphincter injection procedure. Introduction Endoscopic procedures are commonly used for diagnostic and therapeutic indications in urology. An optical system allows the examiner to obtain a view of parts of the body, including the urinary bladder. Treatments are performed with flexible endoscopes and local anesthesia. One of the applications is the identification of urinary bladder cancer. Rigid endoscopes are used for the investigation of cystolithiasis. The elimination of detected bladder stones is executed by laser light or ultrasonic waves using a technique called lithotripsy. Endoscopic methods are also used in the treatment of stress urinary incontinence (UI). The device enters the urethra and delivers a substance, such as autologous mesenchymal-derived stem cells (AMDC), into the sphincter, which helps in its reconstruction. In endoscopy of the lower urinary system, flexible devices are used for the observation of the urethra and urinary bladder. Nowadays, optical fibers are used for image and video transmission. This type of cystoscopy can also be used for biopsy and the excision of small tumors. Rigid endoscopes in diagnostics are almost entirely replaced by flexible ones, but they still find an application in urinary bladder biopsy and others. Ureoteroscopy or the diagnosis of the upper urinary system is carried out by semiflexible or flexible devices. The semiflexible ones are used in the diagnosis of ureter and renal pelvis. Fully flexible ones are applied in the observation of major and minor renal calyces. Ureoteroscopy is used in the treatment of urolithiasis in the upper urinary system. The same methods (laser or ultrasound based) are applied in the case of bladder stones [1]. UI may affect up to 200 million people in the world and a substantial number of patients are women [2]. The implantation of AMDC into the urethra sphincter is one of the most promising treatments. Stem cells make it possible to reconstruct the sphincter function and eliminate leakage. The treatment is carried out with a *Corresponding author: Krzysztof Holak, Faculty of Mechanical Engineering and Robotics, AGH University of Science and Technology, Al. Mickiewicza 30, 30-059 Krakow, Poland, E-mail: holak@agh.edu.pl Piotr Kohut and Maciej Petko: Faculty of Mechanical Engineering and Robotics, AGH University of Science and Technology, Krakow, Poland. http://orcid.org/0000-0001-7307-6389 (M. Petko) Klaudia Stangel-Wojcikiewicz and Monika Kabziska-Turek: Department of Gynecology and Oncology, Jagiellonian University Collegium Medicum, Krakow, Poland. http://orcid.org/0000-00019773-6767 (K. Stangel-Wojcikiewicz) Joanna Florczak: Department of Measurement and Electronics, AGH University of Science and Technology, Krakow, Poland 134Holak et al.: Oblique-viewing endoscope calibration rigid oblique-viewing endoscope, which is put into the patient's body through the urethra. Based on the camera image, the physician finds the approximate point for stem cell injection. Cells are injected through a needle mounted in the additional channel of the device. The main limitation of the method is the absence of a noticeable border between the sphincter and the other surrounding tissues on the image, which makes the injection more difficult to carry out. The physician chooses the best place based on general guidelines and personal experience. In addition, the entire diagnosis or treatment is performed manually by a physician with only a video presented on a monitor screen as a guide. In current medical practice, diagnostic or treatment techniques can be improved by the introduction of modern technology and computational methods. Advanced image processing and analysis as well as computer graphics are becoming integral parts of various biomedical systems [3, 4]. For example, augmented reality (AR) systems make it possible to superimpose 3D models of organs created by ultrasound or tomography with the image from a camera captured by a physician during diagnosis [5]. Additionally, the area of endoscopy and minimally invasive surgery has been revolutionized by the introduction of surgical robots. These two advances in technology allow scientists to design a robotic system for the diagnosis and treatment of the pelvis minor area. The main purpose of the developed medical robot [6] is to deliver stem cells to the area of the urinary bladder sphincter through the natural orifice by injecting them using a needle mounted at the end of the robot's tool. The device makes the treatment much faster and more repeatable. The automatic processing of the video captured by an endoscopic camera helps position the robot's hand inside the lumen of the urethra. An interactive graphical user interface (GUI) is developed to help the user localize interesting parts of the inside of a body registered in images transmitted by an endoscopic camera as well as in cross-sections obtained by magnetic resonance imaging (MRI) or computed tomography (CT) scanners and ultrasonography (USG) systems. The correctness of the measurement of any geometric quantities in images needs a precise knowledge of camera parameters obtained in the process of camera calibration. In this article, the calibration procedure for an oblique-viewing endoscope imaging system is presented. The procedure uses a standard method of calibration for a set of angles of a cylinder's rotation with respect to the camera's body frame. This article presents the results of a laboratory examination of the variability of the internal camera's parameters with the angle of rotation of the endoscope's cylinder. The research proves that the most variable parameters are the coordinates of the principal point. The focal length and lens distortion parameters are assumed constant. The fifth-degree polynomial is used to approximate the values of the image plane's principal point coordinates for all intermediate angles of rotation. The model can be implemented as a simple look-up table in a real-time operating device. The necessary parameter is the relative angle of cylinder's rotation, which can be obtained by an encoder or external tracking device. In this article, a tool for the computation of a relative angle based only on the images is proposed. All developed methods can be implemented in a robot system applicable for the diagnosis and treatment of the pelvis minor area of the human body mainly for the robot-assisted delivery of stem cells to the female bladder sphincter muscle. Materials and methods Calibration of an oblique-viewing endoscope Rigid endoscopes can be divided into two groups: forward-viewing endoscopes with the optical axis coincident with the axis of the cylinder's rotation and oblique-viewing endoscopes with the optical axis skewed with respect to the cylinder's axis of rotation. The angle between the two axes is usually 30° or 70°. In the first group, the field of view is restricted and depends on the point at which the device was inserted into the patient's body. For the oblique-viewing endoscopes, the field of view can be increased by the cylinder's rotation. The calibration of a forward-viewing endoscope is straightforward, because it has to be performed only once using a standard method, whereas, for oblique-viewing devices, the variability of camera parameters with the changing angle of cylinder's rotation must be taken into account. Camera calibration is used for the computation of the camera model, which describes the projection of 3D points on the image plane of a sensor. For the forward-viewing endoscope, image formation can be approximated by the well-known pinhole camera model [7] and modeled by a central projection, as described by Eq. (1): x = K0GX (1) where X is the 3D point of an object, x is the image point corresponding to X in both homogeneous coordinates, P0 is the central projection matrix, K is the matrix of intrinsic parameters, G is the matrix of extrinsic parameters consisting of rotation matrix and translation vector, and l is the scaling factor. Holak et al.: Oblique-viewing endoscope calibration135 The intrinsic parameters are used to describe the process of image formation and include the focal length f, pixel sizes sx and sy (pixels/mm), or more commonly focal length in pixels (fx=fsx, fy=fsy); the principal point (px and py in pixels), which is approximately the center of the image plane; and the radial (k1 and k2) and tangential (p1 and p2) lens distortion coefficients. The first two parameters are given in the form of a 3×3 matrix K [Eq. (2)]. The extrinsic parameters describe the position and orientation of the camera {R,T} with respect to a global coordinate frame of reference associated with an object on the scene (i.e. calibration board). fsx 0 px K = 0 fsy p y 0 0 1 (2) The impact of lens distortions on the processed image formation is modeled, for example, by the following mathematical relation: x d = x(1 + k 1 r 2 + k 2 r 4 + k 3 r6 ) + 2p1 xy + p2 ( r 2 + 2x 2 ) y d = y(1 + k 1 r 2 + k 2 r 4 + k 3 r6 ) + 2p2 xy + p1 (r 2 + 2y 2 ) (3) where xd and yd are the coordinates of image points with lens distortions present in the image and r2=x2+y2. The image formation model is graphically presented in Figure 1. For an oblique-viewing endoscope, the camera's internal parameters change during the time of endoscope operation, because there is a relative rotation between the camera's image plane and the optical elements inside the endoscope's cylinder. Therefore, a single camera calibration is not sufficient. In the literature, there are various types of modification of a standard camera model together with the developed calibration procedures. Additionally, the modeling and correction of lens distortions is currently attracting a lot of attention. Mathematical models that can be divided into three groups have been proposed. ­ Models in which the cylinder's rotation is described by a change in the camera's external parameters but not in the internal parameters. A model proposed by Yamaguchi et al. [8, 9] in which the cylinder's physical rotation is modeled by two mathematical rotations. A modification of the Yamaguchi model introduced by Wu et al. [10], in which the cylinder's rotation is captured in an additional external parameter. ­ Camera models in which the cylinder's rotation is described by a change of internal parameters. De Buck's model [11] in which the influence of the cylinder's rotation on the internal parameters is modeled by homographic transformation and the correction coefficient for the position of the principal point. Melo's model [12], where the cylinder's rotation is modeled by a 2D rotation of the image plane, modifying the K matrix of the optical system. ­ The model developed by Barreto et al. [13] based on the tracking of projection lines, which is a generalization of the image formation process. This model can be used for the description of devices that cannot be modeled by the classical pinhole camera model and central projection geometry. The calibration procedure and endoscope tracking methods described in the literature [8­11] require the application of encoders or special markers and additional Figure 1:Image formation process. Pinhole camera model and graphical representation of intrinsic camera parameters. 136Holak et al.: Oblique-viewing endoscope calibration tracking devices (e.g. the leap motion controller that uses two optical sensors and three infrared LEDs and gives the location of the tip in Cartesian coordinates for every single frame) [14]. On the contrary, the methods developed by Fukuda [15] can be applied to the cylinder's angle of rotation computation based only on captured images. A special marker is placed at the end of the endoscope, which carries information about the relative angle between the body of the endoscope and its cylinder. The calibration method described in Barreto et al. [13] requires the application of a special table of coded markers. In the research presented in this article, a two-step method of camera calibration was developed. In the first step, the set of reference parameters was defined for a chosen, fixed orientation of the endoscope's cylinder ­ the reference orientation. The values of focal length, principal point, and distortion coefficients were computed using a standard calibration procedure [16]. In the next step, the variability of parameters with the changing angle of cylinder's rotation was investigated. The obtained data were used to find all camera parameters as functions of the angle of rotation. The process of camera calibration for a chosen set of angular orientations of the cylinder was carried out. The values of parameters for the intermediate angles were computed using polynomial interpolation. During the robot-aided treatment, the values of camera parameters were stored in the form of a look-up table. The relative angle of rotation was calculated with the help of a special marker placed at the end of the endoscope, which was detected by means of image processing techniques. The correct values of camera parameters can be found in the look-up table and used for the computation of geometrical quantities that are of interest to the physician operating the device. ­ ­ Log-polar transform of the image and thresholding, and Detection of the needle's tip in the transformed image using corner detection and bottom-up raster scanning and computation of the angle of rotation. In the first step of the method, thresholding and edge detection are carried out. Edge detection can be performed using high-pass filters or the Canny algorithm. This is a preprocessing step that makes the next steps computationally less expensive. In the second step, the shape encircling the active area of the image is detected. The Hough transform is applied to detect the circular contour. The output parameters of the transform are the radius and x and y coordinates of the found circle. The detection of the center of the circle is necessary for the next step, the log-polar transform of the image. The last step is computation of a log-polar transform of the input image. It changes the coordinate frame of the image from xy basis to rq basis according to the following formula: = log x 2 + y 2 y = atan x (4) Detection of the angle of cylinder's rotation A marker in the shape of a needle was placed at the end point of the endoscope's cylinder, as shown in Figure 1. The task of the vision system was the detection of the marker and computation of the angle of rotation with respect to the reference rotation, which was fixed in the first step of the calibration. The position value was used to calculate the true relative angle of cylinder's rotation. The developed vision-based detection of the needlelike marker algorithm consists of four steps: ­ Image segmentation and edge detection, ­ Hough transform for the detection of an active part of the image marked by the circle and computation of its geometrical center, The center of the transform must be coincident with the detected center of the active image region. The detection of the marker's tip is much simpler when the image is first transformed by the log-polar operation. First, the image is processed by a corner detection algorithm (e.g. Harris corner detector). A big set of corners is extracted because of the nature of the image (checkerboard pattern). The next task is to find a corner corresponding to the marker. The image is scanned row by row from the bottom and the first corner, which is encountered by the raster scan that is the corner corresponding to the marker. After the detection is performed, the angle of rotation can be computed by inverse log-polar mapping. The entire marker detection algorithm is presented in the form of a flow chart in Figure 2. A calibration procedure based on the developed algorithms has been proposed. It consists of several steps. First, a set of calibration images is captured for each chosen set of angular positions of an endoscope's cylinder. Next, the intrinsic calibration is performed for each angle of rotation based on the method presented by Zhang [16]. After the camera parameters are computed for each angle, the values of parameters for angular positions in between the processed ones are calculated by polynomial interpolation. The resulting data are stored in a look-up table. The entire procedure is summarized in a flow chart presented in Figure 3. Holak et al.: Oblique-viewing endoscope calibration137 Figure 3:Flow chart of the developed calibration procedure. Robot RV-2AJ Calibration table Endoscop (camera Flea3) Figure 2:Flow chart of the developed marker detection method. Experimental investigation of the method The experimental rig is depicted in Figure 4. The experimental stand was composed of the following items: a Mitsubishi RV-2AJ robot with a calibration table mounted in the flange of the last robot's link, an endoscope camera (Point Grey Flea3 FL3-GE-14S3C-C) equipped with Karl Storz 28208BA rigid oblique-viewing endoscope optics, and halogen-based lighting systems. The basic technical parameters of the camera are CCD image sensor: Sony progressive scan ICX267 image sensor, size: 1/2 inch with a pixel size of 4.65 (m), resolution: 1384×1032 pixels, frame rate: 18 (fps) at 1384×1032, and A/D converter: 12-bit. The robot's motion sequences were programmed in Melfa Basic language, whereas image acquisition was carried out by the FlyCapture2 Camera Selection software. The camera calibration process was performed by Lighting halogen lamps Figure 4:Experimental setup consisting of the RV-2AJ Mitsubishi robot, the endoscope equipped with a Point Grey Flea3 camera, the calibration board (in the form of a chessboard planar pattern), and two halogen lamps. the developed Wiz2D calibration module [17]. In the first stage of calibration, the reference parameters of the model were determined. The internal parameters of the camera in a fixed position of the cylinder were estimated. As a calibration board, the chessboard planar pattern with black-and-white squares, an even number of rows and an odd number of columns, was used. The angle of rotation 138Holak et al.: Oblique-viewing endoscope calibration Table 1:Means and standard deviations of the camera's internal parameters for each investigated angle of rotation of an endoscope's cylinder obtained in the experiment. Angle (°) fx mean fx std fy mean fy std px mean px std py mean py std k1 mean k1 std k2 mean k2 std p1 mean p1 std p2 mean p2 std 0 719.117 2.6296 719.564 2.56305 670.581 1.31471 562.728 1.60064 ­0.3818 0.00431 0.15818 0.00652 0.00286 0.00069 ­0.0001 0.00042 45 725.3 2.09435 725.537 2.0033 750.388 1.24077 524.603 1.64191 ­0.3853 0.00303 0.1546 0.00368 0.00235 0.00063 0.00052 0.00022 90 726.114 1.45325 725.512 1.45278 772.661 1.00261 464.023 0.84191 ­0.3908 0.00297 0.16417 0.00498 0.00078 0.00029 0.00086 0.00038 135 724.513 1.98251 724.169 2.38192 750.92 2.24873 398.506 0.95794 ­0.3851 0.00335 0.1443 0.00497 0.00092 0.00024 0.00041 0.00121 180 725.056 1.12787 724.577 1.15488 659.92 1.2315 358.45 1.30467 ­0.3942 0.00824 0.16046 0.01872 0.00038 0.00031 ­0.0007 0.00033 225 720.862 1.61145 720.604 1.57159 607.425 1.28217 379.434 1.42264 ­0.3847 0.00505 0.15743 0.01432 0.00801 0.03734 ­0.0007 0.00027 270 721.932 1.01998 720.424 1.09162 566.559 1.25828 456.539 1.46235 ­0.3846 0.00383 0.15963 0.00996 ­0.0008 0.00026 ­0.0022 0.0003 315 719.352 2.55638 717.433 2.65979 586.401 1.52514 518.741 0.73297 ­0.3822 0.00347 0.16296 0.00817 ­0.0008 0.00022 ­0.0038 0.00082 360 717.701 1.19453 717.967 1.20968 676.725 1.11996 564.883 0.95919 ­0.3809 0.0031 0.1623 0.00917 0.00172 0.0003 0.0005 0.00028 Camera internal parameters: fx and fy (pixels) are the pixel focal length computed for horizontal and vertical directions, respectively; px and py (pixels) are the coordinates of the principal point; k1 and k2 (­) are radial lens distortion coefficients; and p1 and p2 (­) are tangential lens distortion coefficients. Focal length x and y, pixels was changed in the range of 0 to 360° (45° steps). For each rotation angle of the endoscope's cylinder, 15 calibrations were performed. To achieve high repeatability (0.02 mm) of the calibration board locations, for each calibration series, the Melfa RV-2AJ robot was used to move the calibration pattern to the same set of locations. The robot's motion program was programmed in Melfa Basic language. Focal length x and y, pixels Focal length x Focal length y Results In the experiment, the camera's internal parameters were computed for a set of angles of rotation of the endoscope's cylinder in the range of 0­360° (steps of 45°). For each angle of rotation, the mean and standard deviation of each camera parameter were computed from a set of 15 calibration results. The total number of images of the calibration board is 3240. The results are presented in Table 1. The graphs of the variability of all camera parameters with respect to the angle of rotation are presented in Figures 5­8. Figure 9 shows the map of lens distortions of the examined device for one chosen angle of cylinder's rotation. As it turned out in the examination, the only substantial change of the camera's internal parameters was associated with the principal point x and y coordinates, which have shown sinusoidal characteristics. Other parameters 200 150 Angle, ° Figure 5:Focal length variability as a function of angle rotation. have indicated no periodic behavior and their variability was very small compared to the principal point case. The fifth-degree polynomial [Eq. (5)] was chosen to approximate the px and py parameters as a function of the angle. The graphs of approximating polynomials are shown in Figure 10. The obtained functions were used to generate the look-up table for the robot system. f ( x ) = ax 5 + bx 4 + cx 3 + dx 2 + ex + f (5) Holak et al.: Oblique-viewing endoscope calibration139 1000 900 800 Principal point x and y, pixels 700 600 500 400 300 200 Principal point x and y, pixels Principal point x coordinate Principal point y coordinate ×10­3 Tangential distortions coefficients p1 and p2 Coefficient p1 Coefficient p2 6 Coefficients p1 and p2 ­2 100 0 0 50 100 200 150 Angle, ° 250 300 350 ­4 0 50 100 150 200 Angle, ° 250 300 350 400 Figure 6:Principal point coordinate variability as a function of angle rotation. Figure 8:Tangential lens distortion coefficient variability as a function of angle rotation. Radial distortions coefficients k1 and k2 Coefficient k1 Coefficient k2 Image of object in camera 1 900 800 700 0.1 Coefficients k1 and k2 y, pixels ­0.1 ­0.2 ­0.3 ­0.4 0 600 x, pixels 200 Angle, ° Figure 7:Radial lens distortion coefficient variability as a function of angle rotation. Figure 9:Map of lens distortions for one fixed orientation of the endoscope's cylinder. Conclusion Endoscopy procedures are an important diagnostic and therapeutic tool in urology. The oblique-viewing endoscope has a larger field of view than the forward-viewing one due to the possibility of rotating the cylinder and the optical axis skewed with respect to the axis of the cylinder. Therefore, oblique-viewing endoscopes have become increasingly popular in clinical practice. The calibration procedure for an oblique-viewing endoscope camera has been developed and examined in the research presented in this article. The research proves that the most variable parameters are the coordinates of the principal point. The variation of other intrinsic parameters is relatively small and can be neglected in practical applications. The calibration of the internal parameters is performed for a small set of angles of rotation and the fifth-degree polynomial is used to approximate the values of principal point coordinates for all the intermediate angles of rotation. This simple model can be implemented as a simple 140Holak et al.: Oblique-viewing endoscope calibration Principal point coordinates Px Py 800 750 Principal point coordinates, pixels 700 650 600 550 500 450 400 350 0 50 100 200 Angle, ° Figure 10:Polynomials approximating the measured values of the principal points (marked as stars). look-up table in a real-time operating device. The necessary input is the relative angle of cylinder's rotation. A computation tool of the relative angle based only on the images is proposed in this article. All developed methods can be implemented in a robot system used for the delivery of AMDC into the female bladder urethra sphincter muscle through the urethra as well as for other types of diagnosis and treatment in the pelvis minor area of the human body. Author contributions: All the authors have accepted responsibility for the entire content of this submitted manuscript and approved submission. Research funding: Reported research realized within confines of PBS 1/A9/3/2012 project supported by the Polish National Center for Research and Development. Employment or leadership: None declared. Honorarium: None declared. Competing interests: The funding organization(s) played no role in the study design; in the collection, analysis, and interpretation of data; in the writing of the report; or in the decision to submit the report for publication.

Journal

Bio-Algorithms and Med-Systemsde Gruyter

Published: Sep 1, 2016

References