Image Generator to Support the Application of a Haptic
Device for the Simulation of Arthroscopic Surgery
By
Renata Zabawa
Project Advisor:
Dr. Thomas L. Stewart
Bradley University
Department of Electrical and Computer Engineering
Submission Date:
December, 2005
EE451 Senior Capstone Project
Summary:
Magnetic Resonance Imaging (MRI) creates hundreds of images that each shows a cross section of the knee. The goal of this project is to take the cross sections of a knee MRI and extract a three dimensional model of the cartilage. This model is used to simulate a surgeon’s view during arthroscopic surgery. A second goal is to apply these results to simulate the arthroscopic surgery with a haptic feedback system. The simulation could aide medical students by allowing them to practice arthroscopic surgery in a simulator before actually performing a real arthroscopic surgery.
System Block Diagrams:
Figure 1 shows the overall system block diagram. The input is the image data from the MRI scans. The MRI scans will be 500 X 500 pixels each. This data is used to create the model of the cartilage and generate the simulation of the arthroscopic surgery.
Figure 2 shows a high-level block diagram for the entire system. The MRI scan data points each show a cross sections of the knee. DSP software uses Matlab to generate a 3-D model of the cartilage. Then the model is used to create a simulation of an arthroscopic surgery.
Software Block Diagrams:
Figure 3 illustrates the DSP software used. Matlab loads the data from an MRI scan. To generate the three dimensional model of the cartilage, a few steps must be taken. First, the data is changed into the right format if it is needed. Then isosurfaces and isocaps are used on each set of MRI points. Isosurfaces use data that has been smoothed to display the overall structure of the knee. Isocaps use unsmoothed data to reveal details of the interior of the isosurfaces. The two surfaces created are used together to form a three dimensional model of the cartilage which shows the overall structure and the details of the interior of the knee. The model of the cartilage shows the two menisci in the knee. Matlab is then used to separate the two menisci allowing one meniscus to be viewed at a time.
Figure 4 shows the graphics that will be accomplished with Matlab. Once a 3-D image of the cartilage is created and the menisci are split, Matlab functions are used to put light on the cartilage and view it from different angles. These functions create a simulation of the surgeon’s view during arthroscopic surgery. The simulation of an arthroscopic meniscus surgery is displayed on a computer monitor. Figure 5 and 6 exemplify the actual view a surgeon studies. Figure 5 illustrates torn menisci and Figure 6 shows healthy menisci.
Figure 5: Arthroscopic Surgery View of Figure 6: Arthroscopic Surgery View of
Torn Cartilage Healthy Cartilage
Preliminary Results:
A model of the cartilage from MRI data of the knee has been created. The cartilage model is seen in Figure 7: Cartilage Model.
Figure 7: Cartilage Model
There are a few basic issues that need to be examined. The issues are lighting, view control and making the MRI data model look like an arthroscopic surgeon’s view. Lighting properties were examined using a cone created in Matlab. The cone created is seen in Figure 8: Cone with No Light. Light properties were then used to illuminate an area of the cone and black everything else out. The resulting cone is seen in Figure 9: Cone with Light. This effect is what will be done to the cartilage model to simulate a arthroscopic surgeon’s view.
Figure 8: Cone with No Light
Figure 9: Cone with Light
View control was investigated. Matlab has functions that allow a camera to be positioned and the cameras target to be positioned. These positions allow views of the model to be taken at different angles and at different locations. The amount of preliminary work that could be done with the view control was limited by the fact that the cartilage is not yet split. The surface of the cartilage that is examined in an arthroscopic surgery is the area in between the two menisci seen in Figure 7. The part of the cartilage viewed by the surgeon is seen in Figure 10: Cartilage Viewed by Surgeon. Once the cartilage is split so each meniscus can be viewed individually, more work with view control will be completed.
Figure 10: Cartilage Viewed by Surgeon
Light was added to the cartilage model made from the MRI data to make it appear like an arthroscopic surgeon’s view of the meniscus. Problems were met with this. The thickness of the cartilage is about 0.1 inches. The model blows up the size of the cartilage on the screen. This is done with the amount of data that is taken to view the 0.1 inch thick cartilage. The model stretches the data and there are not enough points in the cartilage model. This lack of data causes distortion to show on the cartilage when light is added. Figure 11: Distortion of Light on Cartilage Model shows the distortion of light. To fix this problem, the data between data points must be rendered. This adds more data points so the light does not have the distortion. The rendering routine will be a two dimensional filter over a curve. It has not yet been determined how this will be implemented.
Figure 11: Distortion of Light on Cartilage Model
Once the simulation of a surgeon’s view during arthroscopic surgery is created, a haptic feedback system will be implemented with the model.
Patents:
No patents have been made for the simulation of an arthroscopic knee surgery. Similar patents have been obtained. One patent was issued to Kurt Amplatz for a medical simulator that enables demonstration, trial and test of insertion of torqueable elongated members, such as guide wires or catheters, into small body passages. Another patent was issued to Nobuhiko Mukai, Masayuki Harada and Katsunobu Muroi for a simulated medical treatment virtually executed by an operator with a simulated medical instrument, while using the virtual model information. Wendy Plesniak, Ravikanth Pappu and Stephen Benton received a patent for a coincident visuo-haptic workspace, which allows a user to see, feel, and interact with synthetic objects. There is a spatial display that enables synthetic objects to become a part of the user's manipulatory space.
Schedule:
Project Tasks
Separate the menisci | |
Create surgeon’s view of meniscus using light and view control | |
Apply the results to simulate the arthroscopic surgery with a haptic feedback system |
Schedule
Date |
Task |
12/06/05 |
Present Project |
12/07/05 |
Study Day |
12/16/05 |
Research rendering routines for cartilage model |
01/19/06 |
Address rendering issues of cartilage model |
01/26/06 |
|
02/02/06 |
|
02/09/06 |
Address light issues of cartilage model |
02/16/06 |
Research light and view control on cartilage |
02/23/06 |
|
03/02/06 |
Address issues for the light and view control |
03/09/06 |
|
03/16/06 |
|
03/23/06 |
Research haptic feedback system |
03/30/06 |
Implement haptic device with cartilage model |
04/06/06 |
|
04/13/06 |
Documentation, Presentation |
04/20/06 |
|
04/27/06 |
Equipment List:
· PC with
· SensAble Phantom Omni Haptic Device
Bibliography:
Hill, F.S. Jr. (1990). Computer Graphics using OpenGL, Second Edition. New Jersey, Upper
Saddle River: Prentice Hall.
Holland, Thomas & Marchand, Patrick (2003). Graphics and GUIS with Matlab, Third Edition.
Washington, DC: Chapman & Hall/CRC.
Links, Jonathan and Prince, Jerry. Medical Imaging: Signals and Systems. New Jersey: Pearson Education, 2006.
Matlab, Version 7.0.1, The MathWorks Inc., Natrick, MA 01760, 2004.