Nothing Special   »   [go: up one dir, main page]

Surgical Robotics: The Early Chronicles A Personal Historical Perspective

Download as pdf or txt
Download as pdf or txt
You are on page 1of 11

Surgical Laparoscopy, Endoscopy & Percutaneous Techniques

Vol. 12, No. 1, pp. 6–16


© 2002 Lippincott Williams & Wilkins, Inc., Philadelphia

Surgical Robotics: The Early Chronicles


A Personal Historical Perspective

Richard M. Satava, MD, FACS

Summary: The use of robotics has been emerging for approximately 75 years, but
only during the past 5 years has the potential of robotics been recognized by the
surgical community as a whole. This personal perspective chronicles the development
of robotics for the general surgical community, the role of the military medical re-
search effort, and many of the major programs that contributed to the current success
of robotics. Key Words: Military medicine—Minimally invasive surgery—Robotics.

For approximately 75 years, robots were the stuff of many, such as the Massachusetts Institute of Technology
science fiction. Their descriptions ranged from dumb (MIT)–Utah Hand (Fig. 3), could exceed human perfor-
machines that replaced monotonous work, as first de- mance in a specific dexterous task or exceed human sen-
scribed by the Czech playwright Karel Capek in the clas- sual perception, none achieved even the minimal intelli-
sic 1921 play Rossum’s Universal Robots (RUR), to the gence of a 2-year-old baby. Many attained expertise in a
ultraintelligent anthropomorphic robots of Isaac Asi- specific domain, with various recognition capabilities,
mov’s classic science fiction books of the 1950s, to the but these robots were never able to demonstrate cogni-
familiar R2D2 and C3PO of Star Wars films in the tive abilities. This is the background from which medical
1970s, to the incredible cyborgs of the Terminator film robotics originated.
series. However, rarely were they depicted as medical
robots (except in a few scenes in Star Wars). DEVELOPMENT OF CURRENT SYSTEMS
Robots gradually made their way into factories for The earliest conceptions of surgical robotics were de-
performance of dangerous, repetitive tasks requiring ac- veloped by Scott Fisher, PhD (1), at the National Aero-
curacy (automobile assembly), for handling hazardous nautics and Space Administration (NASA) Ames Re-
wastes in nuclear industries (Fig. 1), for assembling parts search Center (Palo Alto, CA, U.S.A.), and Joseph
with great dexterity and precision (computer chips), and Rosen, MD (Department of Plastic Surgery, Stanford
as delivery robots, such as those by Joseph Engelberger, University, Palo Alto), in the mid to late 1980s. At that
MD (Fig. 2). None of these were anthropomorphic; they time the NASA–Ames group, led by Michael McGreevy,
all were designed to provide functionality. Although PhD, and Steve Ellis, PhD, was working in virtual real-
ity. This group was joined by Scott Fisher and Joe Rosen
during development of the first head-mounted display
Received September 18, 2001; accepted October 16, 2001. (HMD; Fig. 4) for displaying the massive amounts of
From the Department of Surgery, Yale University School of Medi- data being returned from NASA’s planetary exploration
cine, New Haven, Connecticut; and the Telemedicine and Advanced
Technology Research Center, U.S. Army Military Research and Ma- missions of Voyager and others. At this time, Jaron
teriel Command, Ft. Detrick, Maryland, U.S.A. Lanier, who coined the term virtual reality, contributed
The opinions or assertions contained herein are the private views of the DataGlove and object-oriented program from his
the author and are not to be construed as official or as reflecting the
views of the Department of the Army, Department of the Navy, the company, VPL, Inc. (VPL is an abbreviation for visual
Advanced Research Projects Agency, or the Department of Defense. programming language), which made it possible to in-
Address correspondence and reprint requests to Dr. Richard M. teract with the three-dimensional virtual scenes. Scott
Satava, Department of Surgery, Yale University School of Medicine,
40 Temple Street, Suite 3-A, New Haven, CT 06510. Address elec- Fisher and Joe Rosen integrated these ideas of interac-
tronic mail to: richard.satava@yale.edu tivity of virtual reality and applied them to surgical

6
SURGICAL ROBOTICS: PERSONAL HISTORICAL PERSPECTIVE 7

FIG. 1. The Argonne National Laboratory dexterous manipulator


(from Johnson and Corliss, Archives of Argonne National Laboratory,
1967).

robotics. With their earliest concepts (Fig. 5) they envi-


sioned telepresence surgery (a term coined by Scott
Fisher, who later started his own company, Telepresence FIG. 3. Massachusetts Institute of Technology–Utah dexterous hand
(courtesy of Stephen Jacobsen, University of Utah, Salt Lake
Research, Inc.) with use of the DataGlove to control the City, UT).
remote robotic hands.
The NASA–Ames team had expertise in virtual reality sition of Sarnoff Research Institute in Princeton, NJ).
but not robotics. Joe Rosen and Scott Fisher took their Phil Green was head of the biomechanics section at SRI
vision to Phil Green, PhD, at Stanford Research Institute and was working with other roboticists such as John Hill,
(SRI, later changed to SRI International after its acqui- PhD, Joel Jensen, PhD, and Ajit Shah, PhD. Tom Pi-
antanida, PhD, provided the human interface technology
expertise and was the company’s expert in the emerging
virtual reality field. With Joe Rosen’s clinical input, the
first project for development was an extremely dexterous
telemanipulator to greatly enhance vascular and nerve
anastomoses for hand surgery. In keeping with the virtual
reality and telepresence concept, the design focused upon
an intuitive interface (Fig. 6) that was able to give sur-
geons the sense that they were operating on a hand di-
rectly in front of their eyes, which was in fact located on
the other side of the room. Scott Fisher was fond of
saying that, although he could not teleport (as in “Beam
me up, Scotty”) he could send his presence to a remote site.
In 1988 to 1989, the parallel development of laparo-
scopic cholecystectomy emerged on the surgical front.
Jacques Perissat, MD, of Bordeaux, France, presented a
videotape of a laparoscopic cholecystectomy at the So-
ciety of American Gastrointestinal Endoscopic Surgeons
(SAGES) annual meeting in Atlanta, Georgia. The pro-
found effect of the introduction of laparoscopic surgery
to the mainstream surgical community (in addition to the
FIG. 2. HelpMate, a medical delivery robot of Joseph Engelberger pioneering procedures performed by Eddy Joe Reddick,
(courtesy of HRI website: http://www.pyxis.com). MD, Douglas Owens, MD, Barry McKernan, MD, and

Surg Laparosc Endosc Percutan Tech 2002, 12:1


8 R. M. SATAVA

geon: degradation of the sense of touch, loss of natural


three-dimensional visualization (2), and impairment of
dexterity, principally due to the fulcrum effect of the
instruments.
While Joe Rosen was beginning animal trials and then
early clinical trials with the Green Telepresence Surgery
System (as it was being called), Richard Satava, MD,
began working first with the NASA–Ames group and
then with the SRI telepresence team. As a general sur-
geon and surgical endoscopist, he was immediately
aware that the telepresence system provided a number of
solutions to the problems inherent in laparoscopic sur-
gery (3). In response to Rick Satava’s suggestions, Phil
Green began directing the telepresence effort toward
macroscopic surgery (4) and specifically to improving
laparoscopic surgery, in addition to microscopic surgery
for Joe Rosen (hand surgery). A videotape of the tele-
presence surgery system was demonstrated to Col. Russ
Zajtchuck, MD, and Donald Jenkins, PhD, of the Borden
Institute at Walter Reed Army Medical Center
(Bethesda, MD). They brought this to the attention of the
Surgeon General of the Army, Alcide LaNoue, which
resulted in the transfer of Satava from Ft. Ord, Califor-
nia, to the Pentagon’s Advanced Research Projects
Agency (ARPA, which was developing the ARPA-net
that later evolved into the Internet). Under the Surgeon
General’s support, ARPA (later to become DARPA in
FIG. 4. Scott Fisher, wearing one of the first head-mounted displays at 1993) was requested to begin a program in advanced
the NASA–Ames Research Center virtual reality laboratory, circa 1985 biomedical technologies, to include telepresence surgery
(courtesy of Scott Fisher, Telepresence Research, Inc., Palo Alto, CA).
(as it was now being called). Donald Jenkins was re-
quested to be co–program manager in this effort, which
George Berci, MD) caused an explosion in the use of over the next 7 years funded a majority of projects in
laparoscopic cholecystectomy. It soon became apparent telepresence and robotic surgery.
that although laparoscopic surgery was of great benefit to A third effort was also beginning independently in the
the patient, it created enormous difficulties for the sur- early 1990s. Hap Paul, a doctor of veterinary medicine,

FIG. 5. Earliest concept of telepresence surgery, from drawings by FIG. 6. Initial telepresence surgery workstation, showing intuitive in-
Joseph Rosen and Scott Fisher, circa 1986 (courtesy of Joseph Rosen, terface, circa 1987 (courtesy of Philip Green, SRI International, Menlo
Dartmouth University Medical Center, Hanover, VT). Park, CA).

Surg Laparosc Endosc Percutan Tech 2002, 12:1


SURGICAL ROBOTICS: PERSONAL HISTORICAL PERSPECTIVE 9

and William Barger, MD, an orthopedic surgeon, began


collaborating with Russell Taylor, of the IBM T. J. Wat-
son Research Center (5), to develop a robotic system
(based on the IBM Puma arm) that could be used in hip
replacement surgery (many breeds of dogs, including
German shepherds and golden retrievers, were brought to
Hap Paul for replacement of fractured and dislocated
hips). The research by this team resulted in the first ro-
botic surgical device, named RoboDoc (now manufac-
tured by Integrated Surgical Systems, Sacramento, CA,
U.S.A.; Fig. 7). This was a modification of the basic
principals of the Puma Arm, which enabled preoperative
planning of the procedure (including matching the pros-
thesis exactly with the femur that would accept the pros-
thesis). RoboDoc was able to precisely core out the
femoral shaft with 96% precision, whereas a standard
hand broach provided only 75% accuracy. Barger then
took the system to clinical trials in humans, after Hap
Paul proved its efficacy (clinically) in his veterinarian
practice, and RoboDoc is now a commercial product.
Subsequently, orthopedic surgeons such as Anthony Di-
Gioia, MD, (6) have been developing other systems such
as the HipNav for replacement of the knee and hip joints.
On the other side of the Atlantic, at this same early
stage of development, two teams were producing early
prototype surgical robotic systems, one in each of the
different categories, as above. One system, by Sir John
Wickham, MD, and Brian Davies, PhD, of Guy’s Hos-
pital in London (7), was similar to RoboDoc in that it
was used for precise coring; however, as a urologist, FIG. 8. The transurethral resection of the prostate (TURP) robot, with
Wickham developed the system to assist in transurethral a mechanically constraining ring to insure safety, circa 1986 (courtesy
of Sir John Wickham, Guy’s Hospital, London).
resection of the prostate. This was a mechanically con-
strained system with a robotic arm similar to the Puma
and RoboDoc, but for patient safety it had a large, cir- instrument was passed and which prevented the robotic
cular metal ring (Fig. 8) through which the resection arm from moving out of the precise field of the prostate.
After successfully proving the accuracy of the system on
potatoes and then a few patients in the clinic trial, Wick-
ham was given permission to conduct studies on animals
to prove its efficacy and safety.
The second system being developed in Europe was a
collaboration of Hermann Rinnsland, PhD, of the For-
schungszentrum Karlsruhe (Karlsruhe Nuclear Research
Center, Karlsruhe, Germany), and Gerhard Buess, of the
University of Tuebingen, in Tuebingen, Germany (8).
Hermann Rinnsland was head of the group that devel-
oped Germany’s telemanipulation robotics for handling
of nuclear waste. This was a highly dexterous system,
similar to the SRI system, but with significant differ-
ences, especially in the surgeon’s workstation. This sys-
tem, called Advanced Robot and Telemanipulator Sys-
FIG. 7. RoboDoc, the first robotic surgical system, used to core the
femoral shaft in total hip replacement, circa 1986 (courtesy of Hap tem for Minimally Invasive Surgery (ARTEMIS; Fig. 9)
Paul, DVM, University of California–Davis, Sacramento, CA). had remote telemanipulators like the SRI system, but the

Surg Laparosc Endosc Percutan Tech 2002, 12:1


10 R. M. SATAVA

proximity to the intended target—in essence, audio navi-


gation assistance for the Stealth Station. This is one of
the only systems that were designed to provide synes-
thesia, the substitution of one sense (audio) for another
sense (vision) to improve accuracy.
Beginning in July 1992, Rick Satava and Don Jenkins
developed the DARPA Advanced Biomedical Technolo-
gies program. The military imperative was to use ad-
vanced technologies to save soldiers who had been
wounded on the battlefield. Review of the Wound Data
and Munitions Effectiveness Team database of the casu-
alties of the Vietnam War (12) revealed that although
great improvement had occurred in overall mortality, ex-
FIG. 9 The ARTEMIS robotic surgery system, showing the remote amination of those soldiers with life-threatening wounds
manipulators and surgical workstation, circa 1989 (courtesy of Gerhard in the far-forward battlefield showed little change from
Buess, Tuebingen University Medical Center, Tuebingen, Germany).
as early as the Civil War. As a rough generalization,
one-third died of head or massive injuries, one-third died
surgeon’s console had the hand input devices “over the of wounds (principally exsanguinating hemorrhage) that
shoulder” to provide extra manipulation capabilities. The were estimated to be survivable on the basis of today’s
system was very efficient, but after the first prototype technology, and one-third survived. A comprehensive
was developed and demonstrated to be effective, funding
program was initiated, utilizing advanced sensor, robot-
for the Forschungszentrum project was not renewed, and
ics, telemedicine, and virtual reality systems. One of the
this promising system has yet to progress into the com-
prime concepts was to implement Scott Fisher and Joe
mercial phase.
Rosen’s idea to “bring the surgeon to the wounded sol-
Thus, the state of the art in robotics surgery in 1993
dier—through telepresence.” The Green Telepresence
was that of the systems describe above. However, the
Surgery System was seen as a method of providing sur-
military (through the DARPA program) began to dra-
gical care right on the battlefield to save those soldiers
matically increase attention to the Green Telepresence
who would otherwise exsanguinate (13). It was envi-
Surgery System in the following years, until the closeout
of the DARPA program in 1999 (see below for military sioned that the robotic manipulator arms would be
system). mounted in a vehicle for Medical Forward Advanced
All during this period (late 1980s to 1993), the neu- Surgical Treatment (MEDFAST). The vehicle chosen
rosurgery and radiology communities were investigating was a Bradley Fighting Vehicle (577A; Fig. 10). The
robotics to enable collaboration with neurosurgeons to surgical workstation was to be placed in the rear echelon
precisely position probes, resection instruments, ablation Mobile Advanced Surgical Hospital (MASH); when a
devices, and other surgical tools, principally for mini- soldier was wounded it was envisioned that the medic
mally invasive brain surgery. Frank Jolesz, MD, of would place him in the MEDFAST vehicle, and together
Brigham and Women’s Medical Center (Boston), and the surgeon (at the telesurgery unit in the MASH) and the
William Lorensen, PhD, of General Electric Research medic in the MEDFAST vehicle would perform just
Center (Milwaukee, WI), were pioneers in the develop- enough surgery to stop the hemorrhage (the current con-
ment of open magnetic resonance imaging systems (9) cept of “damage-control surgery”), in order for the sol-
for real-time updates of brain images in the initial real- dier to be transported as soon as possible back to the
time, image-guided neurosurgical systems. Neurosur- MASH, but now the soldier would be alive instead of
geon Richard Bucholz, MD, of St. Louis University exsanguinating before arrival at the MASH. In 1996, a
Medical Center (10), was also independently developing military field test was conducted by SRI International,
a tracking system, called the Stealth Station, that could which demonstrated that surgery could be performed
be used during neurosurgery to provide accurate stereo- over a 5-km distance with a microwave telecommunica-
tactic navigation during surgery. This effort resulted in tion link-up between a MASH hospital and the MED-
production of an image-guided system for real-time FAST vehicle. However, the battlefield of the 1990s was
tracking of instruments in surgery. In 1996, Daniel Kar- changing from the conventional, open areas to the close
ron, PhD, of New York University (11), developed an quarters of urban terrain, which was ill suited for the
audio system that provided audio feedback depending on MEDFAST vehicle. Although successfully demonstrated

Surg Laparosc Endosc Percutan Tech 2002, 12:1


SURGICAL ROBOTICS: PERSONAL HISTORICAL PERSPECTIVE 11

der the direction of Rodney Brooks, PhD. This group


was working on the haptics (sense of touch) and devel-
oped an accurate force feedback system for the robotic
devices (16). This became a commercial product called
“The Phantom” (Boston Dynamics, Inc., Cambridge,
MA, U.S.A.; Fig. 11), which has become the industry
standard for providing haptics to a virtual environment
and the basics for robotics systems. Other researchers
from MIT working to improve telepresence surgery in-
cluded Blake Hannaford, PhD, and David Brock, PhD.

THE COMMERCIALIZATION YEARS

The commercialization of robotic surgery started with


the RoboDoc system in 1992 to 1993, as indicated above.
In spite of the exceptional performance of RoboDoc, the
system went through a prolonged approval process with
the Food and Drug Administration. However, for direct
surgical manipulation in laparoscopic surgery, the first
FIG. 10. Medical Forward Advanced Surgical Treatment application was to control the camera in laparoscopic
(MEDFAST) vehicle (courtesy of Anthony Aponick, Foster-Miller,
Inc., Waltham, MA, 1995). surgery. With initial seed funding from DARPA, Yulun
Wang, PhD, began developing the Automated Endo-
on the animal model, the system has not yet been imple- scopic System for Optimal Positioning (AESOP) (17) in
mented for battlefield casualty care. his newly formed company (Computer Motion, Inc.,
A significant number of other robotic surgery appli- Santa Barbara, CA, U.S.A.). This provided acceptance
cations were being developed by DARPA to provide by the medical and surgical community of robotics as an
solutions for many difficulties. Thomas Sheridan, of effective assistive device. This system was the first ro-
MIT (14), was tackling the latency problem–the time of botic device to receive approval from the Food and Drug
travel of the electronic signal, from the moment the Administration and launched the movement toward ro-
handle of the instrument on the workstation moved until botics in general surgery.
the signal arrived at the tip of the manipulator. It is During this time frame, image-guided surgery systems
known that humans can compensate for latency (delay) became commercialized with the introduction of both the
of up to 200 milliseconds (msec), after which the delay is NeuroMate in Switzerland (now a product of Integrated
too great for accuracy. Tom Sheridan was attempting to
solve the problem by predictive algorithms and was suc-
cessful in demonstrating tolerance of delay up to 300
msec. Other investigators, such as Alberto Rovetta, PhD,
of Milan, Italy, have tried to work around the problem by
having identical software programs at the two remote
places, so the only thing transmitted is the hand signals.
In 1993, Alberto Rovetta (15) was able to successfully
perform a biopsy on a pig liver with the surgeon’s station
being at the NASA Jet Propulsion Laboratory in Pasa-
dena, California, and the manipulators and pig liver in
his laboratory in Milan. The time lag with use of a sat-
ellite was >1,200 msec (1.2 seconds). (Note: it takes
approximately 1.2 seconds for a signal to be transmitted
to a geosynchronous satellite 22,000 miles above the
earth and then return.)
Other contributions were made by Kenneth Salisbury,
PhD, Marc Raibert, PhD, and Robert Playter, PhD, of the FIG. 11. Phantom haptics input device (courtesy of Marc Raibert,
MIT Artificial Intelligence and Robotics Laboratory, un- Boston Dynamics, Inc., Cambridge, MA, 1995).

Surg Laparosc Endosc Percutan Tech 2002, 12:1


12 R. M. SATAVA

(18). Within a year, Computer Motion had put their sys-


tem, Zeus (Fig. 14), into production. The two systems are
similar in that they have remote manipulators that are
controlled by a surgical workstation. One major differ-
ence is in the surgical workstations. The da Vinci system
displays a stereoscopic image just above the surgeon’s
hands so it appears as if the surgical instrument tips are
an extension of the handles. This gives the impression
that the patient is right in front of the surgeon (or, con-
versely, that the surgeon has been transported to right
FIG. 12. NeuroMate neurosurgical system, originally a Swiss devel- next to the patient—hence the term telepresence). The
opment but now a component of Integrated Surgical Systems (ISS; Zeus system is ergonomically designed, with the monitor
Sacramento, CA, U.S.A.), which developed the RoboDoc. comfortably in front of the surgeon’s chair and the in-
strument handles in the correct eye–hand axis for maxi-
Surgical Systems; Fig. 12) and Richard Bucholz’s mum dexterity. There is no illusion of being at the pa-
Stealth Station. These systems were specifically devel- tient’s side; rather, there is the sense of an operation at a
oped for neurosurgery, as was the General Electric open distant site but with enhanced capabilities. Initially, the
magnetic resonance imaging system that was popular- da Vinci system was the only one with an additional
ized by Ference Jolesz and Ron Kikinis of Brigham and degree of freedom, a “wrist”; however, recently the Zeus
Women’s Hospital (8). system has introduced instruments with a wrist.
While AESOP was being marketed to the surgical The concept of dexterity enhancement was suitable for
community, Fredrick Moll, MD, licensed the SRI Green the emerging laparoscopic surgery field, and especially
Telepresence Surgery rights and started Intuitive Surgi- for minimally invasive cardiac surgery applications.
cal, Inc. After extensive redesigning from the ground up, Although the original Green Telepresence Surgery Sys-
the da Vinci surgical system (Intuitive Surgery, Moutain- tem was designed for remote trauma surgery on the
view, CA, U.S.A.; Fig. 13) was produced and introduced. battlefield, the commercial telepresence systems were
In April 1997, the first robotic surgical (tele-operation) envisioned for delicate cardiac surgery, specifically coro-
procedure on a patient was performed in Brussels, Bel- nary artery bypass grafting. It was believed that the ro-
gium, by Jacques Himpens, MD, and Guy Cardiere, MD botic systems would allow minimal access surgery on the

FIG. 13. The da Vinci robotic telepres-


ence surgery system (courtesy of Fred-
erick Moll, Intuitive Surgical, Inc.,
Menlo Park, CA, 1999).

Surg Laparosc Endosc Percutan Tech 2002, 12:1


SURGICAL ROBOTICS: PERSONAL HISTORICAL PERSPECTIVE 13

ever, Marescaux and Gagner used a dedicated high-band-


width asynchronous transfer mode terrestrial fiberoptic
cable and were able to conduct the surgery with a delay
of only 155 msec. Thus, with a very broadband, terres-
trial fiber optic cable connection, it is possible to perform
remote surgery over thousands of miles. When the next-
generation Internet, with the 45 megabyte-per-second fi-
ber optic cabling, becomes universally available, such
remote surgery can become a reality to many places in
the world.

CHALLENGES AND FUTURE SYSTEMS


FIG. 14. Zeus robotic system (courtesy of Yulun Wang, Computer
Motion, Inc., Goleta, CA).
The current systems are just the beginning of the ro-
botics revolution. All of the systems have in common a
beating heart. This is to be achieved by first blocking and
central workstation from which the surgeon conducts the
then overpacing the heart and gating the motion of the
surgery. This workstation (Fig. 13 and 14) is the central
robotic system to the heart rate. Although the minimal
point that integrates the entire spectrum of surgery (Fig.
access approach has been achieved, the “virtual stillness”
16). Patient-specific preoperative images can be im-
of the gating method is still in development.
ported into the surgical workstation for preoperative
The challenge of extremely accurate and dexterous
planning and rehearsal of a complicated surgical proce-
robotics was chosen for ophthalmologic surgery, specifi-
dure, as is being developed by Jacques Marescaux, MD
cally for laser retinal surgery. The blood vessels on the
(19). Figure 17 illustrates a patient’s liver with a malig-
retina are 25 ␮m apart. Human performance limits are an
nant lesion and the methods of visualizing, preoperative
accuracy of approximately 200 ␮m. Stephen Charles,
planning, and procedure rehearsal. At the time of sur-
MD, of Baptist Hospital and MicroDexterity Systems,
gery, this image can be imported into the workstation for
Inc. (MDS), in Memphis, Tennessee, collaborated with a
intraoperative navigation. It can also be used as a stand-
brilliant team at NASA Jet Propulsion Laboratory, which
alone workstation for surgical simulation for teaching
included Paul Schenker, Hari Das, Edward Barlow, and
surgical skills and operative techniques. Thus, on an in-
others, to develop the Robot Assisted MicroSurgery
visible level, the challenge is to integrate the software of
(RAMS) system (Fig. 15). This system included 3 basic
all the different elements, such as importing images, pre-
innovations: (1) eye tracking of the saccades of the eye
operative planning tools, automatic segmentation and
(200 Hz) so the video image was perfectly still on the
registration for data fusion and image guidance, and so-
video monitor, (2) scaling of 100 to 1, giving the system
phisticated decision and analysis tools to provide auto-
10-␮m accuracy, and (3) tremor reduction (between 8
matic outcomes analysis of the surgical procedure.
and 14 Hz), removing any tremor or inaccuracy. Today,
any surgeon could sit down at the microdexterity system
and perform laser surgery with 10-␮m accuracy, that is,
20 times the accuracy of the unaided human hand.
Remote surgery with use of robotics was limited to
short distances because of the latency issue. Only re-
cently (2001) was the Zeus system used for a trans-
Atlantic robotic surgery operation, between New York
City and Strasbourg, France, by Jacques Marescaux and
Michel Gagner. The limitation to long-distance surgery
is the latency or delay, which cannot exceed 200 msec.
At longer delays, the time from the hand motion of the
surgeon until the action of the robot’s end effector (in-
strument) is so long that the tissue could move and the
surgeon would cut the wrong structure. In addition, with
FIG. 15. Robot Assisted MicroSurgery (RAMS) for 10-␮m accuracy
delays >200 msec, there is conflict within the robotic in laser retinal surgery (courtesy of Steve Charles and the NASA–Jet
system, such that the system becomes unstable. How- Propulsion Laboratory team, Pasadena, CA).

Surg Laparosc Endosc Percutan Tech 2002, 12:1


14 R. M. SATAVA

FIG. 16. The concept of telepresence surgery and a central workstation that integrates the entire spectrum of surgical care (courtesy of Joel Jensen,
SRI International, Menlo Park, CA).

On a technical side, few if any of the systems include diotherapy, desiccation, and ablation. In addition, ad-
the full range of sensory input (e.g., sound, haptics, or vanced diagnostic systems, such as ultrasonography,
touch), and there are but a few simple instruments (end near-infrared, and confocal microscopy equipment, can
effectors). The next generation of systems will add the be mounted on the robotic systems and used for mini-
sense of touch and improved instruments. The instru- mally invasive diagnosis. The systems will become
ments will need to be both standard mechanical instru- smaller, more robust (not require a full-time technician),
ments as well as energy-directed instruments such as for and less expensive. They will be adaptive for the require-
electrocoagulation, high-intensity focused ultrasound, ra- ments of other surgical subspecialties.
In the evolution of robotics, the systems will become
more intelligent, eventually performing most, if not all,
of an operative procedure. In current systems such as
RoboDoc and NeuroMate, the surgeon preplans the op-
eration on patient-specific computed tomographic scans.
This plan is then programmed into the surgical robot, and
the robot performs precisely what the surgeon would
have done, but with precision and dexterity above human
limitations. This is a trend that will continue, with the
surgeon planning more and more of the operation that the
robot can effectively and efficiently carry out. The robot
must be under complete control of the surgeon, in case
something unexpected were to occur, in which case the
surgeon would take over. It is conceivable that in the
distant future, under special circumstances such as re-
mote expeditions or the NASA mission to Mars, robots
FIG. 17. Patient-specific imaging from a computed tomographic scan would be performing the entire surgical procedure. How-
of a liver metastasis used for preoperative planning, operative re-
hearsal, intraoperative navigation, and surgical simulation (courtesy of ever, in the near future there will be development of
Jacques Marescaux, IRCAD, Strasbourg, France). hybrid hardware–software systems that will perform

Surg Laparosc Endosc Percutan Tech 2002, 12:1


SURGICAL ROBOTICS: PERSONAL HISTORICAL PERSPECTIVE 15

complete portions of an operation, such as an anastomo-


sis or nerve grafting.
These systems will require a complicated infrastruc-
ture, and the operation room of the future will have to
accommodate them. The unique requirements for these
systems include a very robust information infrastructure,
access to information from the outside (such as radio-
graphs, images, and consultations), voice control of
the system by the surgeon, and microminiaturization
of the systems. Perhaps there will be an evolution of
the operating room to resemble more of a control room
because of the large number of electronics that need to
be controlled. An interesting product involved with
the monitoring and control of patients is the Life Support
for Trauma and Transport (LSTAT; Integrated Medical
Systems, Signal Hill, CA, U.S.A.; Fig. 18), which is in FIG. 19. Concept of integration of the Life Support for Trauma and
essence an entire intensive care unit. Although the Transport (LSTAT) into a robotic surgical system (courtesy of Matt
LSTAT was developed by the military as an evacuation Hanson, Integrated Medical Systems, Signal Hill, CA).
system for the battlefield (the “trauma pod” from Robert
Heinlein’s Starship Troopers), it contains complete tant, how to control them. The first generations of these
monitoring and administration systems, has telemedicine systems will not be visible to the eye, will probably be
capability, can be docked and undocked without remov- manufactured chemically by the billions, and will not be
ing the patient, and is fully compatible with current controlled but, like drugs, will be programmed to recog-
telerobotic systems. A system similar to this may be nize certain cell or tissue types to deliver medication or
incorporated into the operating room of the future (Fig. cause ablation.
19) to facilitate patient anesthesia, surgery, and transpor- Frequently, microelectromechanical systems (MEMS)
tation while maintaining continuous monitoring. are discussed in conjunction with nanotechnology, but
There has been speculation about the use of nanotech- these systems are 1,000 times larger (1.0 × 10−6 m) than
nology to inject miniscule robots into the bloodstream to nanotechnology systems (1.0 × 10−9 meters). Such sys-
migrate or be navigated to the target. Numerous concept tems would be visible as very tiny robots that could be
diagrams show mechanical types of systems that either directly controlled by a surgeon. However, as the tech-
are controlled by a surgeon or are autonomous. Although nology is scaled down in size, it also is scaled down in
these are interesting conceptually, there is little practical power or the force that can be generated, making it ex-
understanding of how to actually construct such total, tremely difficult to actually conduct work at this scale.
complex systems on a molecular level and, more impor- Although there are a number of MEMS robots (Fig. 20),
none are actually performing any significant work, let
alone any activity resembling a surgical procedure. Nev-
ertheless, MEMS and nanotechnology are areas for fu-
ture potential surgical robotics, which will take decades
to develop and perfect. It is essential for surgeons to be
aware of these and other technologies such as quantum
mechanics, biomimetic materials and systems, tissue en-
gineering, and genetic programming, to anticipate the
great revolution that is developing.

CONCLUSION

Robotics has established a foothold in surgical prac-


tice. Commercial systems have been available for a few
years, and their value is undergoing stringent scientific
FIG. 18. The Life Support for Trauma and Transport (LSTAT) in its
battlefield configuration (courtesy of Matt Hanson, Integrated Medical evaluation in randomized clinical trials. Although the
Systems, Signal Hill, CA). initial reports are promising, it will be necessary for more

Surg Laparosc Endosc Percutan Tech 2002, 12:1


16 R. M. SATAVA

FIG. 20. A small, autonomous biomi-


metic robot constructed from microelec-
tromechanical systems (MEMS) tech-
nology (courtesy of Sandia National
Laboratory, Albuquerque, NM).

long-term, evidence-based outcome studies to prove their dic surgery: image guided and robotic assistive technologies. Clin
Orthop 1998;September(354):8–16.
efficacy. More important, it will be necessary to prove
7. Wickham JEA. Future developments of minimally invasive
the cost-effectiveness in addition to the other significant
therapy. BMJ 1995;308:193–6.
nontechnical issues such as accommodating the operat-
8. Schurr MO, Brietwieser H, Melzer A, et al. Experimental telema-
ing rooms, training operating room personnel and sur- nipulation in endoscopic surgery. Surg Laparosc Endosc 1996;6:
geons, and gaining acceptance of the technology. How- 17–75.
ever, the future is promising because of the great 9. Jolesz FA. Image guided procedures and the operating room of the
potential of these systems to extend the capabilities of future. Radiology 1997;204:601–12.
surgical performance beyond human limitations. In ad- 10. Bucholz RD, Greco DJ. Image-guided surgical techniques for in-
fections and trauma of the central nervous system. Neurosurg Clin
dition to the typical robotic systems that are available North Am 1996;7:187–200.
today, the next generation, using the emerging MEMS 11. Karron DB, Bucholz RD. Evaluation of tactical audio technol-
and nanotechnology fields, will extend even further the ogy for intraoperative neurosurgical instrument navigation.
capabilities. This nascent field will provide fruitful and http://www.casi.net/D.TADs/D.CASI-SLU_protocol/casi-slu-
rewarding research for decades to come, with the prom- protocol.htm.
ise to greatly improve the quality of surgical care for 12. Zajtchuck R, Jenkins DP, Bellamy RF. Textbook of military medi-
cine: part 1. Warfare, weaponry and the casualty. Vol 5. Wash-
patients. ington, DC: Office of the Surgeon General of the Army,
1998:64–72.
13. Satava RM. Virtual reality and telepresence for military medicine.
REFERENCES Comput Biol Med 1995;25:229–36.
14. Sheridan TB. Telerobotics, automation and human supervisory
1. Fisher SS, McGreevy MM, Humphries J, et al. Virtual environ- control. Cambridge, MA: MIT Press, 1992.
ment display system. In: Crow F, Pizer S, eds. Proceedings of the
15. Rovetta A, Sala R, Cosmi F, et al. Telerobotics surgery in a trans-
Workshop on Interactive 3-Dimensional Graphics. New York:
atlantic experiment: application in laparoscopy. In: Kim WS, ed.
ACM, 1986:1–12.
Telemanipulator technology and space telerobotics: proceedings
2. Cuschieri A. Visual displays and visual perception in minimal of SPIE, 2057. 1993;2057:337–44.
access surgery. Seminars in Laparoscopic Surgery 1995:2:209–14.
16. Raibert M, Playter R, Krummel TM. The use of a virtual reality
3. Green PS, Hill JH, Satava RM. Telepresence: dextrous procedures
haptic device in surgical training. Acad Med 1998;73:596–7.
in a virtual operating field [abstract]. Surg Endosc 1991;57:192.
4. Satava RM. Robotics, telepresence and virtual reality: a critical 17. Wang Y, Sackier J. Robotically enhanced surgery: from concept to
analysis of the future of surgery. Minimally Invasive Therapy development. Surg Endosc 1994;8:63–6.
1992;1:357–63. 18. Himpens J, Leman G, Cardiere GB. Telesurgical laparoscopic cho-
5. Paul HA, Bargar WL, Mittlestadt B, et al. Development of a sur- lecystectomy. Surg Endosc 1998;12:1091.
gical robot for cementless total hip arthroplasty. Clin Orthop 19. Marescaux J, Clement JM, Tassetti V, et al. Virtual reality applied
1992;December(285):57–66. to hepatic surgery simulation: the next revolution. Ann Surg 1998;
6. DiGioia AM, Jaramaz B, Colgan BD. Computer assisted orthopae- 228:627–34.

Surg Laparosc Endosc Percutan Tech 2002, 12:1

You might also like