Nothing Special   »   [go: up one dir, main page]

WO2007137115A2 - Multi-display medical/surgical image and data viewer system that presentes user-defined, custom panoramas - Google Patents

Multi-display medical/surgical image and data viewer system that presentes user-defined, custom panoramas Download PDF

Info

Publication number
WO2007137115A2
WO2007137115A2 PCT/US2007/069141 US2007069141W WO2007137115A2 WO 2007137115 A2 WO2007137115 A2 WO 2007137115A2 US 2007069141 W US2007069141 W US 2007069141W WO 2007137115 A2 WO2007137115 A2 WO 2007137115A2
Authority
WO
WIPO (PCT)
Prior art keywords
image
panorama
data
display
presented
Prior art date
Application number
PCT/US2007/069141
Other languages
French (fr)
Other versions
WO2007137115A3 (en
Inventor
Don Malackowski
Original Assignee
Stryker Corporation
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Stryker Corporation filed Critical Stryker Corporation
Publication of WO2007137115A2 publication Critical patent/WO2007137115A2/en
Publication of WO2007137115A3 publication Critical patent/WO2007137115A3/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00002Operational features of endoscopes
    • A61B1/00004Operational features of endoscopes characterised by electronic signal processing
    • A61B1/00009Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00002Operational features of endoscopes
    • A61B1/00043Operational features of endoscopes provided with output arrangements
    • A61B1/00045Display arrangement
    • A61B1/0005Display arrangement combining images e.g. side-by-side, superimposed or tiled
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/74Details of notification to user or communication with user or patient ; user input means
    • A61B5/742Details of notification to user or communication with user or patient ; user input means using visual displays
    • A61B5/7435Displaying user selection data, e.g. icons in a graphical user interface
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images
    • G16H30/20ICT specially adapted for the handling or processing of medical images for handling medical images, e.g. DICOM, HL7 or PACS
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images
    • G16H30/40ICT specially adapted for the handling or processing of medical images for processing medical images, e.g. editing
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/25User interfaces for surgical systems
    • A61B2034/254User interfaces for surgical systems being adapted depending on the stage of the surgical procedure
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/25User interfaces for surgical systems
    • A61B2034/256User interfaces for surgical systems having a database of accessory information, e.g. including context sensitive help or scientific articles
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B2090/364Correlation of different images or relation of image positions in respect to the body
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/50Supports for surgical instruments, e.g. articulated arms
    • A61B2090/502Headgear, e.g. helmet, spectacles
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/25User interfaces for surgical systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0002Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network
    • A61B5/0004Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network characterised by the type of physiological signal transmitted
    • A61B5/0013Medical image data
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7232Signal processing specially adapted for physiological signals or for diagnostic purposes involving compression of the physiological signal, e.g. to extend the signal recording period
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/361Image-producing devices, e.g. surgical cameras
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation

Definitions

  • This invention is generally related to a system for displaying medical information generated from different sources. More particularly, this invention is related to a system for simultaneously presenting medical information on a number of different displays wherein the information presented on each display is information specifically relevant to the individual (s) viewing that display.
  • This information includes images of a surgical site taken prior to a surgical procedure as well as real time images captured during the procedure. Other information is in the form of data such as pulse rate, and blood oxygen content. Still other information is available and best interpreted in graphical format. This latter type of information includes heart rate data and electrical response, such as EEG data. Medical/surgical personnel also have available data about the medical equipment used to perform the procedure on the patient. For example, if the surgical procedure performed requires a motorized handpiece, data about the operating speed of the handpiece is typically available. If suction is applied to the surgical site, data are available regarding the suction level as well as the quantity of material drawn from the surgical site. Equipment that delivers anesthesia similarly provides information about the characteristics of the anesthesia introduced into the patient .
  • each type of information is presented on a unique display. If the information is produced in real time, the display is typically integral with the device that produced the information. For example, a unit that monitors the blood oxygen content has its own display that graphically or numerically displays these data. If the unit is a surgical navigation unit, the unit includes a video display on which is presented the position and orientation of the tracked object or instrument. If the information is a retrieved stored image, that image is also presented on a specific display. For example, preoperative X-ray or CAT images of a surgical site are often presented on a light panel or monitor in the operating room.
  • one quadrant of the display may present an image captured by an endoscopic camera.
  • the quadrant below that quadrant may contain an X-ray image of the surgical site.
  • the quadrant to the side of the one in which the camera image is displayed may present graphical heart beat and brain wave data.
  • the remaining quadrant may contain information that is best presented numerically such as blood oxygen level, motor speed for the handpiece used to perform the procedure, suction level and capsule pressure.
  • An advantage of the above presentation is that it substantially eliminates the need for an individual to look from place to place in the operating room in order to find the information of interest. However, when all the information are presented on a single screen, the individual still must spend time focusing his/her eyes on the portion of the screen displaying the information of most immediate interest .
  • This invention is related to a new and useful way of displaying information in a medical/surgical environment such as in an operating room.
  • the system and method of this invention presents each viewer with only the information relevant to that individual.
  • One version of the system of this invention includes a data bus.
  • the sources of image-generating data are connected to the data bus. These sources include both medical devices used to monitor the state of the patient and devices used to perform therapeutic procedures on the patient. Other sources connected to the bus include file servers in which information collected earlier for viewing during the procedure are stored.
  • the system and method of this invention includes a processing unit known as a view creator.
  • the view creator contains configuration data regarding a multi-image panorama that is to be presented on each display. These data indicate for each display: the types of images to be presented; the location on the display at which the image presenting the data is to be presented; and data used for editing the image to ensure that the image is presented as desired by the particular viewer. These configuration data are set by the primary individual (s) who view the display.
  • each display device has its own panorama generator.
  • the view creator sends panorama definition data for a specific display to the panorama generator integral with the display.
  • the data-presenting image from each image generator is output on the bus.
  • the panorama generator integral with each display captures data containing the image to be presented on the display.
  • the panorama generator then loads a frame buffer with image defining data that collectively defines the panorama that is be presented on the display.
  • the view creator includes one or more internal panorama generators.
  • the view creator receives the image forming data from each of the image-generating units. Also connected to directly to the view creator are the individual display devices.
  • the panorama generators generate bit maps for each of the display devices.
  • Each bit map contains data that defines the panorama that is to be presented on a specific display device. Once a bit map is created, the view creator forwards the bit map to the correct display device.
  • Figure 1 is block diagram of a medical/surgical network such as in an operating room in which the system of this invention may be present;
  • Figure 2 is block diagram of a first view creator of this invention
  • Figure 3 depicts in block form the different types of data that comprise a panorama definition file
  • Figures 4A, 4B and 4C collectively form a flow chart of the process steps executed by the view creator to generate a panorama definition file
  • Figures 5A through 5E sequentially illustrate how a number of images are selected, resized and positioned to form a multi-image panorama that is viewable from a display;
  • Figure 6 is a block diagram of an image output module integral with one of the image generating devices of this invention.
  • Figure 7 is a block diagram of a panorama generator associated with one of the display devices of this invention.
  • Figures 8A through 8F collectively form of flow chart of the process steps executed by the system of this invention in order to create a custom defined panorama on a display device;
  • Figure 9 depicts in block form the types of data contained in an amended panorama definition file
  • Figure 10 is a block diagram of the configuration of an alternative system of this invention for presenting custom panoramas on different displays
  • Figures HA and HB collectively illustrate in block form the major components of the view creator of the system of Figure 10;
  • Figures 12A and 12B collectively form a flow chart of the process steps executed by the second described system of this invention
  • Figure 13 is a block diagram of a number of the modules internal to an alternative panorama generator of this invention.
  • Figure 14 is a flow chart of the process steps executed by the view creator when it emulates the isochronous resource manager
  • Figure 15 is block form depiction of the data in a panorama definition file that defines a panorama that is reactively presented as the result of the occurrence of a trigger event
  • Figure 16 is a detailed block illustration of the sub-fields in a panorama state trigger field of the file of Figure 16;
  • Figure 17 is a flow chart of the process by which the panorama generator, in response to trigger events, reactively causes different panoramas comprising different views to be presented on the display with which the panorama is associated;
  • Figure 18 is an example of alternative panorama that the panorama may cause to be presented on the associated display in response to the occurrence of a trigger event.
  • FIG. 1 is block diagram view of the components of a network forming system 30 of this invention.
  • System 30 includes a bus over which the below-described network components communicate.
  • bus 31 is an optical cable or Firewire bus.
  • Bus 31 of this invention has a bandwidth of at least 1.5 Gbit/sec. This bandwidth is based on a version of the system of this invention streaming full motion video from four distinct sources and graphic data from another four distinct sources and the data output over the bus is compressed prior to transmission. It should be understood the above transmission rate is exemplary, not limiting. In versions of this invention with fewer sources of video signals and/or enhanced data compression, smaller bandwidths may be all that is needed. In versions of the invention with more sources of video, particularly more sources of full motion video, and/or less data compression, larger bandwidths may be required.
  • bus 31 Connected to bus 31 are a number of devices that output image-forming data. These include devices that output images captured in real time in the operating room.
  • An example of such a device is a camera 32 connected to an endoscope 34. Camera 32 captures the image at the surgical site that viewed through the endoscope 34.
  • Still another type of device that captures an image real time is a personal perspective camera 36. Personal prospective camera 36 is worn as head mounted unit by the surgeon. Other individuals in the operating room find the view captured by the personal perspective camera 36 useful so they can have perspective, a view, of the surgical site identical to that of the surgeon.
  • a file server 38 is another device that outputs image-forming data.
  • File server 38 outputs images of the data stored prior to the start of the surgical procedure. Examples of such images are X-rays, CAT scans or data regarding other diagnostic tests or therapeutic procedures performed before the present procedure. It should be appreciated that file server 38 may not be located in the operating room but in another location in the hospital or even at a more remote location. Bus 31 thus may not be connected directly to file server 38. Instead, bus 31 is connected to a head, a hub, a gateway, or a relay unit to another network such as hospital Intranet network, the Internet or the commercial telephone network. File server 38 is connected bus 31 over this head and the additional network.
  • FIG. 40 a single box labeled INSTRUMENT CONSOLE 40 represents the different types of devices that output image-forming data regarding their operating states and use history.
  • a powered tool console generates data regarding the operating speed of the motor in the handpiece the console is used to energize.
  • a console used to output an RF signal will output data regarding the frequency, the power level of the RF signal applied to the patient and the temperature of the tissue at the site where the RF signal is applied.
  • Equipment used to deliver anesthesia outputs data regarding the quantity of anesthesia and the proportion of anesthesia to carrier media.
  • a suction unit outputs data regarding the level of the suction that is drawn as well as the quantity of material withdrawn from the patient.
  • An irrigation unit supplies data regarding the flow rate of irrigation fluid to the surgical site, the overall amount of fluid supplied and, in some instances, the fluid pressure at the surgical site.
  • BIO-MONITOR 42 represents the different types of devices that output image- forming data regarding the patient's condition. These devices include units that monitor blood pressure, hear rate, brain activity and blood oxygen content. Both instrument console 40 and bio-monitor 42 output the data as image forming data over bus 31. As before, it should be understood that this invention is not limited to the described monitors.
  • Surgical navigation unit 44 tracks the position and orientation of surgical devices, including instruments and implants, relative to the surgical site. Based on these data and a stored image of the surgical site, the surgical navigation unit 44 presents an image of the positions and orientations of the surgical devices relative to the surgical site.
  • FIG. 1 Also connected to bus 31 are plural different display devices.
  • two display monitors 46 and 48 are shown. Monitors 46 and 48 each include a screen that can be viewed by two or more individuals. The size and placement of monitors 46 and 48 is not relevant to the scope of this invention. For example both monitors may be small screen (approximately 19 inch diagonal) . Alternatively, one or both of the monitors may be large screen, (36 inches or more diagonal measurement.)
  • Other display devices are designed to be worn by and present images that are viewed by a single individual. These units are often mounted to a helmet, eyeglass frames or other structural support member worn by the user. Some of these units are small video monitors. Still other units project images onto a user worn screen, such as personal protection face shield. Another class of individual unit is able to project am image directly onto the iris of the user. Frequently, these display devices are referred to as heads-up displays. In Figure 1, a single heads up display 50 is shown diagrammaticalIy .
  • View creator 56 stores data used to define the collection of images that form the single multi-image panorama that is to be presented on each of the display monitors 46 and 48 and the individual heads-up displays 50.
  • data signals are transmitted over bus 31 using FireWire (IEEE 1394) standards. Integral with each device that outputs image-defining data is an image output module 58. (For simplicity, only the image output module 58 internal to the surgical navigation unit 44 is shown.) Image output module 58 receives the image-defining data from the image generating device to which it attached. Image output module 58 places these data into packets for transmission over bus 31. Image output module 58 then outputs these data as isochronous data packets on bus 31.
  • FireWire IEEE 1394
  • panorama generator 60 Integral with each display unit is a panorama generator 60. (For simplicity, only the panorama generator 60 integral with monitor 48 is shown.) As described in detail below, panorama generator 60 selectively captures selected ones of the image forming data packets broadcast over bus 31. Panorama generator 60 packages the plural captured images into data forming a single panorama. The panorama-defining data are supplied to the display so that the display presents the user-designed panorama of images .
  • View creator 56 can be constructed out of a general purpose processing unit.
  • a personal computer can function as the view creator 56. While not explicitly illustrated, it should be appreciated that the view creator 56, in addition to a processor and memory, includes input and output devices.
  • a conventional monitor can function as the output device.
  • a keyboard and mouse can function as input devices.
  • a touch screen display functions as a combined input-output device.
  • Figure 2 illustrates in block form the basic software modules executed on the view creator 56 and the basic data files in the view creator memory. Also shown in Figure 2 are the modules that control the reading of data into and writing of data out of the view creator 56.
  • a first module is input/output module 66.
  • the input/output module 66 includes the drivers that, in response to the data output by the other modules, cause the appropriate image to be presented on the monitor integral with the view creator 56.
  • Input/output module 66 also includes the interpreters that, in response to the commands entered through the input devices, generate instructions to the other modules.
  • a bus interface 68 is also internal to the view creator 56.
  • Bus interface 68 is the module internal to the view creator that writes data out over bus 31 and also receives and forwards the data from the bus written to the view creator 56.
  • Panorama constructor 70 is a specific processing module internal to the view creator 56. The view constructor 70, in response to user-entered commands, generates data that define the panorama that is to be presented on the particular display device that will be viewed by the user. View constructor 70 constructs the panorama based on reference data stored in an image library 72 and display library 74 both internal to the memory integral with the view creator 54.
  • Image library 72 identifies the sources of images, the devices that output images, from which the panorama can be created.
  • image library 72 contains data indicating that the user can select as part of the view one or more of the following types of images as part of the panorama: endoscopic camera image; surgeon perspective image; navigation image; stored X-ray image; blood pressure data; capsule pressure; blood oxygen content; suction level; handpiece speed; liters of fluid collected; and time since start of procedure.
  • Display library 74 identifies the types of display devices that are attached to the bus 31.
  • the view constructor 70 builds a panorama definition file 76, seen in Figure 3.
  • the process by which the view constructor 70 builds a panorama definition file 76 is now described with reference to the flow chart of Figures 4A-4C.
  • the view constructor 70 polls the other devices connected to bus 31. Specifically, the view constructor 70 requests each device to indicate whether or not the device outputs image-forming data and, if so, the type of data.
  • Step 82 represents the report from each device regarding whether or not it is an image-forming data generator and the type of the image-forming data it outputs. Based on these data, in a step 84, view creator determines if a specific device generates image-forming data.
  • view creator builds the image library 72. Specifically, the view creator establishes in library 72 a record for each device that can generate image-forming data indicating the type of data available and the one or more types of images available from the device. Not shown is the loop back from step 86 of each of the additional devices attached to bus 31. [00050] The loop backs from steps 84 and 86 represent that at least step 84 is executed for every device that identifies itself to view constructor 70. Not shown is the decision step indicating that once all devices have been identified and the contents of the image library 72 constructed, other processes are executed by the view constructor 70.
  • Step 88 the view constructor polls the other devices attached to bus 31 to determine if any of the devices are display devices.
  • Step 89 represents the response from a device indicating that it is a display device.
  • the responding display device provides data indicating the type of device it is. These data indicate the type of view that is created by the device. For example, it may be possible to present on a screen display a number of individual images that collectively define the panorama. A heads up display device may only be able to produce a panorama that consists of a single image.
  • step 90 view constructor 70 decides if the device is a display device. If this evaluation tests positive, in step 92, the view constructor generates the display library 74.
  • Each record in the display library describes the characteristics of a particular display device. Display device characteristics it is appreciated include the size of the panorama produced by the device or, in the case of some heads-up displays, that the device can only present a panorama consisting of a single image. Not shown in the Figures is the loop back after step 92 is executed for or a particular display device or the decision step indicating that all potential display devices connected to bus 31 are identified.
  • view constructor 70 is used to build the individual panorama definition files 76.
  • This process starts in step 96 with view constructor 70 requesting the individual building the file to identify the file.
  • the individual is asked to identify: the specific individual for whom the panorama is to be presented; the type of display unit, (main display, anesthesia display, surgeon heads-up display) on which the constructed panorama is to be displayed; and the specific procedure for which the panorama is to be displayed.
  • the list of potential displays is based on the list of displays in the display library 74.
  • a step 82 is executed in which a panorama identification field 120 is created for the panorama definition file.
  • Panorama identification field 120 it should be appreciated, contains a number of sub fields. Thus there are sub fields for indicating the specific individual for whom the panorama will be created, the display device on which the panorama will be presented and procedure for which the panorama is to be presented.
  • View constructor 70 then prompts the individual to indicate what image-forming data are to be used to form the under construction display view.
  • a step 102 the individual is asked to indicate a first image, (IMAGE N in Figure 4B), that is to form part of the displayed panorama.
  • the view constructor 70 indicates the choices of images available for presentation based on the data in image library 72. The user then selects a first image for viewing.
  • a step 104 data indicating this image are written to an IMAGE 1 TYPE field 121 for the panorama definition file.
  • the user edits how the selected image is to appear as part of the overall panorama presented on the display.
  • a representation of the image is presented. If this is the first image component of the panorama, the image may occupy the whole of the virtual representation of the display screen.
  • Figure 5A the first image selected for incorporation into a particular panorama is the image available from the surgical navigation unit 44.
  • Identification number 105 represents the frame of the display device on which the panorama under construction is to be presented. This frame may be the actual physical frame or a symbolic representation.
  • step 106 using controls such as keyboard buttons or a mouse, the user sizes the image so it has the size the user wants to occupy in the display. For example, the user may want the particular image to occupy the whole of the displayed panorama, half the panorama or a smaller fraction of the overall area of the panorama. This is represented by the reduction of size of the image from Figure 5A to Figure 5B.
  • the user positions the resized image so that it occupies the position of the display preferred by the user.
  • this is represented by the repositioning of the resized image from the upper left corner to the lower right corner of the display.
  • the user adjusts how the image is presented to his/her personal preferences. With regard to a video image, this may involve setting the contrast, chroma or brightness of the image to that preferred by the user. If the image is graphic or alphanumeric in nature, the user selects both the color of the background in which the image is presented as well as the color of the image itself. Thus a particular anesthesiologist may want to simultaneously display blood pressure levels as black numbers on a medium blue background and blood oxygen content levels as red numbers on a light blue background.
  • a write image characteristics step 108 the view constructor writes additional data to the panorama definition file 76 that describes how the particular image is to appear as part of the overall panorama presented on the display.
  • an image size field 122 and an image screen location field 124 are established in the panorama definition file 76.
  • Image size field 122 contains data indicating the size of the particular image within the complete panorama.
  • the image size field contains an indication by pixel length and pixel width of the overall size of the image within the panorama.
  • Image screen location field 124 contains data indicating where, on the display, the image should be presented.
  • image size field contains data indicating from a reference pixel location, the location of the starting pixel of the image .
  • an image configuration field 126 is created for the image. Data in the image characteristics 126 field indicate how the image is to be custom presented based on the user-input preferences .
  • an image sequence field 128 is shown being part one of the image characteristics files for each image. This is because some displays may be configured so that the plural images that comprise the panorama may be presented sequentially. When these plural images are presented on a large screen type display, they may be presented consecutively on a single portion of the display. In a heads up display on which it is possible to present only a single image, the images are presented sequentially. [00063] Therefore, as part of the image editing step 106, the user may indicate the sequence in which the panorama- forming images are to be displayed as well as the amount of time each image is presented as part of the panorama. Then, in write image characteristics step 108, the view creator creates or edits the image sequence field 128 for the particular specific panorama-forming image. Image sequence field 128, may include a number of sub fields. Into these sub fields are data indicating when the sequence of images this particular image is to be presented and the overall length of time the image is to be presented as part of the viewable panorama.
  • step 102 is reexecuted.
  • steps 106 and 108 are then reexecuted.
  • the reexecution of these steps is shown as a loop back from decision step 110.
  • the initial reexecution of step 106 is seen in the differences between Figures 5C and 5D.
  • Figure 5D is seen as including data from the unit that is used to suction withdraw fluid away from the surgical site.
  • the view constructor 74 in a reexecution of step 108, creates the new image characteristics fields for this subsequent panorama-forming image.
  • Figure 5D the custom characterization of the image from the suction unit is seen in that the numeric data presented as part of the image is presented in two different colors. Here the suction rate data is seen in blue, the fluid withdrawn data is presented in green.
  • Figure 5E represents what the particular user wants as his/her custom panorama.
  • this particular user wants to see in the lower left quadrant of the panorama an X-ray retrieved from the file server 38.
  • the upper left quadrant of the panorama patient identifying data are presented.
  • data regarding the time since anesthesia was first introduced is presented. This comes from one of the instrument consoles 40 attached to bus 31. It is further observed that the user has customized the panorama so the duration of anesthesia information is presented as red graphics on a brown back ground.
  • the newly created panorama definition file 76 is loaded in a library 114, step 112 of panorama definition files maintained in the memory internal to the view creator 56.
  • step 112 of panorama definition files maintained in the memory internal to the view creator 56.
  • panorama retrieval module 130 Another module internal to view creator 56 is the panorama retrieval module 130. As discussed below, the panorama retrieval module 130, based on user-entered commands, selectively forwards panorama definition files 76 to the display devices.
  • an output module 58 is associated with each image producing device connected to the bus 31.
  • the output module includes a video signal processor 140 and a bus interface 142.
  • Video signal processor 140 receives the video signals output by the image generating device.
  • the video signal processor 140 converts the image data so that it fits in a common sized area.
  • video signal processor converts the data into a form so that, regardless the size of the screen the image from the generating unit would occupy, the data would completely fill a screen with a diagonal size of 18 inches and that has a 4:3 aspect ratio.
  • the video data generated by processor 140 is forwarded to the bus interface 142.
  • the bus interface outputs the data on bus 31 in the form of isochronous data packets.
  • Bus interface 142 and in some versions of the invention processor 140 also compresses the digitized image data prior to its transmission over bus 31.
  • the function performed by the video signal processor 140 is performed by a processor internal to the image generating device.
  • a panorama generator 60 is associated with each display device. Specifically, the panorama generator is located between bus 31 and the display driver 162 of the display device to which the panorama generator is attached.
  • the display driver 162 is the unit integral with the display device that receives or retrieves signals that define the panorama to be presented on the display and converts them into video-image forming signals that cause the display creating components of the display to present the panorama on the screen or other viewing element integral with the display.
  • panorama generator 60 includes a bus interface 164.
  • Bus interface 164 is the component internal to the panorama generator that reads data from and outputs data over bus 31. If communications over bus 31 is in accordance with the Firewire protocol or similar protocol, bus interface 164 it is appreciated, receives data packets that are output on bus 31 as a consequence of both asynchronous and isochronous transfers.
  • An asynchronous transfer is a data transfer over bus 31 to a designated receiver. Typically, a successful asynchronous data transfer is followed by an acknowledgement from the recipient.
  • An isochronous transfer is a transfer in which the object of regular data delivery outweighs the need to receive a confirmation of data delivery. Also, in an isochronous transfer, data may be transmitted, broadcast, to plural receivers. The image-defining data produced by the image generating devices is broadcast in isochronous transfers over bus 31.
  • Image cache 167 is a memory module internal to panorama generator 60.
  • the image cache 167 serves as the memory wherein some of the image-defining data are stored prior to being forwarded to the associated display.
  • the image cache is the memory wherein image- defining data that is not streamed in real time are stored for presentation on the associated display 46, 48 or 50. Examples of such data are medical image data (X-ray, CAT or MRI scans) that are retrieved from the hospital file server 38. These image-defining data are loaded into the image cache 167 through the bus interface 164.
  • the panorama generator 60 is configured to present on the associated display a panorama that consists of plural of other than real-time generated images.
  • the image cache 167 is of such a size that it can simultaneously store plural different sets of image-defining data.
  • panorama generator 60 includes plural image caches 167. Each image cache 167 functions as the memory in which image-defining data for a separate image are stored for an extending period of time.
  • a sealer 166 is also part of panorama generator 60.
  • the sealer 166 is the initial recipient of the image-defining data captured by the bus interface 164 and forwarded down line in the panorama generator.
  • Sealer 166 resets the image-defining data so it is the right size, appropriate number of pixels by length and width, for the panorama that will be presented. Typically, but not always, the sealer 166 compresses the image-defining data.
  • Sealer 166 outputs the scaled image-defining data to a contrast/chroma converter 168.
  • the contrast/chroma converter 168 adjusts the individual pixel defining data that comprise the image-defining data so that the image has the appropriate appearance.
  • the scaled and contrast/chroma adjusted image- defining data are output from converter 168 to an image mapper 172.
  • the image mapper 172 loads the individual pixel defining data into a frame buffer 176.
  • Frame buffer 176 is a memory also internal to the panorama generator 60.
  • the data locations in frame buffer 176 correspond to individual location on the screen (monitor) of+ the display device at which the image defined by the data is to appear.
  • Image mapper 172 loads each set of data that defines a particular image that contributes to the overall panorama into the locations internal the frame buffer that correspond to where, on the display screen, the image is to appear.
  • the data stored in the frame buffer 176 can be considered the panorama-defining data.
  • Display driver 162 is connected to the frame buffer 176.
  • the display driver 163 retrieves the panorama- defining data. These data are the data used to regulate the operation of the display forming elements of the display that cause the selected panorama to be presented on the display screen.
  • sealer 166, contrast/chroma converter 168 and image mapper 172 are distinct components or sub-circuits.
  • a DSP is part of the panorama generator 60. The DSP performs the scaling and contrast/chroma adjustment of the image defining data. The DSP also loads the image-defining data into the frame buffer.
  • the sealer 166, the contrast/chroma converter 168 and the image mapper 172 are each firmware modules that are continually executed by the DSP.
  • panorama processor 177 Also integral with the panorama generator 60 is a panorama processor 177.
  • the panorama processor 177 in response to instruction data received over bus 31, regulates operation of the sealer 166, the contrast/chroma converter 168 and the image mapper 172.
  • Panorama processor 177 is also connected to image cache 167. Specifically, the panorama processor 177 asserts a read out signal to image cache 172 that causes the cache 172 to read out to the sealer 167 the stored image-defining data.
  • software executed by the DSP may further perform the function of the panorama processor 177.
  • step 202 all the image sources are connected to bus 31.
  • step 204 all the display devices, the monitors 48 and 50 and the heads displays 50, are connected to the bus 31.
  • step 206 the view creator 54 is connected to the bus.
  • the bus interface 68 internal to the view creator at least emulates the function of isochronous resource manager at least for the image-defining data.
  • bus interface 68 maintains the list of channels, (time slots) in which each image generating device can output isochronous data packets over the bus 31. This list is then stored in a sub-directory in the image library 72. [00082] Also as part of the reset process of step 208, the image library and display library are updated, steps 80-92 and are reexecuted. Thus, at the start of the procedure, the view creator 70 stores data that identifies the image generating device and the display devices currently connected to the bus 31.
  • System 30 is now ready for configuration for current procedure.
  • a step 210 based on prompts generated by the panorama retrieval module 130, an indication of the user, device and/or procedure is entered.
  • the panorama retrieval module 130 reviews the panorama creation files 76 to determine if a panorama has been predefined for the particular user, device and/or procedure. If a panorama has been so defined, in step 214, it is retrieved from the library. Alternatively, if a panorama has not been defined, in a step 216, the panorama retrieval module calls the view constructor 70 for execution. Thus, the individual is invited to create a new panorama definition file 76 based on the connected image generating devices and the available display devices.
  • the panorama retrieval module 130 compares the list of requested images in the retrieved file to the list of images as maintained in the image library 72 that are presently available. If the data in image library 72 indicates that all necessary image generating devices are connected to the bus 31, panorama retrieval module 130 proceeds to execute step 220. In step 220, module 130, by reference to the display library, verifies that the appropriate display device as specified in the retrieved panorama definition file 76 is also connected to bus 31.
  • panorama retrieval module 130 executes step 222.
  • the panorama retrieval module sends a message to the user. The message reports the unavailability of the requested images and/or the specified display device.
  • the panorama retrieval module 130 by calling the view constructor 70 for execution, allows the user to edit the previously defined panorama to adjust for the unavailability of one or more of the requested images and/or the requested device.
  • panorama retrieval module 130 executes step 228.
  • module 130 generates an amended panorama definition file 230, seen in Figure 9, for each display device.
  • Each amended panorama definition file 230 is based on the panorama definition file 76 with data describing the data panorama to be presented on the display device.
  • the amended panorama definition file 230 does not contain the frame identification data since these data are not needed by the individual panorama generators.
  • the amended panorama definition files 230 do not contain the image type fields 112. Instead the view constructor substitutes channel identification data, in a channel identification field 232, for the image type data.
  • the channel identification data it should be understood, identifies the isochronous channel from which the selected image defining data can be obtained.
  • Step 238 the view constructor forwards the file over bus 31 to the panorama generator 60 integral with the display device that will produce the image. Typically this process is an asynchronous data transmission.
  • Step 240 represents the storage of the amended view definition file 230 by the panorama generator 60 and more particularly, the panorama processor 177.
  • the panorama processor 242 configures the panorama generator 60 for operation.
  • the panorama processor 177 based on the data in the channel identification fields 232, informs the bus interface which channels of isochronous data contain image-defining data that contribute to the panorama that is to be constructed.
  • the panorama generator 60 obtains image- defining data that is not streamed in real time, the data that define "static" image (s) that is (are) to be presented as part of the panorama.
  • This process can be considered part of panorama generator configuration step 242. However, for purposes of illustration, this sub-process is presented as a series of separate steps.
  • An initial step in this sub-process is the generation of the request for the data that defines the static image, step 243.
  • the panorama processor 177 requests these data from the device containing in which they are stored.
  • panorama processor 177 outputs a request through bus interface 164 and bus 31 to the file server 38 that specific image defining data such as an X-RAY or CAT scan image be written out to panorama processor.
  • File server 38 if it does not serve as the storage device in which the requested image-defining data are stored, contains a directory indicating where the file server 38 can obtain the data. File server 38 may then retrieve the data itself.
  • file server returns to panorama processor 177 data identifying the device on which the requested data are stored.
  • Step 244 represents the receipt of the static image-defining data by the panorama generator 60. As part of step 244 these data are stored in the image cache 167. [00092] Once step 244 is executed the static image- defining data are then available to be continually presented on the display 46, 48 or 50 throughout the course of the procedure. As represented by step 245, during the procedure, panorama processor 177 periodically determines if these data need to be supplied to the display 46, 48 or 50 as part of the process of refreshing the presented panorama. When it is time to so forward the data, in a step 246, the panorama processor 177 generates a write instruction to the image cache 167.
  • the image cache writes out the requested data to the display.
  • the data are actually written out to the sealer 166.
  • Sealer 166, chroma/contrast converter 168 and image mapper 172 then processes these data so it can be written into the frame buffer 176 in the manner in which the other image-defining data, the streamed image- defining data, are written into the frame buffer 176.
  • system 30 of this invention Upon execution of step 242, system 30 of this invention is ready for use.
  • Step 248 represents the generation of image-defining data by a device that generates images. These data are forwarded to the image output module 58 to which the image generating device is connected.
  • the video signal processor 140 internal to module 58 converts the image defining data into data defining the standard size image.
  • the bus interface 142 outputs the resized image defining data during the assigned isochronous time channel over the bus 31.
  • step 250 the repacking and broadcast of the image define data.
  • one of the video signal processor 140 or bus interface also compresses the data forming the video signal.
  • step 254 each bus interface 164, decides if the packet received during a particular isochronous channel contains data used by the device to which the bus interface is attached.
  • each bus interface 164 internal to a panorama generator 60 makes this determination based on the data from the panorama processor 177 indicating which channels contain image-defining data used by the generator 60. Again, these are the data in the IMAGE N CHNL ID fields 232. If, in step 254 it is determined that the image-defining data contained in a particular channel does not contribute to the panorama under construction, in a step 256 it is discarded.
  • bus interface 160 determines that the image-defining data received in the channel does contribute to the panorama under construction, two events occur. In step 258, the bus interface 160 informs the panorama data of the specific type of image data it has received. Step 258 may simply involve the bus interface informing the panorama processor that data from channel "N" has been received. Also, in a step 260, the image defining data are forwarded to the sealer 166.
  • panorama processor 177 configures the other components of the panorama generator so that the image data can be appropriately processed, step 262.
  • This configuration is based on the data received in the amended panorama generation file that is specific for the image-defining data received in the identified channel.
  • the panorama processor 177 provides scaling instructions based on the data in the IMAGE N SIZE field 122 to the sealer 166.
  • Panorama processor 177 provides instructions to the contrast/chroma converter indicating how the scaled image defining data are to be adjusted. These data are based on the data from the appropriate IMAGE N CONFGRTN field 126.
  • the panorama processor 177 provides data to the image mapper that indicates the position of the image in the panorama. These data are from the associated IMAGE N SCR LCN field 124.
  • sealer 166 in a step 264 resizes the image forming data so it defines an image of the defined size. It should be appreciated that, as part of step 262, sealer 166 may be the component internal to the panorama generator that decompresses the video data. Alternatively, the decompression is initially performed by bus interface 164. In a step 266, the contrast/chroma converter 168 adjusts the resized image defining data so that it defines an image having the appropriate characteristics.
  • chroma/contrast converter 168 adjusts the chroma, contrast and brightness of the image -defining data so that when the resultant image is presented on the display 46 or 48 as part of a multi-image panorama, the image, in comparison to the other images does not appear either excessively bright or dark.
  • Image mapper 172 receives the resized and adjusted image defining data. The image mapper loads these data into the appropriate space in the frame buffer 176, step 268. [00099] Once the above steps 264, 266 and 268 are executed for all images forming the panorama, frame buffer 176 thus contains a bit map of data defining the complete panorama that is to be presented on the display device.
  • Display driver 162 using a protocol appropriate to the display device, cyclically reads the data in the frame buffer 176. Based on these data the display driver 162 causes the user- defined panorama, such as that illustrated in Figure 5E, to be presented on the viewable component of the display device .
  • each time step 268 is executed in a step 270 the image mapper 172 either directly or through the panorama constructor, (connection not shown) informs the display driver 162 that the image-defining data in the frame buffer has been updated.
  • This status report may serve as a trigger that causes the display driver 162 to refresh the panorama presented on the viewable component so the new image is presented.
  • steps 262, 264, 266 and 268 are not just performed on the real time image-defining data the panorama generator 60 selectively captures off of bus 31.
  • the static image-defining data stored in the image cache 167 are forwarded to the sealer 167.
  • These data are subject to the same scaling, chroma/contrast adjustment and selective frame map loaded to which the real time image-defining data are subjected. This way, the presented panorama can consist of images that are combination of real time images/state information and static images/state information.
  • System 30 of this invention is thus constructed to simultaneously present custom panoramas on plural different display devices.
  • Each panorama consists of the plural images that are of specific interest to the medical personnel viewing that display device. Images not of interest to a particular individual are not presented on the display he/she views. The individual therefore does not spend time or mental effort discarding unneeded images. Instead, the individual immediately views and obtains a mental impression only of the images of interest.
  • System 30 of this invention is designed so that each individual's panorama definition file 76 is, once created, stored. The need to redefine the panorama a particular individual wants to view each time the system is used by that individual is eliminated.
  • the system 30 of this invention also allows image data to have contrast/chroma characteristics that are preferred by an individual.
  • an individual becomes familiar with seeing a particular data presented in the same visual format. This familiarity allows the individual to out of habit, more rapidly view and mentally process the data contained within the image.
  • bus interfaces connected to bus 31 can be configured to allow the medical and surgical devices to transmit and receive data other than image-defining data.
  • the surgical navigation unit 44 may send signals to the instrument console 40 to cause the deactivation of an attached handpiece. Such commands may be sent when the surgical navigation unit 44 determines that the handpiece is approaching a location on the body of the patient to which the handpiece should not be applied.
  • the system of this invention may be integrated into an operating room without the addition of its own dedicated communications bus and the overhead such a link requires.
  • Panorama generator 60 of this invention is further configured to have a memory, image cache 167, in which data defining the static images that are presented as part of a panorama are stored.
  • this cache 167 means that data defining these images do not need to be repetitively transmitted from their sources to a particular display 46, 48 or 50 on which they mages are to be transmitted. This feature of the invention minimizes the bandwidth required to ensure that on each display the panorama consists of the images desired by the display' s viewers .
  • FIG 10 is an overview of an alternative system 302 of this invention.
  • System 302 includes the same bus 31 of the first described version of the invention.
  • the same devices 32, 34, 38, 40, 42 and 44 are present that generate image-defining data.
  • Also present in this version of the invention are same display devices, monitors 46 and 48 and heads up displays 50 (one shown) .
  • a view creator 304 is also part of system 302.
  • the view creator 304 serves as a central unit for creating the bit maps of the panoramas presented on the individual display devices 46, 48 and 50.
  • one or more of the units that generate image- defining data are connected directly to the view creator 302.
  • camera 32, instrument console 40, bio monitor 42 and surgical navigation unit 44 are shown connected directly to view creator 304.
  • Camera 36 and file server 38 transmit their image-defining data to the view creator 304 over bus 31.
  • the direct connections to view creator 304 are understood to be exemplary, not limiting. In alternative configurations of system 302 each device that outputs image- defining data is connected directly to view creator 304.
  • one or more of the display devices are connected directly to the view creator 304.
  • both monitors 46 and 48 have a direct video connection to video creator 304.
  • Only the heads up display 50 receives panorama defining images over from the view creator 304 over bus 31. Again, this should not be interpreted as limiting.
  • each of the individual display devices is connected directly to view creator 304.
  • Figures HA and HB illustrate the basic components of view creator 304.
  • View creator 304 includes an input/output module 306, a bus interface 308, a panorama constructor 310, an image library 312 and a display library 313. These components are similar to the respective versions of these components of view creator 54.
  • panorama constructor 310 like panorama constructor 70 creates a library 114 of panorama definition files 76.
  • a panorama retriever 314 is also internal to view creator 304. Attached to panorama retriever 314 is a memory known as a panorama cache 316. During system 302 operation, panorama cache 316 stores the panorama definition files 76 of the panoramas that are being built by the view creator 304.
  • image buffers 320, 322 and 324 Also internal to view creator 304 are one or more image buffers 320, 322 and 324. Each image buffer 320, 322 and 324 stores the image most recently output by one of the image generating devices.
  • image buffer 320 is shown connected directly to bus interface 308. This is to represent that image buffer 320 serves as the temporary storage buffer for images received over bus 31.
  • Image buffers 322 and 324 are diagrammatically shown as being connected to separate sockets integral with view creator 304. This is to represent that each of image buffers 322 and 324 are directly connected separate ones of the image generating devices.
  • Two panorama generators 328 and 330 are shown as also being integral with view creator 304.
  • Each panorama generator 328 and 330 is capable of generating a bit map representative of one of the panoramas that is to be presented on one of the display devices. While not illustrated, it should be understood that, at a minimum, each panorama generator 328 and 330 contains a sealer, a contrast/chroma converter and an image mapper. These components are functionally identical to the sealer 166, the contrast/chroma converter 168 and the image mapper 172 of panorama generator 60 ( Figure 7) .
  • Image defining data from any one of the image buffers 320, 322 and 324 are read out to any one of the panorama generators 328 and 330.
  • a multiplexer 332 is shown as being the interface between the image buffers 320, 322 and 324.
  • the image defining data from each of the image buffers 320, 322 and 324 is simultaneously read out to both panorama generators 328 and 330.
  • panorama views constructed by the panorama generators 328 and 330 are stored in frame buffers 334, 336 and 338 also part of panorama generator 304.
  • each of the panorama generators 328 and 330 writes panorama-defining image data into any one of the three image generators 334, 336 and 338.
  • a multiplexer 340 routes the panorama-defining data to the appropriate frame buffer 334, 336 or 338.
  • panorama generators 328 and 330 and frame buffers 334, 336 and 338 are constructed so that data from a panorama generator are simultaneously written to plural ones of the frame buffers. This feature is used when system 302 is configured to simultaneously present the same panorama on plural display devices.
  • frame buffers 334 and 336 are shown as being connected to the bus interface 308. This represents that the panorama defining image data stored in these frame buffers 334 and 336 are output to the associated display device over bus 31.
  • Frame buffer 338 is shown as being diagrammatically connected to a socket that is part of view creator 304. This represents that the panorama defining data in buffer 38 are output from view creator 304 directly to a display device.
  • Initial configuration process step 350 involves the display generating units identifying themselves to the panorama constructor 310 so view creator 304 can build the image library 312.
  • this portion of initial configuration step 350 is similar to steps 80-86 executed during implementation of the first system 30 of this invention.
  • this portion of step 350 is the portion in which the image library 312 is built.
  • the display devices identify themselves to panorama constructor 310. Based on these data, panorama generator builds the display device library 313. This portion of strep 350 is similar to steps 88-92 executed by first system 30.
  • panorama constructor 310 builds the panorama definition files 76. Each file build constitutes an execution of the previously described identify viewer steps 96-110. Once panorama definition files 76 are created, view creator 304 can create on any of the displays the multi-image panorama that has been defined for a particular user.
  • Step 354 represents the initialization of system 302 to present plural individually created panoramas for a particular surgical procedure.
  • Step 354 comprises the identification of all image generating units and display devices connected to the system; either over bus 31 or view creator 304.
  • step 352 is analogues to bus reset step 208 ( Figure 8A) of the first version of this invention.
  • a user enters data into view creator 304 through the input/output device to indicate which panorama view definition file 76 contains data indicating the panorama that is to be presented on a particular display device.
  • the panorama generator retrieves the panorama view definition file 76 from the main memory internal to the view creator 304.
  • each image generator outputs images, step not shown.
  • Each image is written to view creator 304, step 360. More particularly each set of image-defining data are written to a specific image buffer 320, 322 or 324.
  • the panorama generators 328 and 330 first create and then update the panoramas presented on each of the displays.
  • the panorama presented on a particular display is updated each time view creator 304 receives new image data for the panorama.
  • each panorama is periodically updated regardless of whether or not new image data are present.
  • the panorama retriever 314 retrieves from the cache 316 data from the relevant panorama definition file 76. More particularly, the panorama retriever 314 retrieves from the file 76 the data indicating how the particular image is to be integrated into the complete panorama. These data are forward to the panorama generator 328 or 330 responsible for adding the image data to the panorama.
  • the plural panorama generators 328 and 330 allow view creator 304 to simultaneously build plural panoramas. In systems 302 wherein data are simultaneously written to the plural panorama generators 328 and 330, these data are simultaneously is used to form separate panoramas.
  • an individual panorama updating step 362 the sealer internal to the panorama generator 328 or 330 resizes the image defining data to fit the area the image will occupy in the assembled panorama.
  • the contrast/chroma converter contrast and color adjusts the image.
  • Panorama updating step 362 concludes with the image mapper loading the resized and contrast/chroma adjusted image data into the appropriate frame buffer 334, 336 or 338. In versions of the invention wherein multiplexer 340 is present, prior to the actual writing of data to the frame buffer, the appropriate panorama generator to frame buffer connection is established.
  • the data are forwarded to the display device, step 364. Again, depending on how the display device is connected to the view creator 304 the data are forwarded to the device either over a dedicated connection or bus 31.
  • System 302 of this invention is thus arranged so a single device receives the image defining data, packages the data into the user-defined custom panoramas, and exports the panorama-defining data to the display devices.
  • System 302 can further be configured so that all the image generating devices and display devices are connected directly to the view creator 304.
  • extra hardware does not have to be added to either the image generating devices or the display devices to provide the custom panoramas.
  • the image and panorama defining data need not be exchanged over the bus 31. This eliminates the need to provide a bus with sufficient bandwidth that allows these data to be transferred.
  • some units that generate data that are used to form individual panoramas may generate plural different types of data.
  • a single tool control console may generate data about plural different tools the console is employed to energized.
  • a biomonitor may generate data that indicates both pulse rate and blood oxygen content.
  • a particular individual may only want to view the data regarding the operation of a specific handpiece (tool) or only the blood oxygen content data.
  • step 102 the user designates more than from which image generating device an image is to be used.
  • the user further designates the particular type of image output by the device that is to be incorporated into the panorama.
  • both frame buffers 176 of the panorama generators 60 or the frame buffers 332, 334 and 336 of view creator 304 may be of the type to which data can be simultaneously written to and read from.
  • panorama defining data are written to the frame buffer while simultaneously other data in the frame buffer are exported to the display driver.
  • view creator 304 is constructed so that each panorama generator is tied directly to a frame buffer 332, 334 or 336. The need to provide a switching unit such as a multiplexer between the individual panorama generators and the frame buffers is eliminated.
  • view creator 304 may have image buffers 320, 322 and 324 for storing image data to which the associated image generating unit can write data while other data are simultaneously read out by one or more panorama generators .
  • the process of selecting the image generator data for inclusion in the panorama includes an additional step.
  • the panorama generator 60, 328 or 330 determines if it is the appropriate time slot for writing the specific image data to the frame buffer.
  • the view creator 56 function as the isochronous resource manager.
  • the view creator 56 more particularly panorama constructor 70, functions as an isochronous resource client.
  • the view creator as part of bus reset process 208, determines how many image generating devices are connected to bus 31 that want to broadcast isochronous image-defining data over the bus, step 382 of the process of Figure 14.
  • the view creator 56 if there are P devices wanting to broadcast image-defining data in isochronous data packets, requests P isochronous channels from the isochronous resource manager.
  • Step 386 represents the response from the isochronous resource manager in which the P channels assigned for image data broadcast are identified.
  • the view creator 56 assigns each one of the image generating devices a particular channel in which it can make its data broadcast. View creator 56 maintains its list of which channels it has assigned each image generator in image library 72.
  • the panorama generator 60 is further configured to monitor the data generated by other devices in the hospital. In response to one or more specific event occurring, "trigger events," the panorama generator 60 reactively causes a panorama with images that are especially useful to be viewed in response to the event to be presented on the associated display 46, 48 or 50.
  • the panorama ID field 120 of the panorama definition file 76a contains one or more panorama state trigger fields 120a.
  • Each state trigger field 120a includes data indicating in response to what trigger event the panorama defined by that specific file 76a should be presented on the associated display.
  • the panorama state trigger field 120a of one of the files 76a contains data indicating that the file 76a contains data for the default panorama, the panorama that is to be presented in the absence of the occurrence of a trigger event.
  • Figure 16 depicts one version of a panorama state trigger field 120a.
  • field 392 includes event type and event trigger sub- fields 392 and 394, respectively.
  • Event type sub-field 392 contains data generally indicating the type of event that would cause the trigger.
  • Event trigger sub-field indicates the specific parameter of the event type that would be considered a trigger event.
  • one panorama state field 120a can contain data indicating that the event type being monitored is blood pressure.
  • the data in the event trigger sub-field 394 identifies a specific blood pressure level .
  • the data in the panorama state trigger field 120a indicates the trigger event is the occurrence of a specific milestone event during the procedure.
  • One such milestone event is the placement of a surgical instrument at a specific location relative to the surgical site.
  • the indication that this event occurred may be something that happens automatically.
  • the surgical navigation system 44 may detect that the endoscope 34 is within a given distance of the surgical site. Data reporting this event may be broadcast over bus 31.
  • the panorama generator 60 may switch from presenting a panorama that includes an includes small images of the navigation data and the view captured by the endoscope camera 32 to a panorama that includes a larger image of view captured by the endoscope camera (no navigation image) .
  • Another trigger event may be the notice that a particular event has occurred in preparation for another part of the procedure.
  • the entry of data indicating that a particular sized femoral stem implant has been selected for implantation, such as by a scanning wand, can serve as a trigger.
  • the trigger event may be some manually entered notice.
  • Such notice could be as simple as the depression of a control button to indicate that the surgeon wants to switch from viewing a panorama with a first set of defined images to a panorama with a second set of defined images.
  • the panorama state trigger field 120a is loaded with the appropriate data.
  • These data can include in the event type sub-field 392 an indication that the event type is something generally concerned with data output by the surgical navigation unit 44.
  • the event trigger field 394 contains data indicating that the actual triggering event is the message that the endoscope is within a given distance of the surgical site.
  • FIG 17 is a flow chart of the process by which the panorama generator 60 reactively causes panoramas composed of different images to be presented on the associated display 46 or 48. Not shown is the initial loading of the plural panorama definition files 76a in the panorama processor 177. As depicted by step 402, initially, the panorama processor causes the default panorama to be presented on the appropriate display. This panorama is generated based on the data contained in the panorama definition file 76a specific to the default panorama. [000146] Throughout the procedure, bus interface 31 forwards to panorama generator 60 monitors the data other components output that report on the procedure, step 404. These data include data that reports on the state of the equipment used during the procedure. Specifically these data are broadcast over bus 31. Bus interface 164, as part of step 404 forwards these data to the panorama processor 177.
  • step 406 the panorama processor 177 monitors these data to determine if any of them indicate a trigger event has occurred. This monitoring is performed by comparing the data received over the bus 31 with the list of trigger events as indicated by the panorama state trigger fields 120a data stored by the processor 177.
  • step 406 in Figure 17 is meant to represent the plural evaluations the panorama processor 177 performs to determine if any one of a number of trigger events have occurred. As long as no such event has occurred, panorama processor 177 causes the other components internal to the panorama generator 60 to present the default panorama.
  • step 406 determines the panorama processor 177 performs to determine if any one of a number of trigger events have occurred. As long as no such event has occurred, panorama processor 177 causes the other components internal to the panorama generator 60 to present the default panorama.
  • step 406 If the comparisons of step 406 indicate that a trigger event has occurred, the panorama generator 60 presents the panorama defined for presentation with that event, step 408.
  • the panorama processor 177 causes the other components of the panorama generator 60 to construct the panorama in accordance with the data in the panorama definition file 76a for the trigger event.
  • the images selected for presentation in the panorama and their sizes, chroma/contrast and positions in the panorama are all based on the data contained in this second file 76a.
  • Figure 18 is an example of one such trigger event panorama.
  • This panorama is similar to the panorama of Figure 5E .
  • this trigger event panorama may be presented when data indicates that patient's heart rate has become irregular.
  • the panorama instead of the panorama containing views of the area on which the procedure is performed, the panorama contains images that report of the patient's blood pressure, blood oxygen content, and a graphical indication of the hear rate.
  • Step 414 represents the continued monitoring of events that occur during the procedure to determine whether or not a second trigger event occurs. Thus, step 414 represents the continued reexecution of steps 404 and 406 during the procedure.
  • the second event is an indication that the patient's condition has returned to one in which the default panorama can again be presented.
  • the panorama processor 177 returns to causing the default panorama to be presented on the display, step 402 is reexecuted. This is for example only.
  • the occurrence of the second trigger event causes panorama generator 60 to present a second trigger event-specific panorama on the display 46 or 48.
  • panorama generator 60 is capable of doing more than presenting panoramas that are appropriate to the specific individuals viewing the display 46 or 50 on which the panorama is located.
  • Panorama generator 60 reactively changes the panorama, the collection of images, presented on the display in response to the occurrence of a trigger event. Thu viewer thus is able to monitor images containing data that, after the event, is most relevant to the event.
  • the instrument console 40 may receive data from an attached handpiece that the handpiece temperature is rising to an unacceptable level.
  • "unacceptable” means to a temperature level where there is the potential of injury to the patient or the individual holding the instrument. Data regarding this temperature rise are broadcast over the bus 31.
  • the receipt of these data is considered to be a trigger event by the panorama processor 177.
  • the panorama processor 177 at least momentarily causes a panorama to be presented that includes a very large image notifying the medical personnel of the handpiece state. The medical personnel may then be required to enter a command to the panorama generator 60 to cause it to return to presenting the previously requested image.
  • the Firewire link layer function as the module that selectively filters, passes through data from particular isochronous channels.
  • the link layer can only perform this function if data packets from 2 or less or 4 or less isochronous channels are to be passed through to the software layer. If a particular display is configured to present on its panorama data from more than these selected number of generators, all isochronous packets are forwarded by the physical layer and the link layer to the bus interface 164 software layer. The software layer then selectively passes on the image-defining data received over the appropriate channels for further processing and display.
  • this invention does not depend on a specific type of communications link to either the image generating devices or to the display devices.
  • Wireless connections that have sufficient bandwidth and that do not interfere with each other and that will not be effected by or effect other equipment in the operating room may be employed.
  • the units that generate data that eventually appear as part of a panorama actual generate image-defining data.
  • the devices that generate data displayed as part of the panorama generate the data in the form of raw data.
  • the control console that regulates the motorized tool may generate data packets that simply indicate tool speed.
  • the bio-monitor that monitors blood pressure may simply present these data in a raw data form.
  • these data generators are not required to have the output units of these data generators with video image processors.
  • the bandwidth required to transmit raw data is appreciably less than the bandwidth required to transmit a video image containing these data.
  • panorama generator 60a as is partially illustrated in Figure 13 is provided.
  • Panorama generator 60a includes the basic modules of panorama generator 60 described with respect to Figure 7.
  • Panorama generator 60a also includes a video image generator 372.
  • Video image generator 372 is, like sealer 166, connected to receive data signals bus interface 164 selectively forwards from bus 31. More particularly, bus interface 164 forwards to video image generator 372 the raw data broadcast by the data generating devices.
  • Video image generator 164 upon receipt of these data, generates a basic image based on the received raw data.
  • the image data output by video image generator is basically a black on white background alphanumeric or graphic representation of the raw data.
  • Header information may be added to the image based on data contained within the data or instructions from the panorama processor 177. These image signals are then forwarded to the sealer 166 for appropriate resizing so it will fit in the designated area in the user-defined panorama. Contrast/chroma converter 168 then converts the image into the background and data color pattern designated by the user. The scaled and color converted image of the data are then forwarded to image mapper 172. The image mapper 172 loads the data into the appropriate area within frame buffer 176.
  • video image generator 372 based on instructions received from the panorama processor 177, produces a video image that is appropriately scaled and has the user-designated contrast and chroma characteristics.
  • the image output by this particular video image generator is output directly to the image mapper 172.
  • view creator 304 includes one or more panorama generators 328 and 330 ( Figure HB) there may be one or more video image generators 372.
  • view creator 304 receives the plural raw data streams from the individual data generating units. Since the bandwidth required to transmit these raw data is, in comparison to the bandwidth required to transmit image data, very small, these data may all be transmitted over bus 31.
  • the raw data are forwarded by the bus interface 308 internal view creator 304 to the one or more video image generators internal to the view creator.
  • the image generator that receives the raw data generates a basic image based on the data. Again, in some, but not all versions of this invention, the image may be simply a black-on-white image.
  • the image is then loaded in the appropriate image buffer 320, 322 and 324. Once the image data are so stored, it is available for use as panorama make up data by the panorama generators 328 and 330.
  • step 254 determines if a particular packet of image packet should be forwarded for processing or discarded.
  • step 254 determines if a particular packet of image packet should be forwarded for processing or discarded.
  • step 86 is executed so as to build the image library 72.
  • step 92 is executed so that data regarding the device are loaded into the display library 74.
  • image data are resized, color/chroma adjusted and mapped into the frame buffer may be different from what has been described.
  • all versions of the invention include a frame buffer that holds a full frame of panorama-defining data.
  • the buffer may only hold portions of one or more lines of panorama defining data.
  • a process specific to the associated display driver is then employed to ensure the data are properly used to create the desired image.
  • a protocol other than the Firewire protocol may be used to regulate data transfer.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Surgery (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Public Health (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Radiology & Medical Imaging (AREA)
  • Biomedical Technology (AREA)
  • Animal Behavior & Ethology (AREA)
  • Veterinary Medicine (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Molecular Biology (AREA)
  • Pathology (AREA)
  • Physics & Mathematics (AREA)
  • Epidemiology (AREA)
  • Biophysics (AREA)
  • Primary Health Care (AREA)
  • Optics & Photonics (AREA)
  • Signal Processing (AREA)
  • Human Computer Interaction (AREA)
  • Robotics (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Processing Or Creating Images (AREA)
  • Image Processing (AREA)

Abstract

A panorama generator (60) for generating a user-defined panorama consisting of a plurality of images generated by different medial and surgical equipment. The panorama generator receives from plural devices that generator images image-defining data. The panorama generator determines if the image-defining data contains data the user wants displayed. If the determination is positive, the panorama generator sizes and positions the data so that the display presents the image in the user-designed location and size on the presented panorama.

Description

MULTI-DISPLAY MEDICAL/SURGICAL
IMAGE AND DATA VIEWER SYSTEM THAT PRESENTS USER-DEFINED, CUSTOM PANORAMAS
FIELD OF THE INVENTION
[0001] This invention is generally related to a system for displaying medical information generated from different sources. More particularly, this invention is related to a system for simultaneously presenting medical information on a number of different displays wherein the information presented on each display is information specifically relevant to the individual (s) viewing that display.
BACKGROUND OF THE INVENTION
[0002] Advances in medical technology have resulted in like increases in the amount of information available to medical and surgical personnel. This information includes images of a surgical site taken prior to a surgical procedure as well as real time images captured during the procedure. Other information is in the form of data such as pulse rate, and blood oxygen content. Still other information is available and best interpreted in graphical format. This latter type of information includes heart rate data and electrical response, such as EEG data. Medical/surgical personnel also have available data about the medical equipment used to perform the procedure on the patient. For example, if the surgical procedure performed requires a motorized handpiece, data about the operating speed of the handpiece is typically available. If suction is applied to the surgical site, data are available regarding the suction level as well as the quantity of material drawn from the surgical site. Equipment that delivers anesthesia similarly provides information about the characteristics of the anesthesia introduced into the patient .
[0003] Traditionally, each type of information is presented on a unique display. If the information is produced in real time, the display is typically integral with the device that produced the information. For example, a unit that monitors the blood oxygen content has its own display that graphically or numerically displays these data. If the unit is a surgical navigation unit, the unit includes a video display on which is presented the position and orientation of the tracked object or instrument. If the information is a retrieved stored image, that image is also presented on a specific display. For example, preoperative X-ray or CAT images of a surgical site are often presented on a light panel or monitor in the operating room. [0004] Consequently, with the above arrangement, for an individual to review any particular type of information, the individual must move his/her eyes, and often his/her head, to focus on the display on which the information is presented. Having to perform this activity takes time away from other activities. The individual is also required to exert at least some conscious or unconscious effort to focus on the display.
[0005] To minimize the need to physically move one's eyes to a specific display, there are efforts to provide medical/surgical facilities, such as operating rooms, with large display units. Plural different types of information are simultaneously presented on these displays. For example, one quadrant of the display may present an image captured by an endoscopic camera. The quadrant below that quadrant may contain an X-ray image of the surgical site. The quadrant to the side of the one in which the camera image is displayed may present graphical heart beat and brain wave data. The remaining quadrant may contain information that is best presented numerically such as blood oxygen level, motor speed for the handpiece used to perform the procedure, suction level and capsule pressure. [0006] An advantage of the above presentation is that it substantially eliminates the need for an individual to look from place to place in the operating room in order to find the information of interest. However, when all the information are presented on a single screen, the individual still must spend time focusing his/her eyes on the portion of the screen displaying the information of most immediate interest .
SUMMARY OF THE INVENTION
[0007] This invention is related to a new and useful way of displaying information in a medical/surgical environment such as in an operating room. The system and method of this invention presents each viewer with only the information relevant to that individual.
[0008] One version of the system of this invention includes a data bus. The sources of image-generating data are connected to the data bus. These sources include both medical devices used to monitor the state of the patient and devices used to perform therapeutic procedures on the patient. Other sources connected to the bus include file servers in which information collected earlier for viewing during the procedure are stored.
[0009] Also connected to the data bus are different display devices. These displays include conventional screen-type displays that can be viewed by one or more individuals. Other display devices are individual displays. Often, but not always, these display devices are head or shoulder mounted. [00010] The system and method of this invention includes a processing unit known as a view creator. The view creator contains configuration data regarding a multi-image panorama that is to be presented on each display. These data indicate for each display: the types of images to be presented; the location on the display at which the image presenting the data is to be presented; and data used for editing the image to ensure that the image is presented as desired by the particular viewer. These configuration data are set by the primary individual (s) who view the display. [00011] In one version of the invention, each display device has its own panorama generator. When the system is initialized, the view creator sends panorama definition data for a specific display to the panorama generator integral with the display. During the procedure, the data-presenting image from each image generator is output on the bus. The panorama generator integral with each display captures data containing the image to be presented on the display. The panorama generator then loads a frame buffer with image defining data that collectively defines the panorama that is be presented on the display.
[00012] In an alternative version of the invention, the view creator includes one or more internal panorama generators. The view creator receives the image forming data from each of the image-generating units. Also connected to directly to the view creator are the individual display devices.
[00013] In this version of the invention, the panorama generators generate bit maps for each of the display devices. Each bit map contains data that defines the panorama that is to be presented on a specific display device. Once a bit map is created, the view creator forwards the bit map to the correct display device. BRIEF DESCRIPTION OF THE DRAWINGS
[00014] The invention is pointed out with particularity in the claims. The above and other features and advantages of the invention are described in the following Detailed Description, taken in conjunction with the accompanying drawings, in which:
[00015] Figure 1 is block diagram of a medical/surgical network such as in an operating room in which the system of this invention may be present;
[00016] Figure 2 is block diagram of a first view creator of this invention;
[00017] Figure 3 depicts in block form the different types of data that comprise a panorama definition file;
[00018] Figures 4A, 4B and 4C collectively form a flow chart of the process steps executed by the view creator to generate a panorama definition file;
[00019] Figures 5A through 5E sequentially illustrate how a number of images are selected, resized and positioned to form a multi-image panorama that is viewable from a display;
[00020] Figure 6 is a block diagram of an image output module integral with one of the image generating devices of this invention;
[00021] Figure 7 is a block diagram of a panorama generator associated with one of the display devices of this invention;
[00022] Figures 8A through 8F collectively form of flow chart of the process steps executed by the system of this invention in order to create a custom defined panorama on a display device;
[00023] Figure 9 depicts in block form the types of data contained in an amended panorama definition file; [00024] Figure 10 is a block diagram of the configuration of an alternative system of this invention for presenting custom panoramas on different displays;
[00025] Figures HA and HB collectively illustrate in block form the major components of the view creator of the system of Figure 10;
[00026] Figures 12A and 12B collectively form a flow chart of the process steps executed by the second described system of this invention;
[00027] Figure 13 is a block diagram of a number of the modules internal to an alternative panorama generator of this invention; and
[00028] Figure 14 is a flow chart of the process steps executed by the view creator when it emulates the isochronous resource manager;
[00029] Figure 15 is block form depiction of the data in a panorama definition file that defines a panorama that is reactively presented as the result of the occurrence of a trigger event;
[00030] Figure 16 is a detailed block illustration of the sub-fields in a panorama state trigger field of the file of Figure 16;
[00031] Figure 17 is a flow chart of the process by which the panorama generator, in response to trigger events, reactively causes different panoramas comprising different views to be presented on the display with which the panorama is associated; and
[00032] Figure 18 is an example of alternative panorama that the panorama may cause to be presented on the associated display in response to the occurrence of a trigger event.
Detailed Description [00033] Figure 1 is block diagram view of the components of a network forming system 30 of this invention. System 30 includes a bus over which the below-described network components communicate. In some versions of the invention, bus 31 is an optical cable or Firewire bus. Bus 31 of this invention has a bandwidth of at least 1.5 Gbit/sec. This bandwidth is based on a version of the system of this invention streaming full motion video from four distinct sources and graphic data from another four distinct sources and the data output over the bus is compressed prior to transmission. It should be understood the above transmission rate is exemplary, not limiting. In versions of this invention with fewer sources of video signals and/or enhanced data compression, smaller bandwidths may be all that is needed. In versions of the invention with more sources of video, particularly more sources of full motion video, and/or less data compression, larger bandwidths may be required.
[00034] Connected to bus 31 are a number of devices that output image-forming data. These include devices that output images captured in real time in the operating room. An example of such a device is a camera 32 connected to an endoscope 34. Camera 32 captures the image at the surgical site that viewed through the endoscope 34. Still another type of device that captures an image real time is a personal perspective camera 36. Personal prospective camera 36 is worn as head mounted unit by the surgeon. Other individuals in the operating room find the view captured by the personal perspective camera 36 useful so they can have perspective, a view, of the surgical site identical to that of the surgeon.
[00035] A file server 38 is another device that outputs image-forming data. File server 38 outputs images of the data stored prior to the start of the surgical procedure. Examples of such images are X-rays, CAT scans or data regarding other diagnostic tests or therapeutic procedures performed before the present procedure. It should be appreciated that file server 38 may not be located in the operating room but in another location in the hospital or even at a more remote location. Bus 31 thus may not be connected directly to file server 38. Instead, bus 31 is connected to a head, a hub, a gateway, or a relay unit to another network such as hospital Intranet network, the Internet or the commercial telephone network. File server 38 is connected bus 31 over this head and the additional network.
[00036] Additional data in image form comes from surgical instruments used to perform the procedure on the patient and to monitor the patient's vital signs. In Figure 1, a single box labeled INSTRUMENT CONSOLE 40 represents the different types of devices that output image-forming data regarding their operating states and use history. For example a powered tool console generates data regarding the operating speed of the motor in the handpiece the console is used to energize. A console used to output an RF signal will output data regarding the frequency, the power level of the RF signal applied to the patient and the temperature of the tissue at the site where the RF signal is applied. Equipment used to deliver anesthesia outputs data regarding the quantity of anesthesia and the proportion of anesthesia to carrier media. A suction unit outputs data regarding the level of the suction that is drawn as well as the quantity of material withdrawn from the patient. An irrigation unit supplies data regarding the flow rate of irrigation fluid to the surgical site, the overall amount of fluid supplied and, in some instances, the fluid pressure at the surgical site. Again, it should be understood that this list is illustrative and not intended to limit the types of devices that may generate data.
[00037] In Figure 1 a single box labeled BIO-MONITOR 42 represents the different types of devices that output image- forming data regarding the patient's condition. These devices include units that monitor blood pressure, hear rate, brain activity and blood oxygen content. Both instrument console 40 and bio-monitor 42 output the data as image forming data over bus 31. As before, it should be understood that this invention is not limited to the described monitors.
[00038] An additional unit attached to bus 31 is a surgical navigation unit 44. Surgical navigation unit 44 tracks the position and orientation of surgical devices, including instruments and implants, relative to the surgical site. Based on these data and a stored image of the surgical site, the surgical navigation unit 44 presents an image of the positions and orientations of the surgical devices relative to the surgical site.
[00039] It should be understood that the above described image generating devices are only a partial list of the devices that may output data surgical personnel may want to view during a procedure. Therefore, this list is recognized as exemplarily, not limiting.
[00040] Also connected to bus 31 are plural different display devices. In Figure 1 two display monitors 46 and 48 are shown. Monitors 46 and 48 each include a screen that can be viewed by two or more individuals. The size and placement of monitors 46 and 48 is not relevant to the scope of this invention. For example both monitors may be small screen (approximately 19 inch diagonal) . Alternatively, one or both of the monitors may be large screen, (36 inches or more diagonal measurement.) Other display devices are designed to be worn by and present images that are viewed by a single individual. These units are often mounted to a helmet, eyeglass frames or other structural support member worn by the user. Some of these units are small video monitors. Still other units project images onto a user worn screen, such as personal protection face shield. Another class of individual unit is able to project am image directly onto the iris of the user. Frequently, these display devices are referred to as heads-up displays. In Figure 1, a single heads up display 50 is shown diagrammaticalIy .
[00041] Also attached to bus 31 is a view creator 56. View creator 56 stores data used to define the collection of images that form the single multi-image panorama that is to be presented on each of the display monitors 46 and 48 and the individual heads-up displays 50.
[00042] In one version of the invention it is contemplated that data signals are transmitted over bus 31 using FireWire (IEEE 1394) standards. Integral with each device that outputs image-defining data is an image output module 58. (For simplicity, only the image output module 58 internal to the surgical navigation unit 44 is shown.) Image output module 58 receives the image-defining data from the image generating device to which it attached. Image output module 58 places these data into packets for transmission over bus 31. Image output module 58 then outputs these data as isochronous data packets on bus 31.
[00043] Integral with each display unit is a panorama generator 60. (For simplicity, only the panorama generator 60 integral with monitor 48 is shown.) As described in detail below, panorama generator 60 selectively captures selected ones of the image forming data packets broadcast over bus 31. Panorama generator 60 packages the plural captured images into data forming a single panorama. The panorama-defining data are supplied to the display so that the display presents the user-designed panorama of images .
[00044] View creator 56 can be constructed out of a general purpose processing unit. A personal computer can function as the view creator 56. While not explicitly illustrated, it should be appreciated that the view creator 56, in addition to a processor and memory, includes input and output devices. A conventional monitor can function as the output device. A keyboard and mouse can function as input devices. In some versions of the invention, a touch screen display functions as a combined input-output device.
[00045] Figure 2 illustrates in block form the basic software modules executed on the view creator 56 and the basic data files in the view creator memory. Also shown in Figure 2 are the modules that control the reading of data into and writing of data out of the view creator 56. A first module is input/output module 66. The input/output module 66 includes the drivers that, in response to the data output by the other modules, cause the appropriate image to be presented on the monitor integral with the view creator 56. Input/output module 66 also includes the interpreters that, in response to the commands entered through the input devices, generate instructions to the other modules. A bus interface 68 is also internal to the view creator 56. Bus interface 68 is the module internal to the view creator that writes data out over bus 31 and also receives and forwards the data from the bus written to the view creator 56. [00046] Panorama constructor 70 is a specific processing module internal to the view creator 56. The view constructor 70, in response to user-entered commands, generates data that define the panorama that is to be presented on the particular display device that will be viewed by the user. View constructor 70 constructs the panorama based on reference data stored in an image library 72 and display library 74 both internal to the memory integral with the view creator 54. Image library 72 identifies the sources of images, the devices that output images, from which the panorama can be created. Thus, image library 72 contains data indicating that the user can select as part of the view one or more of the following types of images as part of the panorama: endoscopic camera image; surgeon perspective image; navigation image; stored X-ray image; blood pressure data; capsule pressure; blood oxygen content; suction level; handpiece speed; liters of fluid collected; and time since start of procedure.
[00047] Display library 74 identifies the types of display devices that are attached to the bus 31.
[00048] In response to the user entered commands, the view constructor 70 builds a panorama definition file 76, seen in Figure 3. The process by which the view constructor 70 builds a panorama definition file 76 is now described with reference to the flow chart of Figures 4A-4C. In an initial step 80, the view constructor 70 polls the other devices connected to bus 31. Specifically, the view constructor 70 requests each device to indicate whether or not the device outputs image-forming data and, if so, the type of data. [00049] Step 82 represents the report from each device regarding whether or not it is an image-forming data generator and the type of the image-forming data it outputs. Based on these data, in a step 84, view creator determines if a specific device generates image-forming data. If the evaluation tests positive, in a step 86, view creator builds the image library 72. Specifically, the view creator establishes in library 72 a record for each device that can generate image-forming data indicating the type of data available and the one or more types of images available from the device. Not shown is the loop back from step 86 of each of the additional devices attached to bus 31. [00050] The loop backs from steps 84 and 86 represent that at least step 84 is executed for every device that identifies itself to view constructor 70. Not shown is the decision step indicating that once all devices have been identified and the contents of the image library 72 constructed, other processes are executed by the view constructor 70.
[00051] In a step 88, the view constructor polls the other devices attached to bus 31 to determine if any of the devices are display devices. Step 89 represents the response from a device indicating that it is a display device. As part of step 89, the responding display device provides data indicating the type of device it is. These data indicate the type of view that is created by the device. For example, it may be possible to present on a screen display a number of individual images that collectively define the panorama. A heads up display device may only be able to produce a panorama that consists of a single image.
[00052] Based on the data received from the display generating device in step 89, in step 90 view constructor 70 decides if the device is a display device. If this evaluation tests positive, in step 92, the view constructor generates the display library 74. Each record in the display library describes the characteristics of a particular display device. Display device characteristics it is appreciated include the size of the panorama produced by the device or, in the case of some heads-up displays, that the device can only present a panorama consisting of a single image. Not shown in the Figures is the loop back after step 92 is executed for or a particular display device or the decision step indicating that all potential display devices connected to bus 31 are identified.
[00053] Once image library 72 and display library 74 are constructed, view constructor 70 is used to build the individual panorama definition files 76. This process starts in step 96 with view constructor 70 requesting the individual building the file to identify the file. Thus, the individual is asked to identify: the specific individual for whom the panorama is to be presented; the type of display unit, (main display, anesthesia display, surgeon heads-up display) on which the constructed panorama is to be displayed; and the specific procedure for which the panorama is to be displayed. The list of potential displays is based on the list of displays in the display library 74. [00054] In response to the entry of these data, in step 98 a step 82 is executed in which a panorama identification field 120 is created for the panorama definition file. Panorama identification field 120, it should be appreciated, contains a number of sub fields. Thus there are sub fields for indicating the specific individual for whom the panorama will be created, the display device on which the panorama will be presented and procedure for which the panorama is to be presented.
[00055] View constructor 70 then prompts the individual to indicate what image-forming data are to be used to form the under construction display view. In a step 102, the individual is asked to indicate a first image, (IMAGE N in Figure 4B), that is to form part of the displayed panorama. Specifically, in step 102 the view constructor 70 indicates the choices of images available for presentation based on the data in image library 72. The user then selects a first image for viewing. In a step 104, data indicating this image are written to an IMAGE 1 TYPE field 121 for the panorama definition file.
[00056] In a series of steps represented as a single image editing step 106, the user then edits how the selected image is to appear as part of the overall panorama presented on the display. As part of this process, either on the display itself or the input/out display integral with the view creator 54, a representation of the image is presented. If this is the first image component of the panorama, the image may occupy the whole of the virtual representation of the display screen. This is represented by Figure 5A. Here the first image selected for incorporation into a particular panorama is the image available from the surgical navigation unit 44. Identification number 105 represents the frame of the display device on which the panorama under construction is to be presented. This frame may be the actual physical frame or a symbolic representation.
[00057] In step 106, using controls such as keyboard buttons or a mouse, the user sizes the image so it has the size the user wants to occupy in the display. For example, the user may want the particular image to occupy the whole of the displayed panorama, half the panorama or a smaller fraction of the overall area of the panorama. This is represented by the reduction of size of the image from Figure 5A to Figure 5B.
[00058] Also as part as image editing step 106, the user positions the resized image so that it occupies the position of the display preferred by the user. In Figures 5B and 5C this is represented by the repositioning of the resized image from the upper left corner to the lower right corner of the display.
[00059] As part of the image editing step 106, the user adjusts how the image is presented to his/her personal preferences. With regard to a video image, this may involve setting the contrast, chroma or brightness of the image to that preferred by the user. If the image is graphic or alphanumeric in nature, the user selects both the color of the background in which the image is presented as well as the color of the image itself. Thus a particular anesthesiologist may want to simultaneously display blood pressure levels as black numbers on a medium blue background and blood oxygen content levels as red numbers on a light blue background.
[00060] As a consequence of the execution of image editing step 106, in a write image characteristics step 108, the view constructor writes additional data to the panorama definition file 76 that describes how the particular image is to appear as part of the overall panorama presented on the display. Thus in step 108, both an image size field 122 and an image screen location field 124 are established in the panorama definition file 76. Image size field 122 contains data indicating the size of the particular image within the complete panorama. In some versions of the invention, the image size field contains an indication by pixel length and pixel width of the overall size of the image within the panorama. Image screen location field 124 contains data indicating where, on the display, the image should be presented. In some versions of the invention, image size field contains data indicating from a reference pixel location, the location of the starting pixel of the image . [00061] Also in the write image characteristics step 108, an image configuration field 126 is created for the image. Data in the image characteristics 126 field indicate how the image is to be custom presented based on the user-input preferences .
[00062] In Figure 3 an image sequence field 128 is shown being part one of the image characteristics files for each image. This is because some displays may be configured so that the plural images that comprise the panorama may be presented sequentially. When these plural images are presented on a large screen type display, they may be presented consecutively on a single portion of the display. In a heads up display on which it is possible to present only a single image, the images are presented sequentially. [00063] Therefore, as part of the image editing step 106, the user may indicate the sequence in which the panorama- forming images are to be displayed as well as the amount of time each image is presented as part of the panorama. Then, in write image characteristics step 108, the view creator creates or edits the image sequence field 128 for the particular specific panorama-forming image. Image sequence field 128, may include a number of sub fields. Into these sub fields are data indicating when the sequence of images this particular image is to be presented and the overall length of time the image is to be presented as part of the viewable panorama.
[00064] Once the user selects the first panorama-forming image, the user selects a second image that is to form part of the panorama. Thus step 102 is reexecuted. Steps 106 and 108 are then reexecuted. In Figure 4C, the reexecution of these steps is shown as a loop back from decision step 110. The initial reexecution of step 106 is seen in the differences between Figures 5C and 5D. Here, Figure 5D is seen as including data from the unit that is used to suction withdraw fluid away from the surgical site. Once this newly-selected image is so presented, the user completes the image editing step 106 as before. The view constructor 74, in a reexecution of step 108, creates the new image characteristics fields for this subsequent panorama-forming image. In Figure 5D the custom characterization of the image from the suction unit is seen in that the numeric data presented as part of the image is presented in two different colors. Here the suction rate data is seen in blue, the fluid withdrawn data is presented in green.
[00065] Eventually, the panorama is completely defined. Thus there are no images to be added to the display. In Figure 4C, this is represented by an affirmative answer to decision step 110. Figure 5E represents what the particular user wants as his/her custom panorama. In addition to the navigation image and suction data image this particular user wants to see in the lower left quadrant of the panorama an X-ray retrieved from the file server 38. In the upper left quadrant of the panorama patient identifying data are presented. On the left side of the display, between the patient identifying data and the X-ray image, data regarding the time since anesthesia was first introduced is presented. This comes from one of the instrument consoles 40 attached to bus 31. It is further observed that the user has customized the panorama so the duration of anesthesia information is presented as red graphics on a brown back ground.
[00066] Once this point in the process is reached, the newly created panorama definition file 76 is loaded in a library 114, step 112 of panorama definition files maintained in the memory internal to the view creator 56. Thus once a particular individual's panorama definition file 76 is created it is available for later use. The need to reconfigure a display so it presents the panorama required by a particular individual each time the system is used is eliminated.
[00067] Another module internal to view creator 56 is the panorama retrieval module 130. As discussed below, the panorama retrieval module 130, based on user-entered commands, selectively forwards panorama definition files 76 to the display devices.
[00068] As mentioned above, an output module 58 is associated with each image producing device connected to the bus 31. As seen in Figure 6, the output module includes a video signal processor 140 and a bus interface 142. Video signal processor 140 receives the video signals output by the image generating device. The video signal processor 140 converts the image data so that it fits in a common sized area. Thus, for example in some versions of the invention, video signal processor converts the data into a form so that, regardless the size of the screen the image from the generating unit would occupy, the data would completely fill a screen with a diagonal size of 18 inches and that has a 4:3 aspect ratio.
[00069] The video data generated by processor 140 is forwarded to the bus interface 142. The bus interface outputs the data on bus 31 in the form of isochronous data packets. Bus interface 142 and in some versions of the invention processor 140 also compresses the digitized image data prior to its transmission over bus 31.
[00070] It should be appreciated that, in some versions of the invention, the function performed by the video signal processor 140 is performed by a processor internal to the image generating device. [00071] As discussed above, a panorama generator 60, now described by reference to Figure 7, is associated with each display device. Specifically, the panorama generator is located between bus 31 and the display driver 162 of the display device to which the panorama generator is attached. The display driver 162 is the unit integral with the display device that receives or retrieves signals that define the panorama to be presented on the display and converts them into video-image forming signals that cause the display creating components of the display to present the panorama on the screen or other viewing element integral with the display.
[00072] Panorama generator 60 includes a bus interface 164. Bus interface 164 is the component internal to the panorama generator that reads data from and outputs data over bus 31. If communications over bus 31 is in accordance with the Firewire protocol or similar protocol, bus interface 164 it is appreciated, receives data packets that are output on bus 31 as a consequence of both asynchronous and isochronous transfers. An asynchronous transfer is a data transfer over bus 31 to a designated receiver. Typically, a successful asynchronous data transfer is followed by an acknowledgement from the recipient. An isochronous transfer is a transfer in which the object of regular data delivery outweighs the need to receive a confirmation of data delivery. Also, in an isochronous transfer, data may be transmitted, broadcast, to plural receivers. The image-defining data produced by the image generating devices is broadcast in isochronous transfers over bus 31.
[00073] Image cache 167 is a memory module internal to panorama generator 60. The image cache 167 serves as the memory wherein some of the image-defining data are stored prior to being forwarded to the associated display. Specifically, the image cache is the memory wherein image- defining data that is not streamed in real time are stored for presentation on the associated display 46, 48 or 50. Examples of such data are medical image data (X-ray, CAT or MRI scans) that are retrieved from the hospital file server 38. These image-defining data are loaded into the image cache 167 through the bus interface 164. [00074] It should be appreciated that the panorama generator 60 is configured to present on the associated display a panorama that consists of plural of other than real-time generated images. In some versions of the invention the image cache 167 is of such a size that it can simultaneously store plural different sets of image-defining data. In other versions of the invention, panorama generator 60 includes plural image caches 167. Each image cache 167 functions as the memory in which image-defining data for a separate image are stored for an extending period of time.
[00075] A sealer 166 is also part of panorama generator 60. The sealer 166 is the initial recipient of the image-defining data captured by the bus interface 164 and forwarded down line in the panorama generator. Sealer 166 resets the image-defining data so it is the right size, appropriate number of pixels by length and width, for the panorama that will be presented. Typically, but not always, the sealer 166 compresses the image-defining data. [00076] Sealer 166 outputs the scaled image-defining data to a contrast/chroma converter 168. The contrast/chroma converter 168 adjusts the individual pixel defining data that comprise the image-defining data so that the image has the appropriate appearance. [00077] The scaled and contrast/chroma adjusted image- defining data are output from converter 168 to an image mapper 172. The image mapper 172 loads the individual pixel defining data into a frame buffer 176. Frame buffer 176 is a memory also internal to the panorama generator 60. The data locations in frame buffer 176 correspond to individual location on the screen (monitor) of+ the display device at which the image defined by the data is to appear. Image mapper 172 loads each set of data that defines a particular image that contributes to the overall panorama into the locations internal the frame buffer that correspond to where, on the display screen, the image is to appear. Thus, the data stored in the frame buffer 176 can be considered the panorama-defining data.
[00078] Display driver 162 is connected to the frame buffer 176. The display driver 163 retrieves the panorama- defining data. These data are the data used to regulate the operation of the display forming elements of the display that cause the selected panorama to be presented on the display screen.
[00079] In some versions of the invention, sealer 166, contrast/chroma converter 168 and image mapper 172 are distinct components or sub-circuits. In alternative versions of the invention, a DSP is part of the panorama generator 60. The DSP performs the scaling and contrast/chroma adjustment of the image defining data. The DSP also loads the image-defining data into the frame buffer. In these versions of the invention, the sealer 166, the contrast/chroma converter 168 and the image mapper 172 are each firmware modules that are continually executed by the DSP.
[00080] Also integral with the panorama generator 60 is a panorama processor 177. The panorama processor 177, in response to instruction data received over bus 31, regulates operation of the sealer 166, the contrast/chroma converter 168 and the image mapper 172. Panorama processor 177 is also connected to image cache 167. Specifically, the panorama processor 177 asserts a read out signal to image cache 172 that causes the cache 172 to read out to the sealer 167 the stored image-defining data. In versions of the invention wherein a DSP internal to the panorama generator processes the image-defining data, software executed by the DSP may further perform the function of the panorama processor 177.
[00081] Operation of system 30 of this invention in order to present user-defined custom panoramas on displays 46 and 48 and the head up displays 50 is now explained by reference to the flow chart of Figures 8A through 8F. Initially, in a step 202, all the image sources are connected to bus 31. In a step 204, all the display devices, the monitors 48 and 50 and the heads displays 50, are connected to the bus 31. In a step 206, the view creator 54 is connected to the bus. Upon the connection of the view creator 54 to the bus, the bus undergoes a reset process, step 208. As a part of this process, the bus interface 68 internal to the view creator at least emulates the function of isochronous resource manager at least for the image-defining data. This means that bus interface 68 maintains the list of channels, (time slots) in which each image generating device can output isochronous data packets over the bus 31. This list is then stored in a sub-directory in the image library 72. [00082] Also as part of the reset process of step 208, the image library and display library are updated, steps 80-92 and are reexecuted. Thus, at the start of the procedure, the view creator 70 stores data that identifies the image generating device and the display devices currently connected to the bus 31.
[00083] System 30 is now ready for configuration for current procedure. In a step 210, based on prompts generated by the panorama retrieval module 130, an indication of the user, device and/or procedure is entered. In a step 212, the panorama retrieval module 130, reviews the panorama creation files 76 to determine if a panorama has been predefined for the particular user, device and/or procedure. If a panorama has been so defined, in step 214, it is retrieved from the library. Alternatively, if a panorama has not been defined, in a step 216, the panorama retrieval module calls the view constructor 70 for execution. Thus, the individual is invited to create a new panorama definition file 76 based on the connected image generating devices and the available display devices. [00084] In a step 218, the panorama retrieval module 130 compares the list of requested images in the retrieved file to the list of images as maintained in the image library 72 that are presently available. If the data in image library 72 indicates that all necessary image generating devices are connected to the bus 31, panorama retrieval module 130 proceeds to execute step 220. In step 220, module 130, by reference to the display library, verifies that the appropriate display device as specified in the retrieved panorama definition file 76 is also connected to bus 31.
[00085] If in step 218 it is determined that one or more of the desired image generating devices are not available or in step 220 the specified display device is not available, panorama retrieval module 130 executes step 222. In step 222 the panorama retrieval module sends a message to the user. The message reports the unavailability of the requested images and/or the specified display device. In a step 224, the panorama retrieval module 130, by calling the view constructor 70 for execution, allows the user to edit the previously defined panorama to adjust for the unavailability of one or more of the requested images and/or the requested device.
[00086] After receipt of a positive response to the evaluation of step 220, the completion of step 216 or the completion of step 224, panorama retrieval module 130 executes step 228. In step 228, module 130 generates an amended panorama definition file 230, seen in Figure 9, for each display device. Each amended panorama definition file 230 is based on the panorama definition file 76 with data describing the data panorama to be presented on the display device. However, the amended panorama definition file 230 does not contain the frame identification data since these data are not needed by the individual panorama generators. Also, the amended panorama definition files 230 do not contain the image type fields 112. Instead the view constructor substitutes channel identification data, in a channel identification field 232, for the image type data. The channel identification data, it should be understood, identifies the isochronous channel from which the selected image defining data can be obtained.
[00087] Once the amended panorama definition file 230 is created, in step 238 the view constructor forwards the file over bus 31 to the panorama generator 60 integral with the display device that will produce the image. Typically this process is an asynchronous data transmission. [00088] Step 240 represents the storage of the amended view definition file 230 by the panorama generator 60 and more particularly, the panorama processor 177. In a step 242, the panorama processor 242 configures the panorama generator 60 for operation. As part of step 242, the panorama processor 177, based on the data in the channel identification fields 232, informs the bus interface which channels of isochronous data contain image-defining data that contribute to the panorama that is to be constructed. [00089] Also as part of the panorama generator configuration, the panorama generator 60 obtains image- defining data that is not streamed in real time, the data that define "static" image (s) that is (are) to be presented as part of the panorama. This process can be considered part of panorama generator configuration step 242. However, for purposes of illustration, this sub-process is presented as a series of separate steps.
[00090] An initial step in this sub-process is the generation of the request for the data that defines the static image, step 243. In step 243, the panorama processor 177 requests these data from the device containing in which they are stored. Thus, panorama processor 177 outputs a request through bus interface 164 and bus 31 to the file server 38 that specific image defining data such as an X-RAY or CAT scan image be written out to panorama processor. File server 38, if it does not serve as the storage device in which the requested image-defining data are stored, contains a directory indicating where the file server 38 can obtain the data. File server 38 may then retrieve the data itself. Alternatively, as part of step 243, file server returns to panorama processor 177 data identifying the device on which the requested data are stored. Panorama processor 177 then, as another sub-step of step 243, generates a write request to the actual device on which the data are stored. [00091] Step 244 represents the receipt of the static image-defining data by the panorama generator 60. As part of step 244 these data are stored in the image cache 167. [00092] Once step 244 is executed the static image- defining data are then available to be continually presented on the display 46, 48 or 50 throughout the course of the procedure. As represented by step 245, during the procedure, panorama processor 177 periodically determines if these data need to be supplied to the display 46, 48 or 50 as part of the process of refreshing the presented panorama. When it is time to so forward the data, in a step 246, the panorama processor 177 generates a write instruction to the image cache 167. In response to this instruction, the image cache writes out the requested data to the display. In some versions of the invention, the data are actually written out to the sealer 166. Sealer 166, chroma/contrast converter 168 and image mapper 172 then processes these data so it can be written into the frame buffer 176 in the manner in which the other image-defining data, the streamed image- defining data, are written into the frame buffer 176. [00093] Upon execution of step 242, system 30 of this invention is ready for use. Step 248 represents the generation of image-defining data by a device that generates images. These data are forwarded to the image output module 58 to which the image generating device is connected. The video signal processor 140 internal to module 58 converts the image defining data into data defining the standard size image. The bus interface 142 outputs the resized image defining data during the assigned isochronous time channel over the bus 31. Collectively, in Figure 8D, these sub-steps are represented as step 250, the repacking and broadcast of the image define data. In step 250, one of the video signal processor 140 or bus interface also compresses the data forming the video signal. [00094] As a result of the broadcast of the image defining data packet, the packet is received by each bus interface, step 252. In a step 254 each bus interface 164, decides if the packet received during a particular isochronous channel contains data used by the device to which the bus interface is attached. On a Firewire bus, this decision is made by the link layer. Each bus interface 164 internal to a panorama generator 60 makes this determination based on the data from the panorama processor 177 indicating which channels contain image-defining data used by the generator 60. Again, these are the data in the IMAGE N CHNL ID fields 232. If, in step 254 it is determined that the image-defining data contained in a particular channel does not contribute to the panorama under construction, in a step 256 it is discarded.
[00095] If in step 254, bus interface 160 determines that the image-defining data received in the channel does contribute to the panorama under construction, two events occur. In step 258, the bus interface 160 informs the panorama data of the specific type of image data it has received. Step 258 may simply involve the bus interface informing the panorama processor that data from channel "N" has been received. Also, in a step 260, the image defining data are forwarded to the sealer 166.
[00096] As a consequence of the execution of step 258, panorama processor 177 configures the other components of the panorama generator so that the image data can be appropriately processed, step 262. This configuration is based on the data received in the amended panorama generation file that is specific for the image-defining data received in the identified channel. Thus, based on these data, in step 262 the panorama processor 177 provides scaling instructions based on the data in the IMAGE N SIZE field 122 to the sealer 166. Panorama processor 177 provides instructions to the contrast/chroma converter indicating how the scaled image defining data are to be adjusted. These data are based on the data from the appropriate IMAGE N CONFGRTN field 126. Also in step 262, the panorama processor 177 provides data to the image mapper that indicates the position of the image in the panorama. These data are from the associated IMAGE N SCR LCN field 124.
[00097] Once step 262 is executed, sealer 166, in a step 264 resizes the image forming data so it defines an image of the defined size. It should be appreciated that, as part of step 262, sealer 166 may be the component internal to the panorama generator that decompresses the video data. Alternatively, the decompression is initially performed by bus interface 164. In a step 266, the contrast/chroma converter 168 adjusts the resized image defining data so that it defines an image having the appropriate characteristics. As part of step 266, chroma/contrast converter 168 adjusts the chroma, contrast and brightness of the image -defining data so that when the resultant image is presented on the display 46 or 48 as part of a multi-image panorama, the image, in comparison to the other images does not appear either excessively bright or dark.
[00098] Image mapper 172 receives the resized and adjusted image defining data. The image mapper loads these data into the appropriate space in the frame buffer 176, step 268. [00099] Once the above steps 264, 266 and 268 are executed for all images forming the panorama, frame buffer 176 thus contains a bit map of data defining the complete panorama that is to be presented on the display device. Display driver 162, using a protocol appropriate to the display device, cyclically reads the data in the frame buffer 176. Based on these data the display driver 162 causes the user- defined panorama, such as that illustrated in Figure 5E, to be presented on the viewable component of the display device .
[000100] In some versions of the invention, each time step 268 is executed, in a step 270 the image mapper 172 either directly or through the panorama constructor, (connection not shown) informs the display driver 162 that the image-defining data in the frame buffer has been updated. This status report may serve as a trigger that causes the display driver 162 to refresh the panorama presented on the viewable component so the new image is presented.
[000101] It should further be appreciated that steps 262, 264, 266 and 268 are not just performed on the real time image-defining data the panorama generator 60 selectively captures off of bus 31. As discussed above with respect to steps 245 and 246, periodically the static image-defining data stored in the image cache 167 are forwarded to the sealer 167. These data are subject to the same scaling, chroma/contrast adjustment and selective frame map loaded to which the real time image-defining data are subjected. This way, the presented panorama can consist of images that are combination of real time images/state information and static images/state information.
[000102] System 30 of this invention is thus constructed to simultaneously present custom panoramas on plural different display devices. Each panorama consists of the plural images that are of specific interest to the medical personnel viewing that display device. Images not of interest to a particular individual are not presented on the display he/she views. The individual therefore does not spend time or mental effort discarding unneeded images. Instead, the individual immediately views and obtains a mental impression only of the images of interest. [000103] System 30 of this invention is designed so that each individual's panorama definition file 76 is, once created, stored. The need to redefine the panorama a particular individual wants to view each time the system is used by that individual is eliminated.
[000104] The system 30 of this invention also allows image data to have contrast/chroma characteristics that are preferred by an individual. Thus, over time, an individual becomes familiar with seeing a particular data presented in the same visual format. This familiarity allows the individual to out of habit, more rapidly view and mentally process the data contained within the image.
[000105] Another feature of system 30 of this invention is that the bus interfaces connected to bus 31 can be configured to allow the medical and surgical devices to transmit and receive data other than image-defining data. For example, the surgical navigation unit 44 may send signals to the instrument console 40 to cause the deactivation of an attached handpiece. Such commands may be sent when the surgical navigation unit 44 determines that the handpiece is approaching a location on the body of the patient to which the handpiece should not be applied. Thus, the system of this invention may be integrated into an operating room without the addition of its own dedicated communications bus and the overhead such a link requires. [000106] Panorama generator 60 of this invention is further configured to have a memory, image cache 167, in which data defining the static images that are presented as part of a panorama are stored. Thus, the presence of this cache 167 means that data defining these images do not need to be repetitively transmitted from their sources to a particular display 46, 48 or 50 on which they mages are to be transmitted. This feature of the invention minimizes the bandwidth required to ensure that on each display the panorama consists of the images desired by the display' s viewers .
[000107] Figure 10 is an overview of an alternative system 302 of this invention. System 302 includes the same bus 31 of the first described version of the invention. The same devices 32, 34, 38, 40, 42 and 44 are present that generate image-defining data. Also present in this version of the invention are same display devices, monitors 46 and 48 and heads up displays 50 (one shown) . A view creator 304 is also part of system 302.
[000108] In system 302 the view creator 304 serves as a central unit for creating the bit maps of the panoramas presented on the individual display devices 46, 48 and 50. In system 302, one or more of the units that generate image- defining data are connected directly to the view creator 302. By way of illustration in Figure 10, camera 32, instrument console 40, bio monitor 42 and surgical navigation unit 44 are shown connected directly to view creator 304. Camera 36 and file server 38 transmit their image-defining data to the view creator 304 over bus 31. The direct connections to view creator 304 are understood to be exemplary, not limiting. In alternative configurations of system 302 each device that outputs image- defining data is connected directly to view creator 304. [000109] Also in system 302 of this invention, one or more of the display devices are connected directly to the view creator 304. In the arrangement of Figure 10 both monitors 46 and 48 have a direct video connection to video creator 304. Only the heads up display 50 receives panorama defining images over from the view creator 304 over bus 31. Again, this should not be interpreted as limiting. In some configurations of system 302, each of the individual display devices is connected directly to view creator 304. [000110] Figures HA and HB illustrate the basic components of view creator 304. View creator 304 includes an input/output module 306, a bus interface 308, a panorama constructor 310, an image library 312 and a display library 313. These components are similar to the respective versions of these components of view creator 54. More specifically, panorama constructor 310, like panorama constructor 70 creates a library 114 of panorama definition files 76. A panorama retriever 314 is also internal to view creator 304. Attached to panorama retriever 314 is a memory known as a panorama cache 316. During system 302 operation, panorama cache 316 stores the panorama definition files 76 of the panoramas that are being built by the view creator 304.
[000111] Also internal to view creator 304 are one or more image buffers 320, 322 and 324. Each image buffer 320, 322 and 324 stores the image most recently output by one of the image generating devices. In Figures HA and HB, image buffer 320 is shown connected directly to bus interface 308. This is to represent that image buffer 320 serves as the temporary storage buffer for images received over bus 31. Image buffers 322 and 324 are diagrammatically shown as being connected to separate sockets integral with view creator 304. This is to represent that each of image buffers 322 and 324 are directly connected separate ones of the image generating devices. [000112] Two panorama generators 328 and 330 are shown as also being integral with view creator 304. Each panorama generator 328 and 330 is capable of generating a bit map representative of one of the panoramas that is to be presented on one of the display devices. While not illustrated, it should be understood that, at a minimum, each panorama generator 328 and 330 contains a sealer, a contrast/chroma converter and an image mapper. These components are functionally identical to the sealer 166, the contrast/chroma converter 168 and the image mapper 172 of panorama generator 60 (Figure 7) .
[000113] Image defining data from any one of the image buffers 320, 322 and 324 are read out to any one of the panorama generators 328 and 330. In the illustrated versions of the invention, a multiplexer 332 is shown as being the interface between the image buffers 320, 322 and 324. In many preferred versions of the invention, the image defining data from each of the image buffers 320, 322 and 324 is simultaneously read out to both panorama generators 328 and 330.
[000114] The panorama views constructed by the panorama generators 328 and 330 are stored in frame buffers 334, 336 and 338 also part of panorama generator 304. In some preferred versions of the invention, each of the panorama generators 328 and 330 writes panorama-defining image data into any one of the three image generators 334, 336 and 338. In the illustrated version of the invention, a multiplexer 340 routes the panorama-defining data to the appropriate frame buffer 334, 336 or 338. In some preferred versions of the invention, panorama generators 328 and 330 and frame buffers 334, 336 and 338 are constructed so that data from a panorama generator are simultaneously written to plural ones of the frame buffers. This feature is used when system 302 is configured to simultaneously present the same panorama on plural display devices.
[000115] In Figures HA and HB, frame buffers 334 and 336 are shown as being connected to the bus interface 308. This represents that the panorama defining image data stored in these frame buffers 334 and 336 are output to the associated display device over bus 31. Frame buffer 338 is shown as being diagrammatically connected to a socket that is part of view creator 304. This represents that the panorama defining data in buffer 38 are output from view creator 304 directly to a display device.
[000116] Operation of system 302 of this invention is now explained by reference to the flow chart of Figures 12A and 12B. Prior to the operation of the system 302, step not shown, it should be understood the image generating devices are connected to the system. This process involves connecting the devices either bus 31 or directly to panorama generator 304. Similarly, the display devices are connected to the system 30. Again a particular display device is connected to directly to the panorama generator 304 or indirectly over bus 31.
[000117] Once all components are connected, system 302 and more particularly, the view creator 304 engages in an initial configuration process, step 350. Initial configuration process step 350 involves the display generating units identifying themselves to the panorama constructor 310 so view creator 304 can build the image library 312. Thus, this portion of initial configuration step 350 is similar to steps 80-86 executed during implementation of the first system 30 of this invention. Thus, this portion of step 350 is the portion in which the image library 312 is built. [000118] Also as part of initial configuration step 350 the display devices identify themselves to panorama constructor 310. Based on these data, panorama generator builds the display device library 313. This portion of strep 350 is similar to steps 88-92 executed by first system 30.
[000119] In a step 352, panorama constructor 310 builds the panorama definition files 76. Each file build constitutes an execution of the previously described identify viewer steps 96-110. Once panorama definition files 76 are created, view creator 304 can create on any of the displays the multi-image panorama that has been defined for a particular user.
[000120] Step 354 represents the initialization of system 302 to present plural individually created panoramas for a particular surgical procedure. Step 354 comprises the identification of all image generating units and display devices connected to the system; either over bus 31 or view creator 304. Thus step 352 is analogues to bus reset step 208 (Figure 8A) of the first version of this invention. [000121] Once step 354 is completed, a user, in a step 356, enters data into view creator 304 through the input/output device to indicate which panorama view definition file 76 contains data indicating the panorama that is to be presented on a particular display device. In response to the entry of these instructions, in a step 358, the panorama generator retrieves the panorama view definition file 76 from the main memory internal to the view creator 304. The file 76 is then loaded into the panorama cache 316. Once the panoramas that are to be presented on each of the display devices are selected, system 302 can generate panoramas. It is appreciated the above process steps are only performed when, as a result of initialization step 354 it is determined the appropriate image generating devices are available to supply images and the appropriate display devices are also integral with the system 302. [000122] During operation of system 302, each image generator outputs images, step not shown. Each image is written to view creator 304, step 360. More particularly each set of image-defining data are written to a specific image buffer 320, 322 or 324.
[000123] The panorama generators 328 and 330 first create and then update the panoramas presented on each of the displays. In some versions of the invention, the panorama presented on a particular display is updated each time view creator 304 receives new image data for the panorama. In alternative versions of the invention, each panorama is periodically updated regardless of whether or not new image data are present.
[000124] The actual updating of the panorama defining is performed in a series of panorama updating steps 362, one shown. In the particular panorama update step 362 of a particular image that goes into a specific panorama, the panorama retriever 314 retrieves from the cache 316 data from the relevant panorama definition file 76. More particularly, the panorama retriever 314 retrieves from the file 76 the data indicating how the particular image is to be integrated into the complete panorama. These data are forward to the panorama generator 328 or 330 responsible for adding the image data to the panorama.
[000125] The plural panorama generators 328 and 330 allow view creator 304 to simultaneously build plural panoramas. In systems 302 wherein data are simultaneously written to the plural panorama generators 328 and 330, these data are simultaneously is used to form separate panoramas. [000126] In an individual panorama updating step 362, the sealer internal to the panorama generator 328 or 330 resizes the image defining data to fit the area the image will occupy in the assembled panorama. The contrast/chroma converter contrast and color adjusts the image. Panorama updating step 362 concludes with the image mapper loading the resized and contrast/chroma adjusted image data into the appropriate frame buffer 334, 336 or 338. In versions of the invention wherein multiplexer 340 is present, prior to the actual writing of data to the frame buffer, the appropriate panorama generator to frame buffer connection is established.
[000127] Once the panorama-defining data are stored in the frame buffer 334, 336 or 338, the data are forwarded to the display device, step 364. Again, depending on how the display device is connected to the view creator 304 the data are forwarded to the device either over a dedicated connection or bus 31.
[000128] System 302 of this invention is thus arranged so a single device receives the image defining data, packages the data into the user-defined custom panoramas, and exports the panorama-defining data to the display devices. System 302 can further be configured so that all the image generating devices and display devices are connected directly to the view creator 304. In these versions of system 302 of this invention, extra hardware does not have to be added to either the image generating devices or the display devices to provide the custom panoramas. Also, in these versions of the invention, the image and panorama defining data need not be exchanged over the bus 31. This eliminates the need to provide a bus with sufficient bandwidth that allows these data to be transferred. [000129] It should be recognized that the foregoing is directed to specific versions of the invention. Other versions of the invention may have features different from what has been described.
[000130] For example, in some versions of the invention, some units that generate data that are used to form individual panoramas may generate plural different types of data. For example, a single tool control console may generate data about plural different tools the console is employed to energized. A biomonitor may generate data that indicates both pulse rate and blood oxygen content. A particular individual may only want to view the data regarding the operation of a specific handpiece (tool) or only the blood oxygen content data.
[000131] Accordingly, in some executions of step 102 (Figure 4B) , the user designates more than from which image generating device an image is to be used. The user further designates the particular type of image output by the device that is to be incorporated into the panorama. [000132] For example, it should be understood that both frame buffers 176 of the panorama generators 60 or the frame buffers 332, 334 and 336 of view creator 304 may be of the type to which data can be simultaneously written to and read from. In these versions of the invention, panorama defining data are written to the frame buffer while simultaneously other data in the frame buffer are exported to the display driver. In these and other versions of view creator 304, it may be desirable to provide an individual frame buffer for storing the image defining data supplied to each display device .
[000133] Also, in some versions of the invention, view creator 304 is constructed so that each panorama generator is tied directly to a frame buffer 332, 334 or 336. The need to provide a switching unit such as a multiplexer between the individual panorama generators and the frame buffers is eliminated.
[000134] Similarly, view creator 304 may have image buffers 320, 322 and 324 for storing image data to which the associated image generating unit can write data while other data are simultaneously read out by one or more panorama generators .
[000135] Depending on the type of panorama presented by a specific display device, the method by which it is created may vary from what has been described. Thus, as discussed above, some display devices based on either limitations or user preference interleave the presentation of some images. When this version of the system is in operation, the process of selecting the image generator data for inclusion in the panorama includes an additional step. By reference to the data in the image sequence field 128, the panorama generator 60, 328 or 330 determines if it is the appropriate time slot for writing the specific image data to the frame buffer. [000136] The processes disclosed to broadcast the image data and to selectively extract these data are similarly understood to be exemplary, not limiting. For example, in versions of the invention in which the image data are broadcast using a Firewire protocol, it is not necessary that the view creator 56 function as the isochronous resource manager. In an alternative construction of this invention, the view creator 56, more particularly panorama constructor 70, functions as an isochronous resource client. In this construction of the invention, as part of bus reset process 208, the view creator, as part of its emulation of isochronous resource manager function, determines how many image generating devices are connected to bus 31 that want to broadcast isochronous image-defining data over the bus, step 382 of the process of Figure 14.
[000137] Then, in a step 384, the view creator 56, if there are P devices wanting to broadcast image-defining data in isochronous data packets, requests P isochronous channels from the isochronous resource manager. Step 386 represents the response from the isochronous resource manager in which the P channels assigned for image data broadcast are identified. In a step 388, the view creator 56 assigns each one of the image generating devices a particular channel in which it can make its data broadcast. View creator 56 maintains its list of which channels it has assigned each image generator in image library 72.
[000138] Further, in some versions of the invention, the panorama generator 60 is further configured to monitor the data generated by other devices in the hospital. In response to one or more specific event occurring, "trigger events," the panorama generator 60 reactively causes a panorama with images that are especially useful to be viewed in response to the event to be presented on the associated display 46, 48 or 50.
[000139] In these versions of the invention, the panorama ID field 120 of the panorama definition file 76a, Figure 15, contains one or more panorama state trigger fields 120a. Each state trigger field 120a includes data indicating in response to what trigger event the panorama defined by that specific file 76a should be presented on the associated display. In this version of the invention, the panorama state trigger field 120a of one of the files 76a contains data indicating that the file 76a contains data for the default panorama, the panorama that is to be presented in the absence of the occurrence of a trigger event. [000140] Figure 16 depicts one version of a panorama state trigger field 120a. In this version of the invention field 392 includes event type and event trigger sub- fields 392 and 394, respectively. Event type sub-field 392 contains data generally indicating the type of event that would cause the trigger. Event trigger sub-field indicates the specific parameter of the event type that would be considered a trigger event. Thus, one panorama state field 120a can contain data indicating that the event type being monitored is blood pressure. The data in the event trigger sub-field 394 identifies a specific blood pressure level .
[000141] Alternatively, the data in the panorama state trigger field 120a indicates the trigger event is the occurrence of a specific milestone event during the procedure. One such milestone event is the placement of a surgical instrument at a specific location relative to the surgical site. The indication that this event occurred may be something that happens automatically. For example, during the procedure the surgical navigation system 44 may detect that the endoscope 34 is within a given distance of the surgical site. Data reporting this event may be broadcast over bus 31. As a consequence of the panorama generator 60 receiving these data, the panorama generator may switch from presenting a panorama that includes an includes small images of the navigation data and the view captured by the endoscope camera 32 to a panorama that includes a larger image of view captured by the endoscope camera (no navigation image) .
[000142] Another trigger event may be the notice that a particular event has occurred in preparation for another part of the procedure. For example, the entry of data indicating that a particular sized femoral stem implant has been selected for implantation, such as by a scanning wand, can serve as a trigger.
[000143] Alternatively, the trigger event may be some manually entered notice. Such notice could be as simple as the depression of a control button to indicate that the surgeon wants to switch from viewing a panorama with a first set of defined images to a panorama with a second set of defined images.
[000144] To ensure that the panorama generator 60 switches to causing the appropriate panorama to be reactively presented in response to the given triggers, the panorama state trigger field 120a is loaded with the appropriate data. These data can include in the event type sub-field 392 an indication that the event type is something generally concerned with data output by the surgical navigation unit 44. The event trigger field 394 contains data indicating that the actual triggering event is the message that the endoscope is within a given distance of the surgical site.
[000145] Figure 17 is a flow chart of the process by which the panorama generator 60 reactively causes panoramas composed of different images to be presented on the associated display 46 or 48. Not shown is the initial loading of the plural panorama definition files 76a in the panorama processor 177. As depicted by step 402, initially, the panorama processor causes the default panorama to be presented on the appropriate display. This panorama is generated based on the data contained in the panorama definition file 76a specific to the default panorama. [000146] Throughout the procedure, bus interface 31 forwards to panorama generator 60 monitors the data other components output that report on the procedure, step 404. These data include data that reports on the state of the equipment used during the procedure. Specifically these data are broadcast over bus 31. Bus interface 164, as part of step 404 forwards these data to the panorama processor 177.
[000147] In a step 406, the panorama processor 177 monitors these data to determine if any of them indicate a trigger event has occurred. This monitoring is performed by comparing the data received over the bus 31 with the list of trigger events as indicated by the panorama state trigger fields 120a data stored by the processor 177. Thus, while step 406 in Figure 17 is meant to represent the plural evaluations the panorama processor 177 performs to determine if any one of a number of trigger events have occurred. As long as no such event has occurred, panorama processor 177 causes the other components internal to the panorama generator 60 to present the default panorama. [000148] If the comparisons of step 406 indicate that a trigger event has occurred, the panorama generator 60 presents the panorama defined for presentation with that event, step 408. Specifically, the panorama processor 177 causes the other components of the panorama generator 60 to construct the panorama in accordance with the data in the panorama definition file 76a for the trigger event. Thus, the images selected for presentation in the panorama and their sizes, chroma/contrast and positions in the panorama are all based on the data contained in this second file 76a. Figure 18 is an example of one such trigger event panorama. This panorama is similar to the panorama of Figure 5E . However, this trigger event panorama may be presented when data indicates that patient's heart rate has become irregular. Thus, instead of the panorama containing views of the area on which the procedure is performed, the panorama contains images that report of the patient's blood pressure, blood oxygen content, and a graphical indication of the hear rate.
[000149] Step 414 represents the continued monitoring of events that occur during the procedure to determine whether or not a second trigger event occurs. Thus, step 414 represents the continued reexecution of steps 404 and 406 during the procedure.
[000150] For the purposes of example, it is assumed that the second event is an indication that the patient's condition has returned to one in which the default panorama can again be presented. When this second trigger event occurs, the panorama processor 177 returns to causing the default panorama to be presented on the display, step 402 is reexecuted. This is for example only. Sometimes the occurrence of the second trigger event causes panorama generator 60 to present a second trigger event-specific panorama on the display 46 or 48.
[000151] Alternatively, as represented by step 416, the viewer may enter an instruction to the panorama generator 60 requested the presentation of the new panorama. Such instruction may come from an input button associated with the panorama generator 60 itself, or a control head tied to the bus 431. One such control head is marketed by the Applicant's Assignee under the trademark SIDNEE. [000152] Thus, panorama generator 60 is capable of doing more than presenting panoramas that are appropriate to the specific individuals viewing the display 46 or 50 on which the panorama is located. Panorama generator 60 reactively changes the panorama, the collection of images, presented on the display in response to the occurrence of a trigger event. Thu viewer thus is able to monitor images containing data that, after the event, is most relevant to the event. [000153] The above feature of the invention can also be used to cause panoramas to be presented regarding equipment being used in the procedure. For example, the instrument console 40 may receive data from an attached handpiece that the handpiece temperature is rising to an unacceptable level. Here "unacceptable" means to a temperature level where there is the potential of injury to the patient or the individual holding the instrument. Data regarding this temperature rise are broadcast over the bus 31. The receipt of these data is considered to be a trigger event by the panorama processor 177. In response to the receipt of these data, the panorama processor 177 at least momentarily causes a panorama to be presented that includes a very large image notifying the medical personnel of the handpiece state. The medical personnel may then be required to enter a command to the panorama generator 60 to cause it to return to presenting the previously requested image.
[000154] Similarly, in not all versions of the invention can the Firewire link layer function as the module that selectively filters, passes through data from particular isochronous channels. In some hardware modules, the link layer can only perform this function if data packets from 2 or less or 4 or less isochronous channels are to be passed through to the software layer. If a particular display is configured to present on its panorama data from more than these selected number of generators, all isochronous packets are forwarded by the physical layer and the link layer to the bus interface 164 software layer. The software layer then selectively passes on the image-defining data received over the appropriate channels for further processing and display.
[000155] Likewise this invention does not depend on a specific type of communications link to either the image generating devices or to the display devices. Wireless connections that have sufficient bandwidth and that do not interfere with each other and that will not be effected by or effect other equipment in the operating room may be employed.
[000156] Further, there is no requirement that in all versions of the invention, the units that generate data that eventually appear as part of a panorama actual generate image-defining data. Thus, in some versions of the invention, the devices that generate data displayed as part of the panorama generate the data in the form of raw data. For example, the control console that regulates the motorized tool may generate data packets that simply indicate tool speed. Similarly, the bio-monitor that monitors blood pressure may simply present these data in a raw data form. One advantage of this arrangement is that these data generators are not required to have the output units of these data generators with video image processors. A further benefit of this version of the invention is that the bandwidth required to transmit raw data is appreciably less than the bandwidth required to transmit a video image containing these data.
[000157] In these versions of the invention with individual panorama generators, a panorama generator 60a as is partially illustrated in Figure 13 is provided. Panorama generator 60a includes the basic modules of panorama generator 60 described with respect to Figure 7. Panorama generator 60a also includes a video image generator 372. Video image generator 372 is, like sealer 166, connected to receive data signals bus interface 164 selectively forwards from bus 31. More particularly, bus interface 164 forwards to video image generator 372 the raw data broadcast by the data generating devices. [000158] Video image generator 164, upon receipt of these data, generates a basic image based on the received raw data. In one version of the invention, the image data output by video image generator is basically a black on white background alphanumeric or graphic representation of the raw data. Header information may be added to the image based on data contained within the data or instructions from the panorama processor 177. These image signals are then forwarded to the sealer 166 for appropriate resizing so it will fit in the designated area in the user-defined panorama. Contrast/chroma converter 168 then converts the image into the background and data color pattern designated by the user. The scaled and color converted image of the data are then forwarded to image mapper 172. The image mapper 172 loads the data into the appropriate area within frame buffer 176.
[000159] In alternative versions of the invention, video image generator 372, based on instructions received from the panorama processor 177, produces a video image that is appropriately scaled and has the user-designated contrast and chroma characteristics. The image output by this particular video image generator is output directly to the image mapper 172.
[000160] In versions of the invention with view creator 304 includes one or more panorama generators 328 and 330 (Figure HB) there may be one or more video image generators 372. In this version of the invention, view creator 304 receives the plural raw data streams from the individual data generating units. Since the bandwidth required to transmit these raw data is, in comparison to the bandwidth required to transmit image data, very small, these data may all be transmitted over bus 31. [000161] The raw data are forwarded by the bus interface 308 internal view creator 304 to the one or more video image generators internal to the view creator. The image generator that receives the raw data generates a basic image based on the data. Again, in some, but not all versions of this invention, the image may be simply a black-on-white image. The image is then loaded in the appropriate image buffer 320, 322 and 324. Once the image data are so stored, it is available for use as panorama make up data by the panorama generators 328 and 330.
[000162] It should likewise be appreciated that the process steps by which the system of this invention operates may be different from what has been described. For example, in some versions of the invention wherein the different types of image data are broadcast to the individual displays over the bus 31, view creator 54 may not generate the amended panorama definition files. Instead, the basic panorama definition files are forwarded to the individual panorama generator 60. At the start of the procedure, each image generating device broadcasts a message indicating what type of image data it exports and identifies the channel in which these data transmitted. Based on these data and the data in the received panorama definition file, the panorama processor 177, as part of step 242 (Figure 8C) determines from what channels the data should be retrieved in order to create the requested panorama. These determinations are then used by the link layer of the bus interface during step 254 to determine if a particular packet of image packet should be forwarded for processing or discarded. [000163] Similarly the polling of steps 80 and 88 may be performed in a single polling step. If the response indicates the polled device is an image generator, step 86 is executed so as to build the image library 72. Alternatively, if the response indicates that the attached device is a display device, step 92 is executed so that data regarding the device are loaded into the display library 74. [000164] Likewise the order in which image data are resized, color/chroma adjusted and mapped into the frame buffer may be different from what has been described. Some image-data processing algorithms may be employed that perform two or all of these processes simultaneously. Likewise, there is no requirement that all versions of the invention include a frame buffer that holds a full frame of panorama-defining data. In some versions of the invention, the buffer may only hold portions of one or more lines of panorama defining data. A process specific to the associated display driver is then employed to ensure the data are properly used to create the desired image. [000165] Further it should be understood that in versions of the invention in which all image-data are distributed to the individual display devices over a common bus, a protocol other than the Firewire protocol may be used to regulate data transfer.
[000166] Thus, it is an object of the appended claims to cover all such variations and modifications that come within the true spirit and scope of this invention.

Claims

What is claimed is:
1. An integrated assembly of medical/surgical devices, said assembly including: a plurality of medical/surgical devices (32, 36, 40, 42) each device generating a separate stream of image- defining data; and a plurality of display devices (46, 48, 50) for receiving from the medical/surgical devices the image- defining data, each said display device configured to present thereon images based on the received image-defining data; characterized in that: at least one panorama generator (60, 328, 330) is connected to at least one of the display devices and said panorama generator is configured to: receive the image-defining data generated by the plurality of medical/surgical devices; determine if the image-defining data received from each one of the medical/surgical devices is to be presented on the display to which said panorama generator is connected (254); and if the image-defining data received from one of the medical/surgical devices is to be presented on the display, forward the image-defining data to the display (260, 268) so that the image defined by the data is presented on the display on a particular location on the display or in a particular sequence.
2. The integrated assembly of medical/surgical devices of Claim 1, wherein there are a plurality of panorama generators (60), each said panorama generator being connected to a separate one of the display devices and capable of independently determining if the received image- defining data are to be presented on the display to which said panorama generated is connected so that the panoramas simultaneously presented on the plural displays consist of different combinations or arrangements of images.
3. The integrated assembly of medical/surgical devices of Claim 1, wherein said panorama generator (328, 330) is connected to plural display devices and is capable of, for each display device to which said panorama generator is connected: determine if the received image-defining data are to be presented on the display; and forward image- defining data to the display and is further configured to cause on the plural displays panoramas consisting of different combinations or arrangement of images to be simultaneously displayed.
4. The integrated assembly of medical/surgical devices of Claims 1, 2 or 3, wherein: the display (46, 48) to which said panorama generator is connected is capable of presenting a panorama that comprises a plurality of images, the panorama of images presented on each display being different and specific to that display; and said panorama generator is further configured to, upon determining that image-defining data received from one of the medical/surgical devices is to be presented on the display: process the image-defining data so that the portion of the panorama occupied by the displayed image occupies a user-selected size of the panorama (264); and cause the image-defining data to be forwarded to the display so that the displayed image occupies a user- selected section of the display (268).
5. The integrated assembly of medical/surgical devices of Claims 1, 2, 3 or 4, wherein said panorama generator is further configured to, upon determining that the image-defining data received from one of the medical/surgical devices is to be presented on the display, adjust the image defining data prior to forwarding the data to the display so that the resultant image has a user- selected chroma or contrast (266).
6. The integrated assembly of medical/surgical devices of Claims 1, 2, 3, 4 or 5, further including: a view creator (56) , said view creator configured to receive from a user data regarding the images the user want presented on the display viewed by user and, in response to receipt of such data generate for that user and device a panorama definition file (76) that identifies what the user wants to use on the device; and said panorama generator is configured to receive from said view creator the panorama definition file for the user viewing the display to which said panorama generator is connected and, in response to the contents of the panorama definition file, determine if the image-defining data received from the medical/surgical devices are to be presented on the display to which said panorama generator is connected.
7. The integrated assembly of medical/surgical devices of Claims 1, 2, 3, 4, 5 or 6, wherein: the plurality of medical/surgical devices that generate image-defining data output the image-defining data over a common bus (31) in an isochronous pattern wherein each device outputs its image-defining data in an assigned channel ; said panorama generator is connected to said bus for receiving the image defining data and determines if received image-defining data are to be presented on the display to which said panorama generator is connected based on the channel on which the data are output.
8. The integrated assembly of medical/surgical devices of Claim 7, wherein the medical/surgical devices receive or transmit over the common bus data other than image-defining data.
9. The integrated assembly of medical/surgical devices of Claims 7 or 8, wherein a view creator (56) , said view creator configured to receive from a user data identifying the devices from which a the user wants presented on the display viewed by user and, in response to receipt of such data, generate data that identifies for that display device that identifies the channels on which image-defining data the user wants presented; and said panorama generator is configured to receive from said view creator the data identifying the channels containing image-defining data to be presented on the display to which said panorama generator is connected and uses the data to determine which channels contain image- defining data that are to be presented on the display.
10. The integrated assembly of medical/surgical devices of Claims 1, 4, 5, 6, 7, 8, or 9, wherein: at least one of the displays (50) consists of a display on which a single image is displayed; and said panorama generator connected to the single-image display is configured to cause a plurality of different images to be presented on the display in a user-selected sequence .
11. The integrated assembly of medical/surgical devices of Claims 1, 2, 3, 4, 5, 6, 7, 8 or 9, wherein said panorama generator is further configured to: based on the determination of whether or not received image-defining data is to be presented on the display to which said panorama generator is connected forward a first set of image defining data to the display so that a panorama comprising a first set of images is presented on the display (402) ; monitor at least one of: the state of the medical/surgical procedure; the state of patient; the operating state of the medical/surgical devices to determine if a trigger event occurs (404, 406); and if the trigger event occurs, based on the determination of whether or not received image-defining data is to be presented on the display to which said panorama generator is connected forward a second set of image defining data to the display so that a panorama comprising a second set of images is presented on the display (402);
12. The integrated assembly of medical/surgical devices of Claims 1, 2, 3, 4, 5, 6, 7, 8, 9, 10 or 11, wherein : said panorama generator further includes an image cache (167) for storing image-defining data; and said panorama generator is configured to selectively forward from image-defining data from both at least one of the medical/surgical devices and said image cache to the display to which said panorama generator is connected so that the display presents thereon a panorama consisting of at least one image from the data from one of the medical/surgical devices and at least one image from the data from the image cache.
PCT/US2007/069141 2006-05-18 2007-05-17 Multi-display medical/surgical image and data viewer system that presentes user-defined, custom panoramas WO2007137115A2 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US80128106P 2006-05-18 2006-05-18
US60/801,281 2006-05-18

Publications (2)

Publication Number Publication Date
WO2007137115A2 true WO2007137115A2 (en) 2007-11-29
WO2007137115A3 WO2007137115A3 (en) 2008-01-17

Family

ID=38654478

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2007/069141 WO2007137115A2 (en) 2006-05-18 2007-05-17 Multi-display medical/surgical image and data viewer system that presentes user-defined, custom panoramas

Country Status (1)

Country Link
WO (1) WO2007137115A2 (en)

Cited By (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2011113982A1 (en) * 2010-03-15 2011-09-22 Universidad De Sevilla System for the analysis and management of surgical images
WO2012061727A3 (en) * 2010-11-05 2012-11-01 Ethicon Enco-Surgery, Inc. Surgical instrument safety glasses or surgical monitor with visual feed back
US8633975B2 (en) 2008-01-16 2014-01-21 Karl Storz Imaging, Inc. Network based endoscopic surgical system
US9039720B2 (en) 2010-11-05 2015-05-26 Ethicon Endo-Surgery, Inc. Surgical instrument with ratcheting rotatable shaft
US9089338B2 (en) 2010-11-05 2015-07-28 Ethicon Endo-Surgery, Inc. Medical device packaging with window for insertion of reusable component
US9649150B2 (en) 2010-11-05 2017-05-16 Ethicon Endo-Surgery, Llc Selective activation of electronic components in medical device
US9782215B2 (en) 2010-11-05 2017-10-10 Ethicon Endo-Surgery, Llc Surgical instrument with ultrasonic transducer having integral switches
US9782214B2 (en) 2010-11-05 2017-10-10 Ethicon Llc Surgical instrument with sensor and powered control
US10085792B2 (en) 2010-11-05 2018-10-02 Ethicon Llc Surgical instrument with motorized attachment feature
US10136938B2 (en) 2014-10-29 2018-11-27 Ethicon Llc Electrosurgical instrument with sensor
US10376304B2 (en) 2010-11-05 2019-08-13 Ethicon Llc Surgical instrument with modular shaft and end effector
US10537380B2 (en) 2010-11-05 2020-01-21 Ethicon Llc Surgical instrument with charging station and wireless communication
US10660695B2 (en) 2010-11-05 2020-05-26 Ethicon Llc Sterile medical instrument charging device
US10881448B2 (en) 2010-11-05 2021-01-05 Ethicon Llc Cam driven coupling between ultrasonic transducer and waveguide in surgical instrument
US10959769B2 (en) 2010-11-05 2021-03-30 Ethicon Llc Surgical instrument with slip ring assembly to power ultrasonic transducer
US10973563B2 (en) 2010-11-05 2021-04-13 Ethicon Llc Surgical instrument with charging devices
WO2023052566A1 (en) * 2021-09-30 2023-04-06 Leica Instruments (Singapore) Pte. Ltd. Devices and systems for use in imaging during surgery

Families Citing this family (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9000720B2 (en) 2010-11-05 2015-04-07 Ethicon Endo-Surgery, Inc. Medical device packaging with charging interface
US9597143B2 (en) 2010-11-05 2017-03-21 Ethicon Endo-Surgery, Llc Sterile medical instrument charging device
US9247986B2 (en) 2010-11-05 2016-02-02 Ethicon Endo-Surgery, Llc Surgical instrument with ultrasonic transducer having integral switches
US9421062B2 (en) 2010-11-05 2016-08-23 Ethicon Endo-Surgery, Llc Surgical instrument shaft with resiliently biased coupling to handpiece
US9017849B2 (en) 2010-11-05 2015-04-28 Ethicon Endo-Surgery, Inc. Power source management for medical device
US9017851B2 (en) 2010-11-05 2015-04-28 Ethicon Endo-Surgery, Inc. Sterile housing for non-sterile medical device component
US9375255B2 (en) 2010-11-05 2016-06-28 Ethicon Endo-Surgery, Llc Surgical instrument handpiece with resiliently biased coupling to modular shaft and end effector
US9381058B2 (en) 2010-11-05 2016-07-05 Ethicon Endo-Surgery, Llc Recharge system for medical devices
US9011471B2 (en) 2010-11-05 2015-04-21 Ethicon Endo-Surgery, Inc. Surgical instrument with pivoting coupling to modular shaft and end effector
US9526921B2 (en) 2010-11-05 2016-12-27 Ethicon Endo-Surgery, Llc User feedback through end effector of surgical instrument
US9161803B2 (en) 2010-11-05 2015-10-20 Ethicon Endo-Surgery, Inc. Motor driven electrosurgical device with mechanical and electrical feedback

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2386957A (en) * 2002-03-28 2003-10-01 Ge Med Sys Information Tech Medical monitoring network sensor interface device
US20040204627A1 (en) * 2002-09-19 2004-10-14 Olympus Optical Co., Ltd. Medical system
US20050033117A1 (en) * 2003-06-02 2005-02-10 Olympus Corporation Object observation system and method of controlling object observation system
WO2005087125A2 (en) * 2004-03-10 2005-09-22 Depuy International Ltd Orthopaedic operating systems, methods, implants and instruments

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2386957A (en) * 2002-03-28 2003-10-01 Ge Med Sys Information Tech Medical monitoring network sensor interface device
US20040204627A1 (en) * 2002-09-19 2004-10-14 Olympus Optical Co., Ltd. Medical system
US20050033117A1 (en) * 2003-06-02 2005-02-10 Olympus Corporation Object observation system and method of controlling object observation system
WO2005087125A2 (en) * 2004-03-10 2005-09-22 Depuy International Ltd Orthopaedic operating systems, methods, implants and instruments

Cited By (27)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8633975B2 (en) 2008-01-16 2014-01-21 Karl Storz Imaging, Inc. Network based endoscopic surgical system
US8982199B2 (en) 2008-01-16 2015-03-17 Karl Storz Imaging, Inc. Network based endoscopic surgical system
ES2374234A1 (en) * 2010-03-15 2012-02-15 Universidad De Sevilla System for the analysis and management of surgical images
WO2011113982A1 (en) * 2010-03-15 2011-09-22 Universidad De Sevilla System for the analysis and management of surgical images
US10085792B2 (en) 2010-11-05 2018-10-02 Ethicon Llc Surgical instrument with motorized attachment feature
US10376304B2 (en) 2010-11-05 2019-08-13 Ethicon Llc Surgical instrument with modular shaft and end effector
US9039720B2 (en) 2010-11-05 2015-05-26 Ethicon Endo-Surgery, Inc. Surgical instrument with ratcheting rotatable shaft
US9072523B2 (en) 2010-11-05 2015-07-07 Ethicon Endo-Surgery, Inc. Medical device with feature for sterile acceptance of non-sterile reusable component
US9089338B2 (en) 2010-11-05 2015-07-28 Ethicon Endo-Surgery, Inc. Medical device packaging with window for insertion of reusable component
US9649150B2 (en) 2010-11-05 2017-05-16 Ethicon Endo-Surgery, Llc Selective activation of electronic components in medical device
US9782215B2 (en) 2010-11-05 2017-10-10 Ethicon Endo-Surgery, Llc Surgical instrument with ultrasonic transducer having integral switches
US9782214B2 (en) 2010-11-05 2017-10-10 Ethicon Llc Surgical instrument with sensor and powered control
WO2012061727A3 (en) * 2010-11-05 2012-11-01 Ethicon Enco-Surgery, Inc. Surgical instrument safety glasses or surgical monitor with visual feed back
US11925335B2 (en) 2010-11-05 2024-03-12 Cilag Gmbh International Surgical instrument with slip ring assembly to power ultrasonic transducer
US10143513B2 (en) 2010-11-05 2018-12-04 Ethicon Llc Gear driven coupling between ultrasonic transducer and waveguide in surgical instrument
CN103281981A (en) * 2010-11-05 2013-09-04 伊西康内外科公司 Surgical instrument safety glasses or surgical monitor with visual feed back
US10537380B2 (en) 2010-11-05 2020-01-21 Ethicon Llc Surgical instrument with charging station and wireless communication
US10660695B2 (en) 2010-11-05 2020-05-26 Ethicon Llc Sterile medical instrument charging device
US10881448B2 (en) 2010-11-05 2021-01-05 Ethicon Llc Cam driven coupling between ultrasonic transducer and waveguide in surgical instrument
US10945783B2 (en) 2010-11-05 2021-03-16 Ethicon Llc Surgical instrument with modular shaft and end effector
US10959769B2 (en) 2010-11-05 2021-03-30 Ethicon Llc Surgical instrument with slip ring assembly to power ultrasonic transducer
US10973563B2 (en) 2010-11-05 2021-04-13 Ethicon Llc Surgical instrument with charging devices
US11389228B2 (en) 2010-11-05 2022-07-19 Cilag Gmbh International Surgical instrument with sensor and powered control
US11744635B2 (en) 2010-11-05 2023-09-05 Cilag Gmbh International Sterile medical instrument charging device
US11690605B2 (en) 2010-11-05 2023-07-04 Cilag Gmbh International Surgical instrument with charging station and wireless communication
US10136938B2 (en) 2014-10-29 2018-11-27 Ethicon Llc Electrosurgical instrument with sensor
WO2023052566A1 (en) * 2021-09-30 2023-04-06 Leica Instruments (Singapore) Pte. Ltd. Devices and systems for use in imaging during surgery

Also Published As

Publication number Publication date
WO2007137115A3 (en) 2008-01-17

Similar Documents

Publication Publication Date Title
WO2007137115A2 (en) Multi-display medical/surgical image and data viewer system that presentes user-defined, custom panoramas
US7492388B2 (en) System and method for automatic processing of endoscopic images
CA2766595C (en) Surgeon's aid for medical display
US6108634A (en) Computerized optometer and medical office management system
CN100530048C (en) Method and apparatus for surgical operating room information display gaze detection and user prioritization for control
US6872179B2 (en) Medical diagnosis system having a medical diagnosis apparatus and a display to be observed by a patient
US6699187B2 (en) System and method for providing remote expert communications and video capabilities for use during a medical procedure
US12093036B2 (en) Telerobotic system with a dual application screen presentation
US20060122482A1 (en) Medical image acquisition system for receiving and transmitting medical images instantaneously and method of using the same
EP0676709A2 (en) Method and system for customizing the display of patient physiological parameters on a medical monitor
EP1821483A1 (en) Computer network system and method for operating the network system screenshot and sourceshot control
WO2008157813A1 (en) Surgical data monitoring and display system
US20070081703A1 (en) Methods, devices and systems for multi-modality integrated imaging
US11910997B2 (en) Apparatus, systems, and methods for intraoperative visualization
JP2008271547A (en) Systems and methods for multi-source video distribution and composite display
CN101389266B (en) Medical image system
Krupinski et al. Differences in time to interpretation for evaluation of bone radiographs with monitor and film viewing
WO2003042968A1 (en) Method and system for presenting real time physiological information through a head mounted display
US8156210B2 (en) Workflow for computer aided detection
JP6948043B1 (en) Medical video system and medical video processing equipment
JP2020039432A (en) Medical image switcher
AU2005237325A1 (en) An apparatus for combining data from a stereoscopic digital source with additional digital imaging devices
US20060152510A1 (en) Cross-platform and data-specific visualisation of 3d data records
CN113693738A (en) Operation system based on intelligent display
CN118250500A (en) Medical video previewing method and system

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 07762231

Country of ref document: EP

Kind code of ref document: A2

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 07762231

Country of ref document: EP

Kind code of ref document: A2