WO2023140120A1 - 手術ロボットシステム - Google Patents
手術ロボットシステム Download PDFInfo
- Publication number
- WO2023140120A1 WO2023140120A1 PCT/JP2023/000118 JP2023000118W WO2023140120A1 WO 2023140120 A1 WO2023140120 A1 WO 2023140120A1 JP 2023000118 W JP2023000118 W JP 2023000118W WO 2023140120 A1 WO2023140120 A1 WO 2023140120A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- display
- image
- user
- surgical
- information
- Prior art date
Links
- 238000009434 installation Methods 0.000 claims abstract description 13
- 238000003384 imaging method Methods 0.000 claims abstract description 8
- 230000000694 effects Effects 0.000 claims description 17
- 239000002131 composite material Substances 0.000 claims description 13
- 230000002452 interceptive effect Effects 0.000 claims description 7
- 230000002194 synthesizing effect Effects 0.000 claims description 5
- 238000001356 surgical procedure Methods 0.000 abstract description 10
- 238000010586 diagram Methods 0.000 description 38
- 230000010365 information processing Effects 0.000 description 29
- 238000001514 detection method Methods 0.000 description 17
- 239000012636 effector Substances 0.000 description 6
- 238000000034 method Methods 0.000 description 6
- 238000002059 diagnostic imaging Methods 0.000 description 4
- 230000006870 function Effects 0.000 description 4
- 230000005057 finger movement Effects 0.000 description 3
- 210000002858 crystal cell Anatomy 0.000 description 2
- 239000011521 glass Substances 0.000 description 2
- 239000004973 liquid crystal related substance Substances 0.000 description 2
- 230000008569 process Effects 0.000 description 2
- 230000001133 acceleration Effects 0.000 description 1
- XAGFODPZIPBFFR-UHFFFAOYSA-N aluminium Chemical compound [Al] XAGFODPZIPBFFR-UHFFFAOYSA-N 0.000 description 1
- 229910052782 aluminium Inorganic materials 0.000 description 1
- 210000004204 blood vessel Anatomy 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 230000004927 fusion Effects 0.000 description 1
- 238000012905 input function Methods 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 239000000203 mixture Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 201000003152 motion sickness Diseases 0.000 description 1
- 230000001151 other effect Effects 0.000 description 1
- 102220015505 rs141611486 Human genes 0.000 description 1
- 102220023217 rs387907538 Human genes 0.000 description 1
- 238000002834 transmittance Methods 0.000 description 1
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/30—Surgical robots
- A61B34/37—Leader-follower robots
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/10—Computer-aided planning, simulation or modelling of surgical operations
- A61B2034/101—Computer-aided simulation of surgical operations
- A61B2034/102—Modelling of surgical devices, implants or prosthesis
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B2090/364—Correlation of different images or relation of image positions in respect to the body
- A61B2090/365—Correlation of different images or relation of image positions in respect to the body augmented reality, i.e. correlating a live optical image with another image
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B90/37—Surgical systems with images on a monitor during operation
- A61B2090/372—Details of monitor hardware
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/25—User interfaces for surgical systems
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/20—Surgical microscopes characterised by non-optical aspects
Definitions
- This technology relates to a surgical robot system, for example, a surgical robot system that provides images that allow medical staff to perform surgery efficiently.
- Patent Document 1 In surgeries using surgical robots, surgical microscopes, etc., by displaying surgical images in 3D, medical staff such as surgeons can easily perceive the structure of the living body and perform surgery efficiently (Patent Document 1).
- This technology was created in view of this situation, and is intended to provide images that allow medical staff to perform surgeries more efficiently.
- a surgical robot system is a surgical robot system comprising: a spatial reproduction display capable of displaying a surgical image that is visually recognized as three-dimensional in a three-dimensional space defined by a plane passing through a lower side of a display surface of a monitor and a normal plane of an installation surface; and a plane passing through a plane parallel to the upper side of the display surface and the installation surface; an imaging device that captures the surgical image; an operation device connected to the spatial reproduction display;
- a surgical robot system includes a spatial reproduction display capable of displaying a surgical image visually recognized as a stereoscopic image in a three-dimensional space formed by a plane passing through the lower side of the display surface of the monitor and a normal plane of the installation surface and a plane passing through the parallel plane of the upper side of the display surface and the installation surface, an imaging device that captures the surgical image, an operation device connected to the spatial reproduction display, and a surgical robot that operates by an operator's operation via the operation device.
- FIG. 4 is a diagram for explaining stereoscopic display to which the present technology is applied; It is a figure which shows the structure of one embodiment of an information processing apparatus. It is a figure which shows the installation example of a display part.
- FIG. 4 is a diagram for explaining a stereoscopic effect perceived by a user; FIG.
- FIG. 4 is a flow chart showing an operation example of the present embodiment
- FIG. 5 is a diagram for explaining display control processing by a display control unit
- FIG. 5 is a diagram for explaining display control processing by a display control unit
- It is a figure which shows an example of a structure of a surgical robot system.
- FIG. 4 is a diagram for explaining a surgical image displayed on a spatial reproduction display;
- FIG. It is a figure which shows an example of a synthetic image. It is a figure which shows an example of space CG.
- FIG. 10 is a diagram showing a difference in synthesized images depending on viewpoint positions; It is a figure which shows an example of a synthetic image. It is a figure which shows an example of a synthetic image. It is a figure which shows an example of a synthetic image. It is a figure which shows an example of a synthetic image.
- FIG. 11 is a diagram showing an example of another configuration of the surgical robot system; It is a figure which shows an example of a synthetic image. It is a figure which shows an example of a synthetic image. It is a figure which shows an example of a synthetic image. It is a figure which shows an example of a synthetic image. It is a figure which shows an example of a synthetic image. It is a figure which shows an example of a synthetic image. It is a figure which shows an example of a synthetic image. It is a figure which shows an example of a synthetic image.
- FIG. 11 is a diagram showing an example of another configuration of the surgical robot system; It is a figure which shows an example of a synthetic image. It is a figure which shows an example of a synthetic image.
- FIG. 1 is a diagram showing the configuration of a surgical robot system according to an embodiment of the present technology.
- the surgical robot system consists of a master device 100 and a slave device 500.
- the user drives the slave device 500 by controlling the master device 100 to perform surgery.
- the master device 100 is an information processing device having an input function for operating the slave device 500, a control function for the slave device 500, and a presentation function for presenting signals output from the cameras and sensors of the slave device 500 to the user.
- the master device 100 includes an information processing device 101 (first information processing device), an input device 200 held and operated by a user (input device 200R for the right hand and input device 200L for the left hand), a support base 300 on which the user's arms or elbows are placed, and a display device 400 that displays an image based on a signal from the slave device 500.
- first information processing device an input device 200 held and operated by a user
- input device 200R for the right hand and input device 200L for the left hand
- support base 300 on which the user's arms or elbows are placed
- a display device 400 that displays an image based on a signal from the slave device 500.
- the master device 100 controls the slave device 500 by the information processing device 101 outputting a control signal to the slave device 500 based on the input of the input device 200 .
- the information processing device 101 vibrates the input device 200 (tactile presentation) or displays on the display device 400 based on the signal from the slave device 500, thereby presenting the image of the operative field and the feedback from the slave device 500 to the user.
- the information processing device 101 is a device that processes information in the master device 100, and is, for example, a computer having an information processing circuit including a CPU, memory, and the like.
- the input device 200 is a device that receives a user's input, and is a device that can input, for example, the user's hand motion or gripping motion.
- the support table 300 is a table on which the user's arms or elbows can be placed, and is an aluminum frame, for example.
- a device such as a touch display capable of inputting to the master device 100 or the slave device 500 may be further provided on the support base 300 .
- the display device 400 is a display device capable of displaying a virtual three-dimensional image in a predetermined space so that the user can visually recognize it, and is, for example, a spatial reproduction display.
- the display surface of the display device 400 is arranged obliquely on the back side with respect to the user.
- the degree of inclination of the display surface may be changeable based on the arrangement angle of the input device 200 .
- the display device 400 is arranged behind the input device 200 in the example of FIG. At this time, it is preferable that the image displayed by the display device 400 includes a CG image that changes according to the user's input. As a result, the user can get the illusion of directly touching an object in the space that is virtually reproduced by the display device 400 .
- the slave device 500 is composed of an information processing device 501 (second information processing device), a medical device 600, and an arm 700 that supports the medical device.
- the slave device 500 controls the medical device 600 or the arm 700 by the information processing device 501 based on the signal transmitted from the master device 100, thereby performing control based on the user's input.
- the information processing device 501 is a device that processes information in the slave device 500, and is, for example, a computer having an information processing circuit including a CPU and memory.
- the medical device 600 is a device that operates based on input from the master device 100, and is, for example, an imaging device such as a 3D imaging device or an endoscope, or an end effector equipped with an electric scalpel or forceps.
- an imaging device such as a 3D imaging device or an endoscope, or an end effector equipped with an electric scalpel or forceps.
- the slave device 500 includes multiple medical device 600 and arm 700 sets.
- the configuration may include a first arm having an imaging device and a second arm having an end effector.
- the end effector preferably has a detection device that detects contact with the affected area. Accordingly, by vibrating the input device or restricting the movement based on the pressure applied to the end effector, it is possible to provide appropriate feedback to the user.
- autostereoscopic image display devices capable of stereoscopically displaying content without using special glasses (hereinafter also referred to as autostereoscopic display or simply stereoscopic display) have been proposed.
- Such an autostereoscopic display can display images that are horizontally shifted for each viewing angle, and the user can view content stereoscopically by viewing different images with the left eye and the right eye.
- FIG. 2 to 4 are explanatory diagrams showing an example of stereoscopic display by a stereoscopic display.
- a stereoscopic object O11 an example of content
- O11 is displayed near the display surface 92 of the stereoscopic display.
- the user can observe from relatively free viewpoint positions. For example, when the user observes from viewpoint V11, object O11 is included in field of view F11, and when the user observes from viewpoint V12, object O11 is included in field of view F12.
- object O11 is included in field of view F12.
- the user perceives the depth of the object O11 based on the display surface 92, it is difficult to obtain a sufficient three-dimensional effect in the example of FIG. 2 in which the object O11 is displayed near the display surface 92.
- FIG. In the example shown in FIG. 2, since the entire object O11 exists in the vicinity of the display surface 92, the feeling of being displayed on a stereoscopic display is strong, and it is difficult to obtain a sufficient sense of coexistence.
- the object O21 is arranged on the front side of the display surface 92 of the stereoscopic display, and in the example shown in FIG.
- the field of view F21 includes all of the object O21 when the user observes from the viewpoint V21, but part of the object O21 is not included in the field of view F22 when the user observes from the viewpoint V22. Therefore, when the user observes from the viewpoint V22, a so-called sticking effect (frame effect) may occur, making it difficult to stereoscopically observe the object O21.
- parallax binocular parallax
- the amount of parallax (binocular parallax) perceived by the user is greater than in the example shown in FIG.
- content is viewed with a large amount of parallax, it becomes difficult to fuse images, and fatigue and motion sickness are likely to occur, which may increase the burden on the user.
- the field of view of the user is limited (narrowed), and the viewpoint position at which the user can stereoscopically observe the object O31 may be more limited than in the example shown in FIG.
- the field of view F31 includes all of the object O31 when the user observes from the viewpoint V31, but part of the object O31 is not included in the field of view F32 when the user observes from the viewpoint V32.
- the user's field of view is a range obtained by combining the field of view F32 and the field of view F33, the field of view for the user to observe the content is limited to the field of view F32 because the object O31 is displayed on the far side of the display surface 92.
- FIG. 5 is an explanatory diagram for explaining the outline of this embodiment.
- the stereoscopic display is installed so that the display surface 92 of the stereoscopic display is perpendicular to the horizontal plane in real space.
- the stereoscopic display is installed so that the display surface 12 of the stereoscopic display is oblique (non-perpendicular) to the horizontal plane in the real space.
- the object O1 three-dimensional object is arranged so as to stand upright with respect to the horizontal plane and intersect the display surface 12 .
- the object O1 when the user observes from the viewpoint V1, the object O1 is included in the field of view F1, and when the user observes from the viewpoint V2, the object O1 is included in the field of view F2, so the user can observe from a wider viewpoint position.
- the medical staff can perform surgery more efficiently by providing the user with a surgical field image that is virtually reproduced as if it exists in space.
- FIG. 6 is a block diagram showing a configuration example of the information processing apparatus 1 according to this embodiment.
- the information processing apparatus 1 includes a display unit 10, a display surface detection unit 20, a user detection unit 30, a storage unit 40, and a control unit 50.
- the display unit 10 is a display that displays a stereoscopic image under the control of the control unit 50, which will be described later.
- the display unit 10 may be an autostereoscopic display (autostereoscopic image display device).
- the display surface 12 of the display unit 10 displays a stereoscopic image.
- FIG. 7 is an explanatory diagram showing an installation example of the display unit 10 according to this embodiment.
- the display unit 10 may have a mechanism (housing) that supports the display surface 12. As shown in FIG.
- the distance from the lower end 122 of the display surface 12 to the upper end 124 of the display surface 12 is L1.
- a space whose diagonal sides are the lower end 122 of the display surface 12 and the upper end 124 of the display surface 12 is set as the drawing space B10 by the controller 50, which will be described later.
- the bottom surface B12 of the drawing space B10 is a plane horizontal to the real space (horizontal plane in the real space).
- the display unit 10 is arranged such that the horizontal plane (bottom surface B12) of the real space and the display surface 12 form an angle ⁇ 1 greater than 0° and less than 90°.
- the arrangement method is not particularly limited, and such arrangement may be realized depending on the shape of the display unit 10, or such arrangement may be realized by supporting the display unit 10 by a device (such as a stand) different from the display unit 10.
- the display surface detection unit 20 shown in FIG. 6 detects the orientation of the display surface 12 and provides it to the control unit 50 .
- the orientation of the display surface 12 detected by the display surface detection unit 20 may be, for example, the angle formed by the horizontal plane in real space and the display surface 12 (angle ⁇ 1 shown in FIG. 6).
- the display surface detection unit 20 may be implemented by, for example, an acceleration sensor, a gyro sensor, or a magnetic sensor having a predetermined relationship with the display surface 12, or by a combination of the above.
- the user detection unit 30 detects information about the user viewing the stereoscopic image and provides the information to the control unit 50 .
- the user detection unit 30 may detect the position and orientation of the user, and the position and orientation of the user may be the position and orientation relative to the display surface 12, for example.
- the position of the user may be the position of the user's eyes, or may be the position of the user's left eye and the position of the user's right eye.
- the posture of the user may be the orientation of the user's face, or may be the line-of-sight directions of the user's left and right eyes.
- the user detection unit 30 may be realized by, for example, one of a camera, a depth camera, a motion sensor, or a combination of the above.
- the storage unit 40 stores programs and parameters for each component of the information processing device 1 to function.
- the storage unit 40 may store content data.
- the content data stored in the storage unit 40 may include, for example, three-dimensional objects (3D images), two-dimensional objects (2D images), audio data, and the like.
- the storage unit 40 may include information about the display surface, and the information about the display surface may include, for example, information on the distance from the lower end to the upper end of the display surface (distance L1 shown in FIG. 7).
- the control unit 50 controls each configuration of the information processing device 1 .
- the control unit 50 also functions as a display surface information acquisition unit 52, a user information acquisition unit 54, a content acquisition unit 56, and a display control unit 58, as shown in FIG.
- the display surface information acquisition unit 52 acquires display surface information regarding the display surface 12 .
- the display surface information acquiring unit 52 may acquire from the display surface detecting unit 20 information on the angle formed between the horizontal plane in the real space and the display surface 12 (posture of the display surface 12) as the display surface information.
- the display surface information acquisition unit 52 may acquire information on the distance from the lower end to the upper end of the display surface from the storage unit 40 as the display surface information.
- the display surface information acquisition unit 52 provides the acquired display surface information to the display control unit 58 .
- the user information acquisition unit 54 acquires user information about the user. For example, the user information acquisition unit 54 acquires the position of the user (the position of the user's left eye and the position of the user's right eye) and posture information from the user detection unit 30 as the user information.
- the user information acquisition unit 54 may acquire information on the user's position and posture from the user detection unit 30 directly or indirectly. For example, if the user detection unit 30 is a camera directed toward the viewing direction of the display surface 12, the user information acquisition unit 54 may identify and indirectly acquire the user information based on the image provided from the user detection unit 30. The user information acquisition unit 54 provides the acquired user information to the display control unit 58 .
- the content acquisition unit 56 acquires content data related to display. For example, the content acquisition unit 56 reads and acquires content data stored in the storage unit 40 . The content acquisition unit 56 provides the acquired content data to the display control unit 58 .
- the display control unit 58 controls display on the display unit 10 based on the display surface information provided by the display surface information acquisition unit 52, the user information provided by the user information acquisition unit 54, and the content data provided by the content acquisition unit 56.
- the display control unit 58 may render the drawing space described with reference to FIG. 7 according to the user's position (left-eye and right-eye positions), and perform orthogonal projection according to the user's position to generate stereoscopic images (left-eye image and right-eye image) and display them on the display surface 12 of the display unit 10.
- the display control unit 58 does not need to generate images for all viewpoints (viewing angles) that can be displayed by the display unit 10, and only needs to generate images for two viewpoints, so the amount of processing can be suppressed. Details of the generation of the stereoscopic image according to the position of the user will be described later with reference to FIG. 10 .
- the display control unit 58 may display a stereoscopic image on the display surface 12 of the display unit 10 so that a floor surface (first plane) parallel to the horizontal plane can be observed in an area corresponding to the distance from the lower end to the upper end of the display surface 12 and the angle between the horizontal plane and the display surface 12 in real space.
- the display control unit 58 may display a stereoscopic image on the display surface 12 of the display unit 10 so that the rear wall surface (second plane) that is in contact with the upper end of the display surface and is perpendicular to the floor surface (i.e., perpendicular to the horizontal plane) can be observed.
- the display control unit 58 may display a stereoscopic image on the display surface 12 of the display unit 10 so that a stereoscopic object placed on the floor can be observed.
- FIG. 8 is an explanatory diagram for explaining the stereoscopic effect perceived by the user from the stereoscopic image displayed by the display control unit 58.
- FIG. 8 is an explanatory diagram for explaining the stereoscopic effect perceived by the user from the stereoscopic image displayed by the display control unit 58.
- a floor plane P1 parallel to the horizontal plane is observed.
- the floor P1 is desirably arranged so as to be in contact with the lower end of the display surface 12.
- the rear wall surface P2 perpendicular to the floor surface P1 is preferably arranged so as to contact the upper end of the display surface 12. As shown in FIG. With such a configuration, it is possible to draw such that the binocular parallax and the motion parallax of the image displayed on the lower end and the upper end of the display surface 12 are reduced (for example, zero).
- the display control unit 58 since the display control unit 58 generates a stereoscopic image according to the user's position, there is a risk of user position detection error and latency (delay).
- delay By drawing so that binocular parallax and motion parallax are small at the lower end and the upper end of the display surface 12, even if detection error or latency occurs, fluctuations are less likely to occur near the lower end and the upper end of the display surface 12, and the user is less likely to feel that the position of the entire content moves (shifts).
- the length L11 of the floor surface P1 in the depth direction and the height L12 of the rear wall surface P2 can be expressed by the following equations (1) and (2) using the distance L1 and the angle ⁇ 1 shown in FIG.
- L11 L1 ⁇ cos ⁇ 1
- L21 L1 ⁇ sin ⁇ 1 (2)
- the area where the floor surface P1 and the rear wall surface P2 are observed is an area corresponding to the actual physical length and arrangement of the display surface 12, so that the user can easily recognize (fusion) the floor surface and the rear wall surface.
- the user can easily recognize the space to be drawn (drawing space), and the burden associated with stereoscopic observation is reduced.
- the three-dimensional object O2 placed on the floor P1 is preferably placed so as to intersect the display surface 12.
- the user perceives parallax according to the distance D1 in the depth direction (retraction direction) and the distance D2 in the projection direction (front direction) as shown in FIG. 8, and can obtain a stereoscopic effect in both the depth direction and the projection direction.
- a micro-optical lens is provided on the display surface that constitutes the display surface 12. It is preferable to divide the 4K video signal converted by XGL with the micro-optical lens to generate images for the right eye and images for the left eye, and provide the images to the user.
- the display is preferably an active retarder type display panel.
- An active retarder type display panel is, for example, a display panel in which a liquid crystal cell, a polarizer, and an active retarder in which ⁇ /4 is alternately exchanged are stacked, and the active retarder is alternately switched to ⁇ /4 according to the display of the liquid crystal cell.
- the user can perceive the position and shape of the three-dimensional object O2 based on the floor surface P1 and the rear wall surface P2 recognized as described above, and can obtain a higher sense of coexistence.
- the content displayed by the display control unit 58 is not limited to the above example.
- the display control unit 58 may display a stereoscopic image on the display surface 12 of the display unit 10 so that a side wall surface (third plane) that is in contact with the left end or right end of the display surface and is perpendicular to the floor surface (i.e., perpendicular to the horizontal surface) can be observed.
- a side wall surface third plane
- the display control unit 58 may display a stereoscopic image on the display surface 12 of the display unit 10 so that a side wall surface (third plane) that is in contact with the left end or right end of the display surface and is perpendicular to the floor surface (i.e., perpendicular to the horizontal surface) can be observed.
- a side wall surface third plane
- the binocular parallax and motion parallax of the images displayed on the left end and the right end of the display surface 12 are small (for example, zero).
- FIG. 9 is a flow chart showing an operation example of this embodiment.
- 10 and 11 are explanatory diagrams for explaining display control processing by the display control unit 58 of the present embodiment.
- the display surface information acquisition unit 52 first acquires display surface information about the display surface 12 (the angle formed by the horizontal plane in real space and the display surface 12, and the distance from the lower end to the upper end of the display surface) (S102).
- the display control unit 58 sets the drawing space based on the display surface information (S104). For example, as shown in FIG. 10, the display control unit 58 may set a drawing space B10 having the upper and lower ends of the display surface 12 as diagonal sides.
- the display control unit 58 arranges the floor surface, the wall surface (back wall surface, side wall surface), and the three-dimensional object in the drawing space (S106). For example, as shown in FIG. 10, the display control unit 58 arranges the floor surface P1, the rear wall surface P2, the side wall surface P3 (P3a, P3b), and the three-dimensional object O3 on the far side of the display surface 12 in the drawing space.
- the user information acquisition unit 54 acquires the positions of the user's left and right eyes and the user's posture as user information (S108).
- the display control unit 58 virtually installs a virtual camera Vc using the position of the user's eyes as a virtual viewpoint, renders the drawing space described with reference to FIG. 10, and acquires a viewpoint image from the virtual viewpoint (S110).
- the display control unit 58 may set the direction of the virtual camera based on the user's posture (face orientation or line-of-sight direction), or may set the direction of the virtual camera so that the virtual camera faces the display surface 12.
- the display control unit 58 virtually arranges the virtual projector Vpj in the same position and direction as the virtual camera Vc, and projects the viewpoint image acquired in step S110 from the virtual projector Vpj onto the virtual display surface Vd that is virtually installed on the display surface 12 (S112).
- the display image R1 is a distorted image when viewed from a position other than the current viewpoint position (user's eye position), but is perceived as a normal image when viewed from the current viewpoint position.
- FIG. 11 only shows the processing for one viewpoint
- the display control unit 58 performs the above processing for the two viewpoints of the left eye and the right eye, and acquires the display image for the left eye and the display image for the right eye. Then, the display control unit 58 causes the display surface 12 to display the combination of the display image for the left eye and the display image for the right eye so that the user perceives it as a stereoscopic image (S116).
- steps S108 to S116 may be performed repeatedly as shown in FIG.
- the flowchart shown in FIG. 9 is an example, and the operation of the present embodiment is not limited to this example.
- the entire processing of steps S102 to S116 may be performed repeatedly, and in such a case, the display control unit 58 can perform display control according to changes in the angle between the horizontal plane and the display surface 12.
- FIG. 12 is a diagram showing an example of the configuration of a surgical robot system.
- FIG. 12 is a diagram showing an example of a configuration of a surgical robot system according to this embodiment.
- the surgical robot system of this embodiment includes an information processing device, a display device and an input device included in the master device, and a medical imaging device and an end effector included in the slave device.
- the display device is a spatial reproduction display that stereoscopically displays surgical images captured by a medical imaging device in a virtual space.
- the information processing device preferably includes an FPGA for driving a spatial reproduction display, which is a display device.
- the information processing device may further include an information processing device for controlling the display device, which is different from the information processing device (first information processing device) that controls the master device, or may be an information processing device integrated with the display device.
- FIG. 13 is a diagram showing an example of a synthesized image.
- the information processing device hybrid-synthesizes a camera 3D image acquired from a medical imaging device and a spatially reproduced CG image from a viewpoint position stored in advance to generate a hybrid synthesized image in 3D space (hereinafter referred to as a synthesized image).
- a synthesized image a hybrid synthesized image in 3D space
- FIG. 14 is a diagram showing an example of a synthesized image.
- the information processing apparatus causes the display device to display a synthesized image obtained by synthesizing a camera 3D image acquired from a medical imaging apparatus and a spatial CG for giving a greater sense of spatial reproduction.
- Spatial CG is an image different from the camera 3D image, and consists of a base unit that gives the user a sense of depth, intraoperative information such as vital signs and a timer, reference information parts that indicate reference information such as preoperative information such as X-ray images and MRI images stored in advance, and interactive parts that can be arranged and rotated interactively, such as blood vessel CG.
- the base unit is preferably a spatial CG configured so that the user feels as if there is a vanishing point in the depth direction of the screen.
- FIG. 15 is a diagram showing an example of space CG.
- a spatial CG consists of a base unit, reference information parts, and interactive parts.
- the base unit is the part that does not change when the camera 3D image changes.
- Reference information parts are parts that display reference information such as vital signs.
- Interactive parts are parts that can be interactively displayed based on user control or the like.
- FIG. 16 is a diagram showing the difference in synthesized images depending on viewpoint positions.
- FIG. 10 is a diagram assuming a case in which a synthetic image is virtually captured from diagonally upper left, diagonally upper, and front when the user is facing the display device.
- FIG. 17 is a diagram showing an example of a synthesized image.
- a white polygonal line that the user feels as if there is a right angle in the front direction of the screen is displayed.
- the polygonal line may be a color other than white.
- the broken line is displayed such that the broken line follows the movement of the user's eyes and face. As a result, the user can perceive objects on the spatial reproduction display as if they existed closer to the front of the screen.
- two polygonal lines are displayed.
- FIG. 18 is a diagram showing an example of a synthesized image.
- the left diagram in FIG. 18 is an example in which the camera image is displayed facing the user's viewpoint position
- the right diagram in FIG. 18 is an example in which the 3D image is displayed fixedly regardless of the user's viewpoint position.
- the camera 3D image and space CG may always follow the user's viewpoint position and face position and may be displayed facing the user, or one of the images may be fixed and displayed.
- FIG. 19 is a diagram showing an example of a synthesized image.
- the space CG may be displayed superimposed on the camera 3D image so that the user feels that it is arranged in the front direction of the screen.
- the planar image is displayed on the near side and the stereoscopic image is displayed on the far side, so that the user can feel a more three-dimensional effect.
- preoperative information and intraoperative information are preferably displayed on the near side. As a result, the user can perform surgery while keeping preoperative information and intraoperative information in sight at all times.
- reference information parts including intraoperative information and timers displayed at the bottom of the screen may be displayed dark only when the user's viewpoint overlaps the reference information parts, and may be displayed translucent otherwise.
- FIG. 20 is a diagram showing an example of a synthesized image.
- the composite image is preferably displayed by arranging surgical images (camera 3D images), preoperative information and intraoperative information (reference information parts), and CG models (interactive parts) in a virtual space on a spatial image (base unit).
- a CG model is, for example, a 3D model generated from preoperative information.
- FIG. 21 is a diagram showing an example of a synthesized image.
- the synthesized image may have a layer structure.
- a plurality of X-ray images as preoperative information, which are reference information parts it is preferable to display the second X-ray image in virtual space in front of the first X-ray image. Older images may be displayed behind the surgical video. As a result, it is possible to prevent the surgical image from being obscured by the reference information parts that have already been confirmed by the user.
- FIG. 22 is a diagram showing an example of a synthesized image.
- the display of the composite image may be changed based on the priority of each layer based on the angle at which the user views the screen or the line of sight.
- the surgical image is a base layer with a high priority that can be seen at any angle or line-of-sight position.
- FIG. 23 is a diagram showing an example of a synthesized image.
- the composite image may be switched from 3D display to 2D display when the position of the face deviates from a predetermined range.
- the user can always view the synthesized image while avoiding 3D failure due to the position of the face deviating from the predetermined range.
- the preoperative information display method may be changed. For example, it is preferable to increase the transmittance of the preoperative information displayed semitransparently as the position of the face deviates from the predetermined range. As a result, it is possible to avoid a situation in which the preoperative information is superimposed on the surgical image, making it difficult for the user to visually recognize the surgical image.
- FIG. 24 is a diagram showing an example of a synthesized image. As shown in FIG. 24, the composite image may display operating room information including the state of the operating room as a reference information part.
- FIG. 25 is a diagram showing an example of a synthesized image.
- the left diagram in FIG. 25 shows a case where the angle (parallax) of the synthesized image is reduced, and the right diagram in FIG. 25 shows a case where the angle of the synthesized image is increased.
- the user can feel the depth direction more.
- FIG. 26 is a diagram showing an example of a synthesized image. As shown in FIG. 26, it is preferable that the virtual display plane of the surgical image (base layer) of the synthesized image (stereoscopic image) is displayed parallel to the angle at which the user looks into the display plane.
- FIG. 27 is a diagram showing an example of a synthesized image. As shown in FIG. 27, when the CG image is superimposed on the surgical video, it is preferable to place the CG image at a predetermined position on the surgical video while moving the CG image. This makes it easier for the user to grasp at which position of the surgical video the CG image is matched and superimposed in the depth direction.
- FIG. 28 is a diagram showing another configuration example of a surgical robot system that displays a CG image on a composite image in conjunction with an input device.
- the surgical robot system of the present embodiment displays the user's hand movement and finger movement on the display device in conjunction with the output of the input device capable of detecting the user's hand movement and finger movement, thereby providing the user with a sense of directly interfering with an object in a more virtual space.
- CG showing the movement of the user's hand or fingers may be superimposed on the area where the end effector is shown.
- Voice recognition and line-of-sight recognition may be added as input devices.
- FIG. 30 is a diagram showing an example of a synthesized image.
- a CG image may be synthesized with the synthesized image.
- a CG image showing hand movement or finger movement may be superimposed on the screen, and the CG image may be fixed or changed in conjunction with user input.
- the system represents an entire device composed of multiple devices.
- a spatial reproduction display capable of displaying surgical images visually recognized as stereoscopic in a three-dimensional space formed by a plane passing through the lower side of the display surface of the monitor and the normal plane of the installation surface, and a plane passing through the upper side of the display surface and the plane parallel to the installation surface; an imaging device that captures the surgical image; an operating device connected to the spatial reproduction display;
- a surgical robot system comprising: a surgical robot that is operated by an operator's operation via the operating device.
- the surgical robot system according to (1) above wherein the spatial reproduction display displays a composite image obtained by synthesizing the surgical image, reference information indicating preoperative information or intraoperative information, and a base unit that enhances the stereoscopic effect of the surgical image.
- the spatial reproduction display displays a composite image obtained by synthesizing the surgical image, reference information indicating preoperative information or intraoperative information, a base unit that enhances the stereoscopic effect of the surgical image, and interactive parts that can be interactively arranged, moved, or rotated based on a user's operation.
- 1 information processing device 10 display unit, 12 display surface, 20 display surface detection unit, 30 user detection unit, 40 storage unit, 50 control unit, 52 display surface information acquisition unit, 54 user information acquisition unit, 56 content acquisition unit, 58 display control unit, 92 display surface, 100: master device, 101: information processing device, 122: lower end, 124: upper end, 200: input device, 300: support base, 400: display device, 500: slave device, 501: information processing device, 600: medical device, 700: arm
Landscapes
- Health & Medical Sciences (AREA)
- Engineering & Computer Science (AREA)
- Life Sciences & Earth Sciences (AREA)
- Surgery (AREA)
- Robotics (AREA)
- Medical Informatics (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Molecular Biology (AREA)
- Animal Behavior & Ethology (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Pathology (AREA)
- Processing Or Creating Images (AREA)
- Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
Abstract
Description
1.手術ロボットシステム
2.空間再現ディスプレイ
3.空間再現ディスプレイにより実現される手術映像
図1は、本技術の一実施の形態に係る手術ロボットシステムの構成を示す図である。
次に図2乃至図9を用いて、手術ロボットシステムの表示装置として適用される立体ディスプレイである空間再現ディスプレイについて説明する。
L11=L1×cosθ1 ・・・(1)
L21=L1×sinθ1 ・・・(2)
次に図12乃至図30を用いて、空間再現ディスプレイにより実現される手術映像について説明する。
(1)
モニタの表示面の下辺と設置面の法線面とが通る面と、表示面の上辺と設置面の平行面が通る面がなす立体空間で立体と視認される手術映像を表示可能な空間再現ディスプレイと、
前記手術映像を撮像する撮像装置と、
前記空間再現ディスプレイと接続された操作装置と、
前記操作装置を介した術者の操作により動作する手術ロボットと
を備える手術ロボットシステム。
(2)
前記空間再現ディスプレイは、前記手術映像と、術前情報または術中情報を示す参照情報と、前記手術映像の立体感を上げるベースユニットと、を合成した合成画像を表示する
前記(1)に記載の手術ロボットシステム。
(3)
前記空間再現ディスプレイは、前記手術映像と、術前情報または術中情報を示す参照情報と、前記手術映像の立体感を上げるベースユニットと、ユーザの操作に基づいてインタラクティブに配置移動または回転が可能なインタラクティブパーツを合成した合成画像を表示する
前記(1)または(2)に記載の手術ロボットシステム。
(4)
前記空間再現ディスプレイは、術前情報または術中情報を示す参照情報を前記手術映像の手前に表示されているように仮想空間上に配置された合成画像を表示する
前記(1)乃至(3)のいずれかに記載の手術ロボットシステム。
(5)
前記空間再現ディスプレイは、術前情報または術中情報を示す参照情報を前記手術映像の奥側に表示されているように仮想空間上に配置された合成画像を表示する
前記(1)乃至(4)に記載の手術ロボットシステム。
Claims (5)
- モニタの表示面の下辺と設置面の法線面とが通る面と、表示面の上辺と設置面の平行面が通る面がなす立体空間で立体と視認される手術映像を表示可能な空間再現ディスプレイと、
前記手術映像を撮像する撮像装置と、
前記空間再現ディスプレイと接続された操作装置と、
前記操作装置を介した術者の操作により動作する手術ロボットと
を備える手術ロボットシステム。 - 前記空間再現ディスプレイは、前記手術映像と、術前情報または術中情報を示す参照情報と、前記手術映像の立体感を上げるベースユニットと、を合成した合成画像を表示する
請求項1に記載の手術ロボットシステム。 - 前記空間再現ディスプレイは、前記手術映像と、術前情報または術中情報を示す参照情報と、前記手術映像の立体感を上げるベースユニットと、ユーザの操作に基づいてインタラクティブに配置移動または回転が可能なインタラクティブパーツを合成した合成画像を表示する
請求項1に記載の手術ロボットシステム。 - 前記空間再現ディスプレイは、術前情報または術中情報を示す参照情報を前記手術映像の手前に表示されているように仮想空間上に配置された合成画像を表示する
請求項1に記載の手術ロボットシステム。 - 前記空間再現ディスプレイは、術前情報または術中情報を示す参照情報を前記手術映像の奥側に表示されているように仮想空間上に配置された合成画像を表示する
請求項1に記載の手術ロボットシステム。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202380017168.0A CN118541106A (zh) | 2022-01-21 | 2023-01-06 | 外科手术机器人系统 |
JP2023575192A JPWO2023140120A1 (ja) | 2022-01-21 | 2023-01-06 | |
EP23743116.8A EP4467095A1 (en) | 2022-01-21 | 2023-01-06 | Surgical robot system |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2022-007631 | 2022-01-21 | ||
JP2022007631 | 2022-01-21 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2023140120A1 true WO2023140120A1 (ja) | 2023-07-27 |
Family
ID=87348700
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2023/000118 WO2023140120A1 (ja) | 2022-01-21 | 2023-01-06 | 手術ロボットシステム |
Country Status (4)
Country | Link |
---|---|
EP (1) | EP4467095A1 (ja) |
JP (1) | JPWO2023140120A1 (ja) |
CN (1) | CN118541106A (ja) |
WO (1) | WO2023140120A1 (ja) |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2010525838A (ja) * | 2007-04-16 | 2010-07-29 | ニューロアーム サージカル リミテッド | マニピュレータのツールの一軸に沿った移動を非機械的に制限および/またはプログラミングする方法、装置、およびシステム |
JP2015019679A (ja) * | 2013-07-16 | 2015-02-02 | セイコーエプソン株式会社 | 情報処理装置、情報処理方法、および、情報処理システム |
JP2019097890A (ja) | 2017-12-01 | 2019-06-24 | ソニー・オリンパスメディカルソリューションズ株式会社 | 医療用撮像装置 |
JP2019531117A (ja) * | 2016-10-03 | 2019-10-31 | バーブ サージカル インコーポレイテッドVerb Surgical Inc. | ロボット手術のための没入型三次元表示 |
JP2020162916A (ja) * | 2019-03-29 | 2020-10-08 | ソニー株式会社 | 制御装置及びマスタスレーブシステム |
JP2021177580A (ja) * | 2018-07-18 | 2021-11-11 | ソニーグループ株式会社 | 情報処理装置、情報処理方法、およびプログラム |
-
2023
- 2023-01-06 CN CN202380017168.0A patent/CN118541106A/zh active Pending
- 2023-01-06 WO PCT/JP2023/000118 patent/WO2023140120A1/ja active Application Filing
- 2023-01-06 JP JP2023575192A patent/JPWO2023140120A1/ja active Pending
- 2023-01-06 EP EP23743116.8A patent/EP4467095A1/en active Pending
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2010525838A (ja) * | 2007-04-16 | 2010-07-29 | ニューロアーム サージカル リミテッド | マニピュレータのツールの一軸に沿った移動を非機械的に制限および/またはプログラミングする方法、装置、およびシステム |
JP2015019679A (ja) * | 2013-07-16 | 2015-02-02 | セイコーエプソン株式会社 | 情報処理装置、情報処理方法、および、情報処理システム |
JP2019531117A (ja) * | 2016-10-03 | 2019-10-31 | バーブ サージカル インコーポレイテッドVerb Surgical Inc. | ロボット手術のための没入型三次元表示 |
JP2019097890A (ja) | 2017-12-01 | 2019-06-24 | ソニー・オリンパスメディカルソリューションズ株式会社 | 医療用撮像装置 |
JP2021177580A (ja) * | 2018-07-18 | 2021-11-11 | ソニーグループ株式会社 | 情報処理装置、情報処理方法、およびプログラム |
JP2020162916A (ja) * | 2019-03-29 | 2020-10-08 | ソニー株式会社 | 制御装置及びマスタスレーブシステム |
Also Published As
Publication number | Publication date |
---|---|
JPWO2023140120A1 (ja) | 2023-07-27 |
CN118541106A (zh) | 2024-08-23 |
EP4467095A1 (en) | 2024-11-27 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9723300B2 (en) | Stereoscopic display | |
EP3293972B1 (en) | Three-dimensional observation device for medical use, three-dimensional observation method for medical use, and program | |
JP3717653B2 (ja) | 頭部搭載型画像表示装置 | |
JP6305187B2 (ja) | 手術顕微鏡システム | |
US20140063198A1 (en) | Changing perspectives of a microscopic-image device based on a viewer' s perspective | |
US9210407B2 (en) | Image processing apparatus and method, and program | |
Breedveld et al. | Observation in laparoscopic surgery: overview of impeding effects and supporting aids | |
JPH09238369A (ja) | 3次元像表示装置 | |
JP2011164781A (ja) | 立体画像生成プログラム、情報記憶媒体、立体画像生成装置、及び立体画像生成方法 | |
JP2012230717A (ja) | 画像生成方法、画像生成プログラム及び画像投影装置 | |
JP2011085830A (ja) | 映像表示システム | |
JP2011205195A (ja) | 画像処理装置、プログラム、画像処理方法、椅子および観賞システム | |
JP4537916B2 (ja) | 医療用立体観察システム | |
JP2005312605A (ja) | 注視点位置表示装置 | |
JP5467683B2 (ja) | 立体映像表示装置の立体ノギス像形成装置およびプログラム | |
US20100259820A1 (en) | Stereoscopic image display | |
JP6618260B2 (ja) | 情報処理装置、情報処理方法、プログラム | |
WO2023140120A1 (ja) | 手術ロボットシステム | |
JP3425402B2 (ja) | 立体画像を表示する装置および方法 | |
US20250057617A1 (en) | Surgical robot system | |
JP4455419B2 (ja) | 手術用立体画像観察装置 | |
JP5582958B2 (ja) | 3dポインター装置 | |
US10330945B2 (en) | Medical image display apparatus, medical information processing system, and medical image display control method | |
JP4634863B2 (ja) | 立体視画像生成装置及び立体視画像生成プログラム | |
JP5331785B2 (ja) | 立体画像分析装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 23743116 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2023575192 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 18724202 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 202380017168.0 Country of ref document: CN |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2023743116 Country of ref document: EP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 2023743116 Country of ref document: EP Effective date: 20240821 |