TECHNICAL FIELD
The following generally relates to an ultrasound imaging system.
BACKGROUND
An ultrasound (US) imaging system has included an ultrasound probe with a transducer, a console with an internal or external display, and a keyboard. The transducer transmits an ultrasound signal into a field of view and receives echoes produced in response to the signal interacting with structure therein. The echoes are processed by the console, which generates images indicative of the structure that are visually presented in the display region.
An example of a suitable display region includes a screen (e.g., LCD, CRT, etc.) with a cover lens (e.g. made of glass). The cover lens provides a surface that is relatively easy to clean. Shattering of the cover lens can be mitigated through a bonding material applied between the cover lens and the screen. However, the cover lens adds at least two surface transitions, an air to cover lens transition and a cover lens to air transition. Unfortunately, both the air to cover lens transition and the cover lens to air transition produce reflections, which may deteriorate the optical perception of the visually presented image. Furthermore, the distance between the touch screen display and the cover lens may result in a parallax that decreases the accuracy of activation of the individual buttons on the touch screen display.
An example of a suitable keyboard includes a keyboard with a coherent, flat surface, without any holes, for example, a “touch screen” display with a cover lens. Unfortunately, the distance between the cover lens and the screen may deteriorate the optical perception of the visually presented image and may result in a parallax that decreases the accuracy of activation of the individual buttons on the “touch screen” display. Likewise, the cover lens provides a surface that is relatively easy to clean. Unfortunately, with such a keyboard, it may not be easy for a clinician to navigate an image while observing the image without looking away from the image in the display region and looking at the keyboard to locate and activate touch screen controls.
SUMMARY
Aspects of the application address the above matters, and others.
In one aspect, an ultrasound imaging system includes a probe with a transducer array with at least one transducer element that transmits ultrasound signal and receives echo signals produced in response thereto. The system further includes a console with a controller that controls the at least one element to transmit the ultrasound signals and receive the echo signals and an echo processor that processes the received echoes and generates images indicative thereof. The system further includes a user interface with at least one control for interacting with the console. The user interface includes at least one recessed physical feature that facilitates identifying, through sense of touch, an operation activated by the at least one control.
In another aspect, an ultrasound imaging system includes a probe with a transducer array with at least one transducer element that transmits ultrasound signal and receives echo signals produced in response thereto. The system further includes a console with a controller that controls the at least one element to transmit the ultrasound signals and receive the echo signals and an echo processor that processes the received echoes and generates images indicative thereof. The system further includes a display region with a screen, a cover lens and an optical coupling there between. A refractive index of the optical coupling is approximately the same of a refractive index of the cover lens and a refractive index of the screen.
In another aspect, a method includes providing an ultrasound console, interfacing a user interface with the ultrasound console, interfacing a display region with the ultrasound console and interfacing a transducer probe with the ultrasound console, and using the transducer probe to scan an object or subject under control of the console, wherein an operator of the system controls at least one operation via the user interface and generated images are displayed via the display region. The user interface includes at least one control for interacting with the console, wherein the user interface includes at least one recessed physical feature that facilitates identifying, through sense of touch, an operation activated by the at least one control, and the display region includes a screen, a cover lens and an optical coupling there between, wherein a refractive index of the optical coupling is approximately the same of a refractive index of the cover lens and a refractive index of the screen.
Those skilled in the art will recognize still other aspects of the present application upon reading and understanding the attached description.
BRIEF DESCRIPTION OF THE DRAWINGS
The application is illustrated by way of example and not limitation in the figures of the accompanying drawings, in which like references indicate similar elements and in which:
FIG. 1 schematically illustrates an example imaging system, including a user interface and a display region;
FIG. 2 illustrates an example of the user interface, which includes at least one control of FIG. 1;
FIG. 3 illustrates an example of a first recessed control of the user interface of FIG. 2;
FIG. 4 illustrates an example of a second recessed control of the user interface of FIG. 2;
FIG. 5 illustrates an example of a third recessed control of the user interface of FIG. 2;
FIG. 6 illustrates the user interface of FIG. 2, with the controls of FIGS. 3, 4 and 5.
FIG. 7 illustrates an example of the display region of FIG. 1 with a cover lens optically bonded to a screen and external light traversing the display region;
FIG. 8 illustrates an example of the display region of FIG. 1 with a cover lens optically bonded to a screen with light emitted by the screen and traversing the display region;
FIG. 9 illustrates a variation of the display region of FIG. 7 with the optical bond omitted therefrom;
FIG. 10 illustrates a variation of the display region of FIG. 8 with the optical bond omitted therefrom; and
FIG. 11 illustrates an example method in accordance with the embodiments disclosed herein.
DETAILED DESCRIPTION
FIG. 1 schematically illustrates an ultrasound (US) imaging system 102.
The system 102 includes a probe 104 with a one-dimensional (1D) or two-dimensional (2D) transducer array 106 with at least one transducer element 108. Suitable configurations include, but are not limited to, linear, curved (e.g., concave, convex, etc.), circular, etc.
The system 102 includes an ultrasound scanner console 110 that controls excitation of the probe 104, receives and processes ultrasound data from the probe 104, and generates images to display.
The system 102 includes a user interface 112 with at least one control for interacting with the console 110. As described in greater detail below, in one non-limiting instance, the user interface 122 includes at least one recessed physical feature that facilitates identifying, through sense of touch (or haptics), an operation activated by the at least one control.
The system 102 includes at least one display region 114 that displays images generated by the console 110. As described in greater detail below, in one non-limiting instance, the display region 124 includes a screen with a cover lens optically bonded thereto, which may improve image quality relative to a configuration in which the optical bonding is omitted.
FIG. 2 illustrates a top down view looking into an example of the user interface 122.
The illustrated user interface 122 includes a flat touch panel 202, in which predetermined regions thereof evoke actions in response to being actuated by simple or multi-touch gestures of the screen with one or more fingers, a stylus, a glove, etc. Suitable touchscreen panels includes, but are not limited to, resistive, projected capacitive, surface acoustic wave, infrared, optical, or piezoelectric.
The touchscreen can be, but is not limited to, a screen (e.g., liquid crystal display (LCD), thin film transistor liquid crystal display (TFT LCD), organic light-emitting diode (OLED) etc.) with a cover lens (e.g. made of glass) optically bonded thereto, which may mitigate parallax relative to a configuration in which the optical bonding is omitted.
The illustrated touch panel 202 further includes a plurality of touch sensitive controls 204. In the illustrated example, the touch sensitive controls 204 include N sets of controls 206 1, 206 2, . . . , 206 N, collectively referred to herein as sets of controls 206, where N is an integer equal to or greater than one.
The first set of controls 206 1 includes M controls 208 1, . . . , 208 M, (collectively referred to as controls 208), where M is an integer equal to or greater than one. The second set of controls 206 2 includes L controls 210 1, . . . , 210 L, (collectively referred to as controls 210), where L is an integer equal to or greater than one. The third set of controls 206 N includes K controls 212 1, . . . , 212 K, (collectively referred to as controls 212), where K is an integer equal to or greater than one.
As described herein, the user interface 122 includes at least one recessed physical feature that facilitates identifying, through sense of touch, an operation activated by the N sets of controls 206 1, 206 2, . . . , 206 N. With respect to FIG. 2, the user interface 122 includes a recessed physical feature in connection with each of the of touch sensitive controls 204. This is shown in greater detail in FIGS. 3, 4 and 5 respectively in connection with the N sets of controls 206 1, 206 2, . . . , 206 N.
Initially referring to FIG. 3, a cross sectional view of the touch panel 202 and the control 208 1 along line A-A of FIG. 2 is illustrated.
The touch panel 202 has a thickness 302 and a major surface 304. The control 208 1 has a thickness 306 (which is less than the thickness 302 of the touch panel 202), a flat recess surface 308 and a diameter 310, and is located in a recess 312 in the major surface 304. A transition surface 314 extends from the major surface 304 into the touch panel 202 to the flat recess surface 308, forming the recess 312.
With the configuration of FIG. 3, the at least one recessed physical feature includes the transition surface 314 and the flat recess surface 308. The transition surface 314 and the flat recess surface 308 facilitate identifying a location of the control 208 1 and the flat recess surface 308 identifies the control 208 1. For example, in the illustrated embodiment, the transition surface 314 and the flat recess surface 308 facilitates identifying the control 208 1 as a touch pad area. As discussed herein, the touch pad area can be used to control a cursor displayed in the display region 124, for example, cursor movement.
Turning to FIG. 4, a cross sectional view of the touch panel 202 and the control 210 1 along line B-B of FIG. 2 is illustrated. Again, the touch panel 202 has the thickness 302 and the major surface 304.
The control 210 1 includes a first recess 402, which is located in the major surface 304, and a second recess 404, which is located in the first recess 402. The first recess 402 has a first thickness 406 (which is less than the thickness 302 of the touch panel 202), a first recess surface 408, and a first protrusion 410. The second recess 404 has a second thickness 412 (which is less than the first thickness 406 of the first recess 402) and a second recess surface 414.
A first transition 416 extends from the major surface 304 into the touch panel 202 to the first recess surface 408, forming the first recess 402. The protrusion 410 is located in the first recess 402, spaced apart from the first transition surfaces 416 by a non-zero distance. A second transition surface 418 extends from the first recess 402 further into the touch panel 202 to the second recess surface 414, forming the second recess 404.
With the configuration of FIG. 4, the at least one recessed physical feature includes the first transition 416 and the protrusion 410, which facilitate identifying a location of a first sub-control of the control 210 1 and, in particular, a rotary or other control, which is located in the first recess 402, between the first transition 416 and the protrusion 410. The rotary control is actuated by sliding an object on the first recess surface 408.
The at least one recessed physical feature further includes the second transition 418, which facilitates identifying a location of a second sub-control of the control 210 1 and, in particular, a push button or other control, which is located in the second recess 404, within the second transition 418. The push button control is actuated by pushing down on the second recess surface 414.
The illustrated control 210 1 is a multi-function control. The illustrated control 210 1 includes two sub-controls; however, in another embodiment, the control 210 1 includes more than two sub-controls.
Next at FIG. 5, a cross sectional view of the touch panel 202 and the control 212 1 along line C-C of FIG. 2 is illustrated. Again, the touch panel 202 has the thickness 302 and the major surface 304.
The control 212 1 includes a first recess 502, which is located in the major surface 304, and a second recess 504, which is located in the first recess 502. The first recess 502 has a first thickness 506 (which is less than the thickness 302 of the touch panel 202) and a first recess surface 508, and the second recess 504 has a second thickness 510 (which is less than the first thickness 506 of the first recess 502) and a second recess surface 512.
A first transition 514 s extend from the major surface 304 into the touch panel 202 to the first recess surface 508, forming the first recess 502. A second transition surface 516 extends from the first recess 502 further into the touch panel 202 to the second recess surface 512, forming the second recess 504.
With the configuration of FIG. 5, the at least one recessed physical feature includes the first transition 514, the first recess surface 508, the second transition 516, and the second recess surface 512 of the second recess 504, which facilitate identifying a location of the control 212 1 and, in particular, a push button or other control, which is located in the second recess 504, within the second transition 516. The push button control is actuated by pushing down on the second recess surface 512.
FIG. 6 illustrates the top down view of FIG. 2, in which the N sets of controls 206 1, 206 2, . . . , 206 N are configured as discussed in connection with FIGS. 3, 4 and 5. However, it is to be appreciated that the sets of controls 206 1, 206 2, . . . , 206 N can include more or less controls and/or similar or different controls. Furthermore, the illustrated geometry of the controls in non-limiting. For example, other shapes (e.g., elliptical, rectangular, etc.) and/or relative sizes are contemplated herein.
Moreover, the N sets of controls 206 1, 206 2, . . . , 206 N may include textured surfaces. For example, the rotary or other control of the sets of controls 206 2 may include roughness, which facilitates sliding a finger along the surface and mitigating having the finger stick to the surface. Likewise, one of more of the other sets of controls 206 1, 206 2, . . . , 206 N may or may not have roughness.
FIGS. 7 and 8 illustrate an example of the display region 114 of the console 110 and/or an example of the user interface 112.
The display region 114 includes a screen 704 with a cover lens 702. The screen 704 and the cover lens 702 are coupled via an optical coupling 706. The optical coupling 706 has a refractive index that substantially matches a refractive index of the screen 704 and the cover lens 702. For example, wherein the refractive index of the screen 704 is approximately 1.5 and the refractive index of the cover lens 702 is approximately 1.5, a suitable refractive index of the optical coupling 706 is 1.5.
The screen 704 can be a LCD, a TFT-LCD, an OLED, and/or other screen. The material 706 can be a liquid, a gas, a gel, a glue (e.g., silicon or other), a laminate, a plastic, a foil, and/or other optical coupling with suitable optical properties, that is, for matching the refractive index of the screen 704 and/or cover lens 702.
As shown in FIG. 7, external light 708 traverses the cover lens 702/optical coupling 706 interface 710 and the optical coupling 706/the screen 704 interface 712 with no or substantially little reflection. As shown in FIG. 8, emitted light 802 from the screen 704 traverses the screen 704/the optical coupling 706 interface 712 and the optical coupling 706/the cover lens 702 interface 710 with no or substantially little reflection.
As a result, features visually displayed on the screen 704 and under the cover lens 702 are seen by the user on the cover lens 702 over the actual location of the features visually displayed on the screen 704. This is in contrast to a configuration in which the optical coupling 706 is omitted and air resides between the screen 704 and the cover lens 702, where the features seen by the user on the cover lens 702 are shifted (parallax) from the actual location of the features visually displayed on the screen 704.
FIGS. 9 and 10 show embodiments in which the optical coupling 706 is omitted. In this embodiment, air 900 is located between the cover lens 702 and the screen 704. As shown, external light 902 refracts at the cover lens 702/air 900 interface 904 and off the air 900/the screen 704 interface 906. Likewise, emitted light 1002 refracts at the screen 704/air 900 interface 906 and at the air 900/the cover lens interface 904.
FIG. 11 illustrates a method in accordance with the embodiments disclosed herein.
It is to be appreciated that the order of the following acts is provided for explanatory purposes and is not limiting. As such, one or more of the following acts may occur in a different order. Furthermore, one or more of the following acts may be omitted and/or one or more additional acts may be added.
At 1102, an ultrasound console is provided.
At 1104, a user interface is interfaced with the ultrasound console. As discussed herein, the user interface includes at least one control for interacting with the console, wherein the user interface includes at least one recessed physical feature that facilitates identifying, through sense of touch, an operation activated by the at least one control.
At 1106, a display region is interfaced with the ultrasound console. As discussed herein, the display region includes a screen, a cover lens and an optical coupling there between, wherein a refractive index of the optical coupling is approximately the same of a refractive index of the cover lens and a refractive index of the screen.
At 1108, a transducer probe is interfaced with the ultrasound console.
At 1110, the transducer probe is used to scan an object or subject under control of the console, wherein an operator of the system controls at least one operation via the user interface and generated images are displayed via the display region.
The application has been described with reference to various embodiments. Modifications and alterations will occur to others upon reading the application. It is intended that the invention be construed as including all such modifications and alterations, including insofar as they come within the scope of the appended claims and the equivalents thereof.