CN116135150A - Ultrasonic imaging equipment and ultrasonic inspection method thereof - Google Patents
Ultrasonic imaging equipment and ultrasonic inspection method thereof Download PDFInfo
- Publication number
- CN116135150A CN116135150A CN202111355905.7A CN202111355905A CN116135150A CN 116135150 A CN116135150 A CN 116135150A CN 202111355905 A CN202111355905 A CN 202111355905A CN 116135150 A CN116135150 A CN 116135150A
- Authority
- CN
- China
- Prior art keywords
- section
- ultrasonic
- user
- instruction
- displaying
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 90
- 238000007689 inspection Methods 0.000 title claims abstract description 63
- 238000003384 imaging method Methods 0.000 title claims abstract description 30
- 238000002604 ultrasonography Methods 0.000 claims abstract description 223
- 238000005259 measurement Methods 0.000 claims abstract description 164
- 208000024891 symptom Diseases 0.000 claims abstract description 50
- 230000000875 corresponding effect Effects 0.000 claims description 61
- 230000004044 response Effects 0.000 claims description 41
- 239000000523 sample Substances 0.000 claims description 34
- 230000003213 activating effect Effects 0.000 claims description 25
- 230000005540 biological transmission Effects 0.000 claims description 12
- 238000003860 storage Methods 0.000 claims description 9
- 230000001276 controlling effect Effects 0.000 claims description 8
- 230000002596 correlated effect Effects 0.000 claims description 6
- 238000002592 echocardiography Methods 0.000 claims description 5
- 230000008569 process Effects 0.000 description 20
- 210000002216 heart Anatomy 0.000 description 18
- 230000006870 function Effects 0.000 description 17
- 230000001960 triggered effect Effects 0.000 description 15
- 210000004072 lung Anatomy 0.000 description 13
- 238000012285 ultrasound imaging Methods 0.000 description 13
- 238000012545 processing Methods 0.000 description 11
- 230000002159 abnormal effect Effects 0.000 description 10
- 238000010586 diagram Methods 0.000 description 9
- 210000001519 tissue Anatomy 0.000 description 9
- 230000003993 interaction Effects 0.000 description 8
- 238000002372 labelling Methods 0.000 description 8
- 230000004048 modification Effects 0.000 description 7
- 238000012986 modification Methods 0.000 description 7
- 230000035939 shock Effects 0.000 description 7
- 241000238097 Callinectes sapidus Species 0.000 description 6
- 208000005228 Pericardial Effusion Diseases 0.000 description 6
- 210000001015 abdomen Anatomy 0.000 description 6
- 230000008901 benefit Effects 0.000 description 6
- 238000005520 cutting process Methods 0.000 description 6
- 230000005856 abnormality Effects 0.000 description 4
- 238000004590 computer program Methods 0.000 description 4
- 210000000056 organ Anatomy 0.000 description 4
- 238000013441 quality evaluation Methods 0.000 description 4
- 208000004756 Respiratory Insufficiency Diseases 0.000 description 3
- 208000027418 Wounds and injury Diseases 0.000 description 3
- 210000000702 aorta abdominal Anatomy 0.000 description 3
- 238000013136 deep learning model Methods 0.000 description 3
- 238000011156 evaluation Methods 0.000 description 3
- 238000001303 quality assessment method Methods 0.000 description 3
- 201000004193 respiratory failure Diseases 0.000 description 3
- 230000000007 visual effect Effects 0.000 description 3
- 230000003187 abdominal effect Effects 0.000 description 2
- 230000009471 action Effects 0.000 description 2
- 210000004204 blood vessel Anatomy 0.000 description 2
- 230000003139 buffering effect Effects 0.000 description 2
- 238000002680 cardiopulmonary resuscitation Methods 0.000 description 2
- 239000003086 colorant Substances 0.000 description 2
- 230000008878 coupling Effects 0.000 description 2
- 238000010168 coupling process Methods 0.000 description 2
- 238000005859 coupling reaction Methods 0.000 description 2
- 210000003734 kidney Anatomy 0.000 description 2
- 238000010801 machine learning Methods 0.000 description 2
- 239000000463 material Substances 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 238000004886 process control Methods 0.000 description 2
- 210000001562 sternum Anatomy 0.000 description 2
- 206010063045 Effusion Diseases 0.000 description 1
- 208000002151 Pleural effusion Diseases 0.000 description 1
- 206010059162 Ventricular dyskinesia Diseases 0.000 description 1
- 210000003484 anatomy Anatomy 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 230000015572 biosynthetic process Effects 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 238000003066 decision tree Methods 0.000 description 1
- 230000008034 disappearance Effects 0.000 description 1
- 201000010099 disease Diseases 0.000 description 1
- 208000037265 diseases, disorders, signs and symptoms Diseases 0.000 description 1
- 230000005284 excitation Effects 0.000 description 1
- 239000000284 extract Substances 0.000 description 1
- 210000003191 femoral vein Anatomy 0.000 description 1
- 238000001914 filtration Methods 0.000 description 1
- 230000000977 initiatory effect Effects 0.000 description 1
- 208000014674 injury Diseases 0.000 description 1
- 210000004185 liver Anatomy 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 239000003550 marker Substances 0.000 description 1
- 238000013178 mathematical model Methods 0.000 description 1
- 230000008520 organization Effects 0.000 description 1
- 210000004197 pelvis Anatomy 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 201000003144 pneumothorax Diseases 0.000 description 1
- 230000035935 pregnancy Effects 0.000 description 1
- 210000000952 spleen Anatomy 0.000 description 1
- 238000003786 synthesis reaction Methods 0.000 description 1
- 230000008733 trauma Effects 0.000 description 1
- 210000001631 vena cava inferior Anatomy 0.000 description 1
- 230000002861 ventricular Effects 0.000 description 1
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/54—Control of the diagnostic device
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/46—Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/46—Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
- A61B8/461—Displaying means of special interest
- A61B8/465—Displaying means of special interest adapted to display user selection data, e.g. icons or menus
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/46—Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
- A61B8/467—Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient characterised by special input means
- A61B8/469—Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient characterised by special input means for selection of a region of interest
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Engineering & Computer Science (AREA)
- Biomedical Technology (AREA)
- Medical Informatics (AREA)
- Pathology (AREA)
- Radiology & Medical Imaging (AREA)
- Biophysics (AREA)
- Physics & Mathematics (AREA)
- Heart & Thoracic Surgery (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Molecular Biology (AREA)
- Surgery (AREA)
- Animal Behavior & Ethology (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Human Computer Interaction (AREA)
- Ultra Sonic Daignosis Equipment (AREA)
Abstract
The invention provides ultrasonic imaging equipment and an ultrasonic inspection method thereof, wherein a scanning interface comprising a section selection area, a comment selection area and a measurement selection area is displayed; receiving an instruction for selecting a section, scanning a target object to obtain an ultrasonic image and displaying the ultrasonic image; displaying ultrasonic signs associated with the selected section and comment items thereof in a comment selection area for selection by a user; the comment item includes at least options for the presence and absence of ultrasound symptoms; obtaining comment content according to comment items selected by a user; displaying the measurement items associated with the selected section in the measurement selection area for selection by a user; and obtaining a measurement result according to the measurement item selected by the user and the measurement operation. Therefore, after the user selects the section and obtains the ultrasonic image through ultrasonic scanning, the doctor can be prompted for possible ultrasonic signs of the current ultrasonic image on the scanning interface and provides options, comment content of the ultrasonic image is obtained based on the selected operation of the doctor, further measurement can be performed, and the working efficiency is high.
Description
Technical Field
The invention relates to the field of medical equipment, in particular to ultrasonic imaging equipment and an ultrasonic inspection method thereof.
Background
Ultrasound is widely used in the ultrasound department of hospitals as a visual, convenient, noninvasive examination device. The current ultrasonic imaging equipment is designed for ultrasonic doctors, and the ultrasonic doctors perform ultrasonic examination in full time, so that not only is the experience abundant and the ultrasonic imaging equipment thoroughly known, but also the ultrasonic imaging equipment is required to have more and better functions so as to meet the increasing examination demands.
In order to obtain more diagnostic basis, clinical departments such as ICU are also more and more widely used to ultrasonic imaging equipment, but ICU doctor is less skilled in using ultrasonic imaging equipment than ultrasonic doctors, and in the face of ultrasonic imaging equipment with various functions and complicated operation, ICU doctor can not master completely, and equipment utilization rate and working efficiency are not high. In addition, the patient of the ICU has rapid change and serious illness state, and the ICU doctor is required to quickly make a decision, so how to improve the efficiency of using the ultrasonic imaging equipment by the ICU doctor is a problem yet to be solved.
Disclosure of Invention
The invention mainly provides ultrasonic imaging equipment and an ultrasonic inspection method thereof, so as to improve the efficiency of ultrasonic inspection.
An embodiment provides an ultrasonic inspection method comprising:
Receiving an instruction for activating a scanning protocol, and responding to the instruction, activating the scanning protocol associated with the instruction and displaying a scanning interface; wherein the scanning protocol comprises a set of a plurality of examination sites required to be scanned by a target object and a set of a plurality of sections required to be scanned by the examination sites; the scanning interface comprises: the system comprises a part selection area for displaying a checking part for a user to select, a section selection area for displaying a section for the user to select, a comment selection area for displaying comment content for the user to select and a measurement selection area for displaying measurement items for the user to select;
displaying a plurality of checking parts of the currently activated scanning protocol in the part selection area for selection by a user;
receiving an instruction for selecting an inspection position, and displaying a plurality of sections of the selected inspection position in the section selection area for selection by a user in response to the instruction;
receiving an instruction for selecting a section, and determining the section currently scanned; transmitting a first ultrasonic wave to a target object before or after receiving an instruction for selecting a section, receiving an echo of the first ultrasonic wave, and obtaining a first ultrasonic echo signal based on the echo of the first ultrasonic wave; generating a first ultrasonic image based on the first ultrasonic echo signal and displaying the first ultrasonic image in an image display area;
Selecting a standard section image from the first ultrasonic image;
displaying ultrasonic signs associated with a selected section in a currently activated scanning protocol in the comment selection area, and displaying comment items of the ultrasonic signs for selection by a user; the comment item includes at least an option of the presence of the ultrasound symptom and an option of the absence of the ultrasound symptom;
obtaining comment content according to comment items selected by a user;
displaying measurement items associated with a selected section in a currently activated scanning protocol in the measurement selection area for selection by a user;
obtaining a measurement result according to the measurement item selected by the user and the measurement operation;
the selected cut plane is correlated with the standard cut plane image, the comment content, and the measurement result.
In the method provided in an embodiment, further comprising:
after the selected examination part, displaying first teaching instruction information associated with the selected examination part to instruct a user to scan the selected examination part; or,
after the selected section, second teaching instruction information associated with the selected section is displayed to instruct the user to scan the selected section.
An embodiment provides an ultrasonic inspection method comprising:
Receiving an instruction for activating a scanning protocol, and responding to the instruction, activating the scanning protocol associated with the instruction and displaying a scanning interface; wherein the scanning protocol comprises a set of a plurality of sections required to be scanned by the target object; the scanning interface comprises: the system comprises a section selection area for displaying a section for a user to select, a comment selection area for displaying comment content for the user to select, and a measurement selection area for displaying measurement items for the user to select;
receiving an instruction for selecting a section, and determining the section currently scanned; transmitting a first ultrasonic wave to a target object before or after receiving an instruction for selecting a section, receiving an echo of the first ultrasonic wave, and obtaining a first ultrasonic echo signal based on the echo of the first ultrasonic wave; generating a first ultrasonic image based on the first ultrasonic echo signal and displaying the first ultrasonic image in an image display area;
displaying ultrasonic signs associated with a selected section in a currently activated scanning protocol in the comment selection area, and displaying comment items of the ultrasonic signs for selection by a user; the comment item includes at least an option of the presence of the ultrasound symptom and an option of the absence of the ultrasound symptom;
Obtaining comment content according to comment items selected by a user;
displaying measurement items associated with a selected section in a currently activated scanning protocol in the measurement selection area for selection by a user;
and obtaining a measurement result according to the measurement item selected by the user and the measurement operation.
An embodiment provides an ultrasound inspection method comprising one or more workflows, wherein the workflows comprise:
selecting one section from a plurality of sections included in the workflow; before or after a section is selected, transmitting a first ultrasonic wave to a target object, receiving an echo of the first ultrasonic wave, and obtaining a first ultrasonic echo signal based on the echo of the first ultrasonic wave; generating a first ultrasound image based on the first ultrasound echo signal;
selecting comment content based on the first ultrasonic image;
measuring the first ultrasonic image to obtain a measurement result;
associating the selected cut plane with the first ultrasound image, the comment content, and the measurement result;
displaying the first ultrasound image, the comment content, and the measurement result.
An embodiment provides a bedside ultrasound examination method comprising one or more workflows, wherein the workflows are adapted for bedside ultrasound examination, the workflows comprising:
Selecting one section from a plurality of sections included in the workflow;
displaying teaching instruction information associated with the selected section to instruct a user to scan the selected section;
transmitting a first ultrasonic wave to a target object, receiving an echo of the first ultrasonic wave, and obtaining a first ultrasonic echo signal based on the echo of the first ultrasonic wave; generating a first ultrasound image based on the first ultrasound echo signal;
selecting comment content based on the first ultrasonic image;
measuring the first ultrasonic image to obtain a measurement result;
the selected cut plane is associated with the first ultrasound image, the comment content, and the measurement result.
In the method provided in one embodiment, the selecting a section from a plurality of sections included in a workflow includes:
selecting one examination site from a plurality of examination sites included in the workflow;
a slice is selected from a plurality of slices included in the selected examination region.
In the method provided in an embodiment, further comprising:
after the selected examination site, activating an ultrasound probe and/or imaging mode associated with the selected examination site; or,
after a cut plane is selected, the ultrasound probe and/or imaging mode associated with the selected cut plane is activated.
In the method provided in one embodiment,
the selecting and obtaining comment content based on the first ultrasonic image comprises the following steps:
displaying ultrasonic signs associated with the selected section in the workflow in a comment selection area of a display interface, and displaying comment items of the ultrasonic signs for selection by a user; the comment item includes at least an option of the presence of the ultrasound symptom and an option of the absence of the ultrasound symptom;
obtaining comment content according to comment items selected by a user;
the measuring the first ultrasonic image to obtain a measurement result includes:
displaying measurement items associated with the selected section in the workflow in a measurement selection area of a display interface for selection by a user;
and obtaining a measurement result according to the measurement item selected by the user and the measurement operation.
In the method provided in an embodiment, further comprising:
receiving an instruction for starting an editing interface, and responding to the instruction, displaying the editing interface for a user to edit the workflow;
editing the workflow according to the operation of the user on the editing interface; the editing includes at least one of the following three types: adding, deleting or modifying the section included in the workflow, adding, deleting or modifying the ultrasound symptom associated with the section, and adding, deleting or modifying the measurement item associated with the section.
In the method provided in an embodiment, further comprising:
receiving an instruction for starting an editing interface, and responding to the instruction, displaying the editing interface for a user to add a new workflow; the editing interface provides at least one of an existing workflow, a section included in the existing workflow, an ultrasonic sign associated with the section of the existing workflow, and a measurement item associated with the section of the existing workflow for selection by a user;
and correlating at least one of the existing workflow selected by the user on the editing interface, the section included in the existing workflow, the ultrasonic sign correlated with the section of the existing workflow and the measurement item correlated with the section of the existing workflow to form a new workflow.
In the method provided in an embodiment, further comprising: and obtaining a prompt of the next operation and/or a clinical judgment prompt according to the section and the comment content of which the comment content is obtained currently.
An embodiment provides an ultrasonic inspection method, further comprising:
receiving an instruction for activating a scanning protocol, and responding to the instruction, activating the scanning protocol associated with the instruction and displaying a scanning interface; wherein the scanning protocol comprises a set of a plurality of examination sites required to be scanned by a target object and a set of a plurality of sections required to be scanned by the examination sites; the scanning interface comprises: a section selection area, a comment selection area and a measurement selection area;
Transmitting a first ultrasonic wave to a target object, receiving an echo of the first ultrasonic wave, and obtaining a first ultrasonic echo signal based on the echo of the first ultrasonic wave; generating a first ultrasound image based on the first ultrasound echo signal;
displaying the first ultrasound image in an image display area;
the section selection area is used for displaying a plurality of sections of the currently activated scanning protocol for a user to select; the comment selection area is used for displaying ultrasonic symptoms associated with a selected section in a currently activated scanning protocol and displaying comment items of the ultrasonic symptoms for selection by a user; the measurement selection area is used for displaying measurement items associated with a selected section in a currently activated scanning protocol for selection by a user.
An embodiment provides an ultrasonic inspection method comprising:
receiving an instruction for activating a scanning protocol, and responding to the instruction, activating the scanning protocol associated with the instruction and displaying a scanning interface; wherein the scanning protocol comprises a set of a plurality of sections required to be scanned by the target object; the scanning interface comprises: the section selection area is used for displaying sections and the flow control area is used for controlling a workflow corresponding to a scanning protocol by a user;
Receiving an instruction for selecting a section, and determining the section currently scanned; or determining the currently scanned section according to the section scanning sequence preset in the activated scanning protocol;
before or after determining a currently scanned tangent plane, transmitting a first ultrasonic wave to a target object, receiving an echo of the first ultrasonic wave, and obtaining a first ultrasonic echo signal based on the echo of the first ultrasonic wave; generating a first ultrasonic image based on the first ultrasonic echo signal and displaying the first ultrasonic image in an image display area;
and controlling the workflow corresponding to the scanning protocol based on the operation of the user in the flow control area.
In the method provided in an embodiment, after the currently scanned section finishes scanning, determining the next scanned section according to the section scanning sequence preset in the activated scanning protocol.
In the method provided in an embodiment, further comprising:
obtaining a measurement result based on the measurement of the first ultrasound image;
comment content is obtained based on a determination of whether a tangent plane-associated ultrasound symptom exists for the first ultrasound image.
In the method provided in an embodiment, further comprising: and associating the first ultrasonic image with the currently scanned tangent plane, and identifying the tangent plane with the first ultrasonic image in the tangent plane selection area.
In the method provided in an embodiment, in the section displayed in the section selection area, a section associated with the first ultrasound image and a section not associated with the first ultrasound image are differentially displayed.
In the method provided in an embodiment, the controlling the workflow corresponding to the scanning protocol based on the operation of the user in the flow control area includes:
receiving an instruction for repeating a section, and copying the section in the section selection area in response to the instruction; or,
receiving an instruction for inserting a single cut plane, and inserting the single cut plane in the cut plane selection area in response to the instruction; or,
receiving an instruction for inserting a set of cuts, and inserting a set of cuts in the cut selection area in response to the instruction; or,
receiving an instruction for increasing the section, and increasing the section in the section selection area in response to the instruction; or,
receiving an instruction for replacing a section, and responding to the instruction, replacing an ultrasonic image associated with the section; or,
receiving an instruction for deleting a section, and deleting the section in the section selection area in response to the instruction; or,
receiving an instruction for suspending a currently activated scanning protocol, and suspending a workflow corresponding to the currently activated scanning protocol in response to the instruction; or,
And receiving an instruction for stopping the currently activated scanning protocol, and stopping the workflow corresponding to the currently activated scanning protocol in response to the instruction.
An embodiment provides an ultrasound imaging apparatus comprising:
an ultrasonic probe for transmitting ultrasonic waves to a region of interest in biological tissue and receiving echoes of the ultrasonic waves;
a transmission/reception control circuit for controlling the ultrasonic probe to transmit ultrasonic waves to the region of interest and to receive echoes of the ultrasonic waves;
input means for receiving input from a user;
a display;
a memory for storing a program;
and a processor for executing the program stored in the memory to implement the method as described above.
A computer readable storage medium having stored thereon a program executable by a processor to implement a method as described above.
According to the ultrasonic imaging apparatus and the ultrasonic inspection method thereof of the above embodiments, a scanning interface including a section selection area, a comment selection area, and a measurement selection area is displayed; receiving an instruction for selecting a section, transmitting first ultrasonic waves to a target object, receiving echo waves of the first ultrasonic waves, generating first ultrasonic images based on the echo waves of the first ultrasonic waves, and displaying the first ultrasonic images in an image display area; displaying ultrasonic signs associated with a selected section in the currently activated scanning protocol in a comment selection area, and displaying comment items of the ultrasonic signs for selection by a user; the comment items include at least an option of the presence of an ultrasound symptom and an option of the absence of an ultrasound symptom; obtaining comment content according to comment items selected by a user; displaying a measurement item associated with a selected section in a currently activated scanning protocol in a measurement selection area for selection by a user; and obtaining a measurement result according to the measurement item selected by the user and the measurement operation. Therefore, after the user selects the section and obtains the ultrasonic image through ultrasonic scanning, the doctor can be prompted for possible ultrasonic signs of the current ultrasonic image on the scanning interface and provides options, comment content of the ultrasonic image is obtained based on the selected operation of the doctor, further measurement can be performed, and the working efficiency is high.
Drawings
FIG. 1 is a block diagram of an embodiment of an ultrasound imaging apparatus provided by the present invention;
FIG. 2 is a flow chart of one embodiment of an ultrasound inspection method provided by the present invention;
FIG. 3 is a flow chart of another embodiment of an ultrasound inspection method provided by the present invention;
FIG. 4 is a schematic diagram of an embodiment of a display interface in an ultrasound imaging apparatus according to the present invention;
FIG. 5 is a flow chart of an embodiment of associating a saved ultrasound image with a cut surface in an ultrasound imaging apparatus provided by the present invention;
FIG. 6 is a flow chart of a workflow in the ultrasound inspection method provided by the present invention;
FIG. 7 is a schematic diagram of one embodiment of a first overview interface in an ultrasound imaging apparatus provided by the present invention;
FIG. 8 is a schematic diagram of one embodiment of a second overview interface in an ultrasound imaging apparatus provided by the present invention;
FIG. 9 is a flow chart of yet another embodiment of an ultrasound inspection method provided by the present invention;
FIG. 10 is a schematic diagram of an embodiment of a display interface corresponding to a custom scan in an ultrasound imaging apparatus according to the present invention;
FIG. 11 is a schematic diagram of another embodiment of a display interface corresponding to a custom scan in an ultrasound imaging apparatus provided by the present invention;
FIG. 12 is a schematic diagram of an embodiment of a display interface for sequential scanning in an ultrasound imaging apparatus according to the present invention;
fig. 13 is a schematic view of an embodiment of a section selection area in an ultrasound imaging apparatus according to the present invention.
Detailed Description
The invention will be described in further detail below with reference to the drawings by means of specific embodiments. Wherein like elements in different embodiments are numbered alike in association. In the following embodiments, numerous specific details are set forth in order to provide a better understanding of the present application. However, one skilled in the art will readily recognize that some of the features may be omitted, or replaced by other elements, materials, or methods in different situations. In some instances, some operations associated with the present application have not been shown or described in the specification to avoid obscuring the core portions of the present application, and may not be necessary for a person skilled in the art to describe in detail the relevant operations based on the description herein and the general knowledge of one skilled in the art.
Furthermore, the described features, operations, or characteristics of the description may be combined in any suitable manner in various embodiments. Also, various steps or acts in the method descriptions may be interchanged or modified in a manner apparent to those of ordinary skill in the art. Thus, the various orders in the description and drawings are for clarity of description of only certain embodiments, and are not meant to be required orders unless otherwise indicated.
The numbering of the components itself, e.g. "first", "second", etc., is used herein merely to distinguish between the described objects and does not have any sequential or technical meaning. The terms "coupled" and "connected," as used herein, are intended to encompass both direct and indirect coupling (coupling), unless otherwise indicated.
As shown in fig. 1, the ultrasonic imaging apparatus provided by the present invention includes an ultrasonic probe 10, a transmission/reception control circuit, a processor 20, a man-machine interaction device 70, and a memory 80.
The ultrasound probe 10 includes a transducer (not shown in the figure) composed of a plurality of array elements arranged in an array. The array element is used for transmitting ultrasonic waves according to the excitation electric signals or converting received ultrasonic waves into electric signals. Each array element can thus be used to achieve a mutual conversion of the electrical pulse signal and the ultrasound wave, so as to achieve an ultrasound wave transmission to the biological tissue of the target object, and also to receive an echo wave of the ultrasound wave reflected back by the tissue.
The transmission/reception control circuit is used to control the ultrasonic probe 10 to transmit ultrasonic waves to a region of interest in biological tissue and to receive echoes of the ultrasonic waves. Specifically, the transmission/reception control circuit includes a transmission circuit 30 and a reception circuit 40.
The transmitting circuit 30 is used for exciting the ultrasonic probe 10 to transmit ultrasonic waves to a target object according to the control of the processor 20.
The receiving circuit 40 is configured to receive an ultrasonic echo returned from a target object through the ultrasonic probe 10 to obtain an ultrasonic echo signal, and may also process the ultrasonic echo signal. The receive circuitry 40 may include one or more amplifiers, analog-to-digital converters (ADCs), and the like.
The man-machine interaction device 70 is used for man-machine interaction, such as outputting visual information and receiving user input. The human-computer interaction device 70 comprises an input device for receiving user input, which may be a keyboard, an operation button, a mouse, a track ball, a touch pad, etc., or a touch screen integrated with a display. The human-machine interaction device 70 further comprises a display for outputting visual information.
The memory 80 is used to store various types of data.
The ultrasound imaging device may also include a beam synthesis module 50 and an IQ demodulation module 60.
The beam forming module 50 is in signal connection with the receiving circuit 40, and is configured to perform corresponding beam forming processes such as delay and weighted summation on the echo signals, and because distances from the ultrasonic receiving points in the tissue to be measured to the receiving array elements are different, channel data of the same receiving point output by different receiving array elements have delay differences, delay processing is required to be performed, phases are aligned, and different channel data of the same receiving point are weighted and summed, so as to obtain beamformed ultrasonic image data, and the ultrasonic image data output by the beam forming module 50 is also referred to as radio frequency data (RF data). The beam forming module 50 outputs the radio frequency data to the IQ demodulation module 60. In some embodiments, the beam forming module 50 may also output the rf data to the memory 80 for buffering or saving, or directly output the rf data to the processor 20 for image processing.
The beam forming module 50 may perform the above functions in hardware, firmware, or software. The beam forming module 50 may be integrated in the processor 20 or may be separately provided, which is not limited by the present invention.
The IQ demodulation module 60 removes the signal carrier by IQ demodulation, extracts the tissue structure information contained in the signal, and performs filtering to remove noise, and the signal obtained at this time is referred to as a baseband signal (IQ data pair). The IQ demodulation module 60 outputs IQ data pairs to the processor 20 for image processing. In some embodiments, the IQ demodulation module 60 also outputs IQ data pairs to the memory 80 for buffering or saving so that the processor 20 reads the data from the memory 80 for subsequent image processing.
The IQ demodulation module 60 may also perform the above functions in hardware, firmware, or software. Similarly, the IQ demodulation module 60 may be integrated in the processor 20 or may be separately provided, which is not limited by the present invention.
The processor 20 is configured to be a central controller Circuit (CPU), one or more microprocessors, graphics controller circuits (GPU) or any other electronic component capable of processing input data according to specific logic instructions, which may perform control of peripheral electronic components, or data reading and/or saving of memory 80, according to the input instructions or predetermined instructions, and may also perform processing of the input data by executing programs in the memory 80, such as one or more processing operations on the acquired ultrasound data according to one or more modes of operation, including but not limited to adjusting or defining the form of ultrasound emitted by the ultrasound probe 10, generating various image frames for display by a display of a subsequent human-machine interaction device 70, or adjusting or defining the content and form displayed on the display, or adjusting one or more image display settings (e.g., ultrasound images, interface components, locating regions of interest) displayed on the display.
The acquired ultrasound data may be processed by the processor 20 in real time during scanning as the echo signals are received, or may be temporarily stored on the memory 80 and processed in near real time in an on-line or off-line operation.
In this embodiment, the processor 20 controls the operation of the transmitting circuit 30 and the receiving circuit 40, for example, controls the transmitting circuit 30 and the receiving circuit 40 to operate alternately or simultaneously. The processor 20 may also determine an appropriate operation mode according to a user's selection or a program setting, form a transmission sequence corresponding to the current operation mode, and send the transmission sequence to the transmission circuit 30, so that the transmission circuit 30 controls the ultrasonic probe 10 to transmit ultrasonic waves using the appropriate transmission sequence.
The processor 20 is also operative to process the ultrasound data to generate a gray scale image of the signal intensity variations over the scan range reflecting the anatomy inside the tissue, referred to as the B image. The processor 20 may output the B-image to a display of the human interaction device 70 for display.
The ultrasonic imaging device provided by the invention has an ultrasonic inspection method shown in figure 2, and comprises the following steps:
step 1, the user issues an instruction for activating a scanning protocol, and the processor 20 receives the instruction for activating a scanning protocol through the input device. For example, the input device includes a control panel, on which one or more shortcuts are disposed, where one shortcut corresponds to one scanning protocol, and when a shortcut is pressed, an instruction for activating the scanning protocol corresponding to the shortcut is sent, and the shortcut may also be a virtual key displayed on the display. For another example, processor 20 displays one or more scanning protocol identifications on a display interface of a display for selection by a user; typically, a plurality of scanning protocol identifications are displayed, and the scanning protocol is selected when the scanning protocol identification is selected. The scanning protocol is a workflow for performing an examination based on a disease or a tissue organ, and when a user operates the input device to select one scanning protocol, an instruction for activating the selected scanning protocol is issued.
After the processor 20 receives the instruction to activate the selected scanning protocol, the processor 20 displays a plurality of facets associated with the selected scanning protocol on a display interface of the display in response to the instruction. In some embodiments, the scanning protocol has no examination site for user selection (e.g., only one examination site is associated or no examination site, e.g., a single examination site for clinical examination of the heart, lung, abdominal aorta, cranium, etc.), so that multiple section identifiers associated therewith are displayed directly. In some embodiments, the scanning protocol is associated with a plurality of examination locations, each examination location in turn being associated with a plurality of sections, i.e., the scanning protocol includes a set of examination locations for which a target object is to be scanned and a set of sections for which an examination location is to be scanned. Therefore, it is necessary to select the inspection site first, and the latter is described as an example in this embodiment. In this embodiment, the scanning protocol is a workflow for a bedside ultrasound examination scenario (typically including a plurality of examination sites), and for example, the displayed scanning protocol includes at least two of a shock scanning protocol (for a shock scenario), a wound scanning protocol (for a wound scenario), a cardiopulmonary resuscitation scanning protocol (for a cardiopulmonary resuscitation scenario), a respiratory failure scanning protocol (for a respiratory failure scenario), and the like, and the present embodiment is described by taking the shock scanning protocol, the wound scanning protocol, and the respiratory failure scanning protocol as examples of the displayed scanning protocol. Wherein the shock scanning protocol includes the examination sites "heart", "lung" and "abdomen", and the trauma scanning protocol includes the examination sites "left lung", "right lung", "subxiphoid", "liver and kidney", "spleen and kidney" and "pelvis".
As shown in fig. 3, in this embodiment, step 1 includes:
After the examination site is selected, the processor 20 may also activate an ultrasound probe and/or imaging mode associated with the selected examination site. For example, if the selected examination site is the heart, the phased array probe and B mode are activated. As another example, when the user switches from a heart position to an abdominal position, the processor 20 automatically switches the activated probe from a phased array probe to a linear array probe, and/or automatically adjusts the image depth, frequency, so that the image performance is more appropriate for the blood vessel, and/or automatically switches the imaging mode from B mode to color mode, as the abdomen involves a scan of the blood vessel. Therefore, a user does not need to manually switch the probe and the imaging mode, the operation is simplified, and the working efficiency is improved.
After the selected examination location, processor 20 also displays first instructional information associated with the selected examination location on a display interface of the display to instruct the user to scan the selected examination location. The first instructional information may be video, pictures, text, etc., such as an anatomical map, a standard tangent plane ultrasound map, or an instructional movie. For example, the man-machine interaction device comprises a main display and a sub-display, wherein the sub-display can be a touch display screen, and the first teaching instruction information can be displayed on the sub-display without occupying the display space of the ultrasonic image of the main display, so that the user can conveniently further view the first teaching instruction information.
In the first mode, the user operates the input device to select one of the section identifiers, i.e. an instruction for selecting a section is issued. The processor 20 receives instructions for selecting a section, determines the section currently being scanned and displays the name of the current section on the scanning interface, such as "Apical 4Chamber" in FIG. 4. In some embodiments, the processor 20 may also determine the currently scanned facet according to a preset facet scan sequence in the activated scan protocol without user selection. The processor 20, in response to an instruction for selecting a section identity (i.e., after determining the currently scanned section), transmits a first ultrasonic wave to the target object through the probe, and receives an echo of the first ultrasonic wave, obtaining a first ultrasonic echo signal based on the echo of the first ultrasonic wave; a first ultrasound image is generated based on the first ultrasound echo signal and displayed in an image display area C for displaying the ultrasound image. The image display area C and the scanning interface may be displayed on the same screen as if they were displayed on the main display. Because of the real-time scanning, the real-time ultrasonic image of the target object can be obtained. The image display area C, the position selection area A and the section selection area B are positioned on the same display interface and are positioned in different areas of the display interface.
In the second mode, the processor 20 transmits a first ultrasonic wave to a target object through the probe, receives an echo of the first ultrasonic wave, and obtains a first ultrasonic echo signal based on the echo of the first ultrasonic wave; a first ultrasound image is generated based on the first ultrasound echo signal and displayed in the image display area C. The processor 20 receives instructions for selecting a facet, determines the currently scanned facet and displays the name of the current facet at the scanning interface. In some embodiments, the processor 20 may also determine the currently scanned facet according to a preset facet scan sequence in the activated scan protocol without user selection.
After obtaining the first ultrasound image, a standard section image is selected from the first ultrasound image, the processor 20 saves the standard section image, and the saved standard section image is associated with the selected section (i.e., the currently scanned section) to be used as the ultrasound image of the current section. The ultrasound images of the saved section may be automatic or manual, as described one by one below.
The ultrasound image of the slice is saved "automatically", e.g., the processor 20 performs a quality assessment of the first ultrasound image of the target object in real time. Specifically, the processor 20 may sequentially input single frames of the first ultrasound image of the target object in real time to a pre-trained deep learning model, and output a quality evaluation result from the deep learning model, for example, output a result that reaches a preset quality standard or output a result that does not reach the preset quality standard. The quality evaluation result output by the deep learning model may also be an evaluation score of a single frame of ultrasonic image, where the evaluation score reflects the quality of the ultrasonic image, and the processor 20 determines whether the evaluation score exceeds a preset score, and if so, determines that the frame of ultrasonic image reaches a preset quality standard; and if the quality of the frame of ultrasonic image is not exceeded, determining that the frame of ultrasonic image does not reach the preset quality standard. The processor 20 obtains at least one frame of ultrasonic image reaching the preset quality standard from the ultrasonic image as a standard section image, saves at least one frame of ultrasonic image reaching the preset quality standard, and associates the saved ultrasonic image with the section corresponding to the selected section identifier to be used as the ultrasonic image of the section.
The ultrasound image of the section is saved "manually", for example, the processor 20 performs quality assessment on the first ultrasound image of the target object in real time, and when the ultrasound image reaches a preset quality standard, corresponding prompt information is displayed to prompt the user to save the image; the prompt may be displayed via a display. The quality evaluation method is the same as the above-mentioned "automatic" saving scheme, and will not be described here. After the user sees the prompt message, confirms or selects at least one frame of ultrasonic image reaching the preset quality standard as the standard section image to be stored. That is, the processor 20 receives an instruction for saving an ultrasound image through the input device, and saves the ultrasound image that reaches a preset quality standard in response to the instruction for saving the ultrasound image. Of course, in some embodiments, the user may directly select one or more frames from the first ultrasound image as the standard section image of the current section to save without performing quality evaluation.
Whether stored "automatically" or "manually," the processor 20 may display the quality assessment of the ultrasound image via a display for reference by the physician.
After the current cut surface has saved an ultrasound image (standard cut surface image), the processor 20 displays a thumbnail of the ultrasound image of the cut surface in a position corresponding to the cut surface identification. As shown in FIG. 4, when the user selects the section identifier of the "A4C" section, after the ultrasonic image of the "A4C" section is saved, the thumbnail of the ultrasonic image is displayed at the position of the section identifier of the "A4C" section. The method can well prompt the user of the position of the saved ultrasonic image in the checking part, and also prompt the user of which sections already save the ultrasonic image and which sections are to be scanned.
The existing ultrasonic imaging equipment can provide automatic workflows of tissues and organs, but the workflows are usually used for serving doctors with abundant experience in an ultrasonic department, very comprehensive examination guidance is provided, more than ten or even tens of sections of one organ need to be examined are usually provided, for example, a doctor also selects to examine a heart, the existing ultrasonic imaging equipment can provide all sections of the heart for the doctor to select, and the doctor sequentially selects the sections to scan, store, measure, remark and the like, so that the comprehensive examination of a patient is completed. However, for the bedside ultrasound scene, not only the doctor is not familiar with the operation equipment, but also the time requirement for emergency treatment is very high, and all sections do not need to be scanned, the ultrasound imaging equipment provided by the invention provides a series of processes for assisting the doctor in carrying out ultrasound examination aiming at the bedside ultrasound scene, the doctor only needs to select a proper scanning protocol according to the condition of the patient, the doctor can scan which examination parts and which sections of the scanning examination parts subsequently, the ultrasound imaging equipment is automatically provided for the doctor in a graphic form, the graphic user interface is friendly and efficient, the equipment also automatically judges the image quality and stores the image quality in consideration of the lack of experience of the doctor, and therefore, after the doctor selects the scanning protocol, each operation subsequently has corresponding prompt, guidance or is automatically completed by the equipment, the doctor only needs to master a method for using the probe, the requirement on the experience of the doctor is very low, the doctor is very suitable for the clinician to use beside the bed, and the bedside ultrasound examination efficiency is greatly improved.
After the selected section, the processor 20 may also activate the ultrasound probe and/or imaging mode associated with the selected section. For example, the selected facet is a four-cavity subsword facet, and the phased array probe and B-mode are activated. For another example, when the user switches from a subsword four-lumen cut to an abdominal aortic cut, the processor 20 automatically switches the activated probe from a phased array probe to a linear array probe, and/or automatically adjusts the image depth, frequency so that the image performance is more appropriate for the vessel, and/or automatically switches the imaging mode from B-mode to color mode.
After the selected facet, processor 20 may also display second instructional information associated with the selected facet on a display interface of the display to instruct the user to scan the selected facet. Specifically, the instruction identifier F may be displayed on the display interface, and after the user clicks on the instruction identifier F, the processor 20 displays the second teaching instruction information on the display interface, which is very convenient. The second teaching instruction information can be video, pictures, words, etc., such as an anatomical schematic, a standard section ultrasonic schematic, or a teaching movie. The second teaching instruction information may also be displayed on the sub-display.
The examination site of the patient or a specific section of the examination site may show which ultrasound symptoms are known, so that these possible ultrasound symptoms are associated with the section beforehand. The image display area C, the position selection area A, the section selection area B and the comment selection area D are positioned in the same display interface and are positioned in different areas of the display interface. For example, as shown in fig. 4, the current examination site is the heart and the ultrasound signs displayed by comment selection area D are pericardial effusion (Pericardial Effusion), segmental ventricular dyskinesia (Segmental Ventricular Motion Abnormality), and cavity Size and ratio abnormality (Chamber Size & Proportion Abnormality).
The user can start to perform corresponding measurement according to the measurement items. The processor 20 obtains a measurement result according to a measurement item selected by a user and a measurement operation performed. And correlating the selected section with the standard section image, the comment content and the measurement result, namely, obtaining the standard section image, the comment content and the measurement result of the current section through the steps, wherein the selected section, the standard section image of the section, the comment content of the section and the measurement result of the section are displayed on the display interface. Therefore, when a clinician performs ultrasonic examination on a patient beside a bed, an ultrasonic image can be obtained according to various prompts basically only by selecting a scanning protocol, the prompt that whether a preset ultrasonic sign exists or not is needed to be judged by the current ultrasonic image can also be obtained, the doctor only needs to select the existing option, and after the existing option is selected, measurement can be performed on a measurement item which is associated in advance, a measurement result is obtained, the automation degree of an examination flow of the whole target scene is very high, the manual memory operation flow of the doctor is not needed, and the efficiency of ultrasonic examination beside the bed is improved.
In some embodiments, the decision maker may be further upgraded, and the measurement result is also input to the decision maker only, that is, the processor 20 may further obtain a prompt for the next operation and/or a prompt for clinical judgment according to the section and the comment content and the measurement result of which comment content are already obtained. And the measurement result is added, so that the decision capability of the decision maker can be enriched.
In other words, the sequence of storing the standard section image in the step 2 and the sequence of storing the standard section image in the step 3 and the step 4 are not limited, in other words, the standard section image can be stored first, then comment and/or measurement can be carried out on the stored standard section image, and also comment and/or measurement can be carried out on the first ultrasonic image first, and then the standard section image is selected for storage.
The flow of the embodiment of fig. 4 is to select an examination site, scan the image after selecting a section, and save the ultrasonic image, wherein the saved ultrasonic image is the ultrasonic image of the selected section; the present invention also provides another workflow, which can save the ultrasonic image first, then the user associates the saved ultrasonic image to the section desired by the user, as shown in fig. 5, and after the first ultrasonic image of the target object is obtained in step 2, the workflow further includes the following steps:
The saved ultrasound image is not associated with the section, so that a subsequent operation is required, and if the user needs to modify the section associated with the saved ultrasound image, the subsequent operation can also be performed.
First association operation: the user operates the input device to drag the selected first ultrasound image to the selected section identification. Accordingly, the processor 20 receives an instruction for dragging the selected first ultrasound image to the selected section identity, and in response to the instruction, associates the selected first ultrasound image with the section corresponding to the selected section identity. Specifically, the display interface of the display also displays a graph storage schematic area G, where the graph storage schematic area G is used to display the thumbnail of the saved first ultrasound image. The saved first ultrasound images may be more than one, the first ultrasound images not associated may be more than one, of course, the ultrasound images associated with the section (standard section images) may be more than one, which are all displayed in the presence map indication area G. And the user operates the input device to drag one thumbnail in the memory graphic indication area G to one section mark of the section selection area B, so that the association of the ultrasonic image and the section corresponding to the section mark is completed. That is, the processor 20 receives an instruction for dragging the selected thumbnail in the memory map icon area G to one of the section identifications of the section selection area B, and in response to the instruction, associates the selected thumbnail with the section identification to which the thumbnail is moved, that is, associates the first ultrasound image corresponding to the thumbnail as the standard section image with the section to which the section identification to which the thumbnail is moved belongs.
Second association operation: the user operates the input device to simultaneously select the first ultrasound image and the section identification. Accordingly, the processor 20 receives instructions for simultaneously selecting the first ultrasound image and the section identity, and in response to the instructions, associates the selected first ultrasound image with the section corresponding to the selected section identity. For example, the user may be allowed to select a plurality of items by the input device while the preset key is continuously pressed, for example, the first ultrasound image and the section identifier may be successively selected, so as to associate the section corresponding to the first ultrasound image and the section identifier selected while the preset key is pressed.
Third association operation: the user operates the input device to select a first ultrasound image, and the processor 20 receives an instruction for selecting the first ultrasound image, and in response to the instruction, displays a first identification of each section identification corresponding to the section on the map at a position on or adjacent to the selected first ultrasound image; the first identifier is used for identifying a section name or a section position. The processor 20 receives an instruction for selecting a first identifier, and in response to the instruction, associates the selected first ultrasound image with a cut plane corresponding to the selected first identifier. As shown in fig. 4, the user may select the ultrasound image in the form of a selected thumbnail in the presence diagram area G, and when the user selects the ultrasound image a, the first identifiers of all the sections under the current examination part are displayed for the user to select, and when the user selects "A4C", the ultrasound image a is associated with the section corresponding to "A4C".
Regardless of the manner in which the ultrasound image is associated with the cutting plane, the cutting plane is then ultrasound image-derived after the association, and the processor 20 displays a thumbnail image of the ultrasound image of the cutting plane in a location corresponding to the identification of the cutting plane. This approach is not only advantageous for custom cutting of ultrasound images, but can also be repeated to modify the associated cutting plane.
In this embodiment, the scanning protocol may be editable. The processor 20 is further configured to receive an instruction for launching an editing interface, and in response to the instruction, display the editing interface on the display for editing the scanning protocol by the user. The processor 20 edits the scanning protocol according to user operations on the editing interface. Wherein editing includes at least one of the following three types: the method comprises the steps of adding, deleting or modifying a section included in the scanning protocol, adding, deleting or modifying an ultrasonic sign associated with the scanning protocol, and adding, deleting or modifying a measurement item associated with the scanning protocol.
The editing interface may also provide functionality for a new scanning protocol. Specifically, the processor 20 receives an instruction to initiate an editing interface, and in response to the instruction, displays the editing interface for the user to add a new scanning protocol. Wherein the editing interface provides at least one of an existing scanning protocol, a tangent plane included in the existing scanning protocol, an ultrasound sign associated with the tangent plane of the existing scanning protocol, and a measurement item associated with the tangent plane of the existing scanning protocol for selection by a user. The processor 20 correlates at least one of the existing scanning protocol selected by the user on the editing interface, the facets included in the existing scanning protocol, the ultrasound indications associated with the facets of the existing scanning protocol, and the measurement items associated with the facets of the existing scanning protocol to form a new workflow. By the method, a user does not need to re-zero to start a new scanning protocol, and the user can obtain a customized new scanning protocol by combining the original scanning protocol and then matching with the functions of adding, deleting or modifying, so that the method is very convenient.
In view of the foregoing, the scanning protocol corresponds to a workflow, that is, the ultrasound inspection method provided by the invention includes one or more workflows. The processor 20 receives, via the input device, an instruction for launching the selected workflow, in response to which the selected workflow is launched. Wherein, as shown in fig. 6, the workflow includes:
step 2', selecting a section from a plurality of sections included in the workflow. For example, the workflow includes a plurality of inspection sites, each inspection site including one or more cut surfaces; an inspection site is selected from a plurality of inspection sites included in the workflow. Then, a section is selected from the sections included in the selected examination region. Specifically, the processor 20 displays a plurality of examination sites included in the workflow on the display interface of the display for selection by the user, and after the user selects an examination site, displays a section included in the selected examination site on the display interface of the display for selection by the user. The specific process is shown in the above steps 11, 12 and 13, and will not be described herein.
The processor 20 may activate the ultrasound probe and/or imaging mode associated with the selected examination site after the selected examination site; the ultrasound probe and/or imaging mode associated with the selected section may also be activated after the selected section. Processor 20 may also display instructional information associated with the selected section to direct the user to scan the selected section. The specific process is described in the above embodiments, and will not be described herein.
Before or after the section is selected, the processor 20 transmits a first ultrasonic wave to the target object through the probe, receives an echo of the first ultrasonic wave, and obtains a first ultrasonic echo signal based on the echo of the first ultrasonic wave; a first ultrasound image is generated based on the first ultrasound echo signal. The specific process is shown in step 2 of the above embodiment, and will not be described herein.
And 3', obtaining comment content through selection based on the first ultrasonic image. For example, processor 20 displays ultrasound symptoms associated with the selected cut plane in the workflow in comment selection area D and also displays comment items for the ultrasound symptoms for selection by the user. Processor 20 obtains comment content based on the comment item selected by the user.
Step 3' further comprises: and obtaining a prompt of the next operation and/or a clinical judgment prompt according to the section and the comment content of which the comment content is obtained currently.
The specific process of step 3' is the same as that of step 3 in the above embodiment, and will not be described herein.
And 4', measuring the first ultrasonic image to obtain a measurement result. For example, the processor 20 displays, in the measurement selection area, measurement items associated with the selected cut plane in the workflow for selection by the user; and obtaining a measurement result according to the measurement item selected by the user and the measurement operation. Then, associating the selected section with the first ultrasound image (e.g., a standard section image selected from the first ultrasound image), comment content, and measurement results; and displaying the first ultrasound image, comment content, and measurement results.
Likewise, the workflow is editable. The processor 20 receives instructions for initiating an editing interface, and in response to the instructions, displays the editing interface via a display for editing the workflow by the user; and editing the workflow according to the operation of the user on the editing interface. Wherein editing includes at least one of the following three types: adding, deleting or modifying the section included in the workflow, adding, deleting or modifying ultrasound symptoms associated with the section, adding, deleting or modifying measurement items associated with the section.
Workflow may also be added. The processor 20 receives an instruction to initiate an editing interface and, in response to the instruction, displays the editing interface for the user to add a new workflow. The editing interface provides at least one of the existing workflow, the facets included by the existing workflow, the ultrasound indications associated with the existing workflow facets, and the measurement items associated with the existing workflow facets for selection by the user. The processor 20 correlates at least one of the existing workflow selected by the user on the editing interface, the cut surface included in the existing workflow, the ultrasound indications associated with the cut surface of the existing workflow, and the measurement items associated with the cut surface of the existing workflow to form a new workflow.
The specific process of step 4' is the same as that of step 4 in the above embodiment, and will not be described herein.
After the ultrasonic inspection method is adopted for inspection, the invention also provides a summary view to present the whole inspection result, so that the user can browse conveniently.
Specifically, the processor 20 receives an instruction for presenting a summary view interface, and in response to the instruction, displays a first summary view interface on the display, as shown in fig. 7. The first overview interface includes a first display area H and a second display area I that is displayed on the same screen as the first display area H. The first display area H displays a human body posture map H1 corresponding to the selected first inspection scene.
The processor 20 displays an examination site identification h2 corresponding to the examination site on a position corresponding to the site in the body position map h1 to indicate the site where the ultrasound image was obtained. That is, the position of the part is at which position of the human body, and the corresponding examination part mark h2 is at which position of the human body position map h1. In this embodiment, the examination site identification h2 includes a thumbnail of an ultrasound image of a tangential plane of the site. In this way, when there is a thumbnail of the ultrasound image, it indicates that at least one section of the portion has been inspected, the thumbnail is a blank portion (the frame of the inspection portion identifier h2 is still in, but there is no thumbnail inside), it indicates that the portion has not yet been inspected, and as shown in fig. 7, two blank frames on the body position map h1 indicate that the two portions have not yet been inspected.
The processor 20 obtains the inspection result of the inspection site based on the comment content of each section of the inspection site. The inspection result reflects whether the inspection site is normal, abnormal, or the like. For example, in the comment of all the sections of the examination site, if there is at least one section for which there is an ultrasonic sign, the processor 20 determines that the examination result of the examination site is abnormal, that is, if there is an abnormality in the section, it indicates that the site is abnormal; if no ultrasonic sign exists in all the sections, the processor 20 determines that the inspection result of the inspected part is normal, namely that all the sections are normal and the part is normal; if no ultrasound symptoms are present on the cut surface and if there are ultrasound symptoms on the cut surface that are not determined to be present, the processor 20 determines that the examination of the site is not determined.
The processor 20 also displays a first result identifier h3 corresponding to the inspection result of the inspection site on the inspection site identifier h2 to indicate the inspection result of the inspection site. The first result mark h3 is used to reflect the examination result of the site, which is three of normal (Positive in fig. 7), abnormal (Positive in fig. 7), and indeterminate (indet in fig. 7). Therefore, the user only needs to check the content of the first display area H, and can know which parts are not checked in the current inspection, which parts are checked, the inspection results of the checked parts, and the like, so that the method is very convenient and quick. The specific form of the first result identifier h3 is not limited, for example, a preset graph can be adopted, in this embodiment, corner marks are adopted, different inspection results are represented by different colors, for example, a red corner mark indicates that the inspection result is abnormal, a white corner mark indicates that the inspection result is uncertain, and a blue or green corner mark indicates that the inspection result is normal. Therefore, the user can rapidly distinguish abnormal parts, and the working efficiency is improved.
The second display area I is used for displaying comment content of the section of each examination site, that is, the processor 20 is further used for displaying comment content of the section of each examination site in the second display area I. In this embodiment, comment contents are displayed in the second display area I according to the associated examination part classification, as shown in fig. 7, the labeling contents of each section of the heart are displayed in a left row, the labeling contents of each section of the lung are displayed in a middle row, and the labeling contents of each section of the abdomen are displayed in a right row.
The second display area I displays comment content, in particular the ultrasound symptoms and the three options mentioned above for ultrasound symptoms associated in advance with the section of the examination area, wherein comment items selected by the user are highlighted. The highlighting may be performed in a variety of ways, such as highlighting with color, brightness, graphics, etc. (highlighting is shown in dark color blocks in the figure), and the invention is not limited thereto. In this way, the user's results are clear at a glance of the presence or absence of ultrasound signs at the site. The second display area I displays not only comment contents but also measurement results.
The comment content and the measurement result displayed in the second display area I can be edited. Specifically, the processor 20 receives, through the input device, an operation of modifying the comment content displayed in the second display area I by the user, and modifies the comment content based on the operation. For example, the ultrasonic sign of pericardial effusion (Pericardial Effusion) is that the current selected comment item is "Present" and the user clicks the comment item without "Absend", so that the selected comment item of pericardial effusion is changed to be Absent, and therefore, the user can uniformly check and modify all comment contents in the second display area I without returning to the check process for modification, and the method is very convenient and quick. Likewise, the processor 20 may receive an operation of modifying the measurement result displayed in the second display area I by the user through the input device, and modify the measurement result based on the operation.
The processor 20 receives an instruction for selecting an examination location on the body position map h1, i.e. the user clicks on the examination location identifier h2 on the body position map h1 to issue the instruction, and in response to the instruction, a second summary view interface is displayed on the display, as shown in fig. 8. The second summary view interface comprises a third display area J and a fourth display area K which is displayed on the same screen with the third display area J. The third display area J displays a map b1 of the position of the selected examination site.
The processor 20 displays the section identifier b2 corresponding to the section from which the ultrasound image is obtained on the position corresponding to the section in the body position map b1 to indicate that the section from which the ultrasound image is obtained. The section mark b2 is used for marking the section from which the ultrasonic image is obtained. I.e. at which position of the part the tangent plane is located, and its corresponding tangent plane identification b2 is located at which position of the map b1. The specific form of the section mark b2 may be various, for example, a graph, a picture, etc., and it is only necessary to distinguish the section from which the ultrasonic image is obtained from the section from which the ultrasonic image is not obtained. In this embodiment, the section identifier b2 corresponding to the section includes a thumbnail of the ultrasound image of the section. In this way, if there is a thumbnail of the ultrasound image at one position of the body position map b1, it indicates that the section at that position has been inspected, and if the thumbnail is blank (the frame of the section identifier b2 is still in but there is no thumbnail inside), it indicates that the section at that position has not yet been inspected, as shown in fig. 8, the middle four sections on the body position map b1 on the front face of the lung and the four sections on the back face of the lung have not yet been inspected.
The processor 20 further displays a second result identifier j1 corresponding to the labeling content of the tangent plane on the tangent plane identifier b2 corresponding to the tangent plane, so as to indicate the comment content of the tangent plane. The comment content indicating the section is mainly a selected comment item indicating the ultrasonic sign of the section (whether the ultrasonic sign exists, the ultrasonic sign does not exist or is not determined to exist), thereby reflecting the examination result of the section. Therefore, the user only needs to check the content of the third display area J, and can know which sections are not checked in the current position check, which sections are checked, the check result of the checked sections, and the like, so that the method is very convenient and quick. The specific form of the second result identifier j1 is not limited, for example, a preset graph may be used, in this embodiment, the second result identifier j1 and the first result identifier use corner marks, different inspection results are represented by different colors (selected label items), for example, a red corner mark indicates that the inspection result is abnormal (an ultrasonic sign exists), a white corner mark indicates that the inspection result is uncertain (whether an ultrasonic sign exists or not is uncertain), and a blue or green corner mark indicates that the inspection result is normal (an ultrasonic sign does not exist). Therefore, the user can rapidly distinguish the abnormal section, and the working efficiency is improved.
The fourth display area K is used for displaying comment content of each section of the selected examination part, that is, the processor 20 is further used for displaying comment content of each section of the selected examination part in the fourth display area K. The fourth display area K displays comment content, specifically, the ultrasound symptom and the three comment items of the ultrasound symptom which are associated in advance with the section of the selected examination site, wherein the selected comment item of the ultrasound symptom is highlighted.
Likewise, the fourth display area K displays not only the comment content of the cut surface but also the measurement result thereof.
The comment content and the measurement result displayed in the fourth display area K can be edited. Specifically, the processor 20 receives an operation of modifying the comment content displayed in the fourth display area K by the user through the input device, and modifies the comment content based on the operation. Also, the processor 20 may receive an operation of modifying the measurement result displayed in the fourth display area K by the user through the input device, and modify the measurement result based on the operation.
It can be seen that the fourth display area K is displayed in the same form as the second display area I, but the second display area I displays comment contents (comment contents of all sections included in the section) and measurement results of all the sections, and the fourth display area K only displays comment contents and measurement results of the selected inspection section, and the other display areas may be the same as the second display area I, which is not described herein.
The user may also return to the ultrasound examination process of fig. 2, 3, or 6, etc. to modify the comment content, modify the measurement results, and conduct subsequent examinations, etc. As shown in fig. 7 and 8, a Back to check key (Back to exam) and an End check key (End exam) are further provided on the first and second summary view interfaces. The return check key is used to trigger a corresponding instruction to return to the ultrasound inspection process of fig. 2, 3 or 6, etc. The end check key is used to trigger a corresponding instruction to end the ultrasound examination of fig. 2, 3 or 6, etc.
And the user selects a section mark b2 on the second summary view interface, namely, the section corresponding to the section mark b2 is selected, and the fourth display area K only displays comment content and measurement results of the selected section. The user can thus view detailed examination information of a single section, which is very convenient.
The invention also provides an ultrasonic inspection method, as shown in fig. 9, comprising the following steps:
step 1", the processor 20 receives an instruction for activating a scanning protocol through the input device, activates the scanning protocol associated with the instruction and displays the scanning interface in response to the instruction. In FIG. 10, "D8-2U" represents a scanning protocol for midnight pregnancy and "DE10-3WU" represents a scanning protocol for gynecology. Wherein the scanning protocol includes a set of multiple facets of the target object that need to be scanned. The scanning interface comprises: a section selection area B' for displaying sections (as shown in fig. 10-12) and a flow control area L (as shown in fig. 10 and 11) for a user to control a workflow corresponding to a scanning protocol. After the scanning protocol is activated, each section included in the scanning protocol can be started, and the scanning can be customized, for example, the processor 20 scans according to the section selected by the user; the sequence of scanning may also be performed, for which scanning protocol a section scanning sequence is preset, i.e. which section is scanned first and then which section is preset, and the processor 20 scans each section in sequence according to the section scanning sequence preset in the activated scanning protocol. The following steps are described in detail.
Before or after determining the currently scanned tangent plane, the processor 20 transmits a first ultrasonic wave to the target object through the ultrasonic probe, receives an echo of the first ultrasonic wave, and obtains a first ultrasonic echo signal based on the echo of the first ultrasonic wave; a first ultrasound image is generated based on the first ultrasound echo signal and displayed in the image display area. The first ultrasonic image obtained before or after the currently scanned section is determined to be the ultrasonic image of the current section. After the first ultrasound image is obtained, a standard section image can be selected from the first ultrasound image so as to be used as an ultrasound image of the current section. The specific process may be the same as the above embodiment, and will not be described here again.
Specifically, as shown in fig. 10 and 11, the flow control area L may be provided with a "repeat" key (virtual key), and of course, an entity key with the same function as the "repeat" key may be provided, and after the "repeat" key is triggered or the corresponding entity key is triggered, an instruction for repeating the section is issued. The processor 20 receives instructions for repeating the section, in response to which the section is copied in the section selection area B', typically the section selected by the user, and if the section has associated with it ultrasound images, measurements, comments etc. these data are copied together.
The flow control area L may also be provided with an "insert" key, and of course, an entity key with the same function as the "insert" key may also be provided, and after the "insert" key is triggered or its corresponding entity key is triggered, an instruction for inserting a single section or a group of sections is issued. Processor 20 receives an instruction to insert a single slice or a set of slices, and in response to the instruction inserts a single slice or a set of slices in slice selection region B'. The inserted section can be a section associated with an ultrasonic image, a measurement result, comment content and the like, can also be a preset section and the like, and the inserted section is inserted, namely a scanning flow corresponding to the section is inserted, and the inserted section scanning flow can be a current scanning protocol or other protocols.
The flow control area L may also be provided with a "new" key, and of course, an entity key with the same function as the "new" key may also be provided, and after the "new" key is triggered or the corresponding entity key is triggered, an instruction for increasing the section is issued. The processor 20 receives an instruction for adding a section, and in response to the instruction, adds a section in the section selection area B', and the base color of the newly added section is different from that of the other sections to be displayed differently. The added section can be a blank section, and a user can customize the section name and the like.
The process control area L may also be provided with a "redo" key, and of course, an entity key with the same function as the "redo" key may also be provided, and after the "redo" key is triggered or the corresponding entity key is triggered, an instruction for replacing the tangent plane is issued. The processor 20 receives instructions for replacing the facet, and in response to the instructions, replaces the ultrasound image associated with the facet, such as rescanning the facet, and replaces the original ultrasound image with the rescanned ultrasound image.
The process control area L may also be provided with a "delete" key, and of course, an entity key with the same function as the "delete" key may also be provided, and after the "delete" key is triggered or the corresponding entity key is triggered, an instruction for deleting the section is issued. The processor 20 receives an instruction for deleting a cut surface, and in response to the instruction, deletes the cut surface in the cut surface selection area, and the ultrasound image, measurement result, comment content, and the like associated with the cut surface are also deleted.
The flow control area L may also set a "pause" key, and of course, may also set an entity key with the same function as the "pause" key, and after the "pause" key is triggered or its corresponding entity key is triggered, an instruction for pausing the currently activated scanning protocol is issued. The processor 20 receives an instruction to pause the currently active scanning protocol and, in response to the instruction, pauses the workflow corresponding to the currently active scanning protocol. The "pause" key after pause becomes a continue key (not shown) and after the continue key or its corresponding physical key is activated, the processor 20 continues to execute the currently active scanning protocol.
The flow control area L may also set a "stop" key, and of course, may also set an entity key with the same function as the "stop" key, where the "stop" key is triggered or the corresponding entity key is triggered, and then issue an instruction for stopping the currently activated scanning protocol. The processor 20 receives an instruction to stop the currently active scanning protocol and, in response to the instruction, stops the workflow corresponding to the currently active scanning protocol.
After associating the first ultrasound image (typically the standard cut plane image selected from the first ultrasound image) with the currently scanned cut plane, the processor 20 also identifies the cut plane to which the first ultrasound image has been associated in the cut plane selection area B'. Specifically, in the sections displayed in the section selection area B', the sections associated with the first ultrasonic image and the sections not associated with the first ultrasonic image are differentially displayed, so that a user is prompted to which sections have completed scanning and which sections remain to be scanned. For example, the processor 20 displays a thumbnail of the resulting (associated) ultrasound image on the section identifier of the section selection area B' to identify that the section has been scanned and that an ultrasound image has been obtained, and for sections for which no ultrasound image has been obtained, a section profile schematic (as shown in fig. 10) is displayed on the section identifier. Of course, the marking may be performed by using a specific mark, and as shown in fig. 13, the cut surface where the ultrasonic image is obtained is marked with a "v" mark, and the cut surface where the ultrasonic image is not obtained is marked with an "x" mark; of course, other identification modes exist, and the two types of sections can be displayed differently.
The processor 20 obtains a measurement result based on the measurement of the first ultrasonic image, specifically, the measurement of the first ultrasonic image may be obtained automatically (such as image recognition, machine learning, etc.), or may be manually measured by a user. The processor 20 obtains comment content based on the judgment of whether the first ultrasonic image has the ultrasonic image associated with the section, and can also automatically (such as image recognition, machine learning, etc.) judge whether the first ultrasonic image has the ultrasonic image associated with the section, thereby obtaining comment content, or manually input comment content after the judgment of the user. The present embodiment is described by taking a manual mode as an example.
The processor 20 may obtain a measurement result based on a measurement operation of the user on the first ultrasonic image (standard section image), for example, trigger a designated key to display a measurement selection area, and measure an ultrasonic image of the current section based on a measurement item of the measurement selection area, to obtain a measurement result. Further, the processor 20 displays a comment selection area on the scanning interface according to the measurement result, displays ultrasonic signs related to the currently scanned section in the currently activated scanning protocol in the comment selection area, and displays comment items of the ultrasonic signs for selection by a user; the comment items include at least an option of the existence of an ultrasonic sign and an option of the absence of an ultrasonic sign, and may also include an option of uncertainty of the existence of an ultrasonic sign; processor 20 in turn obtains comment content based on the comment item selected by the user. The specific process is the same as that of the above embodiment, and will not be described herein.
Of course, the scanning interface may also further include a comment selection area, where the comment selection area displays ultrasonic signs associated with the currently scanned section in the currently activated scanning protocol, and further displays comment items of the ultrasonic signs for the user to select; the comment item includes at least an option of the presence of an ultrasound symptom and an option of the absence of an ultrasound symptom, and may also include an option of an uncertainty of the presence of an ultrasound symptom. Processor 20 obtains comment content based on the comment item selected by the user. The scanning interface can also comprise a measurement selection area, wherein the measurement selection area displays measurement items related to a currently scanned section in a currently activated scanning protocol for selection by a user; and obtaining a measurement result according to the measurement item selected by the user and the measurement operation. The specific process is the same as that of the above embodiment, and will not be described herein.
The comment selection area and the measurement selection area may be the same area or may be different areas. The steps 4 'and 3' are based on the operation of the user, and the sequence is not limited, namely, the user can trigger the step 3 'and then trigger the step 4', and can trigger the step 4 'and then trigger the step 3', and of course, only one step can be triggered.
Reference is made to various exemplary embodiments herein. However, those skilled in the art will recognize that changes and modifications may be made to the exemplary embodiments without departing from the scope herein. For example, the various operational steps and components used to perform the operational steps may be implemented in different ways (e.g., one or more steps may be deleted, modified, or combined into other steps) depending on the particular application or taking into account any number of cost functions associated with the operation of the system.
Additionally, as will be appreciated by one of skill in the art, the principles herein may be reflected in a computer program product on a computer readable storage medium preloaded with computer readable program code. Any tangible, non-transitory computer readable storage medium may be used, including magnetic storage devices (hard disks, floppy disks, etc.), optical storage devices (CD-ROMs, DVDs, blu-Ray disks, etc.), flash memory, and/or the like. These computer program instructions may be loaded onto a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions which execute on the computer or other programmable data processing apparatus create means for implementing the functions specified. These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including means which implement the function specified. The computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified.
While the principles herein have been shown in various embodiments, many modifications of structure, arrangement, proportions, elements, materials, and components, which are particularly adapted to specific environments and operative requirements, may be used without departing from the principles and scope of the present disclosure. The above modifications and other changes or modifications are intended to be included within the scope of this document.
The foregoing detailed description has been described with reference to various embodiments. However, those skilled in the art will recognize that various modifications and changes may be made without departing from the scope of the present disclosure. Accordingly, the present disclosure is to be considered as illustrative and not restrictive in character, and all such modifications are intended to be included within the scope thereof. Also, advantages, other advantages, and solutions to problems have been described above with regard to various embodiments. The benefits, advantages, solutions to problems, and any element(s) that may cause any benefit, advantage, or solution to occur or become more pronounced are not to be construed as a critical, required, or essential feature. The terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, system, article, or apparatus. Furthermore, the term "couple" and any other variants thereof are used herein to refer to physical connections, electrical connections, magnetic connections, optical connections, communication connections, functional connections, and/or any other connection.
Those skilled in the art will recognize that many changes may be made to the details of the above-described embodiments without departing from the underlying principles of the invention. Accordingly, the scope of the invention should be determined from the following claims.
Claims (20)
1. An ultrasonic inspection method comprising:
receiving an instruction for activating a scanning protocol, and responding to the instruction, activating the scanning protocol associated with the instruction and displaying a scanning interface; wherein the scanning protocol comprises a set of a plurality of examination sites required to be scanned by a target object and a set of a plurality of sections required to be scanned by the examination sites; the scanning interface comprises: the system comprises a part selection area for displaying a checking part for a user to select, a section selection area for displaying a section for the user to select, a comment selection area for displaying comment content for the user to select and a measurement selection area for displaying measurement items for the user to select;
displaying a plurality of checking parts of the currently activated scanning protocol in the part selection area for selection by a user;
receiving an instruction for selecting an inspection position, and displaying a plurality of sections of the selected inspection position in the section selection area for selection by a user in response to the instruction;
Receiving an instruction for selecting a section, and determining the section currently scanned; transmitting a first ultrasonic wave to a target object before or after receiving an instruction for selecting a section, receiving an echo of the first ultrasonic wave, and obtaining a first ultrasonic echo signal based on the echo of the first ultrasonic wave; generating a first ultrasonic image based on the first ultrasonic echo signal and displaying the first ultrasonic image in an image display area;
selecting a standard section image from the first ultrasonic image;
displaying ultrasonic signs associated with a selected section in a currently activated scanning protocol in the comment selection area, and displaying comment items of the ultrasonic signs for selection by a user; the comment item includes at least an option of the presence of the ultrasound symptom and an option of the absence of the ultrasound symptom;
obtaining comment content according to comment items selected by a user;
displaying measurement items associated with a selected section in a currently activated scanning protocol in the measurement selection area for selection by a user;
obtaining a measurement result according to the measurement item selected by the user and the measurement operation;
the selected cut plane is correlated with the standard cut plane image, the comment content, and the measurement result.
2. The method as recited in claim 1, further comprising:
after the selected examination part, displaying first teaching instruction information associated with the selected examination part to instruct a user to scan the selected examination part; or,
after the selected section, second teaching instruction information associated with the selected section is displayed to instruct the user to scan the selected section.
3. An ultrasonic inspection method comprising:
receiving an instruction for activating a scanning protocol, and responding to the instruction, activating the scanning protocol associated with the instruction and displaying a scanning interface; wherein the scanning protocol comprises a set of a plurality of sections required to be scanned by the target object; the scanning interface comprises: the system comprises a section selection area for displaying a section for a user to select, a comment selection area for displaying comment content for the user to select, and a measurement selection area for displaying measurement items for the user to select;
receiving an instruction for selecting a section, and determining the section currently scanned; transmitting a first ultrasonic wave to a target object before or after receiving an instruction for selecting a section, receiving an echo of the first ultrasonic wave, and obtaining a first ultrasonic echo signal based on the echo of the first ultrasonic wave; generating a first ultrasonic image based on the first ultrasonic echo signal and displaying the first ultrasonic image in an image display area;
Displaying ultrasonic signs associated with a selected section in a currently activated scanning protocol in the comment selection area, and displaying comment items of the ultrasonic signs for selection by a user; the comment item includes at least an option of the presence of the ultrasound symptom and an option of the absence of the ultrasound symptom;
obtaining comment content according to comment items selected by a user;
displaying measurement items associated with a selected section in a currently activated scanning protocol in the measurement selection area for selection by a user;
and obtaining a measurement result according to the measurement item selected by the user and the measurement operation.
4. An ultrasound inspection method, the method comprising one or more workflows, wherein the workflows comprise:
selecting one section from a plurality of sections included in the workflow; before or after a section is selected, transmitting a first ultrasonic wave to a target object, receiving an echo of the first ultrasonic wave, and obtaining a first ultrasonic echo signal based on the echo of the first ultrasonic wave; generating a first ultrasound image based on the first ultrasound echo signal;
selecting comment content based on the first ultrasonic image;
measuring the first ultrasonic image to obtain a measurement result;
Associating the selected cut plane with the first ultrasound image, the comment content, and the measurement result;
displaying the first ultrasound image, the comment content, and the measurement result.
5. A bedside ultrasound examination method, the method comprising one or more workflows, wherein the workflows are adapted for bedside ultrasound examination, the workflows comprising:
selecting one section from a plurality of sections included in the workflow;
displaying teaching instruction information associated with the selected section to instruct a user to scan the selected section;
transmitting a first ultrasonic wave to a target object, receiving an echo of the first ultrasonic wave, and obtaining a first ultrasonic echo signal based on the echo of the first ultrasonic wave; generating a first ultrasound image based on the first ultrasound echo signal;
selecting comment content based on the first ultrasonic image;
measuring the first ultrasonic image to obtain a measurement result;
the selected cut plane is associated with the first ultrasound image, the comment content, and the measurement result.
6. The method of claim 4 or 5, wherein selecting a facet from a plurality of facets included in the workflow comprises:
Selecting one examination site from a plurality of examination sites included in the workflow;
a slice is selected from a plurality of slices included in the selected examination region.
7. The method of claim 1 or 6, further comprising:
after the selected examination site, activating an ultrasound probe and/or imaging mode associated with the selected examination site; or,
after a cut plane is selected, the ultrasound probe and/or imaging mode associated with the selected cut plane is activated.
8. The method of claim 4 or 5, wherein,
the selecting and obtaining comment content based on the first ultrasonic image comprises the following steps:
displaying ultrasonic signs associated with the selected section in the workflow in a comment selection area of a display interface, and displaying comment items of the ultrasonic signs for selection by a user; the comment item includes at least an option of the presence of the ultrasound symptom and an option of the absence of the ultrasound symptom;
obtaining comment content according to comment items selected by a user;
the measuring the first ultrasonic image to obtain a measurement result includes:
displaying measurement items associated with the selected section in the workflow in a measurement selection area of a display interface for selection by a user;
And obtaining a measurement result according to the measurement item selected by the user and the measurement operation.
9. The method of claim 1 or 8, further comprising:
receiving an instruction for starting an editing interface, and responding to the instruction, displaying the editing interface for a user to edit the workflow;
editing the workflow according to the operation of the user on the editing interface; the editing includes at least one of the following three types: adding, deleting or modifying the section included in the workflow, adding, deleting or modifying the ultrasound symptom associated with the section, and adding, deleting or modifying the measurement item associated with the section.
10. The method of claim 1 or 8, further comprising:
receiving an instruction for starting an editing interface, and responding to the instruction, displaying the editing interface for a user to add a new workflow; the editing interface provides at least one of an existing workflow, a section included in the existing workflow, an ultrasonic sign associated with the section of the existing workflow, and a measurement item associated with the section of the existing workflow for selection by a user;
and correlating at least one of the existing workflow selected by the user on the editing interface, the section included in the existing workflow, the ultrasonic sign correlated with the section of the existing workflow and the measurement item correlated with the section of the existing workflow to form a new workflow.
11. The method of claim 1, 4 or 5, further comprising: and obtaining a prompt of the next operation and/or a clinical judgment prompt according to the section and the comment content of which the comment content is obtained currently.
12. An ultrasonic inspection method, further comprising:
receiving an instruction for activating a scanning protocol, and responding to the instruction, activating the scanning protocol associated with the instruction and displaying a scanning interface; wherein the scanning protocol comprises a set of a plurality of examination sites required to be scanned by a target object and a set of a plurality of sections required to be scanned by the examination sites; the scanning interface comprises: a section selection area, a comment selection area and a measurement selection area;
transmitting a first ultrasonic wave to a target object, receiving an echo of the first ultrasonic wave, and obtaining a first ultrasonic echo signal based on the echo of the first ultrasonic wave; generating a first ultrasound image based on the first ultrasound echo signal;
displaying the first ultrasound image in an image display area;
the section selection area is used for displaying a plurality of sections of the currently activated scanning protocol for a user to select; the comment selection area is used for displaying ultrasonic symptoms associated with a selected section in a currently activated scanning protocol and displaying comment items of the ultrasonic symptoms for selection by a user; the measurement selection area is used for displaying measurement items associated with a selected section in a currently activated scanning protocol for selection by a user.
13. An ultrasonic inspection method comprising:
receiving an instruction for activating a scanning protocol, and responding to the instruction, activating the scanning protocol associated with the instruction and displaying a scanning interface; wherein the scanning protocol comprises a set of a plurality of sections required to be scanned by the target object; the scanning interface comprises: the section selection area is used for displaying sections and the flow control area is used for controlling a workflow corresponding to a scanning protocol by a user;
receiving an instruction for selecting a section, and determining the section currently scanned; or determining the currently scanned section according to the section scanning sequence preset in the activated scanning protocol;
before or after determining a currently scanned tangent plane, transmitting a first ultrasonic wave to a target object, receiving an echo of the first ultrasonic wave, and obtaining a first ultrasonic echo signal based on the echo of the first ultrasonic wave; generating a first ultrasonic image based on the first ultrasonic echo signal and displaying the first ultrasonic image in an image display area;
and controlling the workflow corresponding to the scanning protocol based on the operation of the user in the flow control area.
14. The method of claim 13, wherein after the currently scanned facet is finished scanning, a next scanned facet is determined according to a facet scanning sequence preset in an activated scanning protocol.
15. The method as recited in claim 13, further comprising:
obtaining a measurement result based on the measurement of the first ultrasound image;
comment content is obtained based on a determination of whether a tangent plane-associated ultrasound symptom exists for the first ultrasound image.
16. The method as recited in claim 13, further comprising: and associating the first ultrasonic image with the currently scanned tangent plane, and identifying the tangent plane with the first ultrasonic image in the tangent plane selection area.
17. The method of claim 13, wherein the section selection area displays sections that are differentially displayed with sections associated with the first ultrasound image and sections not associated with the first ultrasound image.
18. The method of claim 13, wherein controlling the workflow corresponding to the scanning protocol based on the operation of the user in the flow control area comprises:
receiving an instruction for repeating a section, and copying the section in the section selection area in response to the instruction; or,
receiving an instruction for inserting a single cut plane, and inserting the single cut plane in the cut plane selection area in response to the instruction; or,
receiving an instruction for inserting a set of cuts, and inserting a set of cuts in the cut selection area in response to the instruction; or,
Receiving an instruction for increasing the section, and increasing the section in the section selection area in response to the instruction; or,
receiving an instruction for replacing a section, and responding to the instruction, replacing an ultrasonic image associated with the section; or,
receiving an instruction for deleting a section, and deleting the section in the section selection area in response to the instruction; or,
receiving an instruction for suspending a currently activated scanning protocol, and suspending a workflow corresponding to the currently activated scanning protocol in response to the instruction; or,
and receiving an instruction for stopping the currently activated scanning protocol, and stopping the workflow corresponding to the currently activated scanning protocol in response to the instruction.
19. An ultrasonic imaging apparatus, comprising:
an ultrasonic probe for transmitting ultrasonic waves to a region of interest in biological tissue and receiving echoes of the ultrasonic waves;
a transmission/reception control circuit for controlling the ultrasonic probe to transmit ultrasonic waves to the region of interest and to receive echoes of the ultrasonic waves;
input means for receiving input from a user;
a display;
a memory for storing a program;
a processor for executing a program stored in the memory to implement the method of any one of claims 1-18.
20. A computer readable storage medium having stored thereon a program executable by a processor to implement the method of any of claims 1-18.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111355905.7A CN116135150A (en) | 2021-11-16 | 2021-11-16 | Ultrasonic imaging equipment and ultrasonic inspection method thereof |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111355905.7A CN116135150A (en) | 2021-11-16 | 2021-11-16 | Ultrasonic imaging equipment and ultrasonic inspection method thereof |
Publications (1)
Publication Number | Publication Date |
---|---|
CN116135150A true CN116135150A (en) | 2023-05-19 |
Family
ID=86332908
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202111355905.7A Pending CN116135150A (en) | 2021-11-16 | 2021-11-16 | Ultrasonic imaging equipment and ultrasonic inspection method thereof |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN116135150A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN117558417A (en) * | 2024-01-04 | 2024-02-13 | 卡本(深圳)医疗器械有限公司 | Medical image display method, device, equipment and storage medium |
-
2021
- 2021-11-16 CN CN202111355905.7A patent/CN116135150A/en active Pending
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN117558417A (en) * | 2024-01-04 | 2024-02-13 | 卡本(深圳)医疗器械有限公司 | Medical image display method, device, equipment and storage medium |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20240252150A1 (en) | 3d ultrasound imaging system | |
US11653897B2 (en) | Ultrasonic diagnostic apparatus, scan support method, and medical image processing apparatus | |
US10121272B2 (en) | Ultrasonic diagnostic apparatus and medical image processing apparatus | |
JP4476400B2 (en) | Ultrasonic diagnostic equipment | |
CN103118598B (en) | Ultrasound diagnosis device, ultrasound image display device, and ultrasound image display method | |
CN109069131A (en) | Ultrasonic system and method for breast tissue imaging | |
US20060034513A1 (en) | View assistance in three-dimensional ultrasound imaging | |
WO2010046819A1 (en) | 3-d ultrasound imaging | |
US9149250B2 (en) | Ultrasound diagnosis apparatus and image-information management apparatus | |
WO2021128310A1 (en) | Ultrasonic imaging device, and method for quickly setting ultrasonic automatic workflow | |
CN113786214B (en) | Ultrasonic imaging equipment and rapid ultrasonic image labeling method thereof | |
JP2014036863A (en) | Method for management of ultrasonic image, method for display and device therefor | |
CN111493932B (en) | Ultrasonic imaging method and system | |
CN111214254A (en) | Ultrasonic diagnostic equipment and section ultrasonic image acquisition method and readable storage medium thereof | |
US8636662B2 (en) | Method and system for displaying system parameter information | |
CN111973220A (en) | Method and system for ultrasound imaging of multiple anatomical regions | |
WO2021087687A1 (en) | Ultrasonic image analyzing method, ultrasonic imaging system and computer storage medium | |
JP7175613B2 (en) | Analysis device and control program | |
CN112702954B (en) | Ultrasonic diagnostic equipment and method for rapidly distinguishing tangent plane and storage medium thereof | |
CN116135150A (en) | Ultrasonic imaging equipment and ultrasonic inspection method thereof | |
WO2021003711A1 (en) | Ultrasonic imaging apparatus and method and device for detecting b-lines, and storage medium | |
CN115299986A (en) | Ultrasonic imaging equipment and ultrasonic inspection method thereof | |
CN113786213B (en) | Ultrasonic imaging device and readable storage medium | |
CN115153643A (en) | Method for displaying ultrasonic image, electronic equipment and ultrasonic imaging system | |
CN111096765B (en) | Ultrasonic diagnostic apparatus, method of rapidly searching for unfinished section using the same, and storage medium storing the same |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |