Nothing Special   »   [go: up one dir, main page]

CN114762611A - Processing method of multiple dynamic parameters of body and application of processing method in ejection fraction - Google Patents

Processing method of multiple dynamic parameters of body and application of processing method in ejection fraction Download PDF

Info

Publication number
CN114762611A
CN114762611A CN202110638976.1A CN202110638976A CN114762611A CN 114762611 A CN114762611 A CN 114762611A CN 202110638976 A CN202110638976 A CN 202110638976A CN 114762611 A CN114762611 A CN 114762611A
Authority
CN
China
Prior art keywords
time
signals
heart
wave
left ventricle
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110638976.1A
Other languages
Chinese (zh)
Inventor
尹立雪
常传礼
甘建红
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chengdu Healson Technology Co ltd
Original Assignee
Chengdu Healson Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chengdu Healson Technology Co ltd filed Critical Chengdu Healson Technology Co ltd
Publication of CN114762611A publication Critical patent/CN114762611A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/06Measuring blood flow
    • A61B8/065Measuring blood flow to determine blood output from the heart
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/24Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
    • A61B5/316Modalities, i.e. specific diagnostic methods
    • A61B5/318Heart-related electrical modalities, e.g. electrocardiography [ECG]
    • A61B5/346Analysis of electrocardiograms
    • A61B5/349Detecting specific parameters of the electrocardiograph cycle
    • A61B5/366Detecting abnormal QRS complex, e.g. widening
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B7/00Instruments for auscultation
    • A61B7/02Stethoscopes
    • A61B7/04Electric stethoscopes
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/44Constructional features of the ultrasonic, sonic or infrasonic diagnostic device
    • A61B8/4444Constructional features of the ultrasonic, sonic or infrasonic diagnostic device related to the probe
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/48Diagnostic techniques
    • A61B8/486Diagnostic techniques involving arbitrary m-mode
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/52Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/5292Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves using additional data, e.g. patient information, image labeling, acquisition parameters

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Surgery (AREA)
  • Veterinary Medicine (AREA)
  • Public Health (AREA)
  • Animal Behavior & Ethology (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Biophysics (AREA)
  • Pathology (AREA)
  • Radiology & Medical Imaging (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Cardiology (AREA)
  • Acoustics & Sound (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Hematology (AREA)
  • Ultra Sonic Daignosis Equipment (AREA)

Abstract

The method for processing multiple dynamic parameters of the human body and the application of the method to ejection fraction acquire myoelectric, sound and M-ultrasonic signals by equipment which simultaneously acquires myoelectric signals, sound signals and M-ultrasonic images and comprehensively analyze the signals to obtain the accurate motion state of the human body. Particularly, by monitoring the electrocardiosignals, the cardiac ultrasonic images and the heartbeat sounds simultaneously and processing the electrocardiosignals, the cardiac ultrasonic images and the heartbeat sound signals to align time, the minimum distance and the maximum distance between the front wall and the rear wall of the left ventricle of the heart can be found in the cardiac ultrasonic images more accurately, and the minimum volume of contraction of the left ventricle of the heart and the maximum volume of the left ventricle of the heart during expansion are obtained through calculation so as to calculate the ejection fraction of the heart.

Description

Processing method of multiple dynamic parameters of body and application of processing method in ejection fraction
Technical Field
The invention relates to medical equipment, a signal processing method and application thereof.
Background
During the movement of human body (including other animal body), the muscle electricity, hereinafter called myoelectricity, is generated under the command of brain. Myoelectric refutes human body movement. Electrocardio is also a kind of myoelectricity, and causes the heart to beat.
The prior art can effectively monitor the generation of myoelectricity and the electric quantity. Recorded by the strength of the electromyographic signal. For example, an electrocardiogram, i.e., the waveform of the electrocardiogram can be known.
Meanwhile, through the ultrasonic image, the shape and the position of the muscle of the human body can be known. When the body moves, sounds such as bone contact sounds, heart beating sounds and the like are also generated. When the human body moves, the myoelectricity, the muscle form change and the sound generation are all generated in cooperation and corresponding to each other. For example, the heartbeat generates electrocardio, drives the heart to generate heartbeat change, and then generates sound (heartbeat sound) when the heart collides.
The existing medical equipment can monitor myoelectricity, morphological images and sound respectively, but the technology cannot monitor the myoelectricity, the morphological images and the sound simultaneously, for example, the existing equipment can be used for assisting the analysis of ejection fraction through the morphological images.
The ejection fraction is the percentage of the stroke volume to the end-diastolic volume of the ventricles (i.e. heart preload), the normal value is 50-70%, and the ejection fraction can be examined by heart color ultrasound and is one of the important indicators for judging the type of heart failure. Ventricular ejection fraction refers to the ratio of the stroke volume of the ventricles to the ventricular end-diastolic volume.
Ejection fraction is a volume ratio index that reflects the ejection function of the ventricles from a volume perspective.
However, since only image information is provided for medical staff, the analysis of the ejection fraction requires the medical staff to measure the ejection fraction through the image information and the hand movement marks of the medical staff, and then calculate the ejection fraction, which is not convenient for rapidly providing a medical solution.
Therefore, in order to obtain the relevant state information of the human body quickly, and to assist the medical staff better, and in order to ensure the accuracy of the monitoring information of the medical equipment on the human body, it is necessary to fully utilize the myoelectric, morphological image and sound information respectively generated and monitored during the human body movement.
Disclosure of Invention
In order to solve the problems, the invention provides a method for collecting multiple dynamic parameters of a body and application of the method in cardiac ejection fraction analysis.
The method for collecting the multi-dynamic-parameter processing of the body comprises the following steps:
the method comprises the steps that an acquisition device with at least two acquisition functions of an ultrasonic signal acquisition function, an electromyographic signal acquisition function and an audio signal acquisition function is adopted, at least two signals of the ultrasonic signal, the electromyographic signal and the audio signal are acquired from the same machine body, and the acquired signals are synchronously displayed on a display device; the synchronization is that all the signals are projected onto the display device in real time, and the time difference between all the signals projected onto the display device is equal to the acquisition time difference of all the signals.
The synchronous display method specifically comprises the following steps:
s1, acquiring the time length required by the respective processing of the ultrasonic signal, the electromyographic signal and the audio signal; obtaining the maximum duration, and then comparing the duration of the rest signal processing with the maximum duration to obtain the comparison duration; the time length required by the processing is the time required from the acquisition of the signals to the display of the signals on the display device.
S2, immediately displaying the signal with the time length required by processing as the maximum time length after the signal processing is finished;
and (4) finishing processing the rest signals, and immediately displaying after the contrast time.
The method for collecting multiple dynamic parameters of the body to process as described above is further described as follows: the ultrasonic signal is a heart M ultrasonic image signal; the electromyographic signals are electrocardio QRS wave signals, and the audio signals are heart sounds.
The method for collecting processing of multiple dynamic parameters of an organism as described above is further described as follows: the electrocardio QRS wave signal has the maximum duration; the comparison duration between the duration required by processing the electrocardio QRS wave signals and the duration of the rest signal processing is specifically as follows: 0.125S.
The method for obtaining the cardiac ejection fraction by utilizing multi-dynamic parameter processing comprises the following steps:
distributing the electrocardio QRS waves which are continuously collected from the same organism in the same time period and the M ultrasonic images of the heart morphological change which synchronously occur with the electrocardio to the same time axis and aligning to the same initial time point;
in the M ultrasonic image, at a time point after a first specific time length after one of Q wave, R wave and S wave in electrocardio QRS waves is generated, obtaining a first longitudinal pixel array of the M ultrasonic image, and finding out the minimum value of the distance between the inner membrane of the front wall of the left ventricle and the inner membrane of the rear wall of the left ventricle in the first longitudinal pixel array;
in the M ultrasonic image, obtaining a second longitudinal pixel array of the M ultrasonic image at a time point after a second specific time length after one of Q wave, R wave and S wave in the electrocardio QRS wave is generated, and finding out the maximum value of the distance between the inner membrane of the front wall of the left ventricle and the inner membrane of the rear wall of the left ventricle in the second longitudinal pixel array;
and calculating the cardiac ejection fraction according to the minimum value and the maximum value.
The method for obtaining the cardiac ejection fraction by utilizing the multi-dynamic-parameter processing as described above is further described as follows: aligning to the same starting time point, specifically, displaying the acquired ultrasonic signals and the electromyographic signals on a display device synchronously by adopting the method of any one of the claims 1 to 4; the display time difference of the ultrasonic signals and the electromyographic signals projected to the display device in real time is equal to the acquisition time difference of the acquired signals.
The method for obtaining the cardiac ejection fraction by using the multi-dynamic-parameter processing as described above is further described as follows: the electrocardio QRS waves and the M ultrasonic images are acquired by equipment with the electrocardio QRS waves and the M ultrasonic images acquisition functions;
the display device is provided with a continuous time axis and a real-time state axis.
The method for obtaining the cardiac ejection fraction by utilizing the multi-dynamic-parameter processing as described above is further described as follows: the method for calculating the first specific time length and the second specific time length comprises the following steps: analyzing a plurality of M ultrasonic images subjected to binarization processing, marking the time of the minimum value of the interval between the inner membrane of the front wall of the left ventricle and the inner membrane of the rear wall of the left ventricle, and calculating the time difference between the time of the occurrence of one of Q wave, R wave and S wave and the time of the occurrence of the minimum value to obtain a first specific time length;
and calculating the time difference between the time of occurrence of one of the Q wave, the R wave and the S wave and the time of occurrence of the maximum value to obtain a second specific time length.
The method for obtaining the cardiac ejection fraction by utilizing the multi-dynamic-parameter processing as described above is further described as follows: after obtaining the minimum value and the maximum value of the distance between the anterior intima of the left ventricle and the posterior intima of the left ventricle, calculating the cardiac ejection fraction by the following calculation formula EF ═ EDV-ES (100%), wherein EF is the cardiac ejection fraction, and EDV is the maximum volume of the left ventricle of the heart at the maximum distance between the anterior intima of the posterior wall and the left ventricle of the heart; minimum volume of the left ventricle of the heart at minimum distance of the anterior-posterior intima of the left ventricle of the ES heart;
wherein, the maximum volume of the left ventricle of the heart is calculated by the minimum distance between the front wall and the back wall of the left ventricle of the heart; and calculating the maximum volume of the left ventricle of the heart according to the maximum distance between the front wall and the back wall of the inner membrane of the left ventricle of the heart.
The method for obtaining the cardiac ejection fraction by utilizing the multi-dynamic-parameter processing as described above is further described as follows: m ultrasound images consist of: the horizontal pixel rows parallel to the time axis and the vertical pixel rows perpendicular to the time axis.
The invention has the beneficial effects that:
the invention can accurately obtain the maximum distance and the minimum distance between the front wall and the rear wall of the left ventricle in the beating process of the heart, thereby automatically obtaining the ejection fraction of the heart.
The average value and the current value of the ejection fraction of the heart can be dynamically known by continuously obtaining the maximum interval of the maximum interval between the left ventricle and the anterior-posterior wall; and the maximum and minimum values of the ejection fraction of the heart.
Drawings
Fig. 1 is a structural diagram of a multifunctional acquisition device for realizing that the same device respectively acquires ultrasonic image information, heartbeat sound and electrocardio in an example one.
Fig. 2 is a schematic diagram of the control principle of the multifunctional acquisition device introduced in the first example and the second example.
Fig. 3 is a structural diagram of a multifunctional acquisition device for realizing acquisition of ultrasonic image information, heartbeat sound and electrocardio by the same device in the second example.
Fig. 4 is a schematic diagram illustrating the control principle of the acquisition device for acquiring and processing ultrasound image information in example three.
Fig. 5 is a structural diagram of an acquisition apparatus for acquiring and processing ultrasound image information in example three.
FIG. 6 is a schematic diagram of a synchronous display method according to an embodiment of the present invention.
Detailed Description
The apparatus and method of the present invention are primarily intended for use with humans, and the body of the present invention is primarily intended for use with humans, although it may also be adapted for use with animals other than humans. The term "human body" is used as a conventional substitute.
The implementation of the present invention is preferably implemented in the form of software codes, and the steps of the present invention are edited into software codes and installed in a computer with a computing function, for example, in the form of software codes and installed in a smart phone or a tablet computer, which has the function of recognizing and reading M-type echocardiography images.
Those of ordinary skill in the art will appreciate that the various illustrative algorithm steps described in connection with the embodiments disclosed herein can be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the technical solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
The implementation of the present invention is accomplished by installing a corresponding computer program in a computer with related hardware, where the computer program can be stored in a computer readable storage medium, and the computer program can implement various steps required when being executed by a processor. Wherein the computer program comprises computer program code, which may be in the form of source code, object code, an executable file or some intermediate form, etc. The computer-readable medium may include: any entity or device capable of carrying the computer program code, recording medium, usb disk, removable hard disk, magnetic disk, optical disk, computer Memory, Read-Only Memory (ROM), Random Access Memory (RAM), electrical carrier wave signals, telecommunications signals, software distribution medium, and the like. It should be noted that the computer readable medium may contain other components which may be suitably increased or decreased as required by legislation and patent practice in jurisdictions, for example, in some jurisdictions, computer readable media which may not include electrical carrier signals and telecommunications signals in accordance with legislation and patent practice.
In the computer of the present invention, the main hardware components still include the following main parts: the system comprises a central processing unit, a memory, a chipset, an I/O bus, I/O equipment, a power supply, a case and related software, wherein the related software refers to software which can input an M-mode echocardiography (M-echo) image onto a display screen, the M-mode echocardiography can be processed and calculated by a computer background instead of displaying the M-mode echocardiography in an image form, and finally, a calculation result is only displayed on the display.
The first example is as follows:
the multifunctional collecting device is provided with a probe for transmitting and collecting ultrasonic signals and a sound collecting head and a myoelectricity collecting electrode, and is used for collecting the body information of a human body. Ultrasonic image information, sound information and myoelectric information are respectively collected. Specifically, the heart beating information of a human body is acquired through a probe for transmitting and acquiring ultrasonic signals and a multifunctional acquisition device provided with a sound acquisition head and an electrocardio acquisition electrode, and the heart ultrasonic image information, the heart beating sound and the electrocardio are respectively acquired.
By referring to the following devices, the same device can respectively acquire ultrasonic image information, heartbeat sound and electrocardio.
Referring to fig. 1, the multifunctional stethoscope provided in this embodiment includes an equipment main body 5, a single-array-element ultrasonic probe 2 is disposed on the equipment main body 5, an electrocardiograph sensor is disposed on the equipment main body 5, and a charging interface is disposed at a lower end of the equipment main body 5; the equipment main body 5 is internally provided with a control circuit board which is used for controlling the equipment main body 5 to work and interactively transmitting data with the mobile client terminal, and a power supply module which supplies power for the whole stethoscope main body, the charging interfaces are respectively connected with the power supply module, and the power supply module comprises a rechargeable battery.
The array element ultrasonic probe 2 and the electrocardio sensor are integrated on the main body 5 of the equipment, so that the probe can realize the functions of ultrasound and electrocardio.
The equipment main body 5 is provided with a sound pick-up 4, the electrocardio sensor is a three-lead electrocardio electrode, the three-lead electrocardio electrode is composed of three electrocardio electrodes 3 which are uniformly arranged on the front end face of the equipment main body 5, the single-array element ultrasonic probe 2 is positioned in the middle of the front end face of the equipment main body 5, and the sound pick-up 4 is positioned on one side of the front end face of the equipment main body 5.
The detection functions of heart sound, fetal heart monitoring, breath sound and bowel sounds can be realized through the sound pickup 4. In actual operation, the sound output of the stethoscope can be displayed on the display of the mobile client terminal through the speaker of the mobile client terminal or through the bluetooth headset connected with the stethoscope.
Equipment main part 5 is including last casing 6, the main casing body 7 and the lower casing that from top to bottom connects gradually, and control circuit board is located inside the main casing body 7, and 5 upper ends of equipment main part are equipped with the visor 1 that closes rather than the lid, go up 6 surfaces of casing and be equipped with the latch 601 that can block visor 1 medial surface, and the spiro union has well hollow cover 401 above going up casing 6, and well hollow cover 401 inboard is equipped with the film piece 402 that is located adapter 4 top.
The structure composition of the equipment control circuit board, the electrocardio sensor, the single-array element ultrasonic probe 2 and the sound pickup 4 is shown in figure 2.
The shell part of the device main body 5 is divided into an upper shell 6, a main shell 7 and a lower shell, so that the difficulty of the production process can be reduced, the protective cover 1 at the upper end of the device main body 5 is covered with the upper shell 6, and the pickup 4, the electrocardio-electrode 3 and the thin film piece 402 can be prevented from being stained by foreign matters when not in use.
The control circuit board is provided with a main control chip which is used for processing the detection data of the single-array element ultrasonic probe 2, the electrocardio sensor and the sound pick-up 4 and converting the detection data into readable data in numerical values and curve forms; and the data transmission module is used for transmitting the data generated by the main control chip to the mobile client terminal. And the FLASH memory chip FLASH is used for storing all the data generated by the main control chip, and is connected with the main control chip.
The single-array-element ultrasonic probe 2 is connected with the main control chip through an ultrasonic channel, the sound pick-up 4 and the electrocardio sensor are connected with the main control chip through a composite channel, and the client terminal and the mobile client terminal are connected with the main control chip through a data transmission module.
In this embodiment, the multifunctional stethoscope can simultaneously acquire ultrasound image information by using the single-array-element ultrasound probe, and the sound pickup 4 acquires heartbeat sound and the electrocardiographic information by using the electrocardiographic sensor. Meanwhile, the three collecting heads are simultaneously and basically at the same position (such as the heart of the left chest of the human body) to collect the information, so that the collected information is real-time and is specific to the same body.
Example two:
referring to the device of the following chinese patent CN209695222U, the same device can be used to acquire ultrasound image information, heartbeat sound, and electrocardiogram.
The structure includes, with reference to fig. 2 and 3:
1. and the single-array-element ultrasonic probe is used for detecting the ejection fraction LVEF of the heart. The cardio-pulmonary function probe comprises an electrocardio sensor for detecting heart rate ECG, wherein the electrocardio sensor is a three-lead electrocardio electrode, and a respiratory frequency sensor is a capacitance type respiratory sensor. A respiratory rate sensor for detecting the respiratory rate RR. The single-array-element ultrasonic probe is connected with the main control chip through the ultrasonic channel, the electrocardio sensor and the respiratory frequency sensor on the cardiopulmonary function probe are connected with the main control chip through the composite channel, the client terminal mobile client terminal is connected with the main control chip through the data transmission module, the single-array-element ultrasonic probe and the cardiopulmonary function probe are both located on the front end face of the stethoscope main body, and the electrocardio sensor and the respiratory frequency sensor are fixed into a whole side by side. The single-element ultrasonic probe includes:
(1) the ultrasonic transducer comprises an ultrasonic probe, a signal processing unit and a signal processing unit, wherein the ultrasonic probe is used for transmitting ultrasonic waves to a human body and receiving echo signals; (2) the transmitting circuit is used for controlling the ultrasonic transducer to transmit ultrasonic waves to the human body; (3) the receiving circuit is used for receiving an echo signal of the ultrasonic transducer; (4) the interface circuit is used for being connected with a main control chip in the stethoscope main body through an ultrasonic channel; (5) and the controller is used for controlling the transmitting circuit to work, detecting whether the receiving circuit has the echo signal or not and storing the echo signal. The controller comprises a transmitting control module, a data storage module, a receiving control module and an interface control module, wherein the transmitting control module is connected with a transmitting circuit through a transmitting control bus, the data storage module is connected with a receiving circuit, the receiving circuit is connected with the receiving control module through a receiving control bus, and the interface control module is connected with an interface circuit. (6) And a transmission control bus, a reception switch circuit, and a reception control bus.
2. The main control chip, refer to fig. 2, is used for processing the detection data of the single-element ultrasonic probe, the electrocardiograph sensor and the respiratory rate sensor, and converting the detection data into readable data in numerical value and curve form.
3. And the data transmission module is used for transmitting the data generated by the main control chip to the mobile client terminal.
4. And the power supply module is used for supplying power to the whole stethoscope main body.
5. And the power switch is used for switching on or off the power supply of the power module.
6. And the FLASH memory chip FLASH is used for storing all the data generated by the main control chip and is connected with the main control chip.
The data transmission module is a USB interface and/or a Bluetooth module. The mobile client terminal is a mobile phone terminal. The number of the cardiopulmonary function probes is not less than two. The stethoscope main body is cylindrical.
In the implementation process, the front end face of the stethoscope main body is pressed on the lung and heart parts of a patient to be auscultated, so that the single-array-element ultrasonic probe, the electrocardio-sensor and the respiratory frequency sensor of the front end face of the stethoscope main body are all in contact with the surface of the auscultation part, the main control chip respectively sends electric signals for detection to the single-array-element probe, the electrocardio-sensor and the respiratory frequency sensor, the single-array-element probe, the electrocardio-sensor and the respiratory frequency sensor detect the body parts and transmit the detection data back to the main control chip, and the main control chip respectively processes the detection data transmitted back by the single-array-element probe, the electrocardio-sensor and the respiratory frequency sensor so as to acquire the numerical values of ejection fraction LVEF, heart rate and respiratory frequency RR in real time and convert the continuously acquired specific numerical values into curves. The main control chip transmits the values and corresponding curves of the acquired ejection fraction LVEF, the heart rate ECG and the respiratory rate RR to the mobile client through the data transmission module, and a doctor can check and acquire the specific values of the respiratory rate RR, the heart rate ECG and the ejection fraction LVEF of the heart in real time on the mobile client. The FLASH memory chip FLASH stores all the numerical values and curves generated by the main control chip, the main control chip can transmit the numerical values and curves stored in the FLASH memory chip FLASH to the mobile client terminal for detailed checking and analysis through the data transmission module, and the stethoscope main body can be prevented from detecting data loss when the power supply module is accidentally powered off.
When the device of CN209695222U is used, the respiration rate sensor is changed to the microphone 4 in the first example to realize the heartbeat sound collection function required by the present invention.
In this embodiment, the multifunctional stethoscope can simultaneously acquire ultrasound image information by using the single-array-element ultrasound probe, heartbeat sound acquired by the sound pickup, and electrocardiogram information acquired by the electrocardiogram sensor. Meanwhile, the three collecting heads are simultaneously and basically at the same position (such as the heart of the left chest of the human body) to collect the information, so that the collected information is real-time and is specific to the same body.
Example three:
in both the first and second examples, the ultrasound image information needs to be processed, and the ultrasound image information can be acquired by referring to the apparatus of the following chinese patent CN 209751086U. Refer to fig. 4 and 5.
The embodiment comprises an ultrasonic transducer for emitting scanning beams, a digital control processing chip for controlling the ultrasonic transducer to emit the scanning beams and collecting echo signals, a portable control terminal for emitting control instructions to the digital control processing chip and checking scanning images, a transmitting and receiving multiplexing circuit, a transmitting and receiving switching circuit, a transmitting circuit, a receiving circuit, a USB interface circuit, a low power module and an ultrasonic instrument shell.
The ultrasonic transmitting and receiving switching circuit, the transmitting and receiving multiplexing circuit and the ultrasonic transducer are sequentially connected in series, the transmitting circuit and the receiving circuit are respectively connected with the transmitting and receiving switching circuit, the ultrasonic transducer, the digital control processing chip, the transmitting and receiving multiplexing circuit, the transmitting and receiving switching circuit, the transmitting circuit, the receiving circuit and the USB interface circuit are all packaged in an ultrasonic instrument shell, the ultrasonic transducer is located at the front end of the ultrasonic instrument shell, the front end face of the ultrasonic instrument shell is a coupling plane, and the USB interface circuit is located at the rear end of the ultrasonic instrument shell.
The low power module is provided with:
1. the digital power supply is used for providing adaptive electric energy for the digital control processing chip, the transmitting and receiving switching circuit, the transmitting circuit and the transmitting and receiving multiplexing circuit, a linear voltage regulator for protecting the voltage stability of the digital power supply is arranged in the digital power supply, and the digital control processing chip, the transmitting and receiving switching circuit, the transmitting circuit and the transmitting and receiving multiplexing circuit are respectively connected with the digital power supply;
2. the analog power supply is used for providing adaptive electric energy for the transmitting and receiving multiplexing circuit and the receiving circuit, a linear voltage regulator for protecting the voltage stability of the analog power supply is arranged in the analog power supply, and the transmitting and receiving multiplexing circuit and the receiving circuit are respectively connected with the analog power supply;
3. the adjustable high-voltage DC converter is used for providing adaptive high voltage for the transmitting circuit, and the transmitting circuit is connected with the adjustable high-voltage DC converter;
4. and the overcurrent protector is used for limiting the current intensity supplied to the digital power supply, the analog power supply and the adjustable high-voltage DC converter, and the digital power supply, the analog power supply and the adjustable high-voltage DC converter are respectively connected with the USB interface circuit through the overcurrent protector.
The end face is the coupling plane before the supersound appearance casing, is equipped with in the digital control processing chip:
1. and the transmitting beam control module is used for controlling the transmitting and receiving switching circuit to open a transmitting circuit channel and close a receiving circuit channel and controlling the ultrasonic transducer to transmit scanning beams, and is connected with the transmitting circuit through a transmitting control bus.
2. And the receiving control module is used for detecting whether the transmitting and receiving multiplexing circuit has echo signals or not, controlling the transmitting and receiving switching circuit to open a receiving circuit channel and close a transmitting circuit channel, and respectively connecting the transmitting and receiving switching circuit and the transmitting and receiving multiplexing circuit with the receiving control module through a receiving control bus.
3. The data processing module is used for receiving and collecting echo signals and is connected with the receiving circuit;
4. and the interface control module is used for controlling the transmitting beam control module and the receiving control module to alternately operate and sending the echo signals acquired by the data processing module to the portable control terminal, the transmitting beam control module and the receiving control module are respectively connected with the interface control module, and the interface control module is connected with the portable control terminal through a USB interface circuit.
The digital control processing chip is an FPGA programmable digital gate array processing chip. The portable control terminal is a mobile phone terminal.
In the implementation process, the portable control terminal supplies power to the ultrasonic instrument through the USB interface circuit, the digital power supply can provide adaptive electric energy for the digital control processing chip, the transmitting and receiving switching circuit, the transmitting circuit and the transmitting and receiving multiplexing circuit, the analog power supply can provide adaptive electric energy for the transmitting and receiving multiplexing circuit and the receiving circuit, the adjustable high-voltage DC converter provides adaptive high voltage for the transmitting circuit, and linear voltage regulators for protecting the voltage stability of the digital power supply and the analog power supply are arranged in the digital power supply and the analog power supply.
The portable control terminal sends a scanning instruction to the ultrasonic instrument, the digital control processing chip controls the transmitting and receiving switching circuit to open a transmitting circuit channel and close a receiving circuit channel through the transmitting circuit, so that the ultrasonic transducer is controlled to send a scanning beam, and the ultrasonic transducer sends an obtained echo signal to the transmitting and receiving multiplexing circuit. When the digital control processing chip detects that the transmitting and receiving multiplexing circuit has an echo signal, the digital control processing chip controls the transmitting and receiving switching circuit to open a receiving circuit channel and close the transmitting circuit channel, the echo signal is sent to the digital control processing chip through the transmitting and receiving switching circuit and the receiving circuit in sequence, the digital control processing chip collects the echo signal, the collected echo signal is sent to the portable control terminal in sequence through the USB interface circuit, and the portable control terminal processes the collected echo signal to obtain an ultrasonic image and displays the ultrasonic image.
The portable control terminal supplies power to the ultrasonic instrument, the voltage of the portable control terminal does not exceed 5V, the current of the portable control terminal does not exceed 1500mA, the overcurrent protector limits the voltage of the portable control terminal input low power module to not exceed 500mA, the digital power supply can output two adaptive power supplies of 3.3V, 300mA and 1.2V, 460mA, and the analog power supply can output adaptive power supplies of 3.3V and 400 mA.
Therefore, the ultrasonic instrument adopts the portable control terminal to supply power, and the low power module in the ultrasonic instrument can convert the electric energy input by the portable control terminal into the electric energy matched with the electric energy, so that the miniaturization requirement of the ultrasonic instrument can be met.
And sending a scanning instruction to the ultrasonic instrument through the portable control terminal according to the scanning requirement, wherein the convex array scanning is used for scanning the abdomen of the human body, and the scanning is used for scanning the superficial part of the human body. The transmitting beam control module controls the transmitting and receiving switching circuit to open a transmitting circuit channel and close a receiving circuit channel through the transmitting circuit, the transmitting beam control module controls the ultrasonic transducer to send out scanning beams, and the ultrasonic transducer sends obtained echo signals to the transmitting and receiving multiplexing circuit. When the receiving control module detects that the transmitting and receiving multiplexing circuit has echo signals, the receiving control module controls the transmitting and receiving switching circuit to open a receiving circuit channel and close a transmitting circuit channel, the echo signals are sequentially sent to the data processing module through the transmitting and receiving switching circuit and the receiving circuit, the data processing module collects the echo signals, the collected echo signals are sequentially sent to the portable control terminal through the interface control module and the USB interface circuit, and the portable control terminal processes the collected echo signals to obtain ultrasonic images and displays the ultrasonic images.
After the portable control terminal sends a scanning instruction to the ultrasonic instrument, the transmitting beam control module controls the ultrasonic transducer to send scanning beams, and the interface control module controls the transmitting beam control module and the receiving control module to alternately operate, so that the ultrasonic instrument can realize an ultrasonic scanning function.
Example four:
the method for obtaining the cardiac ejection fraction by utilizing the multi-dynamic parameter processing comprises the following steps:
1. by adopting the acquisition devices provided by the first, second and third examples and simultaneously having the ultrasonic signal acquisition function and the myoelectric signal acquisition function, the ultrasonic signals and the myoelectric signals are acquired from the same body, and the acquired signals are synchronously displayed on the display device.
The same body here refers to the human body or the human heart. The ultrasonic signal information is an M-echocardiogram, and the myoelectric information is an electrocardiosignal which is shown in an electrocardio QRS wave electrocardiogram. Because the heart beats by the contraction motion of the heart excited by the electrocardio, one piece of electrocardio information corresponds to the cycle beat of one heart, and the electrocardio signals correspond to the cycle beat of the heart synchronously, wherein the corresponding synchronization does not mean simultaneous operation but means corresponding follow-up operation.
2. The synchronous display in the first step is performed in the following manner: and distributing the acquired electrocardio QRS waves and the M ultrasonic images of the heart morphological changes synchronously generated with the electrocardio to the same time axis and aligning to the same starting time point. Synchronously displaying the acquired ultrasonic signals and the acquired electromyographic signals on a display device; the display time difference of the ultrasonic signals and the myoelectric signals projected to the display device is equal to the acquisition time difference of the acquired signals.
The method comprises the following specific steps: because the acquisition device is completed on the same machine body by adopting the acquisition device which has the ultrasonic signal acquisition function and the electromyographic signal acquisition function at the same time and is displayed on the same display device (display), the acquisition part (for example, referring to fig. 1, a single-array-element ultrasonic probe 2, a sound collector 4 and an electrocardio sensor/electrocardio electrode 3) of the acquisition device acquires the signals, because the heartbeat of a human body is stimulated by electrocardio, firstly, the electrocardio signal is generated and then, the heart form change is generated in one heartbeat cycle (from the initial systole stage to the end diastole stage), so the electrocardio signal and the heart form change acquired by the acquisition part of the acquisition device have a time difference (because the electrocardio is firstly generated and then the heart/heart form change is generated), the time difference is accurately reflected on the display device, and the time difference is calculated by the method, and correspondingly faithfully react, so that the display time difference of the ultrasonic signals and the electromyographic signals projected on the display device in real time is equal to the acquisition time difference of the acquired signals.
3. In the M ultrasonic image, at a time point after a first specific time length after one of Q wave, R wave and S wave in the electrocardio QRS wave is generated, a first longitudinal pixel array of the M ultrasonic image is obtained, and the minimum value of the distance between the inner membrane of the anterior wall of the left ventricle and the inner membrane of the posterior wall of the left ventricle is found in the first longitudinal pixel array. In the M ultrasonic image, at a time point after a second specific time length after one of Q wave, R wave and S wave in the electrocardio QRS wave is generated, a first longitudinal pixel array of the M ultrasonic image is obtained, and the maximum value of the distance between the inner membrane of the anterior wall of the left ventricle and the inner membrane of the posterior wall of the left ventricle is found in the second longitudinal pixel array.
The method for calculating the first specific time length and the second specific time length comprises the following steps: analyzing a plurality of M ultrasonic images subjected to binarization processing, marking the time of the minimum value of the interval between the inner membrane of the front wall of the left ventricle and the inner membrane of the rear wall of the left ventricle, and calculating the time difference between the time of the occurrence of one of Q wave, R wave and S wave and the time of the occurrence of the minimum value to obtain a first specific time length; and calculating the time difference between the time of occurrence of one of the Q wave, the R wave and the S wave and the time of occurrence of the maximum value to obtain a second specific time length. It is considered here that the cardiac cycles may be different from one body to another, and that within a cycle, the wavelengths are slightly different. Electrocardiograms have a QRS complex, which is composed of Q, R and S waves, and for example, the heart of a certain body is as follows: after the electrocardio R wave occurs and the first specific time length is 0.40s, the heart enters the end of the systolic phase to reach the minimum volume, which is the rapid ejection phase; after the R-wave passage occurs and passes for a second specified duration of 0.80s, the heart enters the end of diastole, reaching maximum volume. The occurrence time of the R wave is taken as a starting point, because the characteristic point of the R wave is most obvious, that is, the peak value is largest, and is most easily identified. Of course, other Q-wave, S-wave, T-wave, and P-wave may be used as the time starting point, and these peaks are all in one heartbeat cycle, and each heartbeat cycle has these waves, so it is not important which peak is used as the starting point. When taking the R wave as a starting point: after the first specific time is 0.40s, the heart enters the end of the systolic phase to reach the minimum volume; after the second specific time length is 0.80s, the heart enters the end of diastole to reach the maximum volume; starting from the Q wave: q wave occurs, after the first specific time is 0.36s, the heart enters the end of the systolic period, and the minimum volume is reached; and after a second specified time period of 0.76s, the heart enters the end of diastole to reach maximum volume. Images of the minimum and maximum volumes will be captured ultrasonically, producing an M-echocardiogram.
By the minimum value and the maximum value, the following calculation formula is adopted:
calculating the cardiac ejection fraction by EF (EDV-ES) 100%, wherein EF is the cardiac ejection fraction, and EDV is the maximum volume of the left ventricle of the heart at the maximum distance between the anterior wall and the posterior wall of the left ventricle of the heart; minimum volume of the left ventricle of the heart at minimum distance of the anterior-posterior intima of the left ventricle of the ES heart;
wherein, the maximum volume of the left ventricle of the heart is calculated by the minimum distance between the front wall and the back wall of the left ventricle of the heart; and calculating the maximum volume of the left ventricle of the heart according to the maximum distance between the front wall and the back wall of the inner membrane of the left ventricle of the heart.
The M ultrasonic image of the invention: the M ultrasonic image is an image formed by a plurality of dense pixel points, the pixel points are arranged in a matrix mode and divided into transverse pixel rows parallel to a time axis and longitudinal pixel rows perpendicular to the time axis, and a first longitudinal pixel column are all one column in a plurality of longitudinal pixel columns.
On the basis, in order to find out the minimum value and the maximum value of the distance between the inner membrane of the front wall of the left ventricle and the inner membrane of the rear wall of the left ventricle in one frame of M ultrasonic image, the relation between the heart systole end and the heart diastole end can be found by means of the time of the electrocardio QRS wave group.
When this relationship is used, it is satisfied that: projecting all the signals onto a display device in real time, and enabling the time difference between all the signals projected onto the display device to be equal to the acquisition time difference of all the signals; specifically, the electrocardio-wave group corresponds to one frame of M ultrasonic image, and the time difference between the electrocardio-wave group and the ultrasonic image projected to the display device is equal to the acquisition time difference between the electrocardio-signal and the ultrasonic signal transmitted to the acquisition head. For example: electrocardio R wave signals are generated from a machine body, the electrocardio R wave signals are collected by the myoelectricity collecting electrode in the first example at 0s, the R wave signals are generated from the machine body, after the first specific time duration is 0.40s, the heart enters the end of the systolic period, namely the single-array ultrasonic probe is used for collecting an image of the end of the systolic period, so that the time for collecting the ultrasonic image and the R wave signals has the collecting time difference of 0.40s, namely the collecting time difference is 0.40 s. The time from the acquisition of the R wave signal by the myoelectric acquisition electrode to the projection of the R wave signal on the display is 0.5s, the time from the acquisition of the heart image by the single-array-element ultrasonic probe to the projection of the image on the display is only 0.37s, and if the electrocardio R wave signal and the ultrasonic image signal are directly projected, the heart enters the end of the systolic period after the R wave signal seen on the display is generated for 0.37 s. Therefore, the time difference is wrong, and the actual time difference between the electrocardiogram and the image on the body must be reflected to the display to avoid calculation errors.
After the acquisition time difference and the display time difference are processed, the cardiac electrocardiogram QRS wave can be strictly corresponding to the ultrasonic image to find the end (maximum volume or maximum value) of the diastole of the heart, for example, after the R wave occurs for a first specific time length of 0.40, the corresponding row of pixels (first longitudinal pixel row) on the M ultrasonic image is the end of the systole of the heart, and the minimum volume and the minimum value are reached. After the R-wave occurs for the first specific time period of 0.80s, the corresponding column of pixels (second longitudinal pixel column) on the M ultrasound image is the end diastole, i.e. the maximum volume, maximum value of the heart.
Example five:
mentioned in example four: the actual time difference between the electrocardio and the image on the organism is reflected to a display so as to avoid calculation errors, and the method comprises the following steps:
the acquisition device provided by the first, second and third examples and simultaneously provided with at least two acquisition functions of an ultrasonic signal acquisition function, an electromyographic signal acquisition function and an audio signal acquisition function is adopted to acquire at least two signals of the ultrasonic signal, the electromyographic signal and the audio signal from the same body and synchronously display the acquired signals on the display device.
The display device is provided with a continuous time axis and a real-time state axis. The time axis referred to in the present invention is a lateral axis projected on a display device; the real-time status axis may also be referred to as a change status axis, the real-time status axis is a longitudinal axis projected on the display device, and the time axis and the change status axis are displayed on the display device, and the relationship between the change status of the signal and the time can be seen.
The synchronization here is shown as: the signals are projected onto the display device in real time, and the time difference between the signals projected onto the display device is equal to the acquisition time difference of the acquired signals.
At least one signal among the ultrasonic signals, the electromyographic signals and the audio signals is influenced by the processing circuit, so that the display time is lagged compared with the rest signals; the time of the application of the signals affected by the processing circuit to the display device therefore lags behind the rest of the signals, in particular the reason for the circuits of the various electronic components, and the reason for the software processing the collected information, resulting in a time difference between the acquisition of the signals and different sources of information, for example: the heartbeat sound can be processed by a circuit and software, the ultrasonic image can be fast, but the electrocardiosignal is relatively slow from acquisition to presentation, about 0.125S. Therefore, in order to make presented information consistent, presented information needs to be adjusted. I.e. said "synchronization".
The delay time brought by the processing of the signal by the same circuit and the same software is consistent, so that the delay time of the electric signal processing can be calculated according to the characteristics of the circuit and the software. It is better to actually carry out the test through software and circuits, and the test obtains the time required by the software and the circuits to process the signals. We record this time. The ultrasonic signal, the heartbeat sound signal and the electrocardiosignal have the same processing steps and have different required processing time respectively. The method comprises the steps that ultrasonic signals, heartbeat sounds, signals and electrocardiosignals are recorded, the processing time required by the three signals is respectively required, the time required by the electrocardiosignal processing is longest, therefore, when the three signals are presented, after the heartbeat sounds and the ultrasonic signals are processed, the signals are not presented immediately, but are delayed to wait for the electrocardiosignal processing to be completed, the waiting time is the delay difference value between the recorded electrocardiosignals and the rest two signals (ultrasonic signals and sound signals), and after the sound signals and the ultrasonic images are processed, the sound signals and the ultrasonic signals can be presented by waiting for the delay difference value; at the same time, the electrocardiosignals are also immediately presented after being processed; thus, the ultrasonic signal heart, the electric signal and the sound signal are presented as synchronous alignment time (namely, acquisition state).
The specific method of synchronous display is as follows, with reference to fig. 6:
s1, acquiring the time length required by the respective processing of the ultrasonic signal, the electromyographic signal and the audio signal, for example: the ultrasonic signal processing time is 0.5s, the electromyographic signal is 0.625s, and the audio signal is 0.5 s; obtaining the maximum duration of the electromyographic signals as 0.625s, then comparing the duration of processing other signals with the maximum duration, and subtracting 0.5s of audio signals (or ultrasound) from 0.625 to obtain 0.125s of comparison duration; the time length required for processing is the time required from the acquisition of the signal to the acquisition of the signal until the signal is displayed on the display device, for example: the electrocardio R wave signal is generated from the body, is collected by the myoelectricity collecting electrode in the first example at the 0 th s, and needs 0.625s before being projected to the display.
S2, immediately displaying the signal with the time length required by processing as the maximum time length after the signal processing is finished; and (4) finishing processing the rest signals, and immediately displaying after the contrast time. That is, after the electrocardiographic signal is processed, the processed electrocardiographic signal is immediately projected to a display for display, the delay time of the processed electrocardiographic signal on the display is 0.625s, the rest of the processed electrocardiographic signal is immediately displayed after the contrast time duration is 0.125s, and the delay time of the processed electrocardiographic signal on the display is 0.625s, so that the time differences (time delays) of the processed electrocardiographic signal on the display are equal. The time difference condition of the electrocardio and the image seen on the display is the real time difference condition.
The ultrasonic signal is a heart M ultrasonic image signal; the electromyographic signal is an electrocardio QRS wave signal, and the audio signal is a heart sound.
Example six:
in example four, the following example can be referred to as a calculation method for calculating the minimum volume and the maximum volume of the heart through the image pixels.
In cardiac ejection fraction analysis, it is necessary to know the minimum volume of the heart when the heart contracts and the maximum volume of the heart when the heart expands.
When the device software is adopted for automatic identification, because the device software cannot accurately identify the maximum volume and the minimum volume of the heart through the ultrasonic image, heartbeat sound and an electrocardiogram waveform diagram are injected into the device software, and after the device software obtains the electrocardiogram waveform diagram (and the heartbeat sound), the device software searches for the section shape of the most proper heart front and back wall distance from the continuously-changed electrocardiogram waveform diagram to the corresponding continuously-changed ultrasonic diagram. For example:
the section is the minimum value of the front wall and the back wall of the left ventricle of the heart, namely, the section can be determined by the electrocardiogram, the minimum value of the distance between the front wall and the back wall of the heart is determined to be the section, and because the section defines a tiny range area, the distance between the front wall and the back wall can be found out very easily in the tiny range area through image identification. The method of image recognition can be implemented in a variety of ways: for example, the method and apparatus for identifying color patches provided by cn201310125145.x, which is disclosed in chinese patent, determine the shapes of color patches by performing edge detection on an image to be identified, extracting the outlines of the color patches in the image to be identified, calculating the coordinates of the centroid of the color patches according to the extracted outlines of the color patches to obtain the positions of the color patches, and selecting the color of the position of a point in the color patches to determine the color of the color patches. Therefore, the method and the device can simultaneously identify the shape, the position and the color of the color block, thereby more comprehensively extracting the information of the color block. The method of image recognition can be implemented in a variety of ways: such as grey scale recognition.
The image recognition method can be implemented in various ways, for example, chinese patent No. cn202010409334.x, a lung ultrasound image recognition method and a part of the system thereof, can be adopted.
And performing graying and normalization processing on the effective area image respectively to obtain a normalized grayscale image.
Graying the effective area image by adopting the following formula:
Gray-R0.299 + G0.587 + B0.114, wherein R, G, B are red, green and blue components, respectively.
And respectively scaling the normalized gray level images into uniform sizes and forming images to be identified.
And scaling the normalized gray image into an image to be measured with a uniform size, for example, 256 × 256, preferably, scaling by using a method such as bilinear interpolation, cubic spline interpolation or Lanczos interpolation.
The minimum distance between the anterior wall and the posterior wall of the left ventricle is found out by adopting a pattern recognition method, so that the volume of the heart when the heart contracts to the minimum can be obtained, and the maximum value of the anterior wall and the posterior wall of the left ventricle is found out by adopting the same method, so that the volume of the heart when the heart expands to the maximum can be obtained. And (4) carrying an incident blood fraction analysis formula to obtain the ejection fraction.
Example seven:
in the fourth and sixth examples, the following example can be referred to as a calculation method for calculating the minimum volume and the maximum volume of the heart by the image pixels.
First, to calculate the minimum and maximum volumes of the heart, a method of determining the left ventricular septal intima boundary and the left ventricular posterior intima boundary on an echocardiogram is needed. The following:
s1: setting a virtual center line from the echocardiogram, wherein the virtual center line is approximately positioned at the same distance from the upper boundary to the lower boundary;
from the virtual centerline, the echocardiograms are respectively: an echocardiography top view with a mitral valve image and a left ventricular septal membrane image; and an echocardiogram with a left ventricular posterior intimal border;
s2: the echocardiogram is processed to obtain a left ventricular septum boundary. The echocardiography lower image is processed to obtain the left ventricular backwall intima boundary.
Secondly, the method for processing the echocardiography upper graph to obtain the left ventricular septum boundary comprises the following steps:
s01: on an echocardiogram having a mitral valve image and a left ventricular septal membrane image, determining approximate extents of adjacent regions of the left ventricular septal membrane and the mitral valve by:
(1): on the echocardiogram, taking the continuous edge close to one side of the mitral valve as the upper border of the echocardiogram, and setting the pixel row under the upper border and at 30 pixel points away from the upper border as a calculation starting point line:
(2): at the calculation starting point line, downwards counting the percentage of black pixel points in each pixel line parallel to the calculation starting point line;
(3): when the obtained first pixel row at least having 35% black pixel points is the upper edge line of the range of the approximate range;
(4): the lower edge of the approximate range is obtained as follows: (I) continuously counting the percentage of black pixels in each pixel row downwards along the upper edge line of the range of the mitral valve pixel block domain, wherein when the proportion of the black pixels in the pixel row is the maximum, the corresponding pixel row is the lower edge line of the range of the approximate range; (II) when a plurality of pixel rows are obtained and the condition (I) is satisfied, the first pixel row which is closest to the upper edge of the range and satisfies the condition (I) is the lower edge of the range of the approximate range.
S02: in the approximate range, pixel block areas of adjacent areas of the left ventricular septum membrane and the mitral valve are found, wherein the pixel block areas are searched line by line from an upper range line to a lower range line in the approximate range, and the following search conditions are at least met: (I) in the same row of pixel points, the gray value of the pixel is increased progressively from the upper range line to the lower range line by at least ten pixel points; (II) the first range found to have the characteristic (I) above is the pixel patch area in the area adjacent to the left ventricular septal intima and mitral valve. When the finding condition is satisfied, within the found pixel blocks of the adjacent regions of the left ventricular septum membrane and the mitral valve, one of the following cases is allowed and at the same time only, and this case is considered as noise: (I) the gray value of not more than 2 pixel points is compared with the gray value of the last pixel point and stops increasing progressively; (II) the gray value of 1 pixel is reduced compared with the previous pixel.
S03: setting the gray value of the pixel block domain of the adjacent region to 0;
s04: and carrying out binarization processing on the echocardiograms including the adjacent regions with the gray value of the pixel block region set as 0 to obtain the binarized echocardiograms with the adjacent regions and the mitral valve regions both excluded from the left ventricular septal intima.
S05: defining an interested region on the binarization echocardiogram, wherein the interested region comprises all left ventricular septal intima boundary images; the calculation method of the region of interest comprises the following steps: on the echocardiogram, a continuous edge close to one side of the mitral valve is taken as an echocardiogram upper boundary, and a pixel row which is under the upper boundary and is 30 pixel points away from the upper boundary is set as a first boundary of the interested region. Acquiring a second boundary of the region of interest as follows: (I) continuously counting the percentage of black pixel points in each pixel row downwards along the first boundary of the region of interest, wherein when the black pixel points account for the largest proportion of the pixel rows, the corresponding pixel row is the second boundary of the region of interest; (II) when the plurality of pixel rows are obtained to satisfy the condition (I), the first pixel row closest to the first boundary of the region of interest to satisfy the condition (I) is the second boundary of the region of interest.
Between the first boundary of the region of interest and the second boundary of the region of interest, the region of interest is defined.
S06: performing morphological opening operation on the region of interest to make the ultrasonic cardiogram after binarization smoother; then carrying out edge detection to obtain a left ventricular septal inner membrane boundary;
s07: and performing closing operation on the obtained boundary of the left ventricular septum inner membrane, and enabling the obtained boundary of the left ventricular septum inner membrane to be a communication domain. I.e. the last left ventricular septal intimal boundary.
Thirdly, processing the echocardiogram to obtain the left ventricular posterior intima boundary method comprises the following steps:
1: binarizing the ultrasonic image; marking the continuous edge close to one side of the mitral valve as the upper boundary of the echocardiogram; the continuous edge on the opposite side of the upper boundary is the echocardiographic lower boundary;
counting the line position of each row where white pixel points appear for the first time by taking pixel columns as units from the upper boundary to the lower boundary as a virtual central line at the same distance from the upper boundary to the lower boundary on the echocardiogram, starting at the virtual central line and heading towards the lower boundary;
2: in the data obtained by statistics, the row where the white pixel point is located appears for the first time, the distance upper boundary exceeds 45 pixel points, and the column where the white pixel point is located is a data error column;
3: the following method is used to correct the data error in the column:
(1) merging adjacent columns with error data into an error interval; (2) setting a row where white pixel points appear for the first time in which the correct data columns adjacent to the left side of the error interval are located as a first reference point, and setting a row where white pixel points appear for the first time in which the correct data columns adjacent to the right side of the error interval are located as a second reference point; the straight line connects the first reference point and the second reference point, and the straight line is the boundary of the left ventricular backwall intima in the error interval.
4: performing morphological opening operation to make the ultrasonic cardiogram after binaryzation smoother; edge detection is then performed to obtain the left ventricular backwall intima boundary.
5: and performing closing operation on the obtained boundary of the left ventricular back wall intima, and enabling the obtained boundary of the left ventricular back wall intima to be a communication domain. I.e. the membrane boundary in the back wall of the left chamber.
After obtaining the minimum value and the maximum value of the distance between the inner membrane of the anterior wall of the left ventricle and the inner membrane of the posterior wall of the left ventricle, the following calculation formula is used for calculating the distance between the inner membrane of the anterior wall of the left ventricle and the inner membrane of the posterior wall of the left ventricle: calculating the cardiac ejection fraction by EF (EDV-ES) 100%, wherein EF is the cardiac ejection fraction, and EDV is the maximum volume of the left ventricle of the heart when the distance between the front wall and the back wall of the inner membrane of the left ventricle of the heart is maximum; minimum volume of the left ventricle of the heart at minimum distance of the anterior-posterior intima of the left ventricle of the ES heart;
wherein, the maximum volume of the left ventricle of the heart is calculated through the minimum distance between the front wall intima and the back wall intima of the left ventricle of the heart; and calculating the maximum volume of the left ventricle of the heart according to the maximum distance between the front wall intima and the back wall intima of the left ventricle of the heart.
The above are examples of the practice of the invention. In the above embodiments, the description of each embodiment has its own emphasis, and reference may be made to the related description of other embodiments for parts that are not described or recited in any embodiment.
The above-mentioned embodiments are only used for illustrating the technical solutions of the present invention, and not for limiting the same; although the present invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not substantially depart from the spirit and scope of the embodiments of the present invention, and are intended to be included within the scope of the present invention.
While the invention has been described with reference to specific embodiments, it will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the spirit and scope of the invention. Therefore, the protection scope of the present invention shall be subject to the protection scope of the claims.

Claims (10)

1. A method for collecting multi-dynamic parameter processing of an organism is characterized in that an acquisition device which has at least two acquisition functions of an ultrasonic signal acquisition function, an electromyographic signal acquisition function and an audio signal acquisition function at the same time is adopted to acquire at least two signals of the ultrasonic signal, the electromyographic signal and the audio signal from the same organism and synchronously display the acquired signals on a display device; the synchronization is that all the signals are projected on the display device in real time, and the time difference between all the signals projected on the display device is equal to the acquisition time difference of all the signals.
2. The method for processing the collected multiple dynamic parameters of the machine body according to claim 1, wherein the synchronous display specifically comprises:
s1, acquiring the time length required by the respective processing of the ultrasonic signal, the electromyographic signal and the audio signal; obtaining the maximum duration, and then comparing the duration of the rest signal processing with the maximum duration to obtain the comparison duration; the time length required by the processing is the time required from the acquisition of the signals to the display of the signals on the display device.
S2, immediately displaying the signal with the time length required by processing as the maximum time length after the signal processing is finished;
and (4) finishing processing the rest signals, and immediately displaying after the contrast time.
3. The method for collecting processing of multiple dynamic parameters of human body according to claim 1, wherein said ultrasound signal is heart M ultrasound image signal; the electromyographic signal is an electrocardio QRS wave signal, and the audio signal is a heart sound.
4. The method of claim 3, wherein said ECG QRS wave signal has a maximum duration; the comparison duration between the duration required by processing the electrocardio QRS wave signals and the duration of the rest signal processing is specifically as follows: 0.125S.
5. A method for obtaining the cardiac ejection fraction by utilizing multi-dynamic parameter processing is characterized in that,
the method comprises the steps that electrocardio QRS waves which are continuously collected from the same organism in the same time period and M ultrasonic images of heart form changes which synchronously occur with electrocardio are distributed on the same time axis and aligned to the same initial time point;
in the M ultrasonic image, obtaining a first longitudinal pixel array of the M ultrasonic image at a time point after a first specific time length after one of Q wave, R wave and S wave in the electrocardio QRS wave is generated, and finding out the minimum value of the distance between the inner membrane of the front wall of the left ventricle and the inner membrane of the rear wall of the left ventricle in the first longitudinal pixel array;
in the M ultrasonic image, obtaining a second longitudinal pixel array of the M ultrasonic image at a time point after a second specific time length after one of Q wave, R wave and S wave in the electrocardio QRS wave is generated, and finding out the maximum value of the distance between the inner membrane of the front wall of the left ventricle and the inner membrane of the rear wall of the left ventricle in the second longitudinal pixel array;
and calculating the cardiac ejection fraction according to the minimum value and the maximum value.
6. The method for obtaining cardiac ejection fraction by multi-dynamic parameter processing according to claim 5, wherein the alignment is performed to the same starting time point, specifically, the method according to any one of claims 1 to 4 is used to synchronously display the collected ultrasonic signals and myoelectric signals on a display device; the display time difference of the ultrasonic signals and the myoelectric signals projected to the display device is equal to the acquisition time difference of the acquired signals.
7. The method of claim 5, wherein the cardiac QRS wave and the M-ultrasonic image are acquired by a device having both functions of acquiring cardiac QRS wave and M-ultrasonic image;
the display device is provided with a continuous time axis and a real-time state axis.
8. The method of claim 5, wherein the first specific duration and the second specific duration are calculated by: analyzing a plurality of M ultrasonic images subjected to binarization processing, marking the time of the minimum value of the interval between the inner membrane of the front wall of the left ventricle and the inner membrane of the rear wall of the left ventricle, and calculating the time difference between the time of the occurrence of one of Q wave, R wave and S wave and the time of the occurrence of the minimum value to obtain a first specific time length;
and calculating the time difference between the time of occurrence of one of the Q wave, the R wave and the S wave and the time of occurrence of the maximum value to obtain a second specific time length.
9. The method of claim 5, wherein after obtaining the minimum and maximum values of the distance between the anterior intima and the posterior intima of the left ventricle, the cardiac ejection fraction is calculated by the following calculation formula EF ═ EDV-ES (100%), where EF is the cardiac ejection fraction, and EDV is the maximum volume of the left ventricle of the heart at the maximum distance between the anterior and posterior intima of the left ventricle; minimum volume of the left ventricle of the heart at minimum distance of the anterior-posterior intima of the left ventricle of the ES heart;
wherein, the maximum volume of the left ventricle of the heart is calculated by the minimum distance between the front wall and the back wall of the left ventricle of the heart; and calculating the maximum volume of the left ventricle of the heart according to the maximum distance between the front wall and the back wall of the inner membrane of the left ventricle of the heart.
10. The method of obtaining cardiac ejection fraction using multi-dynamic parameter processing as claimed in claim 5, wherein the M ultrasound image is formed by: the horizontal pixel rows parallel to the time axis and the vertical pixel rows perpendicular to the time axis.
CN202110638976.1A 2021-01-13 2021-06-08 Processing method of multiple dynamic parameters of body and application of processing method in ejection fraction Pending CN114762611A (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202110043918 2021-01-13
CN2021100439184 2021-01-13

Publications (1)

Publication Number Publication Date
CN114762611A true CN114762611A (en) 2022-07-19

Family

ID=82364682

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110638976.1A Pending CN114762611A (en) 2021-01-13 2021-06-08 Processing method of multiple dynamic parameters of body and application of processing method in ejection fraction

Country Status (1)

Country Link
CN (1) CN114762611A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116671928A (en) * 2023-07-27 2023-09-01 中国科学技术大学 Bimodal cardiac electromechanical physiological source imaging system

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116671928A (en) * 2023-07-27 2023-09-01 中国科学技术大学 Bimodal cardiac electromechanical physiological source imaging system

Similar Documents

Publication Publication Date Title
EP2563212B1 (en) Visualization of myocardial infarct size in diagnostic ecg
US11647992B2 (en) System and method for fusing ultrasound with additional signals
CN108309353B (en) Heart rate assist for phase determination in echocardiography
CN110477952B (en) Ultrasonic diagnostic apparatus, medical image diagnostic apparatus, and storage medium
US20230218178A1 (en) Construction method and application of digital human cardiovascular system based on hemodynamics
US20160206287A1 (en) Wearable Doppler Ultrasound Based Cardiac Monitoring
CN114762611A (en) Processing method of multiple dynamic parameters of body and application of processing method in ejection fraction
US11020095B2 (en) Data compression to facilitate remote medical analysis and diagnosis
CN112043259A (en) Monitoring system, method and device
EP4125609B1 (en) Medical sensing system and positioning method
Lu et al. Accurate heart beat detection with Doppler radar using bidirectional GRU network
CN116744850A (en) Electrocardiogram generation system and method based on deep learning algorithm
US20230210393A1 (en) Method and device for multidimensional analysis of the dynamics of cardiac activity
CN114027868A (en) Ultrasonic diagnostic apparatus with physiological signal detection function
EP4161377B1 (en) Method and device for multidimensional analysis of the dynamics of cardiac activity
CN105380632A (en) Biological identification robot system with high safety performance
CN214804851U (en) Ultrasonic diagnostic apparatus with physiological signal detection function
WO2024086941A1 (en) Systems, devices, and methods for visualizing patient physiological data
JP2021016627A (en) Ultrasonic diagnostic device, ultrasonic diagnostic system, time information giving program, and delay time measurement method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination