CN109670454B - Biological information introduction method, biological information introduction device and terminal equipment - Google Patents
Biological information introduction method, biological information introduction device and terminal equipment Download PDFInfo
- Publication number
- CN109670454B CN109670454B CN201811568200.1A CN201811568200A CN109670454B CN 109670454 B CN109670454 B CN 109670454B CN 201811568200 A CN201811568200 A CN 201811568200A CN 109670454 B CN109670454 B CN 109670454B
- Authority
- CN
- China
- Prior art keywords
- detected
- target
- animation
- image
- image frame
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/10—Terrestrial scenes
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T13/00—Animation
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- User Interface Of Digital Computer (AREA)
- Processing Or Creating Images (AREA)
Abstract
The application provides a biological information introduction method, a biological information introduction device and a terminal device, wherein the method comprises the following steps: acquiring an image to be detected, and carrying out target detection on the image to be detected; judging whether a target organism is detected in the image to be detected or not; if the target creature is detected in the image to be detected, acquiring an animation corresponding to the type of the target creature according to the type of the target creature contained in the image to be detected, and playing the animation, wherein the animation is used for displaying the growth process of the creature of the type. The method and the device can improve the efficiency of acquiring the relevant information about the biological growth process by the user to a certain extent.
Description
Technical Field
The present application belongs to the field of terminal technologies, and in particular, relates to a biological information introduction method, a biological information introduction apparatus, a terminal device, and a computer-readable storage medium.
Background
Currently, when a user wants to know the growth process of a certain living being (e.g. a rose), the user usually searches a book or searches related data on the network.
Therefore, at present, users cannot quickly and efficiently acquire relevant information about the biological growth process.
Disclosure of Invention
In view of the above, the present application provides a biological information introduction method, a biological information introduction apparatus, a terminal device, and a computer-readable storage medium, which can enable a user to more efficiently acquire relevant information about a biological growth process to some extent.
A first aspect of the present application provides a biological information introduction method including:
acquiring an image to be detected, and carrying out target detection on the image to be detected;
judging whether a target organism is detected in the image to be detected or not;
if the target organism is detected in the image to be detected, the following steps are carried out:
and acquiring an animation corresponding to the type of the target organism according to the type of the target organism contained in the image to be detected, and playing the animation, wherein the animation is used for showing the growth process of the type of the organism.
A second aspect of the present application provides a biological information introducing apparatus including:
the target detection module is used for acquiring an image to be detected and carrying out target detection on the image to be detected;
the judging module is used for judging whether a target organism is detected in the image to be detected or not;
the animation acquisition module is used for acquiring an animation corresponding to the type of the target organism according to the type of the target organism contained in the image to be detected if the target organism is detected in the image to be detected, wherein the animation is used for showing the growth process of the organism of the type;
and the animation playing module is used for playing the animation.
A third aspect of the present application provides a terminal device comprising a memory, a processor and a computer program stored in the memory and executable on the processor, wherein the processor implements the steps of the method according to the first aspect when executing the computer program.
A fourth aspect of the present application provides a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, performs the steps of the method of the first aspect as described above.
A fifth aspect of the present application provides a computer program product comprising a computer program which, when executed by one or more processors, performs the steps of the method of the first aspect as described above.
In view of the above, the present application provides a method for introducing biological information. Firstly, acquiring an image to be detected, such as an image shot by a user through a mobile phone camera; secondly, performing target detection on the image to be detected, and judging whether a target organism is detected in the image to be detected, for example, if the target organism is a rose, judging whether the rose is detected in the image to be detected; and then, if the target creature is detected in the image to be detected, acquiring an animation corresponding to the type of the target creature according to the type of the target creature in the image to be detected, and playing the animation, wherein the animation is used for showing the growth process of the creature of the type. For example, if the target creature is a rose, when it is detected that the image to be detected contains the rose, an animation corresponding to the rose may be obtained, and the animation is played, where the animation is used to show a growth process of the rose. Therefore, according to the technical scheme provided by the application, when a user wants to know the growth process of a certain living being, an image containing the living being can be provided for the terminal device, and if the living being is a target living being which can be detected by the terminal device, the terminal device can push an animation for displaying the growth process of the living being to the user after acquiring the image, so that the user is prevented from manually inquiring related information about the growth process of the living being, and therefore, the efficiency of acquiring the related information about the growth process of the living being by the user can be improved to a certain extent.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed for the embodiments or the prior art descriptions will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings without creative efforts.
FIG. 1 is a schematic flow chart illustrating an implementation of a method for introducing biological information according to an embodiment of the present disclosure;
FIG. 2 is a schematic diagram of an animation playback according to an embodiment of the present application;
FIG. 3 is a schematic flow chart of another biological information introduction method provided in the second embodiment of the present application;
fig. 4 is a schematic structural diagram of a biological information introduction apparatus according to a third embodiment of the present application;
fig. 5 is a schematic structural diagram of a terminal device according to a fourth embodiment of the present application.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system structures, techniques, etc. in order to provide a thorough understanding of the embodiments of the present application. It will be apparent, however, to one skilled in the art that the present application may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present application with unnecessary detail.
The biological information introduction method provided by the embodiment of the application is applicable to terminal equipment, and the terminal equipment includes but is not limited to: smart phones, palm computers, notebooks, desktop computers, intelligent wearable devices, and the like.
It will be understood that the terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It should be further understood that the term "and/or" as used in this specification and the appended claims refers to and includes any and all possible combinations of one or more of the associated listed items.
As used in this specification and the appended claims, the term "if" may be interpreted contextually as "when", "upon" or "in response to a determination" or "in response to a detection". Similarly, the phrase "if it is determined" or "if a [ described condition or event ] is detected" may be interpreted contextually to mean "upon determining" or "in response to determining" or "upon detecting [ described condition or event ]" or "in response to detecting [ described condition or event ]".
In addition, in the description of the present application, the terms "first", "second", and the like are used only for distinguishing the description, and are not intended to indicate or imply relative importance.
In order to explain the technical solution described in the present application, the following description will be given by way of specific examples.
Example one
Referring to fig. 1, a method for introducing biological information according to a first embodiment of the present application is described below, where the method for introducing biological information according to the first embodiment of the present application includes:
in step S101, an image to be detected is obtained, and target detection is performed on the image to be detected;
each step in the biological information introduction method described in the first embodiment of the present application may be applied to a terminal device. The image to be detected in step S101 may be an image shot by a user through a camera APP of the terminal device; or, the image to be detected may be a preview image of a frame in a preview picture acquired by a camera APP or a video camera APP in the terminal device; or, the image to be detected may also be an image stored in a local gallery of the terminal device; alternatively, the image to be detected may also be a frame image in an online-viewed video or a locally-stored video. The source of the above-mentioned image to be detected is not limited in this application.
In the embodiment of the application, the trained neural network model can be used for carrying out target detection on the acquired image to be detected; alternatively, other target detection methods commonly used in the art may be used to perform target detection on the image to be detected, and the target detection algorithm is not limited herein.
In step S102, it is determined whether a target organism is detected in the image to be detected;
in this step, it is necessary to determine whether a target creature is detected in the image to be detected acquired in step S101, and if the target creature is detected, the following step of acquiring and playing an animation in step S103 is performed.
For example, if the step S101 describes "performing target detection on the image to be detected", specifically, "performing target detection on the image to be detected by using the trained neural network model", it is assumed that the target organism that can be detected by the trained neural network model has roses, clubs, mimosa, cats and dogs, if the image to be detected obtained in the step S101 includes clubs, when the trained neural network model identifies the clubs of the image to be detected, the determination result of the step S102 is positive, the subsequent step S103 may be performed, an animation corresponding to the clubs is obtained, and the animation is played.
In step S103, if a target creature is detected in the image to be detected, an animation corresponding to the type of the target creature is obtained according to the type of the target creature included in the image to be detected, and the animation is played, wherein the animation is used for showing the growth process of the creature of the type.
If the determination result in step S102 is positive, according to the type of the target living being in the image to be detected obtained in step S101 (where, the type of the target living being may be input by the user, or may also be determined according to the result of the target detection after the target detection is performed on the image to be detected), an animation corresponding to the type of the target living being is obtained.
In the embodiment of the application, if a plurality of target organisms are detected in the image to be detected, the animation corresponding to the type of one of the target organisms in the image to be detected can be obtained, and the animation is played; or, animations corresponding to the types of the target organisms in the image to be detected can be acquired, and the acquired animations can be played respectively. For example, if it is found that the type of the target organism included in the image to be detected obtained in step S101 is a plum blossom, a rose, or a mimosa, the animation corresponding to the plum blossom may be obtained and played; or acquiring the animation corresponding to the rose and the animation corresponding to the mimosa and playing the two acquired animations; or the animations corresponding to the plum blossom, the rose and the sensitive plant respectively can be obtained, and the obtained three animations are played.
Specifically, animations corresponding to each category may be stored in the preset second server (for example, animations corresponding to roses, chrysanthemums, mimosa, quincunx, dogs, etc.), and accordingly, the step S103 of "obtaining the animation corresponding to the category of the target creature" includes: sending search request information to the second server to instruct the second server to execute an operation of searching for the animation corresponding to the type of the target organism in the image to be detected; receiving the animation corresponding to the type of the target creature in the image to be detected returned by the second server.
In step S103, after the animation is obtained, the animation may be directly played, and the growth process of the target living being in the image to be detected is shown to the user. In this embodiment of the application, the user may adjust the time interval between every two frames of image frames during the playing of the animation, for example, if the time interval between every two frames is 40ms during the normal playing of the animation, the user may adjust the time interval between every two frames to be 80ms during the playing of the animation, so that the playing of the animation is slower, and the user can view the growth process of the living being more clearly. In addition, the user can pause the playing of the animation when the animation is played, so as to check the growth of the creature to a certain stage.
In addition, in this embodiment of the present application, the growth stage of the target living being in the image to be detected obtained in step S101 may also be obtained (for example, it is determined whether the target living being is in the seedling stage, the flowering stage, or the fruiting stage), then an animation is played, and when it is detected that the animation is played to the target image frame, the target image frame is kept displayed for a preset time duration (for example, the target image frame is kept displayed for 5S) until the animation is played, where the target image frame is an image frame in which the growth stage of the living being included in the animation is consistent with the growth stage of the target living being in the image to be detected.
In this embodiment of the present application, the method for determining the target image frame may be: the acquired animation can correspond to growth stage information, the growth stage information is used for indicating the growth stage of the creature contained in each image frame in the animation, and the target image frame in the animation is determined through the growth stage information. The target image frame may be one or more frames of images in animation. The target image frame can be determined after the growth stage of the target organism in the image to be detected is obtained and before the animation is played; or may be determined during the playing of the animation.
For example, if the target image frame is determined during the playing of the animation, when the animation is played, the growth phase of the living being in the currently displayed image frame of the animation may be detected (which may be detected once every 80ms, or may be detected once every 100ms, or may also be detected for each frame displayed by the animation), and once it is detected that the growth phase of the living being in the currently displayed image frame of the animation is the same as the growth phase of the target living being in the acquired image to be detected, the currently displayed image frame may be determined as the target image frame (at this time, the target image frame may be continuously updated), and the target image frame is kept displayed for a preset time duration, which means that the playing of the animation is ended. As shown in fig. 2, the animation 201 includes 7 image frames, and during normal playing, the time interval between every two image frames is 40ms, the animation 201 is used to show the growth process of the rose, and assuming that the image to be detected obtained in step S101 includes the target biological rose and the rose in the image to be detected is in the flowering phase, when playing the animation 201, the growth stage of the rose in each image frame displayed by the animation 201 may be detected, and once it is detected that the rose is in the flowering phase in the image frame currently displayed by the animation 201, the image frame is kept displayed for a preset time, such as 5S, until the playing of the animation 201 is finished. Those skilled in the art can easily find that, in this case, two target image frames, namely, an image frame 1 and an image frame 2, are detected, when the animation 201 is played to the image frame 1, the image frame 1 is determined as the target image frame, the image frame 1 is kept displayed for a preset time, when the animation 201 is played to the image frame 2, the image frame 2 is determined as the target image frame, the image frame 2 is kept displayed for the preset time, however, sometimes, a user often wants the animation to pause only once during playing, therefore, when it is first detected that the animation 201 is played to an image frame containing a biological growth stage consistent with a biological growth stage of a target in the image to be detected, the image frame is determined as the target image frame, the target image frame is kept displayed for the preset time, in this case, the animation 201 is only played to the image frame 1, the image frame 1 is kept displayed for a preset time.
When the animation is played to the target image frame and the target image frame is displayed, the growth stage of the target organism in the image to be detected can be pushed to the user (for example, the user is informed in a voice mode that the growth stage of the target organism in the image to be detected is a seedling stage), and/or preset introduction information for introducing the target organism can be pushed to the user (wherein the introduction information can be in a voice mode, a text mode and/or a picture mode, the application does not limit the introduction information, and if the target organism in the image to be detected is a rose, text introduction information and related pictures related to the rose can be pushed to the user, and the like).
As can be seen from the above, in the technical solution provided in the first embodiment of the present application, when a user wants to know a growth process of a certain living being, an image including the living being may be provided to a terminal device, and if the living being is a target living being that can be detected by the terminal device, the terminal device may push an animation for showing the growth process of the living being to the user after obtaining the image, so as to avoid that the user manually queries related information about the growth process of the living being, and therefore, the efficiency of obtaining the related information about the growth process of the living being by the user may be improved to a certain extent.
Example two
Referring to fig. 3, another method for introducing biological information provided in the second embodiment of the present application is described below, where the method for introducing biological information in the second embodiment of the present application includes:
in step S301, obtaining an image to be detected, performing target detection on the image to be detected by using the trained neural network model, and obtaining a detection result output by the neural network model, where the detection result is used to indicate whether a target organism is detected in the image to be detected, and when a target organism is detected in the image to be detected, the detection result is used to indicate the type and growth stage of the target organism contained in the image to be detected;
in the embodiment of the application, a neural network model can be trained in advance, and the acquired image to be detected is subjected to target detection through the neural network model. Specifically, how to obtain the trained neural network model is the prior art, and details are not described herein.
In step S302, it is determined whether a target organism is detected in the image to be detected;
in step S303, if a target creature is detected in the image to be detected, acquiring an animation corresponding to the type of the target creature according to the type of the target creature included in the image to be detected indicated by the detection result, wherein the animation is used for showing the growth process of the creature of the type;
in step S304, a growth stage of the target living being included in the image to be detected is acquired based on the detection result;
in step S305, playing the animation, and when it is detected that the animation is played to a target image frame, keeping displaying the target image frame within a preset time period until the animation is played, where the target image frame is an image frame in which a growth stage of a living being included in the animation is consistent with a growth stage of a target living being in the image to be detected;
the relevant contents of the above steps S302-S305 are already described in the steps S102-S103 in the first embodiment, and are not described herein again.
Furthermore, in the second embodiment of the present application, after the target image frame of the animation is acquired, the following operations may be performed:
A. calculating the similarity between the organism in the target image frame of the animation and the same organ of the target organism in the image to be detected;
assuming that the trained neural network model judges that the target organism in the image to be detected is a rose and is in the flowering phase, after the corresponding animation is obtained, obtaining a target image frame in the animation, and calculating the similarity of the same organ of the rose in the image to be detected and the rose in the target image frame, such as calculating the similarity of roots and/or the similarity of flowers.
B. Judging whether the calculated similarities of the same organ are all larger than a preset threshold value;
C. and if the unevenness is larger than the preset threshold value, sending feedback information to a first server, wherein the feedback information is used for indicating that the detection result output by the trained neural network model is wrong.
If the calculated similarities of the same organ are all smaller than a preset threshold (for example, 60%), it is determined that the trained neural network model has a deviation in the inspection result of the target organism in the image to be detected, and the trained neural network model may have a detection error on the type of the target organism in the image to be detected, or may also have a detection error on the growth stage where the target organism is located in the image to be detected. At this time, the terminal device may send a feedback message to the preset first server, so that the background staff may better modify and update the neural network model according to the feedback message.
In this embodiment, the feedback information may include: version information of software for implementing each step of the second embodiment of the present application, MAC address of the terminal device, and/or information such as the result calculated in the step a.
The second embodiment of the present application provides a technical solution, in which a trained neural network model is used for target detection, and in the technical solution provided by the second embodiment of the present application, a terminal device may evaluate the neural network model and provide feedback information to a server, so that a background worker can better maintain a corresponding product. In addition, the second embodiment of the present application is the same as the first embodiment, and the efficiency of the user for obtaining the information related to the biological growth process can also be improved to some extent.
It should be understood that, the sequence numbers of the steps in the foregoing embodiments do not imply an execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation to the implementation process of the embodiments of the present application.
EXAMPLE III
In the third embodiment of the present application, there is provided a biological information introduction apparatus, as shown in fig. 4, the biological information introduction apparatus 400 including:
a target detection module 401, configured to obtain an image to be detected, and perform target detection on the image to be detected;
a judging module 402, configured to judge whether a target organism is detected in the image to be detected;
an animation obtaining module 403, configured to, if a target living being is detected in the image to be detected, obtain an animation corresponding to a type of the target living being according to the type of the target living being included in the image to be detected, where the animation is used to show a growth process of the living being of the type;
and an animation playing module 404, configured to play the animation.
Optionally, the animation playing module 404 includes:
a growth stage acquisition unit configured to acquire a growth stage of the target organism included in the image to be detected;
and the playing unit is used for playing the animation, and when the animation is detected to be played to a target image frame, the target image frame is kept displayed within a preset time period until the animation is played, wherein the target image frame is an image frame in which the growth stage of the creature in the animation is consistent with the growth stage of the creature in the image to be detected.
Optionally, the playing unit is specifically configured to:
playing the animation, keeping displaying the target image frame within a preset time when the animation is detected to be played to the target image frame until the animation is played, and pushing the growth stage of the target organism in the image to be detected and/or preset introduction information for introducing the target organism to a user when the target image frame is displayed, wherein the target image frame is an image frame in which the growth stage of the organism contained in the animation is consistent with the growth stage of the target organism in the image to be detected.
Optionally, the target detection module 401 is specifically configured to:
acquiring an image to be detected, performing target detection on the image to be detected by using the trained neural network model, and acquiring a detection result output by the neural network model, wherein the detection result is used for indicating whether a target organism is detected in the image to be detected or not, and indicating the type and the growth stage of the target organism contained in the image to be detected when the target organism is detected in the image to be detected;
accordingly, the animation obtaining module 403 is specifically configured to:
and if the target creature is detected in the image to be detected, acquiring animation corresponding to the type of the target creature according to the type of the target creature contained in the image to be detected indicated by the detection result.
Correspondingly, the growth stage acquiring unit is specifically configured to:
and acquiring the growth stage of the target organism contained in the image to be detected according to the detection result.
Optionally, the animation playing module 404 further includes:
a target frame determining unit for determining a target image frame in the animation according to a growth stage of the target living being included in the image to be detected;
correspondingly, the animation playing module 404 further includes:
a similarity calculation unit for calculating the similarity between the living body in the target image frame of the animation and the same organ of the target living body in the image to be detected;
the similarity judging unit is used for judging whether the calculated similarities of the same organ are all larger than a preset threshold value;
and a feedback information sending unit, configured to send feedback information to the first server if the unevenness is greater than the preset threshold, where the feedback information is used to indicate that a detection result output by the trained neural network model is incorrect.
Optionally, animations corresponding to various types are pre-stored in the second server;
accordingly, the animation obtaining module 403 includes:
a search request unit, configured to send search request information to the second server to instruct the second server to perform an operation of searching for an animation corresponding to the type of the target creature;
and an animation receiving unit for receiving the animation corresponding to the type of the target creature returned by the second server.
It should be noted that, for the information interaction, execution process, and other contents between the above-mentioned devices/units, the specific functions and technical effects thereof are based on the same concept as those of the embodiment of the method of the present application, and specific reference may be made to the part of the embodiment of the method, which is not described herein again.
Example four
Fig. 5 is a schematic diagram of a terminal device according to a fourth embodiment of the present application. As shown in fig. 5, the terminal device 5 of this embodiment includes: a processor 50, a memory 51 and a computer program 52 stored in the memory 51 and executable on the processor 50. The processor 50 implements the steps of the various method embodiments described above, such as steps S101 to S103 shown in fig. 1, when executing the computer program 52. Alternatively, the processor 50 executes the computer program 52 to implement the functions of the modules/units in the device embodiments, such as the modules 401 to 404 shown in fig. 4.
Illustratively, the computer program 52 may be divided into one or more modules/units, which are stored in the memory 51 and executed by the processor 50 to complete the present application. The one or more modules/units may be a series of computer program instruction segments capable of performing specific functions, which are used to describe the execution process of the computer program 52 in the terminal device 5. For example, the computer program 52 may be divided into an object detection module, a judgment module, an animation acquisition module, and an animation playback module, and each module has the following specific functions:
acquiring an image to be detected, and carrying out target detection on the image to be detected;
judging whether a target organism is detected in the image to be detected or not;
if the target organism is detected in the image to be detected, the following steps are carried out:
and acquiring an animation corresponding to the type of the target organism according to the type of the target organism contained in the image to be detected, and playing the animation, wherein the animation is used for showing the growth process of the type of the organism.
The terminal device may include, but is not limited to, a processor 50 and a memory 51. Those skilled in the art will appreciate that fig. 5 is merely an example of a terminal device 5, and does not constitute a limitation of the terminal device 5, and may include more or less components than those shown, or combine some of the components, or different components, for example, the terminal device may also include input-output devices, network access devices, buses, etc.
The Processor 50 may be a Central Processing Unit (CPU), other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic, discrete hardware components, etc. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The memory 51 may be an internal storage unit of the terminal device 5, such as a hard disk or a memory of the terminal device 5. The memory 51 may be an external storage device of the terminal device 5, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), or the like provided in the terminal device 5. Further, the memory 51 may include both an internal storage unit and an external storage device of the terminal device 5. The memory 51 is used for storing the computer program and other programs and data required by the terminal device. The above-mentioned memory 51 may also be used to temporarily store data that has been output or is to be output.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-mentioned division of the functional units and modules is illustrated, and in practical applications, the above-mentioned functions may be distributed as different functional units and modules according to needs, that is, the internal structure of the apparatus may be divided into different functional units or modules to implement all or part of the above-mentioned functions. Each functional unit and module in the embodiments may be integrated in one processing unit, or each unit may exist alone physically, or two or more units are integrated in one unit, and the integrated unit may be implemented in a form of hardware, or in a form of software functional unit. In addition, specific names of the functional units and modules are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working processes of the units and modules in the system may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or illustrated in a certain embodiment.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus/terminal device and method may be implemented in other ways. For example, the above-described embodiments of the apparatus/terminal device are merely illustrative, and for example, the division of the above modules or units is only one logical function division, and there may be other division manners in actual implementation, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated modules/units described above, if implemented in the form of software functional units and sold or used as separate products, may be stored in a computer readable storage medium. Based on such understanding, all or part of the flow in the method of the embodiments described above may be implemented by a computer program, which may be stored in a computer readable storage medium and used by a processor to implement the steps of the embodiments of the methods described above. The computer program includes computer program code, and the computer program code may be in a source code form, an object code form, an executable file or some intermediate form. The computer readable medium may include: any entity or device capable of carrying the above-mentioned computer program code, recording medium, usb disk, removable hard disk, magnetic disk, optical disk, computer Memory, Read-Only Memory (ROM), Random Access Memory (RAM), electrical carrier wave signal, telecommunication signal, software distribution medium, etc. It should be noted that the computer readable medium described above may include content that is subject to appropriate increase or decrease as required by legislation and patent practice in jurisdictions, for example, in some jurisdictions, computer readable media that does not include electrical carrier signals and telecommunications signals in accordance with legislation and patent practice.
The above embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not substantially depart from the spirit and scope of the embodiments of the present application and are intended to be included within the scope of the present application.
Claims (7)
1. A biological information presentation method, comprising:
acquiring an image to be detected, and carrying out target detection on the image to be detected;
judging whether a target organism is detected in the image to be detected or not;
if a target organism is detected in the image to be detected, then:
acquiring an animation corresponding to the type of the target organism according to the type of the target organism contained in the image to be detected, and playing the animation, wherein the animation is used for showing the growth process of the type of organism;
wherein the playing the animation comprises:
obtaining the growth stage of the target organism contained in the image to be detected;
playing the animation, and when it is detected that the animation is played to a target image frame, keeping displaying the target image frame within a preset time period until the animation is played, wherein the target image frame is an image frame in which the growth stage of the creature contained in the animation is consistent with the growth stage of the target creature in the image to be detected; the method for determining the target image frame comprises the following steps: the obtained animation corresponds to growth stage information, the growth stage information is used for indicating the growth stage of organisms contained in each image frame in the animation, and a target image frame in the animation is determined according to the growth stage information; the target image frame is one or more frames of images in the animation;
wherein the keeping of the display of the target image frame within the preset time period comprises:
and keeping displaying the target image frame within a preset time length, and pushing a growth stage of the target organism in the image to be detected and/or preset introduction information for introducing the target organism to a user when the target image frame is displayed.
2. The biological information presentation method according to claim 1, wherein said performing the object detection on the image to be detected comprises:
performing target detection on the image to be detected by using the trained neural network model, and acquiring a detection result output by the neural network model, wherein the detection result is used for indicating whether a target organism is detected in the image to be detected or not, and indicating the type and the growth stage of the target organism contained in the image to be detected when the target organism is detected in the image to be detected;
correspondingly, the obtaining of the animation corresponding to the type of the target living being according to the type of the target living being included in the image to be detected includes:
acquiring animation corresponding to the type of the target organism according to the type of the target organism contained in the image to be detected indicated by the detection result;
correspondingly, the acquiring the growth stage of the target organism contained in the image to be detected comprises:
and acquiring the growth stage of the target organism contained in the image to be detected according to the detection result.
3. The biological information presentation method according to claim 2, further comprising, after said step of acquiring a growth stage of said target organism contained in said image to be detected:
determining the target image frame in the animation according to the growth stage of the target organism contained in the image to be detected;
accordingly, after the step of determining the target image frame in the animation, the method further comprises:
calculating the similarity between the living beings in the target image frame of the animation and the same organs of the target living beings in the image to be detected;
judging whether the calculated similarities of the same organ are all larger than a preset threshold value;
and if the unevenness is larger than the preset threshold value, sending feedback information to a first server, wherein the feedback information is used for indicating that the detection result output by the trained neural network model is wrong.
4. A biological information presentation method according to any one of claims 1 to 3, wherein animation corresponding to each category is previously stored in the second server;
correspondingly, the acquiring of the animation corresponding to the type of the target living being includes:
sending search request information to the second server to instruct the second server to execute an operation of searching for the animation corresponding to the type of the target creature;
receiving the animation corresponding to the type of the target creature returned by the second server.
5. A biological information introduction apparatus, comprising:
the target detection module is used for acquiring an image to be detected and carrying out target detection on the image to be detected;
the judging module is used for judging whether a target organism is detected in the image to be detected or not;
the animation acquisition module is used for acquiring an animation corresponding to the type of the target organism according to the type of the target organism contained in the image to be detected if the target organism is detected in the image to be detected, wherein the animation is used for displaying the growth process of the type of organism;
the animation playing module is used for playing the animation;
wherein, the animation playing module comprises:
a growth stage acquisition unit configured to acquire a growth stage of the target organism included in the image to be detected;
the playing unit is used for playing the animation, and when the animation is detected to be played to a target image frame, the target image frame is kept displayed within a preset time period until the animation is played, wherein the target image frame is an image frame of which the growth stage of the creature in the animation is consistent with the growth stage of the target creature in the image to be detected; the method for determining the target image frame comprises the following steps: the obtained animation corresponds to growth stage information, the growth stage information is used for indicating the growth stage of organisms contained in each image frame in the animation, and a target image frame in the animation is determined according to the growth stage information; the target image frame is one or more frames of images in the animation;
wherein the keeping of the display of the target image frame within the preset time period comprises:
and keeping displaying the target image frame within a preset time length, and pushing a growth stage of the target organism in the image to be detected and/or preset introduction information for introducing the target organism to a user when the target image frame is displayed.
6. A terminal device comprising a memory, a processor and a computer program stored in the memory and executable on the processor, characterized in that the processor implements the steps of the method according to any of claims 1 to 4 when executing the computer program.
7. A computer-readable storage medium, in which a computer program is stored which, when being executed by a processor, carries out the steps of the method according to any one of claims 1 to 4.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811568200.1A CN109670454B (en) | 2018-12-21 | 2018-12-21 | Biological information introduction method, biological information introduction device and terminal equipment |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811568200.1A CN109670454B (en) | 2018-12-21 | 2018-12-21 | Biological information introduction method, biological information introduction device and terminal equipment |
Publications (2)
Publication Number | Publication Date |
---|---|
CN109670454A CN109670454A (en) | 2019-04-23 |
CN109670454B true CN109670454B (en) | 2020-09-25 |
Family
ID=66145707
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201811568200.1A Active CN109670454B (en) | 2018-12-21 | 2018-12-21 | Biological information introduction method, biological information introduction device and terminal equipment |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN109670454B (en) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113457136B (en) * | 2021-06-29 | 2022-05-31 | 完美世界(北京)软件科技发展有限公司 | Game animation generation method and device, storage medium and terminal |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105141986A (en) * | 2015-08-06 | 2015-12-09 | 小米科技有限责任公司 | Method and device for video processing and method and device for image recognition |
CN106372106A (en) * | 2016-08-19 | 2017-02-01 | 无锡天脉聚源传媒科技有限公司 | Method and apparatus for providing video content assistance information |
CN106485609A (en) * | 2016-11-07 | 2017-03-08 | 宇龙计算机通信科技(深圳)有限公司 | Point folk prescription method based on augmented reality, system and equipment |
CN107748750A (en) * | 2017-08-30 | 2018-03-02 | 百度在线网络技术(北京)有限公司 | Similar video lookup method, device, equipment and storage medium |
CN107967110A (en) * | 2017-11-30 | 2018-04-27 | 广东小天才科技有限公司 | Playing method, playing device, electronic equipment and computer readable storage medium |
CN108170263A (en) * | 2017-11-26 | 2018-06-15 | 安徽省司尔特肥业股份有限公司 | A kind of crop growth experiencing system |
-
2018
- 2018-12-21 CN CN201811568200.1A patent/CN109670454B/en active Active
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105141986A (en) * | 2015-08-06 | 2015-12-09 | 小米科技有限责任公司 | Method and device for video processing and method and device for image recognition |
CN106372106A (en) * | 2016-08-19 | 2017-02-01 | 无锡天脉聚源传媒科技有限公司 | Method and apparatus for providing video content assistance information |
CN106485609A (en) * | 2016-11-07 | 2017-03-08 | 宇龙计算机通信科技(深圳)有限公司 | Point folk prescription method based on augmented reality, system and equipment |
CN107748750A (en) * | 2017-08-30 | 2018-03-02 | 百度在线网络技术(北京)有限公司 | Similar video lookup method, device, equipment and storage medium |
CN108170263A (en) * | 2017-11-26 | 2018-06-15 | 安徽省司尔特肥业股份有限公司 | A kind of crop growth experiencing system |
CN107967110A (en) * | 2017-11-30 | 2018-04-27 | 广东小天才科技有限公司 | Playing method, playing device, electronic equipment and computer readable storage medium |
Also Published As
Publication number | Publication date |
---|---|
CN109670454A (en) | 2019-04-23 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109308469B (en) | Method and apparatus for generating information | |
US10311115B2 (en) | Object search method and apparatus | |
CN109492607B (en) | Information pushing method, information pushing device and terminal equipment | |
CN107861750B (en) | Label management method, label management device and intelligent terminal | |
CN105589783A (en) | Application program lag problem data obtaining method and device | |
US20210350545A1 (en) | Image processing method and apparatus, and hardware apparatus | |
CN107967110A (en) | Playing method, playing device, electronic equipment and computer readable storage medium | |
CN108256006B (en) | Method and system for loading badge pictures in live broadcast room | |
CN111324378B (en) | Configuration method, device and equipment for video monitoring application software | |
CN110210501B (en) | Virtual object generation method, electronic device and computer-readable storage medium | |
EP3328041A1 (en) | Clothes matching recommendation method and smart terminal | |
CN110120087B (en) | Label marking method and device for three-dimensional virtual sand table and terminal equipment | |
CN109658501B (en) | Image processing method, image processing device and terminal equipment | |
CN107402756B (en) | Method, device and terminal for drawing page | |
CN107729491B (en) | Method, device and equipment for improving accuracy rate of question answer search | |
CN109670454B (en) | Biological information introduction method, biological information introduction device and terminal equipment | |
CN108721897B (en) | Game material display method and device, computing equipment and computer storage medium | |
CN103702218A (en) | Video playing method and device | |
CN107071553B (en) | Method, device and computer readable storage medium for modifying video and voice | |
CN110366025B (en) | Configuration method of display content, intelligent terminal and computer readable storage medium | |
CN110087120B (en) | Same-window switching method of online list and local list and computing equipment | |
CN109344335B (en) | Content recommendation method and electronic equipment | |
CN115941869A (en) | Audio processing method and device and electronic equipment | |
CN109683906A (en) | Handle the method and device of HTML code segment | |
CN113706671A (en) | Animation implementation method, device, equipment and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |