Nothing Special   »   [go: up one dir, main page]

CN111914685B - Sow oestrus detection method and device, electronic equipment and storage medium - Google Patents

Sow oestrus detection method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN111914685B
CN111914685B CN202010677045.8A CN202010677045A CN111914685B CN 111914685 B CN111914685 B CN 111914685B CN 202010677045 A CN202010677045 A CN 202010677045A CN 111914685 B CN111914685 B CN 111914685B
Authority
CN
China
Prior art keywords
sow
oestrus
model
video
preset
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010677045.8A
Other languages
Chinese (zh)
Other versions
CN111914685A (en
Inventor
鞠铁柱
王宇华
耿科
张震
申光
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Xiaolongqianxing Technology Co ltd
Original Assignee
Beijing Xiaolongqianxing Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Xiaolongqianxing Technology Co ltd filed Critical Beijing Xiaolongqianxing Technology Co ltd
Priority to CN202010677045.8A priority Critical patent/CN111914685B/en
Publication of CN111914685A publication Critical patent/CN111914685A/en
Application granted granted Critical
Publication of CN111914685B publication Critical patent/CN111914685B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/40Scenes; Scene-specific elements in video content
    • G06V20/46Extracting features or characteristics from the video content, e.g. video fingerprints, representative shots or key frames
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/049Temporal neural networks, e.g. delay elements, oscillating neurons or pulsed inputs
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/40Scenes; Scene-specific elements in video content
    • G06V20/41Higher-level, semantic clustering, classification or understanding of video scenes, e.g. detection, labelling or Markovian modelling of sport events or news items
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P60/00Technologies relating to agriculture, livestock or agroalimentary industries
    • Y02P60/80Food processing, e.g. use of renewable energies or variable speed drives in handling, conveying or stacking
    • Y02P60/87Re-use of by-products of food processing for fodder production

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computational Linguistics (AREA)
  • Software Systems (AREA)
  • General Health & Medical Sciences (AREA)
  • General Engineering & Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • Biomedical Technology (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • Biophysics (AREA)
  • Artificial Intelligence (AREA)
  • Mathematical Physics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Multimedia (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
  • Image Analysis (AREA)

Abstract

The embodiment of the invention provides a sow oestrus detection method, a sow oestrus detection device, electronic equipment and a storage medium, wherein the method comprises the following steps: acquiring a sow monitoring video; identifying whether the sow has oestrus local behaviors according to the sow monitoring video; acquiring a video image according to the sow monitoring video, and extracting the outline of the video image to acquire the sow outline; judging the sow posture according to the sow outline, and eliminating the sow sleep posture; if the sow is identified to have oestrus local behaviors, according to the video image after the sleep posture of the sow is eliminated, starting timing from the state of the sow to the state of standing, judging whether the duration time of the state of the sow in the state of standing is greater than a preset oestrus threshold value, and if so, judging that the sow is oestrus. The sow oestrus detection method provided by the embodiment of the invention can realize automatic detection and accurate detection of sow oestrus.

Description

Sow oestrus detection method and device, electronic equipment and storage medium
Technical Field
The invention relates to the technical field of computers, in particular to a sow oestrus detection method and device, electronic equipment and a storage medium.
Background
Whether the sow is in oestrus is a key link related to whether the sow can be normally bred, conception and farrowed. Common sow oestrus identification methods are classified into mental state identification methods, vulva change identification methods, compression identification methods, external observation methods, vaginal mucosa mucus examination methods, stillstanding reflex examination methods, boar oestrus test methods and the like.
For the oestrus identification of sows, a pressing identification method is mainly used in pig farms at home and abroad, namely, a professional breeder with abundant experience judges oestrus behaviors of the sows. The professional breeder presses the back or hip of the sow by hand, the sow stands still, or the sow climbs across the sow by the test boar, and the sow stands still, namely the hybridization period.
However, with the centralized large-scale development of the breeding industry, the shortage of professional breeders is a big problem facing the modern pig production, and thus, a method capable of automatically detecting oestrus of sows is urgently needed at present.
Disclosure of Invention
Aiming at the problems in the prior art, the embodiment of the invention provides a sow oestrus detection method, a sow oestrus detection device, electronic equipment and a storage medium.
Specifically, the embodiment of the invention provides the following technical scheme:
in a first aspect, an embodiment of the present invention provides a sow estrus detection method, including:
Acquiring a sow monitoring video; the sow monitoring video is obtained under the condition of boar condition searching;
identifying whether the sow has oestrus local behaviors according to the sow monitoring video;
acquiring a video image according to the sow monitoring video, and extracting the outline of the video image to acquire the sow outline;
judging the sow posture according to the sow outline, and eliminating the sow sleep posture;
if the sow is identified to have oestrus local behaviors, timing is started from the state of the sow to the state of standing according to the video image after the sleep posture of the sow is eliminated, whether the duration time of the state of the sow in the state of standing is greater than a preset oestrus threshold value is judged, if so, the sow is judged to be oestrus, otherwise, the starting time of the state of the sow in the state of standing is determined again, and the judging process of whether the state of the sow in the state of standing is greater than the preset oestrus threshold value is repeatedly executed.
Further, identifying whether a sow has oestrus local behaviors according to the sow monitoring video comprises:
the method comprises the steps of collecting a first video in a preset time period before oestrus of oestrus sows and a second video in a preset time period before oestrus of non-oestrus sows in an oestrus detection stage;
Dividing the first video and the second video into video frames, obtaining training samples, and dividing the training samples into a training data set and a test data set;
taking the training data set as sample input data, taking whether oestrus local behaviors exist or not as sample tag data, and training an initial machine learning model to obtain a preliminary sow oestrus local behavior judging model; the initial machine learning model comprises a CNN model and an LSTM model, wherein the CNN model is used for extracting features, and the LSTM model is used for performing classification learning based on feature extraction results of the CNN model;
testing the preliminary sow oestrus local behavior judgment model by using a test data set, and adjusting parameters of the preliminary sow oestrus local behavior judgment model according to a test result until a predicted result meets a preset accuracy condition to obtain an optimal sow oestrus local behavior judgment model;
inputting the acquired sow monitoring video into the optimal sow oestrus local behavior judging model, and judging whether the sow has oestrus local behaviors or not according to the output result of the optimal sow oestrus local behavior judging model.
Further, acquiring a video image according to the sow monitoring video, and extracting the outline of the video image to acquire the sow outline, including:
collecting a plurality of images of sows under different conditions of feeding, sleeping and oestrus in advance to serve as training samples;
marking the outline of the sow on each image in the training sample by using an open source marking tool Labelme to form a training set and a testing set;
taking the images in the training set as sample input data, taking the corresponding contour marking results as sample output data, and training based on a Mask-Rcnn network model to obtain a preliminary sow contour extraction model;
testing the preliminary sow profile extraction model by utilizing the image in the test set and the corresponding profile marking result, and adjusting the preliminary sow profile extraction model according to the test result until the predicted result meets the preset accuracy condition to obtain an optimal sow profile extraction model;
and taking the sow monitoring video as input, preprocessing each video image in the sow monitoring video, inputting the preprocessed video image into the optimal sow profile extraction model, and acquiring the sow profile according to the output of the optimal sow profile extraction model.
Further, the sow posture is judged according to the sow outline, and the sow sleep posture is eliminated, including:
collecting contour images of sows in different standing and sleeping postures as training samples;
taking the gesture marking result of the training sample as tag data to form a training set and a testing set; the gesture marking result comprises a standing gesture and a sleeping gesture;
taking the outline image in the training set as sample input data, taking the corresponding tag data as sample output data, and training based on the LeNet network model to obtain a preliminary sow gesture detection model; the preliminary sow posture detection model can exclude sows in sleep postures in the outline image;
testing the preliminary sow gesture detection model by utilizing the outline image in the test set and the corresponding label data, and adjusting the preliminary sow gesture detection model according to the test result until the prediction result meets the preset accuracy condition to obtain an optimal sow gesture detection model;
inputting the sow outline to the optimal sow posture detection model, and eliminating the sow sleep posture according to the output result of the optimal sow posture detection model.
Further, if the sow is identified to have a estrus local behavior, timing is started from the state of the sow to the state of standing according to the video image after the sleep posture of the sow is eliminated, whether the duration of the state of the sow in the state of standing is greater than a preset estrus threshold value is judged, if so, the sow is judged to be estrus, otherwise, the starting time of the state of the sow standing is determined again, and the judging process of whether the state of the sow standing is greater than the preset estrus threshold value is repeatedly executed, and the method comprises the following steps:
judging whether the oestrus local behaviors exist or not, if so, according to the video image after the sleep posture of the sow is eliminated, judging from the sow state to the standing state, performing sample modeling on the acquired first sow outline image and starting timing;
setting all newly acquired sow contour images as foreground images, setting the last acquired sow contour image as background images, comparing the contours of the foreground images and the background images, judging whether the contours of the foreground images and the background images change or not, and carrying out re-modeling and timing when the contour change value is larger than a preset standing judgment threshold value; if the contour change value is smaller than or equal to the preset standing judgment threshold value, continuing to accumulate the time, judging whether the accumulated time reaches the preset estrus threshold value, if so, sending a result to a preset terminal to inform the preset terminal that the sow estrus is monitored, and if not, continuing to read the sow contour image to repeat the process until the process is ended.
Further, the preset estrus threshold is 120-150 seconds.
In a second aspect, an embodiment of the present invention further provides a sow estrus detection apparatus, including:
the acquisition module is used for acquiring the sow monitoring video; the sow monitoring video is obtained under the condition of boar condition searching;
the identification module is used for identifying whether the sow has oestrus local behaviors according to the sow monitoring video;
the extraction module is used for acquiring a video image according to the sow monitoring video, and extracting the outline of the video image to acquire the sow outline;
the elimination module is used for judging the sow posture according to the sow outline and eliminating the sow sleep posture;
the detection module is used for judging whether the duration time of the state of the sow in the standing state is greater than a preset oestrus threshold value or not according to the video image after the sleep posture of the sow is eliminated when the local oestrus behavior of the sow is identified, judging whether the state of the sow in the standing state is greater than the preset oestrus threshold value or not, judging that the sow is oestrus if the duration time of the state of the sow in the standing state is greater than the preset oestrus threshold value, otherwise, determining the starting time of the state of the sow in the standing state again, and repeatedly executing the judging process of whether the state of the sow in the standing state is greater than the preset oestrus threshold value or not.
Further, the identification module is specifically configured to:
the method comprises the steps of collecting a first video in a preset time period before oestrus of oestrus sows and a second video in a preset time period before oestrus of non-oestrus sows in an oestrus detection stage;
dividing the first video and the second video into video frames, obtaining training samples, and dividing the training samples into a training data set and a test data set;
taking the training data set as sample input data, taking whether oestrus local behaviors exist or not as sample tag data, and training an initial machine learning model to obtain a preliminary sow oestrus local behavior judging model; the initial machine learning model comprises a CNN model and an LSTM model, wherein the CNN model is used for extracting features, and the LSTM model is used for performing classification learning based on feature extraction results of the CNN model;
testing the preliminary sow oestrus local behavior judgment model by using a test data set, and adjusting parameters of the preliminary sow oestrus local behavior judgment model according to a test result until a predicted result meets a preset accuracy condition to obtain an optimal sow oestrus local behavior judgment model;
inputting the acquired sow monitoring video into the optimal sow oestrus local behavior judging model, and judging whether the sow has oestrus local behaviors or not according to the output result of the optimal sow oestrus local behavior judging model.
In a third aspect, an embodiment of the present invention further provides an electronic device, including a memory, a processor, and a computer program stored in the memory and executable on the processor, where the steps of the sow oestrus detection method according to the first aspect are implemented when the processor executes the program.
In a fourth aspect, embodiments of the present invention also provide a non-transitory computer readable storage medium having stored thereon a computer program which, when executed by a processor, implements the steps of the sow oestrus detection method according to the first aspect.
According to the technical scheme, the sow oestrus detection method, the sow oestrus detection device, the electronic equipment and the storage medium provided by the embodiment of the invention are characterized in that firstly, the sow oestrus local behavior is identified according to the sow monitoring video image, and under the condition that the sow oestrus local behavior is determined, the static standing time length is judged, and the method specifically comprises the following steps: and acquiring a sow outline according to the sow monitoring video image, then removing the sow sleeping posture according to the sow outline, judging whether the duration of the sow standing state is greater than a preset oestrus threshold value when the sow posture non-sleeping posture is determined and the sow standing state is in the standing state, if so, judging oestrus of the sow, otherwise, repeating the process in a timing mode. Therefore, according to the sow oestrus detection method provided by the embodiment of the invention, the monitoring video of the sow is collected, and then whether the sow oestrus occurs or not can be judged through background automatic processing, so that the automatic detection of the sow oestrus is realized, the required cost is lower, and meanwhile, the oestrus identification accuracy is higher because the oestrus is assisted and judged according to the special local action of the oestrus.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions of the prior art, the following description will briefly explain the drawings used in the embodiments or the description of the prior art, and it is obvious that the drawings in the following description are some embodiments of the present invention, and other drawings can be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a flow chart of a sow oestrus detection method according to an embodiment of the invention;
fig. 2 is a schematic diagram of a processing procedure of a sow oestrus detection method according to an embodiment of the invention;
fig. 3 is a schematic diagram of a sow monitoring video acquisition process according to an embodiment of the invention;
fig. 4 is a schematic diagram of a sow monitoring video acquisition device according to an embodiment of the invention;
fig. 5 is a schematic diagram for identifying local oestrus behaviors of a sow according to an embodiment of the invention;
fig. 6 is a schematic diagram of a sow sleep posture elimination process according to an embodiment of the invention;
fig. 7 is a schematic diagram of a sow profile extraction process according to an embodiment of the invention;
fig. 8 is a schematic diagram of a sow image provided by an embodiment of the invention;
FIG. 9 is a diagram illustrating a sow profile segmentation according to an embodiment of the present invention;
Fig. 10 is a schematic diagram of a sow standing recognition and estrus monitoring process according to an embodiment of the invention;
FIG. 11 is a schematic diagram of a sow estrus detection device according to an embodiment of the invention;
fig. 12 is a schematic structural diagram of an electronic device according to an embodiment of the present invention.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the embodiments of the present invention more apparent, the technical solutions of the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present invention, and it is apparent that the described embodiments are some embodiments of the present invention, but not all embodiments of the present invention. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
Whether the sow is in oestrus is a key link related to whether the sow can be normally bred, conception and farrowed. Common sow oestrus identification methods are classified into mental state identification methods, vulva change identification methods, compression identification methods, external observation methods, vaginal mucosa mucus examination methods, stillstanding reflex examination methods, boar oestrus test methods and the like.
For the oestrus identification of sows, a pressing identification method is mainly used in pig farms at home and abroad, namely, a professional breeder with abundant experience judges oestrus behaviors of the sows. The professional breeder presses the back or hip of the sow by hand, the sow stands still, or the sow climbs across the sow by the test boar, and the sow stands still, namely the hybridization period. However, with the centralized large-scale development of the breeding industry, the shortage of professional breeders is a big problem facing the modern pig production, and thus, a method capable of automatically detecting oestrus of sows is urgently needed at present.
At present, an infrared sensor is mostly adopted at home and abroad to judge sow oestrus. For example, s.c. scolari et al detect changes in the body surface temperature and the hip temperature of the exocarpium in the oestrus cycle of a sow by using an infrared thermal imaging technique, and found that the temperature of the vulva of the sow is significantly raised at the start of oestrus and significantly lowered before ovulation, thereby providing the possibility of oestrus judgment according to the changes in the pudendum temperature, but the infrared thermal imaging technique has many sensitive factors such as wind speed, pig farm environmental humidity, external temperature, standing posture, position of the sow at the time of measurement, and the like, so that it is not suitable as a judgment basis. Besides, zhang Ziyun and the like detect the subcutaneous body temperature of each sow after different varieties through a novel electronic chip and explore the non-oestrus period and oestrus period body temperature change rules of the replacement sow. However, the temperature of the pig per day also changes by about 1 ℃ and the individual difference causes the body temperature difference, in addition, the contact temperature acquisition equipment can generate stress reaction to the sow, but the cost of the non-contact temperature acquisition equipment is high, the acquisition distance is required, and the temperature contrast is not obvious, so the pig temperature is not suitable to be used as a judgment basis.
In order to solve the problems, the embodiment of the invention provides a sow oestrus detection method and device based on video behavior data identification, and the embodiment of the invention not only can realize automatic detection of sow oestrus under the assistance of local behaviors, but also has low cost, high efficiency and high detection accuracy. The scheme provided by the invention will be explained in detail by specific examples.
Fig. 1 shows a flowchart of a sow oestrus detection method according to an embodiment of the present invention. As shown in fig. 1, the sow oestrus detection method provided by the embodiment of the invention comprises the following steps:
step 101: acquiring a sow monitoring video; the sow monitoring video is obtained under the condition of boar condition searching;
step 102: identifying whether the sow has oestrus local behaviors according to the sow monitoring video;
in this step, the sow oestrus local behavior is identified according to the sow monitoring video image. The local behaviors of sow oestrus include: during oestrus, the sow can suffer from the phenomenon of little or no feeding, the back is slightly arched, the head is often straight to the front, the ears are erected, and the pig only acts in a dull state and the like.
Step 103: acquiring a video image according to the sow monitoring video, and extracting the outline of the video image to acquire the sow outline;
Step 104: judging the sow posture according to the sow outline, and eliminating the sow sleep posture;
step 105: if the sow is identified to have oestrus local behaviors, timing is started from the state of the sow to the state of standing according to the video image after the sleep posture of the sow is eliminated, whether the duration time of the state of the sow in the state of standing is greater than a preset oestrus threshold value is judged, if so, the sow is judged to be oestrus, otherwise, the starting time of the state of the sow in the state of standing is determined again, and the judging process of whether the state of the sow in the state of standing is greater than the preset oestrus threshold value is repeatedly executed.
In this embodiment, when acquiring a sow monitoring video, an image monitoring device is required to be arranged above a sow fence, for example, an RGB camera is mounted above the sow fence to collect video in real time and perform oestrus analysis, and in addition, it is required to explain that when performing sow oestrus monitoring, a special oestrus checking area is required to be defined, a boar car track is paved, and a boar car is used for oestrus checking, specifically, the boar car is made to advance at a uniform speed, so that the contact time of the boar car and each sow is about 2 minutes. When the boar is used for oestrus, the standing performance of the sow is taken as the main judging basis (the sow is motionless for more than 2 minutes), the standing identification is the core of the scheme, the accuracy of the method directly influences the accuracy of oestrus judgment, and the oestrus is detected in an auxiliary way by taking the special local behaviors of the oestrus sow as auxiliary means. Wherein, a warning lamp for oestrus is arranged on each pig column, and when the oestrus is confirmed, the red lamp is lighted. And transmitting the specific columns of the oestrus sow to a server side and pushing the specific columns to the mobile terminal of the manager.
In this embodiment, a sow monitoring video is obtained from an image monitoring device, then a sow estrus local behavior is identified from a read sow monitoring video image, a sow outline is obtained by outline extraction, then the sow outline is judged according to the sow outline, after the non-sleep state of the sow is determined, timing is started from the state of the sow to judge that the state of the sow is still, whether the duration of the state of the sow is still is greater than a preset estrus threshold value is judged, if the duration of the state of the sow is greater than the preset estrus threshold value, the sow is judged to be estrus, otherwise, the starting time of the state of the sow still is redetermined, and the judging process of whether the state of the sow still is greater than the preset estrus threshold value is repeatedly executed.
In this embodiment, it should be noted that, in the sow oestrus detection method provided in this embodiment, step 102 may be omitted, but since an auxiliary determination process of a local oestrus behavior of a sow is omitted, in this case, the preset oestrus threshold needs to be set longer, so as to improve the accuracy of oestrus determination. For example, 180s may be set. If step 102 is not omitted, the accuracy of estrus determination is higher, and at the same time, the standing time standard of estrus determination can be reduced, that is, the preset estrus threshold can be set to be shorter, for example, 150s, so as to improve the estrus determination speed.
In this embodiment, it should be noted that, under the assistance of the local behaviors of the sow oestrus, the detection mode that whether the sow is growing for a long time is still standing is judged by monitoring the video image, and whether the sow oestrus is detected, compared with the detection mode that the temperature is detected by the sensor, the detection mode is low in cost, high in automation degree, free of emergency reaction and high in detection accuracy, so that the detection method can be well popularized practically.
According to the technical scheme, the sow oestrus detection method provided by the embodiment of the invention firstly identifies local oestrus behaviors of the sow according to the sow monitoring video image, for example: during oestrus, the sow can suffer from the phenomenon of little or no feeding, the back is slightly arched, the head is often straight to the front, the ears are erected, and the pig only acts in a dull state and the like. Under the condition that the local oestrus behavior of the sow is determined, the static standing time length is judged, and the method specifically comprises the following steps: and acquiring a sow outline according to the sow monitoring video image, then removing the sow sleeping posture according to the sow outline, judging whether the duration of the sow standing state is greater than a preset oestrus threshold value when the sow posture non-sleeping posture is determined and the sow standing state is in the standing state, if so, judging oestrus of the sow, otherwise, repeating the process in a timing mode. Therefore, according to the sow oestrus detection method provided by the embodiment of the invention, the monitoring video of the sow is collected, and then whether the sow oestrus occurs or not can be judged through background automatic processing, so that the automatic detection of the sow oestrus is realized, the required cost is lower, and meanwhile, the oestrus identification accuracy is higher because the oestrus is assisted and judged according to the special local action of the oestrus. In addition, by collecting the monitoring video of the sow and processing the monitoring video by a computer so as to detect whether the sow is in oestrus, compared with other modes of detecting by adopting a sensor, the method has the advantages of low cost, high efficiency and no damage, and can also avoid the stress reaction of the sow caused by adopting the sensor monitoring method. In addition, the estrus time can be more accurately determined by detecting through a computer, so that an accurate basis can be provided for artificial fertilization, the success rate of one-time fertilization is improved, and the disease transmission risk caused by artificial participation can be reduced.
As shown in fig. 2, the present embodiment provides a sow estrus detection method based on video behavior data identification, and the present embodiment obtains a sow real-time monitoring video through a pig farm camera, on one hand, compares the sow monitoring video with a sow estrus behavior model to determine whether a sow has a special behavioral characteristic of an estrus sow, if the special behavioral characteristic exists, the stillness determination estrus standard is reduced, and if not, whether estrus is determined according to a normal standard. The sow oestrus behavior model is obtained by deep learning based on data of the first half hour of oestrus of the hundred-hair oestrus sow. On the other hand, a boar (or oestrus agent) is introduced in the oestrus stage, the sow image outline is identified according to a sow outline identification model based on MASK-RCNN through a sow monitoring video, after the sleep posture of the sow is eliminated, the oestrus behavior is performed according to the earlier stage, if the earlier stage oestrus behavior is obvious, the stillness threshold (also called oestrus threshold) is adjusted downwards for a period of time (from 180 seconds to 150 seconds), otherwise, the stillness threshold is judged according to the normal standard stillness threshold. The standing time is counted from the state of the sow in the video to the standing state. According to the sow oestrus detection method provided by the embodiment of the invention, the monitoring video of the sow is collected, and whether the sow oestrus occurs or not can be judged through background automatic processing, so that the automatic detection of the sow oestrus is realized, and the personnel cost can be reduced. In addition, in the embodiment, the comprehensive judgment is performed according to the special behavioral characteristics and the static expression of the sow oestrus, so that the oestrus identification accuracy is high.
In the following examples, video data is required to introduce a determination of whether a sow has a specific local behaviour in oestrus and to detect oestrus phases. Accordingly, a description of the sow video data acquisition process is given below.
Software needed to be used: video acquisition platform software; the hardware required is as follows: RGB camera, raspberry group 4b+; high performance computing and application service platform (1-2 GPU card)
The video acquisition process comprises the following steps: the video acquisition is connected with the raspberry group 4b+ through the RGB camera, a client acquisition program is operated through the raspberry group 4b+, data are sent to the server, and the sow video data are stored locally after the server receives the data.
The video acquisition module mainly adopts a C/S structure and is divided into a client and a server, wherein the client is in charge of acquiring video images, then data codes are transmitted to the server through socket communication, and the server decodes and transcodes the received data for storage.
As shown in fig. 4, the video acquisition software acquisition end is divided into the following modules:
and a communication module: the server is used for being responsible for the connection between the server and the client;
and a data acquisition module: for video acquisition;
and (3) a clock module: for timing the acquisition device;
In this embodiment, the communication module is specifically configured to:
the communication module for video acquisition adopts socket communication, and the service end is considered to be connected with a plurality of clients at the same time, so that the clients can detect the service end at all times by adopting a tcp protocol, and send request connection when the service end is detected, and after connection is successful, information is acquired from a queue and transmitted.
In this embodiment, the data acquisition module is specifically configured to:
the data acquisition module is only used as an opencv image processing tool, and the opencv is provided with a camera reading interface, so that after setting image parameters, the image can be read according to frames, and the data can be stored in an acquisition queue.
In this embodiment, the clock module is specifically configured to:
the clock module mainly times the acquisition equipment, sets variables which are accessible to the acquisition and transmission modules in the program, wherein the variables are marks for running the program, and the clock module can realize the control of the program by modifying the change amount.
Since the image sample data of the sow in different states is needed for introducing contour extraction and posture determination in the subsequent embodiments, a sow image sample collection method and a sow image sample collection process are given below.
Software needed to be used: data acquisition platform software; the hardware required is as follows: RGB cameras (9); high performance computing and application service platform (1-2 GPU card)
And (3) test data acquisition: the purpose is as follows: data acquisition (adopting RGB camera to acquire pictures at fixed time according to the need); using a professional monitoring camera (or a professional RGB camera), dust and moisture protection should be considered in view of the pigsty environment. Each fence is specially provided with a camera, the timing acquisition is carried out, the time interval is 1 second, the equipment is started before the condition is checked (or triggered), in addition, the sow fence matching is carried out before the acquisition, the configuration scheme (net height is 1.8-2 meters) of the camera above the fence can be realized by adopting an RGB camera, and the camera can be placed above a plurality of fences of sows to shoot a top view, as shown in fig. 3.
The specific data acquisition scheme is as follows:
1. establishing a data acquisition site
2. Selecting part of representative pig (pig fence)
9 cameras above (the heights are 1.8 meters, 2.0 meters and 2.2 meters), and a plurality of columns of pigs are respectively covered (the number of specific columns is determined according to the coverage area);
for the posture determination of sows, it is necessary to acquire data of the following states:
1) Non-sleep;
2) Sleeping;
and respectively selecting feeding time and sleeping time, and carrying out data acquisition in an oestrus detection time (only day is considered), wherein a picture is taken at intervals of 2 seconds. Under the condition that the condition is not allowed, shooting and collecting can be carried out all the day, and manual processing is carried out at a later stage.
Sampling period:
daytime data for 1 month were collected according to the above requirements.
When data are collected:
1) Identifying a daily sleep start time-end time (the interval data is stored separately);
2) Picture naming rules, column-month, day, minute and second
Data set label:
inputting an acquisition program starting command on a server by an administrator when each photo is acquired; by calling the camera, a photo is acquired every 2 seconds, and the information of the column where the camera is located and the acquired time is named for the photo. And directly storing the acquired data in a server side.
And finally, intercepting the sleep posture data set of each day according to the marks, namely the start time-end time of each day sleep, and providing a data set for the posture judging learning model.
Further, based on the content of the foregoing embodiment, in this embodiment, identifying whether there is a local estrus behavior of the sow according to the sow monitoring video includes:
the method comprises the steps of collecting a first video in a preset time period before oestrus of oestrus sows and a second video in a preset time period before oestrus of non-oestrus sows in an oestrus detection stage;
dividing the first video and the second video into video frames, obtaining training samples, and dividing the training samples into a training data set and a test data set;
Taking the training data set as sample input data, taking whether oestrus local behaviors exist or not as sample tag data, and training an initial machine learning model to obtain a preliminary sow oestrus local behavior judging model; the initial machine learning model comprises a CNN model and an LSTM model, wherein the CNN model is used for extracting features, and the LSTM model is used for performing classification learning based on feature extraction results of the CNN model;
testing the preliminary sow oestrus local behavior judgment model by using a test data set, and adjusting parameters of the preliminary sow oestrus local behavior judgment model according to a test result until a predicted result meets a preset accuracy condition to obtain an optimal sow oestrus local behavior judgment model;
inputting the acquired sow monitoring video into the optimal sow oestrus local behavior judging model, and judging whether the sow has oestrus local behaviors or not according to the output result of the optimal sow oestrus local behavior judging model.
In this embodiment, as shown in fig. 5, at first, a estrus searching stage is acquired, a video 20min before estrus of a estrus sow and a video 20min before estrus searching of a non-estrus sow can be acquired, then the video is segmented according to a 10 second one-frame time sequence, and a training sample is acquired, so that a training data set and a testing data set are generated; then inputting the training data set into a CNN model for feature extraction, and then inputting the training data set into an LSTM model for classification learning; testing the sow oestrus local behavior judging model by using a testing set, and adjusting parameters of the sow oestrus local behavior judging model according to a testing result until a predicted result meets a preset accuracy condition to obtain an optimal sow oestrus local behavior judging model; inputting the acquired sow monitoring video into the sow oestrus local behavior judging model, and judging whether the sow has oestrus local behaviors or not according to the output result of the sow oestrus local behavior judging model. It should be noted that, the preset accuracy condition herein may be that the prediction accuracy is higher than a preset threshold, for example, higher than 90%.
Further, based on the content of the foregoing embodiment, in this embodiment, obtaining a video image according to the sow monitoring video, and performing contour extraction on the video image to obtain a sow contour includes:
collecting a plurality of images of sows under different conditions of feeding, sleeping and oestrus in advance to serve as training samples;
marking the outline of the sow on each image in the training sample by using an open source marking tool Labelme to form a training set and a testing set;
taking the images in the training set as sample input data, taking the corresponding contour marking results as sample output data, and training based on a Mask-Rcnn network model to obtain a preliminary sow contour extraction model;
testing the preliminary sow profile extraction model by utilizing the image in the test set and the corresponding profile marking result, and adjusting the preliminary sow profile extraction model according to the test result until the predicted result meets the preset accuracy condition to obtain an optimal sow profile extraction model;
and taking the sow monitoring video as input, preprocessing each video image in the sow monitoring video, inputting the preprocessed video image into the optimal sow profile extraction model, and acquiring the sow profile according to the output of the optimal sow profile extraction model.
In this embodiment, a plurality of images of sows in different situations of eating, sleeping and oestrus are collected in advance as training samples by the raspberry group device. As shown in fig. 8 and 9, marking the outline of the sow on each image in the training sample by using an open source marking tool Labelme to form a training set and a test set; taking the images in the training set as sample input data, taking the corresponding contour marking results as sample output data, and training based on a Mask-Rcnn network model to obtain a preliminary sow contour extraction model; testing the preliminary sow profile extraction model by utilizing the image in the test set and the corresponding profile marking result, and adjusting the preliminary sow profile extraction model according to the test result until the predicted result meets the preset accuracy condition (for example, the accuracy is more than 90 percent) to obtain an optimal sow profile extraction model; and (3) performing image preprocessing (denoising, histogram equalization and sharpening) on the sow monitoring image, inputting the image into the optimal sow profile extraction model, and performing profile extraction according to the optimal sow profile extraction model to obtain the sow profile.
In this embodiment, since the shooting angle can lead to that the sow in the middle can shoot all the outlines, the sow on both sides can only shoot part of the outlines (for example, when the sow sleeps, the sow on the left side is close to the railing on the leftmost side, and the angle can lead to shooting the limb insufficiency of the sow). The profile head photographed by the sow in feeding and resting is different, the sow in feeding and drinking heads are low, and the profile of the sow in sleeping is four-limb-containing and is different from the profile in resting, so that the profile of the sow needs to be extracted, and then the resting state and the resting time length are judged based on the extracted profile.
In this embodiment, as shown in fig. 7, image recognition and contour extraction are performed by a Mask-Rcnn-based deep learning method. Firstly, collecting a plurality of images of sows under different conditions of feeding, sleeping and oestrus in advance through raspberry pie equipment to serve as training samples, and marking the outline of the sows on each image in the training samples by using an open source marking tool Labelme to form a training set and a testing set; performing histogram equalization pretreatment on the experimental data to reduce the influence of uneven brightness; setting Mask-Rcnn structural parameters, and training a segmentation model by using a training sample to obtain an optimal sow image segmentation model.
In this embodiment, it should be noted that, the Mask-Rcnn has three outputs, which are the classification result, the coordinates of the prediction frame, and the object_mask, respectively, and the three output branches are all parallel to each other, and compared with other example segmentation algorithms of prior segmentation and reclassifying, the parallel design is simple and efficient.
Further, based on the foregoing embodiment, in this embodiment, determining the sow posture according to the sow profile, excluding the sow sleep posture includes:
collecting contour images of sows in different standing and sleeping postures as training samples;
Taking the gesture marking result of the training sample as tag data to form a training set and a testing set; the gesture marking result comprises a standing gesture and a sleeping gesture;
taking the outline image in the training set as sample input data, taking the corresponding tag data as sample output data, and training based on the LeNet network model to obtain a preliminary sow gesture detection model; the preliminary sow posture detection model can exclude sows in sleep postures in the outline image;
testing the preliminary sow gesture detection model by utilizing the outline image in the test set and the corresponding label data, and adjusting the preliminary sow gesture detection model according to the test result until the prediction result meets the preset accuracy condition to obtain an optimal sow gesture detection model;
inputting the sow outline to the optimal sow posture detection model, and eliminating the sow sleep posture according to the output result of the optimal sow posture detection model.
In the embodiment, firstly, contour images of sows in standing postures and sleeping postures are collected to serve as training samples; manufacturing label tag data for standing and sleeping postures in the training sample to form a training set and a testing set; then, taking the outline image in the training set as sample input data, taking the corresponding gesture marking result as sample output data, and training based on the LeNet network model to obtain a preliminary sow gesture detection model; finally, testing the preliminary sow gesture detection model by utilizing the outline image in the test set and the corresponding gesture marking result, and adjusting the preliminary sow gesture detection model according to the test result until the prediction result meets the preset accuracy condition (for example, the accuracy is more than 90 percent) to obtain an optimal sow gesture detection model; inputting the sow outline to the optimal sow posture detection model, and eliminating the sow sleep posture according to the output result of the optimal sow posture detection model.
In this embodiment, since the posture of the sow is mainly divided into a standing posture and a sleeping posture, but the long duration of the sleeping posture affects the determination of the oestrus behavior of the sow, it is necessary to exclude the sleeping posture of the sow. In this embodiment, as shown in fig. 6, the LeNet network is used to perform gesture recognition to exclude the sleep gesture of the sow. Firstly, acquiring a sow picture, preparing a training sample, and secondly, marking sleeping postures and non-sleeping postures of sows in different columns in the training sample to form a training set and a testing set; and setting LeNet network structure parameters, and training the identification model by using a training sample to obtain an optimal sow posture detection model. It should be noted that, in this embodiment, the LeNet network is adopted to perform gesture recognition to exclude the sleep gesture of the sow, and the precision and the speed of using the LeNet network model are obviously superior to those of other small-sized neural network models.
Further, based on the foregoing embodiment, in this embodiment, if a sow has a estrus local behavior, according to a video image after a sleep posture of the sow is excluded, starting timing from a state of the sow determined as a standing state, determining whether a duration of the state of the sow in the standing state is greater than a preset estrus threshold, if so, determining that the sow is estrus, otherwise, re-determining a starting time of the standing state of the sow, and repeating a determination process of whether the standing state of the sow is greater than the preset estrus threshold, including:
Judging whether the oestrus local behaviors exist or not, if so, according to the video image after the sleep posture of the sow is eliminated, judging from the sow state to the standing state, performing sample modeling on the acquired first sow outline image and starting timing;
setting all newly acquired sow contour images as foreground images, setting the last acquired sow contour image as background images, comparing the contours of the foreground images and the background images, judging whether the contours of the foreground images and the background images change or not, and carrying out re-modeling and timing when the contour change value is larger than a preset standing judgment threshold value; if the contour change value is smaller than or equal to the preset standing judgment threshold value, continuing to accumulate the time, judging whether the accumulated time reaches the preset estrus threshold value, if so, sending a result to a preset terminal to inform the preset terminal that the sow estrus is monitored, and if not, continuing to read the sow contour image to repeat the process until the process is ended.
In the embodiment, firstly, whether special local behaviors of the oestrus sow exist or not is judged, if yes, the first sow outline image is subjected to sample modeling from the state that the sow is determined to be a standing state, and timing is started; setting all newly acquired sow contour images as foreground images, setting the last acquired sow contour image as background images, comparing the contours of the foreground images and the background images, judging whether the contours of the foreground images and the background images have larger changes or not, and carrying out re-modeling and timing when the contour change value is larger than a preset standing judgment threshold value; if the contour change value is smaller than or equal to the preset standing judgment threshold value, continuing to accumulate the time, judging whether the accumulated time reaches the preset estrus threshold value, if so, sending a result to a preset terminal to inform the preset terminal that the sow estrus is monitored, and if not, continuing to read the sow contour image to repeat the process until the process is ended.
In the present embodiment, as shown in fig. 10, a determination method based on a contour background modeling algorithm is used. Firstly, judging whether a special local behavior of an oestrus sow exists, if yes, modeling a first sow outline image acquired from a sow video, starting timing, setting all newly acquired pictures as foreground pictures, setting the last acquired picture as background pictures, repeatedly acquiring the sow outline image from a video stream, and comparing the foreground pictures with the background pictures, and judging whether mutation occurs (wherein the mutation is that the similarity difference of the front sow outline and the rear sow outline is larger and is larger than a preset standing outline similarity value). If mutation occurs, updating the foreground image into the background image, re-modeling and timing, if the contour similarity of the foreground image and the background image is within the preset still judging threshold value, continuing to accumulate time, judging whether the accumulated time reaches the preset oestrus threshold value, if so, sending the result to a preset terminal to inform the preset terminal of monitoring oestrus of the sow, and if not, continuing to read the contour image of the sow to repeat the process until the process is ended. In the embodiment, the contour background modeling algorithm is better suitable for the situation of a later device, and the influence caused by the shaking of the camera can be perfectly solved through acquiring the contour and comparing the similarity of the contour, and the recognition speed is higher.
In addition, it should be noted that, in this embodiment, since the estrus is assisted and judged according to the special local behaviors of the estrus, the preset estrus threshold may be set lower than the preset estrus threshold in the general estrus judgment method. For example, if there is a local condition of estrus, a smaller estrus threshold may be used, for example, from 180 seconds to 120-150 seconds, to increase the recognition speed.
From this, as shown in fig. 2, the present embodiment provides a sow estrus detection method based on video behavior data identification, and in this embodiment, a sow real-time monitoring video is obtained through a pig farm camera, on one hand, the sow monitoring video is compared with a sow estrus behavior model, and whether a specific behavior feature of an estrus sow exists in the sow is determined, if the specific behavior feature exists, the still standing estrus determination standard is reduced, otherwise, whether estrus exists is determined according to a normal standard. The sow oestrus behavior model is obtained by deep learning based on data of the first half hour of oestrus of the hundred-hair oestrus sow. On the other hand, a boar (or oestrus agent) is introduced in the oestrus stage, the sow image outline is identified according to a sow outline identification model based on MASK-RCNN through a sow monitoring video, after the sleep posture of the sow is eliminated, the oestrus behavior is performed according to the earlier stage, if the earlier stage oestrus behavior is obvious, the stillness threshold (also called oestrus threshold) is adjusted downwards for a period of time (from 180 seconds to 150 seconds), otherwise, the stillness threshold is judged according to the normal standard stillness threshold. The standing time is counted from the state of the sow in the video to the standing state. According to the sow oestrus detection method provided by the embodiment of the invention, the monitoring video of the sow is collected, and whether the sow oestrus occurs or not can be judged through background automatic processing, so that the automatic detection of the sow oestrus is realized, and the personnel cost can be reduced. In addition, in the embodiment, the comprehensive judgment is performed according to the special behavioral characteristics and the static expression of the sow oestrus, so that the oestrus identification accuracy is high.
Fig. 11 shows a schematic structural diagram of a sow estrus detection device according to an embodiment of the invention. As shown in fig. 11, the sow estrus detection device provided in this embodiment includes: an acquisition module 21, an identification module 22, an extraction module 23, an exclusion module 24 and a detection module 25, wherein:
an acquisition module 21, configured to acquire a sow monitoring video; the sow monitoring video is obtained under the condition of boar condition searching;
the identifying module 22 is used for identifying whether the sow has oestrus local behaviors according to the sow monitoring video;
the extracting module 23 is configured to obtain a video image according to the sow monitoring video, and perform contour extraction on the video image to obtain a sow contour;
the elimination module 24 is used for judging the sow posture according to the sow outline and eliminating the sow sleep posture;
the detection module 25 is configured to determine, when it is identified that there is a local estrus behavior in the sow, from the sow state to start timing when it is determined that the state is still according to the video image after the sleep posture of the sow is excluded, determine whether a duration of the sow state in the state is greater than a preset estrus threshold, if so, determine that the sow is estrus, otherwise, re-determine a starting time of the state is still, and repeatedly execute a determination process of whether the state is still greater than the preset estrus threshold.
Further, based on the content of the foregoing embodiment, in this embodiment, the identification module 22 is specifically configured to:
the method comprises the steps of collecting a first video in a preset time period before oestrus of oestrus sows and a second video in a preset time period before oestrus of non-oestrus sows in an oestrus detection stage;
dividing the first video and the second video into video frames, obtaining training samples, and dividing the training samples into a training data set and a test data set;
taking the training data set as sample input data, taking whether oestrus local behaviors exist or not as sample tag data, and training an initial machine learning model to obtain a preliminary sow oestrus local behavior judging model; the initial machine learning model comprises a CNN model and an LSTM model, wherein the CNN model is used for extracting features, and the LSTM model is used for performing classification learning based on feature extraction results of the CNN model;
testing the preliminary sow oestrus local behavior judgment model by using a test data set, and adjusting parameters of the preliminary sow oestrus local behavior judgment model according to a test result until a predicted result meets a preset accuracy condition to obtain an optimal sow oestrus local behavior judgment model;
Inputting the acquired sow monitoring video into the optimal sow oestrus local behavior judging model, and judging whether the sow has oestrus local behaviors or not according to the output result of the optimal sow oestrus local behavior judging model.
The sow oestrus detection device provided by the embodiment of the invention can be used for executing the sow oestrus detection method provided by the embodiment, and the working principle and the beneficial effects of the sow oestrus detection device are similar, so that the detailed description is omitted herein, and the specific content can be seen in the description of the embodiment.
In this embodiment, it should be noted that, each module in the apparatus of the embodiment of the present invention may be integrated into one body, or may be separately deployed. The modules can be combined into one module or further split into a plurality of sub-modules.
Based on the same inventive concept, a further embodiment of the present invention provides an electronic device, see fig. 12, comprising in particular: a processor 301, a memory 302, a communication interface 303, and a communication bus 304;
wherein, the processor 301, the memory 302, and the communication interface 303 complete communication with each other through the communication bus 304;
the processor 301 is configured to invoke a computer program in the memory 302, where the processor executes the computer program to implement all the steps of the sow estrus detection method described above, for example, the processor executes the computer program to implement the following procedures: acquiring a sow monitoring video; the sow monitoring video is obtained under the condition of boar condition searching; identifying whether the sow has oestrus local behaviors according to the sow monitoring video; acquiring a video image according to the sow monitoring video, and extracting the outline of the video image to acquire the sow outline; judging the sow posture according to the sow outline, and eliminating the sow sleep posture; if the sow is identified to have oestrus local behaviors, timing is started from the state of the sow to the state of standing according to the video image after the sleep posture of the sow is eliminated, whether the duration time of the state of the sow in the state of standing is greater than a preset oestrus threshold value is judged, if so, the sow is judged to be oestrus, otherwise, the starting time of the state of the sow in the state of standing is determined again, and the judging process of whether the state of the sow in the state of standing is greater than the preset oestrus threshold value is repeatedly executed.
It will be appreciated that the refinement and expansion functions that the computer program may perform are as described with reference to the above embodiments.
Based on the same inventive concept, a further embodiment of the present invention provides a non-transitory computer readable storage medium having stored thereon a computer program which, when executed by a processor, implements all the steps of the sow estrus detection method described above, for example, the processor implements the following procedure when executing the computer program: acquiring a sow monitoring video; the sow monitoring video is obtained under the condition of boar condition searching; identifying whether the sow has oestrus local behaviors according to the sow monitoring video; acquiring a video image according to the sow monitoring video, and extracting the outline of the video image to acquire the sow outline; judging the sow posture according to the sow outline, and eliminating the sow sleep posture; if the sow is identified to have oestrus local behaviors, timing is started from the state of the sow to the state of standing according to the video image after the sleep posture of the sow is eliminated, whether the duration time of the state of the sow in the state of standing is greater than a preset oestrus threshold value is judged, if so, the sow is judged to be oestrus, otherwise, the starting time of the state of the sow in the state of standing is determined again, and the judging process of whether the state of the sow in the state of standing is greater than the preset oestrus threshold value is repeatedly executed.
It will be appreciated that the refinement and expansion functions that the computer program may perform are as described with reference to the above embodiments.
Further, the logic instructions in the memory described above may be implemented in the form of software functional units and stored in a computer-readable storage medium when sold or used as a stand-alone product. Based on this understanding, the technical solution of the present invention may be embodied essentially or in a part contributing to the prior art or in a part of the technical solution, in the form of a software product stored in a storage medium, comprising several instructions for causing a computer device (which may be a personal computer, a server, a network device, etc.) to perform all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
The apparatus embodiments described above are merely illustrative, wherein the elements illustrated as separate elements may or may not be physically separate, and the elements shown as elements may or may not be physical elements, may be located in one place, or may be distributed over a plurality of network elements. Some or all of the modules can be selected according to actual needs to achieve the purpose of the embodiment of the invention. Those of ordinary skill in the art will understand and implement the present invention without undue burden.
From the above description of the embodiments, it will be apparent to those skilled in the art that the embodiments may be implemented by means of software plus necessary general hardware platforms, or of course may be implemented by means of hardware. Based on this understanding, the above technical solution may be embodied essentially or in a part contributing to the prior art in the form of a software product, which may be stored in a computer readable storage medium, such as a ROM/RAM, a magnetic disk, an optical disk, etc., comprising several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to perform the sow estrus detection method according to the embodiments or some parts of the embodiments.
Moreover, in the present invention, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Moreover, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
Furthermore, in the present disclosure, descriptions of the terms "one embodiment," "some embodiments," "examples," "particular examples," or "some examples," etc., mean that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the present disclosure. In this specification, schematic representations of the above terms are not necessarily directed to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples. Furthermore, the different embodiments or examples described in this specification and the features of the different embodiments or examples may be combined and combined by those skilled in the art without contradiction.
Finally, it should be noted that: the above embodiments are only for illustrating the technical solution of the present invention, and are not limiting; although the invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit and scope of the technical solutions of the embodiments of the present invention.

Claims (10)

1. A sow oestrus detection method, comprising:
acquiring a sow monitoring video; the sow monitoring video is obtained under the condition of boar condition searching;
identifying whether the sow has oestrus local behaviors according to the sow monitoring video; the oestrus local behavior comprises the behavior that the sow has little food or fasted food during oestrus, the back is slightly arched, the head is upright to the front, the ears are upright, and the behavior of the sow is in a dull state;
if the sow is identified to have oestrus local behaviors, acquiring a video image according to the sow monitoring video, and extracting the outline of the video image to acquire the sow outline;
judging the sow posture according to the sow outline, eliminating the sow sleep posture, and determining that the sow posture is a non-sleep posture and is in a standing state;
according to the video image after the sow sleep posture is eliminated, starting timing from the sow state to the standing state, judging whether the duration time of the sow state in the standing state is greater than a preset oestrus threshold value, if the sow standing state is larger than the preset oestrus threshold value, judging that the sow is oestrus, otherwise, re-determining the starting time of the sow standing state, and repeatedly executing the judging process of whether the sow standing state is larger than the preset oestrus threshold value; the video image after the sow sleep posture is excluded is a video image of the sow in a non-sleep posture and in a standing state.
2. The sow estrus detection method according to claim 1, wherein identifying whether a sow has estrus local behaviors according to the sow monitoring video comprises:
the method comprises the steps of collecting a first video in a preset time period before oestrus of oestrus sows and a second video in a preset time period before oestrus of non-oestrus sows in an oestrus detection stage;
dividing the first video and the second video into video frames, obtaining training samples, and dividing the training samples into a training data set and a test data set;
taking the training data set as sample input data, taking whether oestrus local behaviors exist or not as sample tag data, and training an initial machine learning model to obtain a preliminary sow oestrus local behavior judging model; the initial machine learning model comprises a CNN model and an LSTM model, wherein the CNN model is used for extracting features, and the LSTM model is used for performing classification learning based on feature extraction results of the CNN model;
testing the preliminary sow oestrus local behavior judgment model by using a test data set, and adjusting parameters of the preliminary sow oestrus local behavior judgment model according to a test result until a predicted result meets a preset accuracy condition to obtain an optimal sow oestrus local behavior judgment model;
Inputting the acquired sow monitoring video into the optimal sow oestrus local behavior judging model, and judging whether the sow has oestrus local behaviors or not according to the output result of the optimal sow oestrus local behavior judging model.
3. The sow estrus detection method according to claim 1, wherein obtaining a video image according to the sow monitoring video, and performing contour extraction on the video image to obtain a sow contour, comprises:
collecting a plurality of images of sows under different conditions of feeding, sleeping and oestrus in advance to serve as training samples;
marking the outline of the sow on each image in the training sample by using an open source marking tool Labelme to form a training set and a testing set;
taking the images in the training set as sample input data, taking the corresponding contour marking results as sample output data, and training based on a Mask-Rcnn network model to obtain a preliminary sow contour extraction model;
testing the preliminary sow profile extraction model by utilizing the image in the test set and the corresponding profile marking result, and adjusting the preliminary sow profile extraction model according to the test result until the predicted result meets the preset accuracy condition to obtain an optimal sow profile extraction model;
And taking the sow monitoring video as input, preprocessing each video image in the sow monitoring video, inputting the preprocessed video image into the optimal sow profile extraction model, and acquiring the sow profile according to the output of the optimal sow profile extraction model.
4. The sow estrus detection method according to claim 1, wherein the determining of the sow posture according to the sow outline, excluding the sow sleep posture, comprises:
collecting contour images of sows in different standing and sleeping postures as training samples;
taking the gesture marking result of the training sample as tag data to form a training set and a testing set; the gesture marking result comprises a standing gesture and a sleeping gesture;
taking the outline image in the training set as sample input data, taking the corresponding tag data as sample output data, and training based on the LeNet network model to obtain a preliminary sow gesture detection model; the preliminary sow posture detection model can exclude sows in sleep postures in the outline image;
testing the preliminary sow gesture detection model by utilizing the outline image in the test set and the corresponding label data, and adjusting the preliminary sow gesture detection model according to the test result until the prediction result meets the preset accuracy condition to obtain an optimal sow gesture detection model;
Inputting the sow outline to the optimal sow posture detection model, and eliminating the sow sleep posture according to the output result of the optimal sow posture detection model.
5. The sow estrus detection method according to claim 1, wherein if a sow is identified to have an estrus local behavior, based on the video image after the sow is excluded from the sleep posture, starting timing from the state of the sow determined as the stationary state, determining whether the duration of the state of the sow in the stationary state is greater than a preset estrus threshold, if so, determining that the sow is estrus, otherwise, re-determining the starting time of the stationary state of the sow, and repeating the determining process of whether the stationary state of the sow is greater than the preset estrus threshold, comprising:
judging whether the oestrus local behaviors exist or not, if so, according to the video image after the sleep posture of the sow is eliminated, judging from the sow state to the standing state, performing sample modeling on the acquired first sow outline image and starting timing;
setting all newly acquired sow contour images as foreground images, setting the last acquired sow contour image as background images, comparing the contours of the foreground images and the background images, judging whether the contours of the foreground images and the background images change or not, and carrying out re-modeling and timing when the contour change value is larger than a preset standing judgment threshold value; if the contour change value is smaller than or equal to the preset standing judgment threshold value, continuing to accumulate the time, judging whether the accumulated time reaches the preset estrus threshold value, if so, sending a result to a preset terminal to inform the preset terminal that the sow estrus is monitored, and if not, continuing to read the sow contour image to repeat the process until the process is ended.
6. The sow estrus detection method according to any one of claims 1-5, wherein the preset estrus threshold is 120-150 seconds.
7. Sow estrus detection device, characterized by, include:
the acquisition module is used for acquiring the sow monitoring video; the sow monitoring video is obtained under the condition of boar condition searching;
the identification module is used for identifying whether the sow has oestrus local behaviors according to the sow monitoring video; the oestrus local behavior comprises the behavior that the sow has little food or fasted food during oestrus, the back is slightly arched, the head is upright to the front, the ears are upright, and the behavior of the sow is in a dull state;
the extraction module is used for acquiring a video image according to the sow monitoring video if the sow is identified to have oestrus local behaviors, and extracting the outline of the video image to acquire the sow outline;
the sow removing module is used for judging the sow posture according to the sow outline, removing the sow sleep posture, and determining that the sow posture is a non-sleep posture and is in a standing state;
the detection module is used for judging whether the duration time of the sow state in the standing state is greater than a preset oestrus threshold value or not according to the video image after the sow sleep posture is eliminated, judging from the sow state to the standing state, judging whether the duration time of the sow state in the standing state is greater than the preset oestrus threshold value, if so, judging that the sow is oestrus, otherwise, determining the starting time of the sow standing state again, and repeatedly executing the judging process of whether the sow standing state is greater than the preset oestrus threshold value or not; the video image after the sow sleep posture is excluded is a video image of the sow in a non-sleep posture and in a standing state.
8. The sow estrus detection device according to claim 7, wherein the identification module is specifically configured to:
the method comprises the steps of collecting a first video in a preset time period before oestrus of oestrus sows and a second video in a preset time period before oestrus of non-oestrus sows in an oestrus detection stage;
dividing the first video and the second video into video frames, obtaining training samples, and dividing the training samples into a training data set and a test data set;
taking the training data set as sample input data, taking whether oestrus local behaviors exist or not as sample tag data, and training an initial machine learning model to obtain a preliminary sow oestrus local behavior judging model; the initial machine learning model comprises a CNN model and an LSTM model, wherein the CNN model is used for extracting features, and the LSTM model is used for performing classification learning based on feature extraction results of the CNN model;
testing the preliminary sow oestrus local behavior judgment model by using a test data set, and adjusting parameters of the preliminary sow oestrus local behavior judgment model according to a test result until a predicted result meets a preset accuracy condition to obtain an optimal sow oestrus local behavior judgment model;
Inputting the acquired sow monitoring video into the optimal sow oestrus local behavior judging model, and judging whether the sow has oestrus local behaviors or not according to the output result of the optimal sow oestrus local behavior judging model.
9. An electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, characterized in that the steps of the sow oestrus detection method as claimed in any one of claims 1 to 6 are carried out by the processor when said program is executed.
10. A non-transitory computer readable storage medium having stored thereon a computer program, characterized in that the computer program when executed by a processor implements the steps of the sow estrus detection method as claimed in any one of claims 1 to 6.
CN202010677045.8A 2020-07-14 2020-07-14 Sow oestrus detection method and device, electronic equipment and storage medium Active CN111914685B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010677045.8A CN111914685B (en) 2020-07-14 2020-07-14 Sow oestrus detection method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010677045.8A CN111914685B (en) 2020-07-14 2020-07-14 Sow oestrus detection method and device, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN111914685A CN111914685A (en) 2020-11-10
CN111914685B true CN111914685B (en) 2024-04-09

Family

ID=73280933

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010677045.8A Active CN111914685B (en) 2020-07-14 2020-07-14 Sow oestrus detection method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN111914685B (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114041426A (en) * 2020-12-31 2022-02-15 重庆市六九畜牧科技股份有限公司 Backup sow management pigsty
CN113016657A (en) * 2021-03-05 2021-06-25 河南牧原智能科技有限公司 Pigsty sow oestrus identification system and application method thereof
CN113711944B (en) * 2021-08-27 2023-03-03 河南牧原智能科技有限公司 Sow estrus identification method, device and system
CN114403043B (en) * 2021-12-20 2022-11-29 北京市农林科学院智能装备技术研究中心 Sow oestrus searching method, device and system
CN114586701A (en) * 2022-04-15 2022-06-07 东南大学 Milk cow oestrus prediction device based on body temperature and exercise amount data

Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1839621A1 (en) * 2006-03-31 2007-10-03 Walter Signorini Method and system to determine a physiological state of a sow
CN201504483U (en) * 2009-05-15 2010-06-16 广东广兴牧业机械设备公司 Sow oestrus monitoring device
CN103858791A (en) * 2012-12-13 2014-06-18 青岛金弘测控技术发展有限公司 Device capable of recognizing livestock estrus state automatically
CN104396865A (en) * 2014-10-29 2015-03-11 中国农业大学 Sow oestrus remote automatic monitoring system and method
CN107027650A (en) * 2017-03-21 2017-08-11 中国农业大学 A kind of boar abnormal state detection method and device based on PSO SVM
CN107133604A (en) * 2017-05-25 2017-09-05 江苏农林职业技术学院 A kind of pig abnormal gait detection method based on ellipse fitting and predictive neutral net
CN207600521U (en) * 2017-12-28 2018-07-10 重庆派森百橙汁有限公司 A kind of oestrus of sow automatic monitoring system
CN108717523A (en) * 2018-04-26 2018-10-30 华南农业大学 Oestrus of sow behavioral value method based on machine vision
CN108766075A (en) * 2018-05-31 2018-11-06 长春博立电子科技有限公司 A kind of individualized education analysis system and method based on video analysis
CN108830144A (en) * 2018-05-03 2018-11-16 华南农业大学 A kind of milking sow gesture recognition method based on improvement Faster-R-CNN
CN109984054A (en) * 2019-04-19 2019-07-09 广州影子科技有限公司 Oestrous detection method, oestrous detection device and oestrous detection system
CN110741963A (en) * 2019-10-16 2020-02-04 北京海益同展信息科技有限公司 Object state monitoring and sow oestrus monitoring method, device and system
CN110839557A (en) * 2019-10-16 2020-02-28 北京海益同展信息科技有限公司 Sow oestrus monitoring method, device and system, electronic equipment and storage medium
CN110866481A (en) * 2019-11-07 2020-03-06 北京小龙潜行科技有限公司 Sow oestrus detection method and device
CN110991222A (en) * 2019-10-16 2020-04-10 北京海益同展信息科技有限公司 Object state monitoring and sow oestrus monitoring method, device and system

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
NL2019762B1 (en) * 2017-10-19 2019-04-29 N V Nederlandsche Apparatenfabriek Nedap Method and system for determining a state of at least one pig in a cage.

Patent Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1839621A1 (en) * 2006-03-31 2007-10-03 Walter Signorini Method and system to determine a physiological state of a sow
CN201504483U (en) * 2009-05-15 2010-06-16 广东广兴牧业机械设备公司 Sow oestrus monitoring device
CN103858791A (en) * 2012-12-13 2014-06-18 青岛金弘测控技术发展有限公司 Device capable of recognizing livestock estrus state automatically
CN104396865A (en) * 2014-10-29 2015-03-11 中国农业大学 Sow oestrus remote automatic monitoring system and method
CN107027650A (en) * 2017-03-21 2017-08-11 中国农业大学 A kind of boar abnormal state detection method and device based on PSO SVM
CN107133604A (en) * 2017-05-25 2017-09-05 江苏农林职业技术学院 A kind of pig abnormal gait detection method based on ellipse fitting and predictive neutral net
CN207600521U (en) * 2017-12-28 2018-07-10 重庆派森百橙汁有限公司 A kind of oestrus of sow automatic monitoring system
CN108717523A (en) * 2018-04-26 2018-10-30 华南农业大学 Oestrus of sow behavioral value method based on machine vision
CN108830144A (en) * 2018-05-03 2018-11-16 华南农业大学 A kind of milking sow gesture recognition method based on improvement Faster-R-CNN
CN108766075A (en) * 2018-05-31 2018-11-06 长春博立电子科技有限公司 A kind of individualized education analysis system and method based on video analysis
CN109984054A (en) * 2019-04-19 2019-07-09 广州影子科技有限公司 Oestrous detection method, oestrous detection device and oestrous detection system
CN110741963A (en) * 2019-10-16 2020-02-04 北京海益同展信息科技有限公司 Object state monitoring and sow oestrus monitoring method, device and system
CN110839557A (en) * 2019-10-16 2020-02-28 北京海益同展信息科技有限公司 Sow oestrus monitoring method, device and system, electronic equipment and storage medium
CN110991222A (en) * 2019-10-16 2020-04-10 北京海益同展信息科技有限公司 Object state monitoring and sow oestrus monitoring method, device and system
CN110866481A (en) * 2019-11-07 2020-03-06 北京小龙潜行科技有限公司 Sow oestrus detection method and device

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
后备母猪的发情与配种;吕惠序;;今日畜牧兽医(第09期);全文 *
母猪假发情的原因及判别方法;叶贵玉, 时自立;河南畜牧兽医(第10期);全文 *
规模猪场能繁母猪的饲养管理技术;程建云;周黎明;;内蒙古农业科技(第03期);全文 *

Also Published As

Publication number Publication date
CN111914685A (en) 2020-11-10

Similar Documents

Publication Publication Date Title
CN111914685B (en) Sow oestrus detection method and device, electronic equipment and storage medium
CN110866481B (en) Sow oestrus detection method and device
CN108717523B (en) Sow oestrus behavior detection method based on machine vision
CN110839557B (en) Sow oestrus monitoring method, device and system, electronic equipment and storage medium
CN107077626B (en) Non-invasive multi-modal biometric identification system for animals
CN108549852B (en) Specific scene downlink person detector automatic learning method based on deep network enhancement
CN112734731B (en) Livestock temperature detection method, device, equipment and storage medium
CN110532899B (en) Sow antenatal behavior classification method and system based on thermal imaging
CN114004866B (en) Mosquito recognition system and method based on image similarity difference
KR102296501B1 (en) System to determine sows' estrus and the right time to fertilize sows using depth image camera and sound sensor
CN114596448A (en) Meat duck health management method and management system thereof
CN108491807B (en) Real-time monitoring method and system for oestrus of dairy cows
CN112580552A (en) Method and device for analyzing behavior of rats
WO2018155856A1 (en) System for determining mounting behavior of bull or cow
CN111767794A (en) Cage-rearing poultry abnormal behavior detection method and detection system based on machine vision
CN110197130A (en) A kind of live pig abnormal gait detection device and system
KR102527058B1 (en) Apparatus for detecting mounting behavior of cattle
CN114581948A (en) Animal face identification method
EP4402657A1 (en) Systems and methods for the automated monitoring of animal physiological conditions and for the prediction of animal phenotypes and health outcomes
CN110751085B (en) Mouse behavior recognition method
CN113780207A (en) System and method for goat face recognition
CN110263753B (en) Object statistical method and device
KR102424901B1 (en) method for detecting estrus of cattle based on object detection algorithm
CN205608801U (en) Automatic device of grading of milk cow body condition
CN114022831A (en) Binocular vision-based livestock body condition monitoring method and system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant