CN110908340A - Smart home control method and device - Google Patents
Smart home control method and device Download PDFInfo
- Publication number
- CN110908340A CN110908340A CN201811075428.7A CN201811075428A CN110908340A CN 110908340 A CN110908340 A CN 110908340A CN 201811075428 A CN201811075428 A CN 201811075428A CN 110908340 A CN110908340 A CN 110908340A
- Authority
- CN
- China
- Prior art keywords
- target object
- type
- model
- classification
- image information
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 33
- 238000013145 classification model Methods 0.000 claims abstract description 48
- 238000012549 training Methods 0.000 claims abstract description 8
- 238000003066 decision tree Methods 0.000 claims description 7
- 238000013136 deep learning model Methods 0.000 claims description 7
- 238000003062 neural network model Methods 0.000 claims description 7
- 238000012544 monitoring process Methods 0.000 claims description 5
- 238000012545 processing Methods 0.000 description 5
- 238000012806 monitoring device Methods 0.000 description 4
- 230000008878 coupling Effects 0.000 description 3
- 238000010168 coupling process Methods 0.000 description 3
- 238000005859 coupling reaction Methods 0.000 description 3
- 230000003238 somatosensory effect Effects 0.000 description 3
- 238000004891 communication Methods 0.000 description 2
- 230000000366 juvenile effect Effects 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 230000008569 process Effects 0.000 description 2
- 238000001514 detection method Methods 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 230000037308 hair color Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B19/00—Programme-control systems
- G05B19/02—Programme-control systems electric
- G05B19/418—Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS] or computer integrated manufacturing [CIM]
- G05B19/4183—Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS] or computer integrated manufacturing [CIM] characterised by data acquisition, e.g. workpiece identification
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B15/00—Systems controlled by a computer
- G05B15/02—Systems controlled by a computer electric
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B19/00—Programme-control systems
- G05B19/02—Programme-control systems electric
- G05B19/418—Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS] or computer integrated manufacturing [CIM]
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/20—Pc systems
- G05B2219/26—Pc applications
- G05B2219/2642—Domotique, domestic, home control, automation, smart house
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Manufacturing & Machinery (AREA)
- Quality & Reliability (AREA)
- Selective Calling Equipment (AREA)
Abstract
The application belongs to the field of intelligent home furnishing, and provides a control method of the intelligent home furnishing, which comprises the following steps: acquiring image information of a target object obtained by shooting; identifying image information, and classifying the identification result by using at least one classification model to determine the type of the target object, wherein the classification model is a model obtained by training based on historical image data, and the historical image data comprises at least one type of feature data of objects of different types; controlling the home appliance based on the type of the target object. The application solves the technical problems that when a user controls household appliances, the running state of the household appliances needs to be manually adjusted, and the user experience is poor.
Description
Technical Field
The application relates to the field of smart home, in particular to a control method and device for smart home.
Background
In the prior art, when a user controls a home appliance, the user needs to manually select the running state of the corresponding home appliance required by the user through a mobile phone, a remote controller or a key on home appliance equipment, which wastes the time of the user, cannot automatically provide the running state of the corresponding home appliance for the user, and has poor user experience.
Disclosure of Invention
The embodiment of the application provides a control method and device for smart home, and the technical problems that when a user controls home appliances, the user needs to manually adjust the running state of the home appliances, and the user experience is poor are at least solved.
According to an aspect of an embodiment of the present application, a method for controlling smart home is provided, including: acquiring image information of a target object obtained by shooting; identifying image information, and classifying the identification result by using at least one classification model to determine the type of the target object, wherein the classification model is a model obtained by training based on historical image data, and the historical image data comprises at least one type of feature data of objects of different types; controlling the home appliance based on the type of the target object.
Optionally, the classification model comprises at least one of: the model comprises an SVM model, a Bayesian model, a deep learning model, a neural network model, a decision tree model and a KNN model.
Alternatively, in a case where a plurality of classification models are used, identifying image information and determining a type of the target object by performing classification processing on the identification result using at least one classification model includes: identifying the image information to obtain an identification result, wherein the identification result comprises: at least one characteristic data of the target object; classifying the recognition result by using each classification model respectively to obtain a classification result corresponding to each classification model, wherein each classification model corresponds to different classification rules; and determining the type of the target object based on the classification result corresponding to each classification model.
Optionally, determining the type of the target object based on the classification result corresponding to each classification model includes: determining the type of the classification result based on the classification result corresponding to each classification model; the number of classification results of each type is counted, and based on the statistical results, the type of the target object is determined.
Optionally, determining the type of the target object based on the statistical result includes: comparing the number of classification results of each type; and taking the type of the classification result with the largest quantitative value as the type of the target object.
Optionally, controlling the home appliance based on the type of the target object includes: based on the type of the target object, inquiring a control instruction corresponding to the type; and controlling the running mode of the household appliance based on the control instruction obtained by the query.
Optionally, when a plurality of home appliances are provided, the queried control instruction includes at least a plurality of sub-control instructions, where each sub-control instruction carries identification information of a corresponding home appliance, so that different sub-control instructions are sent to the corresponding home appliances.
Optionally, in the case that the home appliances are deployed in different spaces, if it is monitored that the target object enters the corresponding space, the home appliance located in the space is started, and the shooting device is started to shoot image information of the target object.
Optionally, whether the target object enters the corresponding space is detected by any one of the following manners: the target object is sensed through sensors arranged in different spaces, wherein the sensors comprise at least one of a monitor, a camera, a voice monitoring device and a somatosensory monitoring device.
According to an aspect of the embodiments of the present application, there is provided a control device for smart home, the device including: the acquisition module is used for acquiring the image information of the shot target object; the determining module is used for identifying the image information and classifying the identification result by using at least one classification model to determine the type of the target object, wherein the classification model is obtained by training based on historical image data, and the historical image data comprises at least one type of characteristic data of different types of objects; and the control module is used for controlling the household appliance based on the type of the target object.
According to an aspect of the embodiments of the present application, a storage medium is provided, where the storage medium includes a stored program, and when the program runs, a device where the storage medium is located is controlled to execute the above-mentioned control method for smart home.
According to an aspect of the embodiments of the present application, a processor is provided, and is characterized in that the processor is configured to execute a program, where the program executes the control method for smart home as described above.
In the embodiment of the application, image information of a target object obtained by shooting is collected; identifying image information, and classifying the identification result by using at least one classification model to determine the type of the target object, wherein the classification model is a model obtained by training based on historical image data, and the historical image data comprises at least one type of feature data of objects of different types; controlling the home appliance based on the type of the target object. The method and the device have the advantages that the type of the target object is determined according to the shot image information of the target object and the running state of the household appliance corresponding to the target object is automatically adjusted according to the type of the target object, and user experience is improved.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the application and together with the description serve to explain the application and not to limit the application. In the drawings:
fig. 1 is a schematic flowchart of an alternative smart home control method according to an embodiment of the present application;
fig. 2 is a schematic structural diagram of an alternative control device for smart home according to an embodiment of the present application.
Detailed Description
In order to make the technical solutions of the present application better understood, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only partial embodiments of the present application, but not all embodiments. All other embodiments, which can be obtained by a person skilled in the art without making any inventive step based on the embodiments in the present application, shall fall within the scope of protection of the present application.
It should be noted that the terms "first," "second," and the like in the description and claims of the present application and in the above-described drawings are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the application described herein are capable of operation in sequences other than those illustrated or described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
According to an aspect of an embodiment of the present application, a method for controlling smart home is provided, as shown in fig. 1: the method comprises at least steps S102-S106.
Step S102, acquiring image information of a shot target object;
wherein the target object may be a user.
In some embodiments of the present application, an image capturing device may be disposed in a living room to capture a target object and obtain image information of the target object. The image capturing device includes, but is not limited to, an infrared image capturing device, a panoramic image capturing device, and the like.
Step S104, identifying image information, classifying the identification result by using at least one classification model, and determining the type of the target object, wherein the classification model is obtained by training based on historical image data, and the historical image data comprises at least one type of characteristic data of objects of different types;
in some optional embodiments of the present application, the classification model comprises at least one of: the model comprises an SVM model, a Bayesian model, a deep learning model, a neural network model, a decision tree model and a KNN model.
In the case where a plurality of classification models are used, identifying image information and determining a type of a target object by performing classification processing on the identification result using at least one classification model, includes: identifying the image information to obtain an identification result, wherein the identification result comprises: at least one characteristic data of the target object; classifying the recognition result by using each classification model respectively to obtain a classification result corresponding to each classification model, wherein each classification model corresponds to different classification rules; and determining the type of the target object based on the classification result corresponding to each classification model.
In an alternative embodiment, the feature data of the target object may include: the method comprises the following steps of providing clothes style information, hair color information, hair length short information, height information, gender information, age information and the like of a user. The classification rules are rules corresponding to each classification model.
The classification processing of the recognition result by using each classification model can be realized by the following modes: classifying the one or more kinds of characteristic data of the target object by respectively using an SVM model, a Bayesian model, a deep learning model, a neural network model, a decision tree model and a KNN model; in an optional embodiment, the age information of the user a is classified by using an SVM model, a bayesian model, a deep learning model, a neural network model, a decision tree model and a KNN model respectively.
After the age information of the user A is classified, obtaining classification results corresponding to the classification models; for example, classification results obtained by using an SVM model, a Bayesian model, a deep learning model and a neural network model are young, and classification results obtained by using a decision tree model and a KNN model are old.
In some optional embodiments of the present application, determining the type of the target object based on the classification result corresponding to each classification model includes: determining the type of the classification result based on the classification result corresponding to each classification model; the number of classification results of each type is counted, and based on the statistical results, the type of the target object is determined.
In some optional embodiments of the present application, determining the type of the target object based on the statistical result includes: comparing the number of classification results of each type; and taking the type of the classification result with the largest quantitative value as the type of the target object.
In an optional embodiment, when the classification result obtained by using the SVM model, the bayesian model, the deep learning model and the neural network model is young and the classification result obtained by using the decision tree model and the KNN model is old, the classification result of four models is young, the classification result of two models is old, and if four is greater than two, the age of the user a is determined to be young.
Step S106, based on the type of the target object, controlling the household appliance.
In some optional embodiments of the present application, controlling the home appliance based on the type of the target object includes: based on the type of the target object, inquiring a control instruction corresponding to the type; and controlling the running mode of the household appliance based on the control instruction obtained by the query.
In some optional embodiments in the present application, in a case that there are a plurality of home appliances, the queried control instruction includes at least a plurality of sub-control instructions, where each sub-control instruction carries identification information of a corresponding home appliance, so that different sub-control instructions are sent to the corresponding home appliances.
In an optional embodiment, when the type of the current user is young, a control instruction corresponding to the young user may be queried from a memory of the smart home system, and the operation mode of the household appliance may be controlled based on the queried control instruction. The intelligent home system can be an independent device which does not belong to the home appliances, and can also be one of the home appliances. If the type of the current user is young and the number of the household appliances corresponding to young is multiple, the inquired control instruction corresponding to young may be a sub-control instruction corresponding to different household appliances, and at this time, each sub-control instruction carries identification information of the corresponding household appliance, so that different sub-control instructions are sent to the corresponding household appliances.
The priority modes of the household appliances corresponding to different users are different, for example, if the old is afraid of cold, the operating temperature of the air conditioner corresponding to the old is higher than that of the air conditioner corresponding to the young. The corresponding relation between the user and the running state of the household appliance can be stored in a memory in the intelligent home system in advance.
In another optional embodiment, when the type of the current user is acquired as the old, a control instruction corresponding to the old user may be queried from a memory of the smart home system, and the operation mode of the household appliance may be controlled based on the queried control instruction. For example: if the temperature of the air conditioner corresponding to the old person at home is 27 degrees, the queried control instruction corresponding to the old user controls the operation state of the air conditioner to be adjusted to 27 degrees based on the control instruction, and the control instruction may also include: and controlling the opening instruction of the air conditioner.
In another optional embodiment, when the type of the current user is acquired as a young user, a control instruction corresponding to the young user may be queried from a memory of the smart home system, and the operation mode of the home appliance may be controlled based on the queried control instruction. For example: if the television program corresponding to the young at home is a juvenile channel, inquiring a control instruction corresponding to the young user, and controlling the channel of the television program to be a juvenile channel based on the control instruction; wherein, the control instruction may also include: and controlling the television to be started.
In some optional embodiments in the present application, in a case that the home appliances are deployed in different spaces, if it is detected that the target object enters the corresponding space, the home appliance located in the space is started, and the shooting device is started to shoot image information of the target object.
In some optional embodiments in the present application, whether a target object enters a corresponding space is detected by any one of the following manners: sensing the target object by sensors disposed in different spaces, the sensors including at least one of: monitor, camera, pronunciation monitoring facilities and body sensing monitoring facilities.
In some alternative embodiments of the present application, home appliances may be deployed in each room; for example: the kitchen is provided with a range hood, and the living room and the bedroom are provided with air conditioners. Sensors may be provided in each room to sense the target object, and the sensors may include at least one of: the system comprises a monitor, a camera, a voice monitoring device and a somatosensory monitoring device; specifically, an inductor may be disposed at the door or in each room to induce whether the user enters the corresponding room; or monitors are installed in each room, and the intelligent home system monitors whether the user enters the corresponding room or not through the monitors; or the voice detection equipment in each room determines whether the user enters the corresponding room through voice recognition, or the somatosensory monitoring equipment in each room determines whether the user enters the corresponding room.
In some optional embodiments in the present application, the target objects at different times may also be photographed by an image capturing device disposed in the living room, so as to obtain image information of the target objects at different times. Specifically, the movement tendency of the target object may be determined by position information of the target object in the plurality of image information of the target object at different times, wherein the movement tendency is a tendency to move to a certain room. According to the moving trend, the intelligent home system controls the starting of the household appliances in the corresponding room and controls the adjustment of the running state of the household appliances in the corresponding room. In addition, after image information of the target object at different moments is acquired through an image acquisition device arranged in the living room, the shot image information does not contain the target object, and when the previous image shot before the preset time length contains the target object, if the position corresponding to the image information without the target object is a door of a room, the target object can be determined to enter the room; the intelligent home system can control the starting of the household appliances in the corresponding room and control the adjustment of the running state of the household appliances in the corresponding room. For example: when the shot image information does not contain the user and the previous image shot 3 seconds ago contains the target object, if the position corresponding to the image information without the target object is the entrance of the bedroom, the target object can be determined to enter the bedroom. The intelligent home system can control the starting of the household appliances in the bedroom and control the adjustment of the running state of the household appliances in the bedroom.
In the embodiment of the application, image information of a target object obtained by shooting is collected; identifying image information, and classifying the identification result by using at least one classification model to determine the type of the target object, wherein the classification model is a model obtained by training based on historical image data, and the historical image data comprises at least one type of feature data of objects of different types; controlling the home appliance based on the type of the target object. The method and the device have the advantages that the type of the target object is determined according to the shot image information of the target object and the running state of the household appliance corresponding to the target object is automatically adjusted according to the type of the target object, so that personalized service is provided for users, and user experience is improved.
According to an aspect of an embodiment of the present application, there is provided a control device for smart home, as shown in fig. 2: the device comprises: an acquisition module 22, a determination module 24, a control module 26; wherein:
the acquisition module 22 is used for acquiring the image information of the shot target object;
a determining module 24, configured to identify image information, and determine a type of the target object by performing classification processing on an identification result using at least one classification model, where the classification model is a model trained based on historical image data, and the historical image data includes at least one type of feature data of different types of objects;
and a control module 26 for controlling the home appliance based on the type of the target object.
It should be noted that, reference may be made to the relevant description of fig. 1 for a preferred implementation of the above-described embodiment, and details are not described here again.
According to an aspect of the embodiments of the present application, a storage medium is provided, where the storage medium includes a stored program, and when the program runs, a device where the storage medium is located is controlled to execute the above-mentioned control method for smart home.
According to an aspect of the embodiments of the present application, a processor is provided, and is characterized in that the processor is configured to execute a program, where the program executes the control method for smart home as described above.
The above-mentioned serial numbers of the embodiments of the present application are merely for description and do not represent the merits of the embodiments.
In the above embodiments of the present application, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to related descriptions of other embodiments.
In the embodiments provided in the present application, it should be understood that the disclosed technology can be implemented in other ways. The above-described embodiments of the apparatus are merely illustrative, and for example, a division of a unit may be a division of a logic function, and an actual implementation may have another division, for example, a plurality of units or components may be combined or may be integrated into another system, or some features may be omitted, or may not be executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, units or modules, and may be in an electrical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be provided in one place, or may be distributed over a plurality of units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application may be substantially implemented or contributed to by the prior art, or all or part of the technical solution may be embodied in a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method of the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a removable hard disk, a magnetic or optical disk, and other various media capable of storing program codes.
The foregoing is only a preferred embodiment of the present application and it should be noted that those skilled in the art can make several improvements and modifications without departing from the principle of the present application, and these improvements and modifications should also be considered as the protection scope of the present application.
Claims (12)
1. A control method of smart home is characterized by comprising the following steps:
acquiring image information of a target object obtained by shooting;
identifying the image information, and classifying the identification result by using at least one classification model to determine the type of the target object, wherein the classification model is a model obtained by training based on historical image data, and the historical image data comprises at least one type of feature data of different types of objects;
controlling the home appliance based on the type of the target object.
2. The method of claim 1, wherein the classification model comprises at least one of: the model comprises an SVM model, a Bayesian model, a deep learning model, a neural network model, a decision tree model and a KNN model.
3. The method according to claim 2, wherein identifying the image information and determining the type of the target object by classifying the identification result using at least one classification model in a case where a plurality of classification models are used comprises:
identifying the image information to obtain the identification result, wherein the identification result comprises: at least one characteristic data of the target object;
classifying the recognition result by using each classification model respectively to obtain a classification result corresponding to each classification model, wherein each classification model corresponds to different classification rules;
and determining the type of the target object based on the classification result corresponding to each classification model.
4. The method of claim 2, wherein determining the type of the target object based on the classification result corresponding to each classification model comprises:
determining the type of the classification result based on the classification result corresponding to each classification model;
counting the number of classification results of each type, and determining the type of the target object based on the counting results.
5. The method of claim 3, wherein determining the type of the target object based on the statistical result comprises:
comparing the number of classification results of each type;
and taking the type of the classification result with the largest quantitative value as the type of the target object.
6. The method according to any one of claims 1 to 5, wherein controlling the home appliance based on the type of the target object comprises:
based on the type of the target object, inquiring a control instruction corresponding to the type;
and controlling the running mode of the household appliance based on the control instruction obtained by the query.
7. The method according to claim 6, wherein when there are a plurality of household appliances, the queried control command includes at least a plurality of sub-control commands, wherein each sub-control command carries identification information of the corresponding household appliance, so that different sub-control commands are sent to the corresponding household appliances.
8. The method according to claim 1, wherein if it is monitored that the target object enters the corresponding space in the case that the household appliance is deployed in a different space, the household appliance located in the space is started, and the photographing device is started to photograph the image information of the target object.
9. The method according to claim 8, wherein whether the target object enters the corresponding space is detected by any one of the following methods: sensing the target object by sensors disposed in different spaces, the sensors including at least one of: monitor, camera, pronunciation monitoring facilities and body sensing monitoring facilities.
10. The utility model provides a controlling means of intelligence house which characterized in that includes:
the acquisition module is used for acquiring the image information of the shot target object;
the determining module is used for identifying the image information and classifying the identification result by using at least one classification model to determine the type of the target object, wherein the classification model is obtained by training based on historical image data, and the historical image data comprises at least one type of feature data of objects of different types;
and the control module is used for controlling the household appliance based on the type of the target object.
11. A storage medium, characterized in that the storage medium includes a stored program, and when the program runs, the device where the storage medium is located is controlled to execute the control method of the smart home according to any one of claims 1 to 9.
12. A processor, wherein the processor is configured to execute a program, and when the program runs, the program executes the control method of the smart home according to any one of claims 1 to 9.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811075428.7A CN110908340A (en) | 2018-09-14 | 2018-09-14 | Smart home control method and device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811075428.7A CN110908340A (en) | 2018-09-14 | 2018-09-14 | Smart home control method and device |
Publications (1)
Publication Number | Publication Date |
---|---|
CN110908340A true CN110908340A (en) | 2020-03-24 |
Family
ID=69813006
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201811075428.7A Pending CN110908340A (en) | 2018-09-14 | 2018-09-14 | Smart home control method and device |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN110908340A (en) |
Cited By (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111720924A (en) * | 2020-05-27 | 2020-09-29 | 华帝股份有限公司 | Kitchen air conditioner and control method applying same |
CN112230555A (en) * | 2020-10-12 | 2021-01-15 | 珠海格力电器股份有限公司 | Intelligent household equipment, control method and device thereof and storage medium |
CN112488590A (en) * | 2020-12-21 | 2021-03-12 | 青岛海尔科技有限公司 | Target object classification method and device, storage medium and electronic device |
CN112507756A (en) * | 2019-08-26 | 2021-03-16 | 阿里巴巴集团控股有限公司 | Detection method, control method and device |
CN114063459A (en) * | 2020-08-10 | 2022-02-18 | 青岛海信电子产业控股股份有限公司 | Terminal and intelligent household control method |
CN114266419A (en) * | 2022-01-12 | 2022-04-01 | 华中科技大学 | Cigar tobacco leaf process stage prediction method, system and medium based on data fusion |
CN114815645A (en) * | 2022-04-19 | 2022-07-29 | 青岛海尔科技有限公司 | Control method and device of Internet of things equipment, storage medium and electronic device |
WO2023005706A1 (en) * | 2021-07-29 | 2023-02-02 | 华为技术有限公司 | Device control method, electronic device, and storage medium |
CN115701032B (en) * | 2021-07-29 | 2024-11-15 | 华为技术有限公司 | Device control method, electronic device and storage medium |
Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101178773A (en) * | 2007-12-13 | 2008-05-14 | 北京中星微电子有限公司 | Image recognition system and method based on characteristic extracting and categorizer |
CN102171517A (en) * | 2008-10-06 | 2011-08-31 | 日立空调·家用电器株式会社 | Air conditioner |
WO2014119900A1 (en) * | 2013-01-29 | 2014-08-07 | Lee Jong Sik | Smart device having user interface based on human emotions or inclinations, and user interface method |
CN104864557A (en) * | 2015-04-30 | 2015-08-26 | 广东美的制冷设备有限公司 | Air conditioner control method, intelligent control equipment and air conditioner system |
CN105068520A (en) * | 2015-07-31 | 2015-11-18 | 惠而浦(中国)股份有限公司 | Novel smart home system and installation and control method |
CN107560090A (en) * | 2017-09-20 | 2018-01-09 | 珠海格力电器股份有限公司 | Air supply control method and device of air conditioner and terminal |
CN107642877A (en) * | 2017-09-26 | 2018-01-30 | 广东美的制冷设备有限公司 | Air conditioning control method, device and air conditioner |
CN108050674A (en) * | 2017-10-30 | 2018-05-18 | 珠海格力电器股份有限公司 | Control method and device of air conditioning equipment and terminal |
CN108131791A (en) * | 2017-12-04 | 2018-06-08 | 广东美的制冷设备有限公司 | Control method, device and the server of home appliance |
CN108131787A (en) * | 2017-11-06 | 2018-06-08 | 珠海格力电器股份有限公司 | Air conditioner control method and device |
-
2018
- 2018-09-14 CN CN201811075428.7A patent/CN110908340A/en active Pending
Patent Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101178773A (en) * | 2007-12-13 | 2008-05-14 | 北京中星微电子有限公司 | Image recognition system and method based on characteristic extracting and categorizer |
CN102171517A (en) * | 2008-10-06 | 2011-08-31 | 日立空调·家用电器株式会社 | Air conditioner |
WO2014119900A1 (en) * | 2013-01-29 | 2014-08-07 | Lee Jong Sik | Smart device having user interface based on human emotions or inclinations, and user interface method |
CN104864557A (en) * | 2015-04-30 | 2015-08-26 | 广东美的制冷设备有限公司 | Air conditioner control method, intelligent control equipment and air conditioner system |
CN105068520A (en) * | 2015-07-31 | 2015-11-18 | 惠而浦(中国)股份有限公司 | Novel smart home system and installation and control method |
CN107560090A (en) * | 2017-09-20 | 2018-01-09 | 珠海格力电器股份有限公司 | Air supply control method and device of air conditioner and terminal |
CN107642877A (en) * | 2017-09-26 | 2018-01-30 | 广东美的制冷设备有限公司 | Air conditioning control method, device and air conditioner |
CN108050674A (en) * | 2017-10-30 | 2018-05-18 | 珠海格力电器股份有限公司 | Control method and device of air conditioning equipment and terminal |
CN108131787A (en) * | 2017-11-06 | 2018-06-08 | 珠海格力电器股份有限公司 | Air conditioner control method and device |
CN108131791A (en) * | 2017-12-04 | 2018-06-08 | 广东美的制冷设备有限公司 | Control method, device and the server of home appliance |
Cited By (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112507756A (en) * | 2019-08-26 | 2021-03-16 | 阿里巴巴集团控股有限公司 | Detection method, control method and device |
CN111720924A (en) * | 2020-05-27 | 2020-09-29 | 华帝股份有限公司 | Kitchen air conditioner and control method applying same |
CN114063459A (en) * | 2020-08-10 | 2022-02-18 | 青岛海信电子产业控股股份有限公司 | Terminal and intelligent household control method |
CN114063459B (en) * | 2020-08-10 | 2024-03-15 | 海信集团控股股份有限公司 | Terminal and intelligent home control method |
CN112230555A (en) * | 2020-10-12 | 2021-01-15 | 珠海格力电器股份有限公司 | Intelligent household equipment, control method and device thereof and storage medium |
CN112488590A (en) * | 2020-12-21 | 2021-03-12 | 青岛海尔科技有限公司 | Target object classification method and device, storage medium and electronic device |
WO2023005706A1 (en) * | 2021-07-29 | 2023-02-02 | 华为技术有限公司 | Device control method, electronic device, and storage medium |
CN115701032A (en) * | 2021-07-29 | 2023-02-07 | 华为技术有限公司 | Device control method, electronic device, and storage medium |
CN115701032B (en) * | 2021-07-29 | 2024-11-15 | 华为技术有限公司 | Device control method, electronic device and storage medium |
CN114266419A (en) * | 2022-01-12 | 2022-04-01 | 华中科技大学 | Cigar tobacco leaf process stage prediction method, system and medium based on data fusion |
CN114815645A (en) * | 2022-04-19 | 2022-07-29 | 青岛海尔科技有限公司 | Control method and device of Internet of things equipment, storage medium and electronic device |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110908340A (en) | Smart home control method and device | |
US10769914B2 (en) | Informative image data generation using audio/video recording and communication devices | |
CN107883541B (en) | Air conditioner control method and device | |
CN114237069B (en) | Indoor environment control method, device, system and storage medium | |
CN110799978B (en) | Face recognition in a residential environment | |
CN108131787B (en) | Air conditioner control method and device | |
CN108427298A (en) | User's inlet/outlet intelligent home control system based on image recognition | |
CN107860101A (en) | Parameter adjusting method and device of air conditioning equipment and air conditioning equipment | |
CN112991585B (en) | Access personnel management method and computer readable storage medium | |
CN107654406B (en) | Fan air supply control device, fan air supply control method and device | |
CN108061359B (en) | Air conditioner control method and device | |
CN110186167B (en) | Control method and device of air conditioner, air conditioner and storage medium | |
CN109358546B (en) | Control method, device and system of household appliance | |
CN111063067A (en) | Intelligent access control system based on voice control | |
CN113091245B (en) | Control method and device for air conditioner and air conditioner | |
CN108181837B (en) | Control method and control device | |
CN108427310A (en) | Intelligent home furnishing control method, device and computer readable storage medium | |
CN112327645A (en) | Control method and device for household appliance and household appliance | |
CN113339965A (en) | Method and device for air conditioner control and air conditioner | |
CN114859749B (en) | Intelligent home management method and system based on Internet of things | |
US10834366B1 (en) | Audio/video recording and communication doorbell devices with power control circuitry | |
US20190327128A1 (en) | Using a local hub device as a substitute for an unavailable backend device | |
CN112394647A (en) | Control method, device and equipment of household equipment and storage medium | |
CN111158258A (en) | Environment monitoring method and system | |
CN108088027A (en) | Air conditioner auxiliary equipment, air conditioner control method and device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |