Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is described in further detail below with reference to specific embodiments and the accompanying drawings.
It should be noted that all expressions using "first" and "second" in the embodiments of the present invention are used for distinguishing two entities with the same name but different names or different parameters, and it should be noted that "first" and "second" are merely for convenience of description and should not be construed as limitations of the embodiments of the present invention, and they are not described in any more detail in the following embodiments.
In a first aspect of the embodiments of the present invention, an interaction method for a light emitting device is provided, which can solve the problem of single judgment logic of the light emitting device in the prior art to a certain extent. Fig. 1 is a schematic flow chart of an embodiment of an interaction method of a light emitting device according to the present invention.
The interaction method of the light-emitting device comprises the following steps:
step 101: acquiring the activity of the activity factor; wherein the activity factor is selected from at least one of the following parameters: the current time, the light intensity, the weather condition, the time interval between the current time and the last interaction and the use duration in the preset interaction period.
Step 102: and determining the active state according to the activity of the activity factor.
Here, the active state depends on the activity of the selected activity factor, and specifically, the following table may be referred to.
As an embodiment of the present invention, when the activity factor selects the current time, the activity distribution for different time periods (hr) is as follows:
TABLE 1 relationship between Current time and Activity
Time period
|
00-02
|
02-04
|
04-06
|
06-08
|
08-10
|
10-12
|
12-14
|
14-16
|
16-18
|
18-20
|
20-22
|
22-24
|
Degree of liveness
|
-10
|
-20
|
-10
|
0
|
+5
|
+5
|
+5
|
+10
|
+10
|
+10
|
+20
|
0 |
At this time, the active states in the time period may be classified into six types, i.e., very inactive (-20), very inactive (-10), inactive (0), active (+5), very active (+10), and very active (+20), according to the activity level corresponding to the time period. For example, when the current time is 23 pm, the corresponding activity level is 0, and the current active state is "inactive".
As an embodiment of the present invention, when the liveness factor is selected to be the current light intensity, the different light conditions (%) liveness effect profiles are as follows:
TABLE 2 relationship between light intensity and liveness
Intensity of light
|
0-10
|
10-30
|
30-50
|
50-70
|
70-90
|
90-100
|
Degree of liveness
|
+20
|
+10
|
0
|
5
|
-10
|
-20 |
At this time, the active states in the light intensity range can be classified into six types, i.e., very inactive (-20), very inactive (-10), inactive (0), active (+5), very active (+10), and very active (+20), according to the activity level corresponding to the light intensity. For example, if the current light intensity ratio is 5%, the corresponding activity level is +20, and the current active state is "very active".
It should be noted that the light intensity in table 2 is a percentage, which is a ratio of the current light intensity to the reference light intensity. The reference light intensity can be the light intensity collected at 12 am in the previous day, and when the light intensity reaches 12 am in the current day, the light intensity detected in real time can be updated to be new reference light intensity, so that the light intensity percentage of each day can be updated in real time, and the actual situation is better met. Of course, alternatively, the reference light intensity may be a predetermined light intensity value, such as a default setting value or a user-defined value, etc.
As an embodiment of the present invention, when the activity factor selects the current or current day's weather condition, the activity impact distribution for different weather conditions is as follows in Table 3:
TABLE 3 weather conditions vs. liveness
At this time, the active states of the corresponding weather conditions can be classified into five types, i.e., very inactive (-20), very inactive (-15), inactive (-10), normal (0), and very active (+10), according to the activity level corresponding to the weather conditions. For example, when the current or present day weather condition is a heavy rain, the corresponding activity level is-15, and the current activity state is "very inactive".
As an embodiment of the present invention, when the activity factor selects the time interval between the current time and the last interaction, the activity impact distribution of the different time intervals is as follows:
TABLE 4 Interactive time Interval and Activity relationship
Interaction time interval
|
Greater than 48h
|
48h-24h
|
24h-12h
|
12h-4h
|
4h-30min
|
30-15min
|
Less than 15min
|
Degree of liveness
|
+30
|
+20
|
+10
|
0
|
-10
|
-15
|
-20 |
At this time, the active states of the corresponding time interval may be classified into seven types, i.e., very inactive (-20), very inactive (-15), inactive (-10), normal (0), active (+10), very active (+20), and very active (+30), according to the activity level corresponding to the interaction time interval. For example, if the current time is more than 48 hours from the last interaction, and the corresponding activity level is +30, the current active state is "very active".
As an embodiment of the present invention, when the activity factor selects the usage duration within the preset interaction period (e.g. 24 hours or 48 hours, etc.), the activity impact distribution of different usage durations is as follows:
TABLE 5 relationship of duration of use and liveness
Accumulated length of use
|
0
|
0-1min
|
1min-5min
|
5min-30min
|
30min-2hr
|
Over 2hr
|
Degree of liveness
|
+20
|
+15
|
+10
|
0
|
-10
|
-20 |
At this time, according to the activity degree corresponding to the usage duration in the preset interaction period, the active states of the corresponding usage duration can be divided into six types, namely, very inactive (-20), inactive (-10), normal (0), active (+10), very active (+15), and very active (+ 20). For example, when the usage duration in the preset interaction period is 31 minutes, and the corresponding activity degree is-10, the current active state is "inactive".
It should be noted that, when the end points of the two ranges overlap, the parameter ranges in the above table may be distinguished by a rule about left or right, for example, the cumulative duration of use is 0-1min and 1min-5min overlap at 1 minute, the end point of 1 minute is not included in 0-1min, or the end point of 1 minute is not included in 1min-5min, so as to distinguish from 0-1 min. The parameter ranges of other tables can also be distinguished by using similar rules, which are not described herein again.
Preferably, when the activity factor is selected, more than two activity factors can be selected from five parameters of the current time, the light intensity, the weather condition, the time interval between the current time and the last interaction and the use time in the preset interaction period to determine the activity state.
For example, when selecting the current time and light intensity, the activities in tables 1 and 2 above may be added, and then the sum of the activities compared to the active state. For example, when the current time is 21 points, the activity is +20, and when the current light intensity is 5, the activity is +20, and the sum of the two activities is +40, this time may be defined as "very active". When the sum of the two activity degrees is other values, the division can be performed correspondingly according to the division manner of the active state, which is not described herein again.
Furthermore, in the step of obtaining the activity of the activity factor, the obtained activity factor at least includes a time interval between the current time and the last interaction and a use duration in a preset interaction period, so that the activity state can be better reflected. Preferably, the acquired activity factor further includes light intensity, so as to further accurately reflect the activity state.
Preferably, as an embodiment of the present invention, the current time, the light intensity, the weather condition, the time interval between the current time and the last interaction, and the usage time duration in the preset interaction period are all used as activity factors to determine the activity state.
Specifically, the calculation formula of the total activity a is as follows:
A=50+T+L+W+I+P
wherein, the initial state activity is 50, T, L, W, I, P respectively represents the current time, the light intensity, the weather condition, the time interval between the current time and the last interaction and the activity of the duration of use in the preset interaction period, and the activities of these several activity factors can refer to the records in tables 1 to 5, which are not described herein again.
Thus, after calculating the total activity A, the activity status may be determined according to Table 6 below.
TABLE 6 relationship of Total Activity to active State
Active state
|
Is very inactive
|
Is very inactive
|
Inactive
|
Is normal
|
Activating
|
Is very active
|
Is very active
|
Interval of total activity
|
-∞~10
|
10~20
|
20~40
|
40~60
|
60~80
|
80~90
|
90~+∞ |
Optionally, the active state in the interaction method of the light emitting apparatus may be updated by using a timing refresh method, for example, refreshing every hour; the refresh mechanism may also be triggered when a certain parameter changes, for example, when rainstorm suddenly occurs, the activity level in the active state during rainstorm is updated to refresh the active state, and so on.
In summary, according to the embodiments of the active state calculation rule, the light emitting device has an active state at any time.
Step 103: and receiving an interactive operation instruction. Here, the interoperation command may be a command generated for interacting with the light emitting device, and mainly controls the light emitting device to emit light with a certain light emitting effect through the command.
Optionally, the light-emitting device further includes an infrared sensing module, a brightness sensing module, and a pickup module; preferably, the infrared sensing module is a pyroelectric infrared sensor;
the receiving of the interoperation instruction includes:
receiving data collected by the infrared sensing module, the brightness sensing module and/or the pickup module;
and determining the interactive operation instruction according to the data collected by the infrared sensing module, the brightness sensing module and/or the pickup module.
Different interactive operation instructions can be obtained according to data acquired by different modules or any combination of the data.
For example, if the brightness data acquired by the brightness sensing module is lower than a first brightness threshold and the infrared data acquired by the infrared sensing module is within a first preset range, it is determined that the interactive operation instruction is a voice wake-up instruction. Of course, this is only an example, and the specific combination may be set according to the actual situation.
Step 104: and determining the required lighting effect by combining the interactive operation instruction and the active state.
As an embodiment of the present invention, the method for interacting light emitting devices further includes:
and if the infrared sensing module, the brightness sensing module and the pickup module do not acquire data, determining that the current standby state is not used (also one type of interactive operation instruction), and determining the light-emitting effect according to the active state.
According to the different active states, the lighting device will adopt different lighting effects as silent standby state when not in use standby, as shown in table 7 below.
TABLE 7 distribution of luminous efficacy in Standby State
Classification
|
Is very inactive
|
Is very inactive
|
Inactive
|
Is normal
|
Activating
|
Is very active
|
Is very active
|
Luminous effect 1
|
|
|
|
|
|
|
√
|
Luminous effect 2
|
|
|
|
|
|
√
|
√
|
Luminous effect 3
|
|
|
|
|
√
|
√
|
√
|
Luminous effect 4
|
|
|
|
√
|
√
|
√
|
|
Luminous effect 5
|
|
|
√
|
√
|
√
|
|
|
Luminous effect 6
|
|
√
|
√
|
√
|
|
|
|
Luminous effect 7
|
√
|
√
|
√
|
|
|
|
|
Luminous effect 8
|
√
|
√
|
|
|
|
|
|
Luminous effect 9
|
√
|
|
|
|
|
|
|
As shown in Table 7, the lighting effects 1-9 are different lighting effects respectively for showing the active status of the lighting device, such as lighting effects 1-3 showing a cheerful lighting effect under very active status and lighting effects 7-9 showing a quiet and gloomy lighting effect under very inactive status.
As an embodiment of the present invention, the method for interacting light emitting devices further includes:
if the brightness data acquired by the brightness sensing module is lower than a first brightness threshold value and the infrared data acquired by the infrared sensing module is within a preset range, determining that the interactive operation instruction is a voice awakening instruction and starting voice interaction; here, the first brightness threshold may be a value set as needed, and the selection criterion may be a brightness value considered as a degree that light rays reach a darker level under a normal condition; the preset range may be a range of the infrared data that can be detected when a person walks under normal conditions, and a value of the range may be selected according to actual conditions, which is not limited herein. Under the condition, the brightness data acquired by the brightness sensing module is lower than the first brightness threshold, and the infrared data acquired by the infrared sensing module is in the preset range, so that the function of a wakeup word is replaced, and the corresponding voice interaction function is not required to be awakened by the wakeup word before voice interaction, so that the operation steps are saved, and convenience is brought to users.
As an alternative, if the voice data collected by the pickup module is detected to be a wakeup word, determining that the interactive operation instruction is a voice wakeup instruction, and starting voice interaction; optionally, the wakeup word may be default, or may be set by the user, and the content of the wakeup word is not specifically limited herein.
After the voice interaction is started, the light emitting device can be controlled to emit light with corresponding light emitting effect through voice, or other interaction with the light emitting device can be carried out through voice control.
Further, after the voice interaction is started, the interaction method of the light-emitting device may further include the steps of:
determining the required light emitting effect as a wake-up light effect by combining the voice wake-up instruction and the active state;
and outputting a light-emitting control instruction to control the light-emitting device to emit light according to the awakening light effect.
Therefore, after the voice awakening instruction is confirmed to be received, the light with the awakening light effect is sent by the light-emitting device in combination with the active state, so that a user can know that the light-emitting device is in a state of interaction through voice at present according to the special light effect, on one hand, the user is favorably reminded that the current voice interaction function is activated, interaction with equipment can be carried out through the voice instruction at any time, on the other hand, the user is reminded that the current voice interaction function is activated, and if the voice interaction function is misoperation, the function is timely closed so as to avoid other misoperation problems.
As an embodiment of the present invention, the method for interacting light emitting devices further includes:
determining whether the current time is in a sleep period (e.g., 23 o 'clock late to 5 o' clock early);
if the current time is in a sleep time period, the brightness data collected by the brightness sensing module is lower than a second brightness threshold value, and a light-on instruction is received, the light-emitting effect is a small night light effect, and a light-emitting control instruction is output to control the light-emitting device to emit light according to the small night light effect.
Here, the second brightness threshold may be a value set as needed, and the selection criterion may be light intensity considered to be when people are asleep under normal conditions; the second brightness threshold may be the same as or different from the first brightness threshold in some cases, and the specific value is determined according to the specific situation. The effect of the small night lamp is yellow and dark on the whole, and the specific color temperature, the brightness and the like of the small night lamp can be selected according to actual needs and are not limited.
The lighting device can show different lighting effects (including bright illumination light and color-changing light) when interacting with people under different conditions according to different active states, as shown in table 8 below.
TABLE 8 relationship of Interactive operation Instructions, active State and Lighting effects
As can be seen from table 8, the interoperation command includes at least the following:
1) if the brightness data acquired by the brightness sensing module is higher than the third brightness threshold value and the infrared data acquired by the infrared sensing module is within the preset range, it is indicated that the current light intensity is in a bright state and the infrared signal detects that someone passes through, and at this time, one of the corresponding light-emitting effects 10-16 is determined in combination with the corresponding active state. Here, the third brightness threshold may be a value set as needed, and the selection criterion may be a brightness value considered as a brightness degree that the light reaches brighter degree under normal conditions; in some cases, the specific value may be the same as the first brightness threshold, which depends on the specific situation and is not limited herein.
2) If the brightness data collected by the brightness sensing module is lower than a first brightness threshold value, and the infrared data collected by the infrared sensing module is in a preset range, it is indicated that the current light intensity is in a dim state and the infrared signal detects that someone passes through, it is determined that the interaction operation instruction is a voice awakening instruction, voice interaction is started, and meanwhile, one of corresponding awakening light colors c1-c7 is determined by combining with an active state.
3) And if the voice data collected by the pickup module is detected to be a wake-up word, which indicates that the user is waking up the voice interaction function through voice, determining that the interaction operation instruction is a voice wake-up instruction, starting voice interaction, and determining one of corresponding wake-up light colors c1-c7 by combining with an active state.
4) If the voice data collected by the pickup module are detected to be continuous three times of sounds within a certain time interval, which indicates that the user claps hands three times, at this time, one of the corresponding light-emitting effects 17-23 is determined in combination with the corresponding active state.
5) If the brightness data collected by the brightness sensing module is detected to be shaking (e.g. brightness suddenly decreases or the brightness repeats), it indicates that the user is waving his/her hand at the top of the device, and at this time, determines one of the corresponding light-emitting effects 24-30 in combination with the corresponding active state.
6) If the voice data collected by the pickup module is detected to have data shock (e.g. a sudden change in frequency or amplitude), indicating that the user is currently blowing against the top of the device, then one of the corresponding lighting effects 31-37 is determined in combination with the corresponding activity status.
7) If the current time is in the sleep time period, the brightness data collected by the brightness sensing module is lower than a second brightness threshold value, and a light-on instruction is received, so that the small night light needs to be turned on at present, the light-emitting effect is the small night light effect, and the light-emitting control instruction is output to control the light-emitting device to emit light according to the small night light effect. Optionally, the light-on instruction may be implemented by operations such as voice control, physical switch key or mobile phone control.
Step 105: outputting a light-emitting control instruction to control the light-emitting device to emit light according to the light-emitting effect; here, through the correspondence between the aforementioned interactive operation instruction, the active state and the lighting effect, a corresponding lighting control instruction can be obtained to control the lighting device to light; the light-emitting control instruction comprises light parameters such as light intensity and color temperature, and the light parameters corresponding to different light-emitting effects are different.
As an embodiment of the present invention, the light-emitting device further includes a sound-generating device, optionally, the sound-generating device is a microphone, and the sound-collecting module is a microphone sound-collecting module; the interaction method of the light-emitting device further comprises the following steps:
determining a required sound effect by combining the interactive operation instruction and the active state;
and outputting a sound production control instruction to control the sound production device to produce sound according to the sound effect.
In particular, the sound effect can be combined with the aforementioned lighting effect to provide a more versatile interactive effect.
For example, different sound and light effects are used as the silent standby state when the standby is not used, as shown in table 9 below.
TABLE 9 distribution of Acousto-optic effects in Standby State
Classification
|
Is very inactive
|
Is very inactive
|
Inactive
|
Is normal
|
Activating
|
Is very active
|
Is very active
|
Acousto-optic effects 1
|
|
|
|
|
|
|
√
|
Acousto-optic effect 2
|
|
|
|
|
|
√
|
√
|
Acousto-optic effect 3
|
|
|
|
|
√
|
√
|
√
|
Acoustooptic effect 4
|
|
|
|
√
|
√
|
√
|
|
Acousto-optic effect 5
|
|
|
√
|
√
|
√
|
|
|
Acousto-optic effect 6
|
|
√
|
√
|
√
|
|
|
|
Acoustooptic effect 7
|
√
|
√
|
√
|
|
|
|
|
Acousto-optic effect 8
|
√
|
√
|
|
|
|
|
|
Acousto-optic effect 9
|
√
|
|
|
|
|
|
|
As shown in table 9, the sound and light effects 1-9 are respectively different sound effects and light combinations to represent the activity of the lighting device, for example, the sound and light effects 1-3 under very activity can represent sound and light effects with cheerful passion, and the sound and light effects 7-9 under very inactive condition can represent sound and light effects with quiet and melancholy.
For another example, the lighting device may show different acousto-optic effects when interacting with human under different conditions according to different active states, as shown in table 10 below.
TABLE 10 relationship between Interactive operation Instructions, active State and Acousto-optic Effect
As can be seen from table 10, the interoperation command includes at least the following:
1) if the brightness data acquired by the brightness sensing module is higher than a third brightness threshold value and the infrared data acquired by the infrared sensing module is within a preset range, it is indicated that the current light intensity is in a bright state and the infrared signal detects that someone passes through, and at this time, one of the corresponding acousto-optic effects 10-16 is determined in combination with the corresponding active state. Here, the third brightness threshold may be a value set as needed, and the selection criterion may be a brightness value considered as a brightness degree that the light reaches brighter degree under normal conditions; in some cases, the specific value may be the same as the first brightness threshold, which depends on the specific situation and is not limited herein.
2) If the brightness data collected by the brightness sensing module is lower than a first brightness threshold value, and the infrared data collected by the infrared sensing module is in a preset range, it is indicated that the current light intensity is in a dim state and the infrared signal detects that someone passes through, it is determined that the interaction operation instruction is a voice awakening instruction, voice interaction is started, meanwhile, one of corresponding awakening light colors c1-c7 is determined by combining with an active state, and meanwhile, corresponding sound effects can be provided by matching with the corresponding awakening light colors.
3) If the voice data collected by the pickup module is detected to be a wake-up word, which indicates that the user is waking up the voice interaction function through voice, the interaction operation instruction is determined to be the voice wake-up instruction, voice interaction is started, one of corresponding wake-up light colors c1-c7 is determined by combining the active state, and corresponding sound effects can be provided by matching with the corresponding wake-up light colors.
4) If the voice data collected by the pickup module is detected to be continuous three times of sounds within a certain time interval, the user takes a clap three times, and at the moment, one of the corresponding acousto-optic effects 17-23 is determined according to the corresponding active state.
5) If the brightness data collected by the brightness sensing module is detected to be data shock (such as sudden brightness reduction or repeated high and low), indicating that the user is waving his hand at the top of the device, at this time, determining one of the corresponding acousto-optic effects 24-30 in combination with the corresponding active state.
6) If the voice data collected by the pickup module is detected to have data shock (e.g. a sudden change in frequency or amplitude), indicating that the user is currently blowing against the top of the device, then one of the corresponding acousto-optic effects 31-37 is determined in combination with the corresponding activity status.
7) If the current time is in the sleep time period, the brightness data collected by the brightness sensing module is lower than a second brightness threshold value, and a light-on instruction is received, so that the small night light needs to be turned on at present, the light-emitting effect is the small night light effect, and the light-emitting control instruction is output to control the light-emitting device to emit light according to the small night light effect. Optionally, the light-on instruction may be implemented by operations such as voice control, physical switch key or mobile phone control; simultaneously, optionally, also can cooperate corresponding little night-light to provide corresponding sound effect to richen interactive effect.
It can be seen that according to different active states, different acousto-optic feedback is shown when a user interacts with the light-emitting device, so that the feedback performance for the same operation is different under the conditions of different time, different light conditions, different weather conditions, different interaction intervals and different accumulated use time, the user can be really understood, and the expression is rich, so that the user feels surprise and is not dull.
It can be seen from the foregoing embodiments that, in the interaction method for a light emitting device according to the embodiments of the present invention, the active state is determined according to the activity of the activity factor, and then the active state is combined with the received interactive operation instruction, so as to determine the corresponding light emitting effect, so that the determination logic of the light emitting effect is no longer single, and compared with a light emitting device with a single determination logic, the problem of misoperation is not easily caused.
Furthermore, when the activity factor selects more than two of the current time, the light intensity, the weather condition, the time interval between the current time and the last interaction and the use duration in the preset interaction period, the activity state can be better determined, so that the interaction with the user can be better carried out. The active state judgment mechanism provided by the embodiment of the invention enables the light-emitting device to have an own active judgment mode, can show different feedbacks when meeting different interaction triggers, is not a constant interactive expression mode any more, jumps out of the rigid feeling of the electronic device, increases the richness and the attraction of the expression, and enables a user to feel over-expected.
Furthermore, when the interactive operation instruction is an instruction obtained by analyzing data acquired by the infrared sensing module, the brightness sensing module and/or the sound pickup module, a multi-mode interactive light-emitting device is formed, namely, intelligent equipment which can recognize human behaviors, understand human behaviors and interact with the human behaviors. Under different activity degrees, different acousto-optic feedback and different judging mechanisms are respectively provided for the user to wake up by voice, the infrared induction detects that a person is present and light is bright, the infrared induction detects that the person is present and light is dim, the user takes a hand three times, the user waves the hand towards the top end of the equipment, the user blows towards the top end of the equipment, the user executes the light-on operation in the sleeping time period and in the dim state, and the like, and the logic judgment decision is made in a one-to-one correspondence mode. The integral multi-mode interactive logic judgment framework can process the logic judgment of complex conditions, is not a one-to-one judgment mode any more, gives the most appropriate feedback under the current condition through the comprehensive judgment of ambient light, the current time and the user behavior, and improves the fault-tolerant rate of the device, the false triggering rate is very low, and the linkage effect is good.
According to the interaction method of the light-emitting device provided by the embodiment of the invention, the input information (the current time, the current light condition, whether people are sensed by the current infrared rays or not, the current weather condition, the previous interaction time of the device with the user, the accumulated time for using the equipment within about 24hr, the current voice command condition of the people and the current behavior operation of the people) is comprehensively analyzed to determine a proper interaction feedback mode to be expressed in a sound and light effect adding mode, so that the interaction requirements under various different scenes are met.
In a second aspect of the embodiments of the present invention, a light emitting device is provided, which can solve the problem of single judgment logic of the light emitting device in the prior art to a certain extent. Fig. 2 is a schematic structural diagram of a light-emitting device according to an embodiment of the present invention.
The light emitting device includes:
an interaction triggering unit 201, configured to receive an interaction operation instruction;
a light emitting unit 203 for emitting light according to the light emission control instruction;
a processing unit 202 configured to:
acquiring the activity degrees of at least two activity degree factors; wherein the activity factor is selected from: the current time, the light intensity, the weather condition, the time interval between the current time and the last interaction and the use duration in a preset interaction period;
determining an active state according to the activity of the at least two activity factors;
determining a required lighting effect by combining the interactive operation instruction and the active state;
according to the lighting effect, a lighting control instruction is output to control the lighting unit 203 to light.
It can be seen from the foregoing embodiments that, in the light emitting device provided in the embodiments of the present invention, the active state is determined according to the activity of the activity factor, and then the active state is combined with the received interactive operation instruction, so as to determine the corresponding light emitting effect, so that the determination logic of the light emitting effect is no longer single, and compared with a light emitting device with a single determination logic, the light emitting device is not prone to cause a problem of misoperation.
As an embodiment of the present invention, the light-emitting device further includes a sound-emitting unit 304 for emitting sound according to the sound-emission control instruction;
the processing unit 202 is configured to:
determining a required sound effect by combining the interactive operation instruction and the active state;
according to the sound effect, a sound production control instruction is output to control the sound production unit 304 to produce sound, so that the sound effect can be combined with the aforementioned light-emitting effect to provide more diversified interactive effects.
As an embodiment of the present invention, the interaction triggering unit 201 includes an infrared sensing module, a brightness sensing module and a sound pickup module;
the processing unit 202 is configured to:
receiving data collected by the infrared sensing module, the brightness sensing module and/or the pickup module;
and determining the interactive operation instruction according to the data collected by the infrared sensing module, the brightness sensing module and/or the pickup module.
When the interactive operation instruction is an instruction obtained by analyzing data acquired by the infrared sensing module, the brightness sensing module and/or the sound pickup module, a multi-mode interactive light-emitting device is formed, and the intelligent equipment can identify human behaviors, understand the human behaviors and interact with the human behaviors. Under different activity degrees, different acousto-optic feedback and different judging mechanisms are respectively provided for the user to wake up by voice, the infrared induction detects that a person is present and light is bright, the infrared induction detects that the person is present and light is dim, the user takes a hand three times, the user waves the hand towards the top end of the equipment, the user blows towards the top end of the equipment, the user executes the light-on operation in the sleeping time period and in the dim state, and the like, and the logic judgment decision is made in a one-to-one correspondence mode. The integral multi-mode interactive logic judgment framework can process the logic judgment of complex conditions, is not a one-to-one judgment mode any more, gives the most appropriate feedback under the current condition through the comprehensive judgment of ambient light, the current time and the user behavior, and improves the fault-tolerant rate of the device, the false triggering rate is very low, and the linkage effect is good.
In view of the above object, a third aspect of the embodiments of the present invention provides an embodiment of an apparatus for performing the interaction method. Fig. 3 is a schematic hardware structure diagram of an embodiment of an apparatus for performing the interaction method according to the present invention.
As shown in fig. 3, the apparatus includes:
one or more processors 301 and a memory 302, with one processor 301 being illustrated in fig. 3.
The apparatus for performing the interaction method may further include: an input device 303 and an output device 304.
The processor 301, the memory 302, the input device 303 and the output device 304 may be connected by a bus or other means, and fig. 3 illustrates the connection by a bus as an example.
The memory 302 is a non-volatile computer-readable storage medium and can be used for storing non-volatile software programs, non-volatile computer-executable programs, and modules, such as program instructions/modules corresponding to the interaction method in the embodiment of the present application (for example, the interaction triggering unit 201, the light emitting unit 203, and the processing unit 202 shown in fig. 2). The processor 301 executes various functional applications of the server and data processing by running nonvolatile software programs, instructions and modules stored in the memory 302, that is, implements the interaction method of the above-described method embodiment.
The memory 302 may include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required for at least one function; the storage data area may store data created according to the use of the light emitting device, and the like. Further, the memory 302 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other non-volatile solid state storage device. In some embodiments, memory 302 may optionally include memory located remotely from processor 301, which may be connected to the member user behavior monitoring device via a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The input device 303 may receive input numeric or character information and generate key signal inputs related to user settings and function control of the light emitting device. The output means 304 may comprise a display device such as a display screen.
The one or more modules are stored in the memory 302 and, when executed by the one or more processors 301, perform the interaction method of any of the method embodiments described above. The technical effect of the embodiment of the device for executing the interaction method is the same as or similar to that of any method embodiment.
Embodiments of the present application provide a non-transitory computer storage medium, where a computer-executable instruction is stored, and the computer-executable instruction may execute a processing method for list item operations in any of the above method embodiments. Embodiments of the non-transitory computer storage medium may be the same or similar in technical effect to any of the method embodiments described above.
Finally, it should be noted that, as will be understood by those skilled in the art, all or part of the processes in the methods of the above embodiments may be implemented by a computer program that can be stored in a computer-readable storage medium and that, when executed, can include the processes of the embodiments of the methods described above. The storage medium may be a magnetic disk, an optical disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), or the like. The technical effect of the embodiment of the computer program is the same as or similar to that of any of the method embodiments described above.
Furthermore, the apparatuses, devices, etc. described in the present disclosure may be various electronic terminal devices, such as a mobile phone, a Personal Digital Assistant (PDA), a tablet computer (PAD), a smart television, etc., and may also be large terminal devices, such as a server, etc., and therefore the scope of protection of the present disclosure should not be limited to a specific type of apparatus, device. The client disclosed by the present disclosure may be applied to any one of the above electronic terminal devices in the form of electronic hardware, computer software, or a combination of both.
Furthermore, the method according to the present disclosure may also be implemented as a computer program executed by a CPU, which may be stored in a computer-readable storage medium. The computer program, when executed by the CPU, performs the above-described functions defined in the method of the present disclosure.
Further, the above method steps and system elements may also be implemented using a controller and a computer readable storage medium for storing a computer program for causing the controller to implement the functions of the above steps or elements.
Further, it should be appreciated that the computer-readable storage media (e.g., memory) described herein can be either volatile memory or nonvolatile memory, or can include both volatile and nonvolatile memory. By way of example, and not limitation, nonvolatile memory can include Read Only Memory (ROM), Programmable ROM (PROM), Electrically Programmable ROM (EPROM), Electrically Erasable Programmable ROM (EEPROM), or flash memory. Volatile memory can include Random Access Memory (RAM), which can act as external cache memory. By way of example and not limitation, RAM is available in a variety of forms such as synchronous RAM (DRAM), Dynamic RAM (DRAM), Synchronous DRAM (SDRAM), double data rate SDRAM (DDR SDRAM), Enhanced SDRAM (ESDRAM), Synchronous Link DRAM (SLDRAM), and Direct Rambus RAM (DRRAM). The storage devices of the disclosed aspects are intended to comprise, without being limited to, these and other suitable types of memory.
Those of skill would further appreciate that the various illustrative logical blocks, modules, circuits, and algorithm steps described in connection with the disclosure herein may be implemented as electronic hardware, computer software, or combinations of both. To clearly illustrate this interchangeability of hardware and software, various illustrative components, blocks, modules, circuits, and steps have been described above generally in terms of their functionality. Whether such functionality is implemented as software or hardware depends upon the particular application and design constraints imposed on the overall system. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present disclosure.
The various illustrative logical blocks, modules, and circuits described in connection with the disclosure herein may be implemented or performed with the following components designed to perform the functions described herein: a general purpose processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other programmable logic device, discrete gate or transistor logic, discrete hardware components, or any combination of these components. A general purpose processor may be a microprocessor, but in the alternative, the processor may be any conventional processor, controller, microcontroller, or state machine. A processor may also be implemented as a combination of computing devices, e.g., a combination of a DSP and a microprocessor, a plurality of microprocessors, one or more microprocessors in conjunction with a DSP core, or any other such configuration.
The steps of a method or algorithm described in connection with the disclosure herein may be embodied directly in hardware, in a software module executed by a processor, or in a combination of the two. A software module may reside in RAM memory, flash memory, ROM memory, EPROM memory, EEPROM memory, registers, hard disk, a removable disk, a CD-ROM, or any other form of storage medium known in the art. An exemplary storage medium is coupled to the processor such the processor can read information from, and write information to, the storage medium. In the alternative, the storage medium may be integral to the processor. The processor and the storage medium may reside in an ASIC. The ASIC may reside in a user terminal. In the alternative, the processor and the storage medium may reside as discrete components in a user terminal.
In one or more exemplary designs, the functions may be implemented in hardware, software, firmware, or any combination thereof. If implemented in software, the functions may be stored on or transmitted over as one or more instructions or code on a computer-readable medium. Computer-readable media includes both computer storage media and communication media including any medium that facilitates transfer of a computer program from one place to another. A storage media may be any available media that can be accessed by a general purpose or special purpose computer. By way of example, and not limitation, such computer-readable media can comprise RAM, ROM, EEPROM, CD-ROM or other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other medium that can be used to carry or store desired program code in the form of instructions or data structures and that can be accessed by a general-purpose or special-purpose computer, or a general-purpose or special-purpose processor. Also, any connection is properly termed a computer-readable medium. For example, if the software is transmitted from a website, server, or other remote source using a coaxial cable, fiber optic cable, twisted pair, Digital Subscriber Line (DSL), or wireless technologies such as infrared, radio, and microwave, then the coaxial cable, fiber optic cable, twisted pair, DSL, or wireless technologies such as infrared, radio, and microwave are included in the definition of medium. Disk and disc, as used herein, includes Compact Disc (CD), laser disc, optical disc, Digital Versatile Disc (DVD), floppy disk, blu-ray disc where disks usually reproduce data magnetically, while discs reproduce data optically with lasers. Combinations of the above should also be included within the scope of computer-readable media.
Disclosed exemplary embodiments should be noted, however, that various changes and modifications could be made herein without departing from the scope of the disclosure as defined by the appended claims. The functions, steps and/or actions of the method claims in accordance with the disclosed embodiments described herein need not be performed in any particular order. Furthermore, although elements of the disclosure may be described or claimed in the singular, the plural is contemplated unless limitation to the singular is explicitly stated.
It should be understood that, as used herein, the singular forms "a," "an," "the" are intended to include the plural forms as well, unless the context clearly supports the exception. It should also be understood that "and/or" as used herein is meant to include any and all possible combinations of one or more of the associated listed items.
The above-mentioned serial numbers of the embodiments of the present disclosure are merely for description and do not represent the merits of the embodiments.
It will be understood by those skilled in the art that all or part of the steps for implementing the above embodiments may be implemented by hardware, or may be implemented by a program instructing relevant hardware, where the program may be stored in a computer-readable storage medium, and the above-mentioned storage medium may be a read-only memory, a magnetic disk or an optical disk, etc.
Those of ordinary skill in the art will understand that: the discussion of any embodiment above is meant to be exemplary only, and is not intended to intimate that the scope of the disclosure, including the claims, is limited to these examples; within the idea of an embodiment of the invention, also technical features in the above embodiment or in different embodiments may be combined and there are many other variations of the different aspects of an embodiment of the invention as described above, which are not provided in detail for the sake of brevity. Therefore, any omissions, modifications, substitutions, improvements, and the like that may be made without departing from the spirit and principles of the embodiments of the present invention are intended to be included within the scope of the embodiments of the present invention.