WO2023034631A1 - Systems for interpreting a digit-to-digit gesture by a user differently based on roll values of a wrist- wearable device worn by the user, and methods of use thereof - Google Patents
Systems for interpreting a digit-to-digit gesture by a user differently based on roll values of a wrist- wearable device worn by the user, and methods of use thereof Download PDFInfo
- Publication number
- WO2023034631A1 WO2023034631A1 PCT/US2022/042612 US2022042612W WO2023034631A1 WO 2023034631 A1 WO2023034631 A1 WO 2023034631A1 US 2022042612 W US2022042612 W US 2022042612W WO 2023034631 A1 WO2023034631 A1 WO 2023034631A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- wrist
- digit
- wearable device
- user
- target device
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 83
- 238000004891 communication Methods 0.000 claims abstract description 51
- 230000009471 action Effects 0.000 claims description 25
- 238000002567 electromyography Methods 0.000 claims description 24
- 230000002232 neuromuscular Effects 0.000 claims description 23
- 239000002775 capsule Substances 0.000 claims description 11
- 238000005259 measurement Methods 0.000 claims description 10
- 230000004044 response Effects 0.000 claims description 10
- 230000007246 mechanism Effects 0.000 description 24
- 230000008878 coupling Effects 0.000 description 21
- 238000010168 coupling process Methods 0.000 description 21
- 238000005859 coupling reaction Methods 0.000 description 21
- 210000003811 finger Anatomy 0.000 description 16
- 230000006870 function Effects 0.000 description 13
- 210000003813 thumb Anatomy 0.000 description 11
- 210000000707 wrist Anatomy 0.000 description 10
- 230000003993 interaction Effects 0.000 description 9
- 238000012545 processing Methods 0.000 description 8
- 238000012544 monitoring process Methods 0.000 description 6
- 238000001514 detection method Methods 0.000 description 5
- 238000010586 diagram Methods 0.000 description 5
- 238000012360 testing method Methods 0.000 description 5
- 230000004913 activation Effects 0.000 description 4
- 238000001994 activation Methods 0.000 description 4
- 230000036541 health Effects 0.000 description 4
- 238000007726 management method Methods 0.000 description 4
- 230000002459 sustained effect Effects 0.000 description 4
- 230000008859 change Effects 0.000 description 3
- 230000006872 improvement Effects 0.000 description 3
- 238000012986 modification Methods 0.000 description 3
- 230000004048 modification Effects 0.000 description 3
- 210000003205 muscle Anatomy 0.000 description 3
- 230000003387 muscular Effects 0.000 description 3
- 230000008569 process Effects 0.000 description 3
- 230000000007 visual effect Effects 0.000 description 3
- 210000000988 bone and bone Anatomy 0.000 description 2
- 238000010801 machine learning Methods 0.000 description 2
- 230000014759 maintenance of location Effects 0.000 description 2
- 238000013507 mapping Methods 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 150000002926 oxygen Chemical class 0.000 description 2
- 230000009192 sprinting Effects 0.000 description 2
- 210000004243 sweat Anatomy 0.000 description 2
- 230000001133 acceleration Effects 0.000 description 1
- 230000003213 activating effect Effects 0.000 description 1
- QVGXLLKOCUKJST-UHFFFAOYSA-N atomic oxygen Chemical compound [O] QVGXLLKOCUKJST-UHFFFAOYSA-N 0.000 description 1
- 239000008280 blood Substances 0.000 description 1
- 210000004369 blood Anatomy 0.000 description 1
- 230000001413 cellular effect Effects 0.000 description 1
- 239000004020 conductor Substances 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 230000000881 depressing effect Effects 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 230000005021 gait Effects 0.000 description 1
- 230000001939 inductive effect Effects 0.000 description 1
- 230000003155 kinesthetic effect Effects 0.000 description 1
- 239000003550 marker Substances 0.000 description 1
- 229910052760 oxygen Inorganic materials 0.000 description 1
- 239000001301 oxygen Substances 0.000 description 1
- 230000037081 physical activity Effects 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 238000009877 rendering Methods 0.000 description 1
- 230000000717 retained effect Effects 0.000 description 1
- 238000010079 rubber tapping Methods 0.000 description 1
- 229920006395 saturated elastomer Polymers 0.000 description 1
- 230000035807 sensation Effects 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/014—Hand-worn input/output arrangements, e.g. data gloves
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/015—Input arrangements based on nervous system activity detection, e.g. brain waves [EEG] detection, electromyograms [EMG] detection, electrodermal response detection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0346—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
Definitions
- This application relates generally to interpretation of digit-to-digit gestures at a wrist- wearable device, and, more specifically, to interpreting a digit-to-digit gesture (e.g., one finger touching a thumb) by a user differently based on roll values of the wrist-wearable device (e.g., different input commands are caused to be performed at a device controlled by the wrist-wearable device if the digit-to-digit gesture is performed while the wrist-wearable device has a first roll value as compared to when the wrist-wearable device has a second roll value).
- a digit-to-digit gesture e.g., one finger touching a thumb
- roll values of the wrist-wearable device e.g., different input commands are caused to be performed at a device controlled by the wrist-wearable device if the digit-to-digit gesture is performed while the wrist-wearable device has a first roll value as compared to when the wrist-wearable device has a second roll value.
- IMUs of consumer electronic devices are able to provide yaw, pitch, and roll values for the consumer electronic devices. These values can be used in conjunction with games on the consumer electronic devices, e.g., to allow a user to move the device around in three-dimensional space to cause different changes in direction of a race car, as one example.
- the systems and methods described herein address one or more of the above- mentioned drawbacks by allowing a user to provide digit-to-digit gestures detected at a wristwearable device while the wrist-wearable device has different roll values (e.g., roll values determined by an inertial measurement unit of the wrist-wearable device).
- the wrist-wearable device detects that a user is intending to perform a digit-to-digit gesture (e.g., intending to cause their index finger to contact their thumb) while the wrist- wearable device has a first roll value (e.g., display is pointing toward a ceiling), then the wristwearable device can cause performance of a first command (either at the wrist-wearable device itself or at a device that is being controlled by the wrist-wearable device such as a television).
- a digit-to-digit gesture e.g., intending to cause their index finger to contact their thumb
- a first roll value e.g., display is pointing toward a ceiling
- the wrist-wearable device can cause performance of a second command that is distinct from the first command (either at the wristwearable device itself or at a device that is being controlled by the wrist-wearable device such as a television).
- this example technique has been determined to provide an expanded set of gestures (e.g., the same digit-to- digit command can be interpreted as a different gesture based on roll value of the wristwearable device, essentially turning one digit-to-digit gesture into multiple different digit-to- digit gestures with associated roll values) and to make it seamless and efficient for users to control devices using a wrist-wearable device (e.g., a user can seamlessly provide various digit-to-digit commands to control a television without needing to look down at the wristwearable device to allow them to perform the gestures).
- an expanded set of gestures e.g., the same digit-to- digit command can be interpreted as a different gesture based on roll value of the wristwearable device, essentially turning one digit-to-digit gesture into multiple different digit-to- digit gestures with associated roll values
- a user can seamlessly provide various digit-to-digit commands to control a television without needing to look down at the wristwearable device to allow them to
- a method of interpreting a digit-to-digit gesture based on roll values for a wrist-wearable device is provided.
- the method is performed at a wrist-wearable device including a display and one or more sensors configured to detect yaw, pitch, and roll values (e.g., angular values as discussed in more detail below) for the wearable device.
- the method includes receiving an indication at a first point in time that a user donning the wrist-wearable device is providing a digit-to-digit gesture in which one of the user’s digits touches another of the user’s digits without contacting the display of the wearable device.
- the method further includes, in accordance with a determination that the digit-to-digit gesture is provided while data from the one or more sensors indicates that the wrist-wearable device has a first roll value, causing a target device that is in communication with the wearable device to perform a first input command.
- the method further includes, receiving another indication at a second point in time that is after the first point in time that the user is providing the digit-to-digit gesture again, and in accordance with a determination that the digit-to-digit gesture is provided again while data from the one or more sensors indicates that the wearable device has a second roll value that is distinct from the first roll value, causing the target device to perform a second input command that is distinct from the first input command.
- FIG. 1C depicts performance of a digit-to-digit gesture by user 110 while the one or more sensors (e.g., an IMU) indicate that the wearable device 114 has a first roll value, which then causes a target device (e.g., television 102 in Figure 1C) that is communicatively coupled to the wearable device to perform a first input command (e.g., a pause command as shown in Figure 1C).
- a target device e.g., television 102 in Figure 1C
- a first input command e.g., a pause command as shown in Figure 1C.
- the indication received at the first point in time and the other indication received at the second point in time are both detected based on neuromuscular signals sensed by an Electromyography (EMG) sensor of the one or more sensors of the wearable device.
- EMG Electromyography
- the indication received at the first point in time and the other indication received at the second point in time are both detected based on vibration signals sensed by an Inertial Measurement Unit (IMU) of the one or more sensors of the wrist-wearable device.
- IMU Inertial Measurement Unit
- the target device is separate and distinct from the wearable device.
- the method includes establishing a communication channel with the target device, without requiring any input from the user. For example, establishing a communication channel without requiring any input from the user helps to establish an improved man- machine interface in which the user is able to provide fewer inputs to achieve the desired outcome of controlling the target device. In this example, the user provides no inputs in order to achieve that desired outcome, thereby offering a significant improvement in the manmachine interface and helping to ensure that users are able to maintain a sustained interaction with the wearable device and also with the target device.
- an ultra-wideband (UWB) antenna of the wrist-wearable device is used for establishing the communication channel.
- the wearable device has UWB antennas that are used to transmit data and localize antennas.
- the device 114 can make use of the UWB antennas to search for and pair with the target device, in addition to exchanging authentication information in certain embodiments or circumstances.
- a social-media account e.g., a Facebook account
- the target device is already set up to authenticate with the device 114 that is also signed into the same social-media account. In this way, the communication channel can be quickly established via the UWB antennas.
- the method further includes — in response to detecting that the wrist-wearable device has moved closer to a second target device than to the target device — establishing a second communication channel with the second device instead of the target device, without requiring any input from the user.
- the first communication channel is then ceased or terminated.
- the method further includes, receiving an additional indication at a third point in time that the user intends to perform the digit-to-digit gesture once again.
- the method includes causing the second device to perform a third input command, using the second communication channel.
- Figure IE An example of this is shown in Figure IE, which is described in more detail below.
- the target device is the wrist-wearable device. Examples in which the target device is the wristwearable device are provided with reference to Figures 5A-5B.
- the method also includes controlling a user interface on the display of the wrist-wearable device based on additional sensor data from an Inertial Measurement Unit (IMU) of the one or more sensors of the wristwearable device.
- IMU Inertial Measurement Unit
- controlling the user interface on the display of the wrist-wearable device includes changing zoom levels on a map based on yaw values of the wrist-wearable device.
- (A14) In some embodiments of the method of any of (A1)-(A13), selecting different input commands as the first and second input commands based on whether biometric data sensed at the wrist-wearable device indicates that the user is a first user or a second user. For example, when the wrist-wearable device is used as a gaming controller in an artificial-reality environment, the commands used for different users can be configured differently. For instance, a first user might prefer that a jump command is executed while the first gesture is received while the wearable device has the first roll value, but a different second user might prefer that a sprinting command is executed while the first gesture is received while the wearable device is worn by the different, second user and the wearable device has the first roll value. This is further shown and described with reference to Figures 4 A and 4B.
- the method also includes selecting sensed-value thresholds that are specific to the user, the sensed-value thresholds used by the wrist-wearable device for detecting one or both of (i) the digit-to-digit gesture and (ii) the first and second roll values for different users of the wrist-wearable device.
- the first input command is selected based on a device state of the target device.
- the device state includes information regarding an application that is executing on the target device when the indication is received.
- the application is one that is currently executing in a foreground such that a user interface for the application is currently visible on a screen of the target device, and the device state can provide information concerning what is currently displayed within the application while the wrist-wearable device receives the indication that the user intends to perform the digit-to-digit gesture.
- the first input command causes the target device to perform a first action; and while the target device is displaying a second application, the first input command also causes the target device to perform the first action.
- the same first action can be performed for two different applications.
- the first input command causes the device to perform a first action; while the target device is displaying a second application, the first input command causes the device to perform a second action distinct from the first action.
- different actions can be performed for two different applications.
- the first action may be going to a previous page in a web application and the second action may be causing an avatar to jump in a gaming application. Whether to perform same or different actions can be a preference that is configured by the user.
- (Bl) A wrist- wearable device for controlling devices, the wrist- wearable device configured to perform or cause performance of the method of any of (A1)-(A19).
- (B2) A capsule portion of a wrist-wearable device, wherein: the capsule is configured to couple with a band to form a wrist-wearable device, and the capsule includes one or more processors configured to perform or cause performance of the method of any of any of (A1)-(A19).
- (B3) A nontransitory, computer-readable storage medium including instructions that, when executed by a wrist-wearable device, cause the wrist- wearable device to perform or cause performance of the method of any of (A1)-(A19).
- (B4) A system including a wrist-wearable device and a target device, wherein the wrist-wearable device is configured to perform or cause performance of the method of any of (A1)-(A19).
- a wrist-wearable device that includes means for performing or causing performance of the methods of any of (A1)-(A19).
- Figures 1A-1E graphically depict a sequence in which a user is able to provide digit-to-digit gestures that are detected by a wrist-wearable device while it has different roll values and to then cause performance of commands at a target device (e.g., a television), in accordance with some embodiments.
- Figure 1A depicts the user with the wristwearable device being too far away from the target device to establish a connection, in accordance with some embodiments.
- Figure IB depicts the user having moved closer to the target device to then establish a connection, in accordance with some embodiments.
- Figure 1C next depicts that the user performs a digit-to-digit gesture while the wrist-wearable device has a first roll value, which then causes the target device to execute a video-playback command, in accordance with some embodiments.
- Figure ID next depicts that the user performs the same digit-to-digit gesture while the wrist-wearable device has a second roll value, which then causes the target device to execute a different video-playback command, in accordance with some embodiments.
- Figure IE depicts that once the user moves closer to a different target device, a connection with that different target device is then established, in accordance with some embodiments.
- Figures 2A-2B depict that digit-to-digit gestures detected by the wristwearable device, while it has various roll values, can also cause performance of different commands related to navigation through photos at a target device, in accordance with some embodiments.
- Figures 3A-3B depict that digit-to-digit gestures detected by the wristwearable device, while it has various roll values, can also cause performance of different commands related to controlling a gaming interface at the target device, while, in accordance with some embodiments.
- Figures 4A-4B depict digit-to-digit gestures detected by the wrist-wearable device, while having the same roll values, can also cause performance of different commands depending on the user. Specifically, Figure 4A depicts photos navigation control and Figure 4B depicts controlling a gaming interface at the target device, in accordance with some embodiments.
- Figures 5A-5B depict that digit-to-digit gestures detected by the wristwearable device, while it has various yaw values, can also cause performance of different commands related to altering zoom levels in a maps application, at the target device, in accordance with some embodiments.
- Figures 6A-6D show a flow diagram of interpreting a digit-to-digit gesture based on roll values for a wrist-wearable device, in accordance with some embodiments.
- Figures 7A and 7B illustrate an example wrist-wearable device, in accordance with some embodiments.
- Figure 8 is a block diagram of a wrist-wearable device system, in accordance with some embodiments.
- Figure 9 is a flow diagram of an exemplary method for sensing gestures via vibration-sensitive wearables donned by users of artificial-reality systems, in accordance with some embodiments.
- First setting 100 may include first environment 120 and second environment 122, which can be different rooms of a house or office building.
- Wrist- wearable device 114 can be configured to interpret a digit-to-digit gesture (e.g., one finger touching a thumb) into many different available gestures, by disambiguating which of the different gestures was intended by reference to a roll value at the device 114 when the digit-to-digit gesture was provided. In this way, a user can simply rotate their wrist to different positions (causing the IMU to detect different roll values for the device 114) and then provide digit-to-digit gestures while the device 114 has the different roll values, thereby opening up a rich set of new gestures.
- a digit-to-digit gesture e.g., one finger touching a thumb
- a user can simply rotate their wrist to different positions (causing the IMU to detect different roll values for the device 114) and then provide digit-to-digit gestures while the device 114 has the different roll values,
- Detection of the digit-to-digit gesture itself can be done using Electromyography (EMG) sensors (e.g., EMG Sensor 1046 in Figure 8) configured to detect a user’s intention to perform a gesture prior to the user performing the digit-to-digit gesture, as well as (or as alternatives to) using data from an Inertial Measurement Unit (IMU) sensor(s) (e.g., IMU sensor 1042 in Figure 8).
- EMG Electromyography
- IMU Inertial Measurement Unit
- the device 114 can also make use of yaw and/or pitch values from the IMU sensor 1042 to enable additional interactions at the device 114.
- one digit-to-digit gesture can allow for a number of different gestures, simply by looking at the roll values when the gesture was provided. This also allows for an improved man-machine interface, in which the user 110 is able to provide fewer inputs to achieve desired outcomes. In some instances, the desired outcome may be controlling a target device through a wrist-wearable device with minimal user input (and, additionally or alternatively, without a user needing to look down at the device 114 while performing the digit-to-digit gesture).
- the digit-to-digit gesture can include any action in which one of the user’s fingers is intended to touch another of the user’s fingers without contacting the display of the wearable device (e.g., an index finger touching a thumb also known as a digit-to-digit pinch gesture).
- Example digit-to-digit gestures that are performed while the device 114 has different roll values are described below, in particular in reference to Figures 1C (digit-to-digit gesture performed while roll value of the device 114 is such that the display of the device 114 is facing toward a ceiling); ID (digit-to-digit gesture performed while roll value of the device 114 is such that the display of the device 114 not facing toward the ceiling); 2A-2B (examples of the same digit-to-digit gesture performed while device 114 has different roll positions, resulting in performance of different photo-navigation-related commands); 3A-3B (examples of the same digit-to-digit gesture performed while device 114 has different roll positions, resulting in performance of different gaming-related commands); among others.
- FIG. IB depicts the user, having moved closer to the target device, then establishing a connection, in accordance with some embodiments.
- a communication channel 101 may be established with any target device present in first environment 120 (e.g., television 102, or any other target device such as a computer or mobile phone).
- the target device may be distinct from the wrist- wearable device 114 or may be the wrist-wearable device 114.
- Establishing a communication channel 101 without requiring any input from the user helps to ensure an improved man-machine interface, as sustained interactions, in which the user 110 is able to provide fewer inputs to achieve desired outcomes (e.g., controlling the target device, such as television 102).
- the user 110 provides no inputs in order to establish a communication channel 101 with the target device (e.g., television 102), thereby offering a significant improvement to the man-machine interface and helping to ensure that users are able to seamlessly maintain a sustained interaction with the wristwearable device 114 and also with television 102, as will be described in more detail below.
- the target device is a television 102 in this example, the target device can be any device that is separate and distinct from the wrist-wearable device 114.
- Target devices can include any device with communication channel connectivity capabilities.
- target devices can include a television 102 in first environment 120 or television 104 in second environment 122. Broken lines leading from wrist- wearable device 114 to target devices (e.g., television 102) are used to indicate an established communication channel 101.
- ultra- wideband (UWB) antennas may be used for establishing the communication channel.
- communication channels 101 may be established based on a distance between the wristwearable device 114 and the target device, without requiring any input from the user. For example, in response to the wrist- wearable device 114 detecting that the user 110 is closer to a first target device than to a second target device, the wrist-wearable device 114 will establish a communication channel 101 between the first target device instead of the second target device.
- an application on the wrist- wearable device 114 may offer a list of paired devices, and the user 110 may select which device to control with the wearable device 114.
- User 110 may be wearing wrist- wearable device 114 on their wrist.
- the wristwearable device 114 may be used to control other devices, such as target devices, based on inputs at the wearable device interface itself, such as digit-to-digit gestures that can be detected based on sensed neuromuscular signals and/or IMU sensor data.
- the digit-to-digit gestures provided at the wrist- wearable device 114 can allow for controlling a mouse or pointer on a target device (such a mouse or pointer displayed on a television or computer monitor).
- the target device may be separate and distinct from the wrist-wearable device or it may be the same device as the wearable device 114.
- the wrist- wearable device 114 can also function as a human-interface device
- any electronics that supports the combined keyboard and mouse input modality (e.g., all Android and iOS devices, computers running Windows, Linux, Mac, certain embedded systems, etc.) once the wearable device has paired with them.
- FIG. 1C next depicts that the user performs a digit-to-digit gesture while the wrist-wearable device has a first roll value, which then causes the target device to execute a video-playback command, in accordance with some embodiments.
- the wrist-wearable device 114 may receive an indication at a first point in time, for example at 12:00 PM, that the user 110 intends to perform a digit-to-digit gesture 106.
- the intention to perform the digit-to-digit gesture 106 may be sensed based on neuromuscular signals that are detected using an EMG sensor 1046.
- the neuromuscular signals can allow the device 114 to determine that a user is going to perform the digit-to-digit gesture before the motor actions to cause the digit-to-digit gesture actually occur.
- data from an IMU sensor 1042 can be used to detect actual performance of the digit-to-digit gesture.
- IMU sensor 1042 such as a vibration sensor that can pick up vibrations at the wrist-wearable device 114 that are caused by the user’s performance of the digit-to-digit gesture
- additional details regarding how IMU sensor data can be used to detect digit-to-digit gestures is provided below in reference to Figure 9
- Representations of sensed values detected by the EMG sensors are shown in EMG sensor plot 132 and in IMU sensor vibration data plot 134.
- the DtD-0 threshold need not be a fixed quantity, but can reflect a process in which sensed values are analyzed using a machinelearning model to make a determination that a digit-to-digit gesture is intended or actually being performed (the threshold, in this sense, can then reflect the machine-learning models determination that the data that was input to the model, which can be EMG signals and/or vibration data, indicates that a digit-to-digit gesture is intended or being performed).
- the threshold in this sense, can then reflect the machine-learning models determination that the data that was input to the model, which can be EMG signals and/or vibration data, indicates that a digit-to-digit gesture is intended or being performed.
- multiple different thresholds are stored to allow for detecting of different types of digit-to-digit gestures, such as one threshold for the index finger contacting a user’s thumb, a different threshold for the middle finger contacting a user’s thumb, etc.
- the digit-to-digit gesture 106 that is detected is a digit-to-digit pinch gesture in which the user 110 has moved their index finger to contact their thumb.
- the wrist- wearable device 114 may cause the television 102 to perform a first input command (e.g., a pause command as shown on television 102 in Figure 1C). As shown on television 102, this input command may control video playback on the target device.
- a first input command e.g., a pause command as shown on television 102 in Figure 1C.
- Figure ID next depicts that the user 110 performs the same digit-to-digit gesture 106 while the wrist-wearable device 114 has a second roll value, which then causes the target device (e.g., television 102) to execute a different video-playback command, in accordance with some embodiments.
- Yaw, pitch, and roll plot 130 provides a rough illustration showing that the IMU sensor 1042 of the device 114 can provide angular estimates for each of the yaw, pitch, and roll values. These values change as the user moves their wrist around to different positions.
- an indication may be received by the EMG sensors that the user 110 intends to perform the gesture 106 once again after having already performed the gesture 106 at the first point in time that was before the second point in time.
- the intention to perform the digit-to-digit gesture 106 may be sensed based on neuromuscular signals sensed using an EMG sensor 1046 (and/or based on vibration data from an IMU sensor 1042).
- Yaw, pitch, and roll plot 130 also shows in Figure ID that a roll value of the device 114 has shifted to be a second roll value that is different from the first roll value.
- the digit-to-digit gesture 106 is performed in the example of Figure ID, it is performed after the user has rotated their left wrist in a clockwise direction such that the display of the device 114 is no longer pointing toward the ceiling.
- the wrist-wearable device 114 may cause the target device to perform a second input command that is distinct from the first input command (e.g., the second input command can be a fast-forward command as shown on television 102 in Figure ID). As shown on television 102, this input command may control video playback on the target device.
- the first input command e.g., the second input command can be a fast-forward command as shown on television 102 in Figure ID.
- this input command may control video playback on the target device.
- one digit-to-digit gesture 106 may allow for a number of different gestures, causing performance of different input commands, simply by having the device 114 look at the roll values when each digit-to-digit gesture is detected. This is reflected between Figures 1C and ID where one digit-to-digit gesture allows for different video playback commands simply because of the differing roll values.
- Figure IE depicts that once the user moves closer to a different target device, a connection with that different target device is then established, in accordance with some embodiments.
- wrist-wearable device 114 may disconnect the communication channel with TV 102 in first environment 120 and establish a new communication channel with a second target device due to wrist- wearable device 114 detecting that it has moved closer to a second television 104 in second environment 122.
- the determinations as to how close a user is to different target devices can be performed using data exchanged via ultra- wideband (UWB) antennas (of the device 114 and of the target device or devices).
- UWB ultra- wideband
- digit-to-digit gestures performed at the device 114 cause performance of commands at the new target device (e.g., second television 104) instead of the target device (e.g., television 102).
- the new target device presents a message indicating as much (e.g., a message of “Connection Established” is displayed on new target device 104 once the new communication channel has been established).
- Figures 1A-1E relate to control of video playback
- other input commands can also be executed based on detected digit-to-digit gestures.
- Figures 2A-2B depict that digit-to-digit gestures detected by the wrist-wearable device while it has various roll values can also cause performance of different commands related to navigation through photos at a target device, in accordance with some embodiments.
- the input commands to be performed responsive to digit-to-digit gestures can be based on a device state for the target device (such as a state indicating which application is displayed in a foreground of the target device, such that appropriate input commands are chosen for the current foreground application).
- user 110 may need to access photos on television 102.
- the wrist-wearable device 114 may receive an indication at a first point in time, for example at 1:00 PM, that the user 110 intends to perform a pinch gesture.
- the digit-to-digit gesture 106 can be detected based on sensor values from an EMG sensor 1046 and/or an IMU sensor 1042 (as was explained above).
- the plots in Figures 2A- 2B also show that sensor values have crossed the thresholds indicating that a digit-to-digit gesture has been detected.
- the wrist-wearable device 114 Because the digit-to-digit gesture in the example of Figure 2A is detected while the one or more sensors (e.g., IMU data that can be used to estimate yaw, pitch, and roll values) indicate that the wrist-wearable device 114 has a first roll value (e.g., watch facing upwards or facing the sky), the wrist-wearable device 114 causes the television 102 to perform a first input command (e.g., a photos application selection in Figure 2A).
- a first input command e.g., a photos application selection in Figure 2A
- Figure 2B next depicts that the same digit-to-digit gesture 106 from the example of Figure 2A (e.g., movement of the user’s index finger and thumb such that they contact one another) is detected by the wrist-wearable device while it has a second roll value different from the first roll value in Figure 2A, in accordance with some embodiments.
- the same digit-to-digit gesture 106 from the example of Figure 2A e.g., movement of the user’s index finger and thumb such that they contact one another
- the wrist-wearable device while it has a second roll value different from the first roll value in Figure 2A, in accordance with some embodiments.
- an indication may be received by the EMG sensors that the user 110 intends to perform the gesture 106. Because the digit-to-digit gesture 106 is detected while the device 114 has the second roll value that is different from the first roll value (e.g., display of device 114 no longer facing toward the ceiling and instead having been rotated about 90 degrees in a clockwise direction in Figure 2B related to its position in Figure 2A), the target device is caused to perform a different input command (relative to the first input command of the Figure 2A example), which causes navigation through the photos at the target device.
- a different input command relative to the first input command of the Figure 2A example
- the second input command is a photos navigation command as shown on television 102 in Figure 2B, in which the target device (e.g., television 102) is caused to navigate through the user’s photos (different than the photo selection command depicted in Figure 2A in which an individual photo was selected for display on the target device).
- the target device e.g., television 102
- the photo selection command depicted in Figure 2A in which an individual photo was selected for display on the target device.
- one digit-to-digit gesture 106 can allow for a number of different gestures, using roll values as a disambiguating variable to interpret the same digit-to-digit gesture 106 to cause performance of many different input commands.
- Figures 3A-3B provide one additional example.
- Figures 3A-3B depict that digit-to-digit gestures detected by the wrist-wearable device 114 while it has various roll values can also cause performance of different commands related to controlling a gaming interface at the target device.
- user 110 may need to access photos on television 102.
- the wrist- wearable device 114 may receive an indication at a first point in time, for example at 2:00 PM, that the user 110 intends to perform a digit-to-digit gesture 106.
- the target device is caused to perform a different input command (relative to the first input command of the Figure 3A example), which causes navigation through user in a gaming interface to run.
- the second input command causes a user in a gaming interface to run (different than the user selection command depicted in Figure 3A in which a user was selected to display on the target device).
- the device 114 can also be a device that is shared among different users. In such circumstances, interpretation of digit-to-digit gestures at the device 114 can be customized to each of these different users.
- Figures 4A-4B depict that the same digit-to-digit gesture performance by different users (a first user in Figure 4A and a different second user in Figure 4B), while the device 114 has the same roll value (e.g., when the first user performs the digit-to-digit gesture in the example of Figure 4A, the device 114 has a certain roll value; when the second user performs the digit-to-digit gesture in the example of Figure 4B, the device 114 has the same certain roll value), can also cause performance of different commands depending on which user is wearing the wrist-wearable device. In this way, the wrist- wearable device 114 can personalize its interpretation of digit-to-digit gestures for specific users.
- Figure 4A depicts a first user 150 controlling photos navigation at target device 160 and Figure 4B depicts a second user 152 controlling a gaming interface at target device 162.
- both first user 110 and second user 152 are performing the same digit-to-digit gesture while the device 114 has the same roll value, yet the input command that results from performance of the gesture is different and specific to which user performed the gesture.
- Figures 4A-4B show that the two users are using different applications on the target device (e.g., photos application in Figure 4A; settings application in Figure 4B), it will be appreciated that this user-specific command interpretation also applies while different users interact with the same application. For example, if the first and second users were both using the same photos application (at different points in time), then the first user’s performance of the digit-to-digit gesture while the device 114 has a certain roll value can cause performance of the first photos input command (e.g., selection of an individual photo), while the second user’s performance of the digit-to-digit gesture while the device 114 has the same certain roll value can cause performance of a second photos input command (e.g., navigation through different photos).
- first photos input command e.g., selection of an individual photo
- the second user’s performance of the digit-to-digit gesture while the device 114 has the same certain roll value can cause performance of a second photos input command (e.g., navigation through different photos).
- the mapping of digit-to-digit gestures to performance of certain input commands can be configured by each user during a setup or configuration process available at the device 114. As discussed in more detail below, determination of which user is using the device 114 can be based on biometric signals detected by the device 114, such that the device is able to determine which user is wearing the device 114 by analyzing these biometric signals to allow the device to select appropriate thresholds (for detecting digit-to-digit gestures for each individual user) and to select appropriate input commands based on each user’s configured preferences.
- Use of digit-to-digit gestures to cause performance of input commands is not limited to performance of such commands at remote or separate target devices.
- the device 114 itself can be the target device in certain embodiments or circumstances, such that the example input commands discussed above with respect to a target device that is a separate television device can instead be performed at the device 114 itself.
- yaw and pitch values can allow for control of other features at the device 114 (and also at remote or separate target devices).
- a user 402 moves their arm away from their head, which causes the yaw value of the wrist-wearable device 404 to move along the yaw axis approximately 45 degrees, resulting in a zoom-out on a map that is displayed on the display of the wrist-wearable device, in accordance with some embodiments.
- Figure 5B depicts user 402 moving their arm back toward the user’s head, which causes the yaw value of the device 114 to move back toward its original position before the movement from Figure 5 A, which results in a zoom-in on the map displayed on the display of the wrist-wearable device, in accordance with some embodiments.
- Figures 5A and 5B indicate that various yaw values may control changing zoom levels on a map of the wrist-wearable device 404.
- Figure 5A depicts user 402 moving wrist-wearable device 404 along the yaw axis approximately 45 degrees resulting in a zoom level for the map of 50% (which can be a reduction from an initial zoom level of 100% while the device was closer to the user’s head) on the wrist- wearable device 404.
- the arrow near the user’s body in Figure 5A illustrates that the user has moved their hand in a direction away from the user’s head, thereby causing the change in the estimated yaw value for the device 114.
- Figure 5B depicts user 402 moving wristwearable device 404 along the yaw axis approximately -35 degrees resulting in a zoom level of 75% on the wrist-wearable device 404.
- the arrow near the user’s body in Figure 5B illustrates that the user has moved their hand in a direction back toward the user’s head, thereby causing the change in the estimated yaw value for the device 114.
- pitch changes at the device can also result in changing what is displayed at the device 114 (e.g., a tilt level can be adjusted based on changes in pitch values). In this way, the user can move their wrist around to cause corresponding changes in how the map (and what part of the map) is presented on the display of device 114.
- the use of yaw and/or pitch values can be reflected both at the device 114 and at the separate target device with which a communication channel has been established.
- the user is able to both control how an image or map is presented (as two nonlimiting examples), while also using digit-to-digit gestures to then drive performance of various input commands (at either or both of the device 114 and a separate target device).
- Figures 6A-6D show a flow diagram of operations associated with a method of optimizing performance for a wrist-wearable device based on a position of a portion of the wrist-wearable device relative to a user’s body, in accordance with some embodiments.
- a method 600 of controlling devices using a wrist-wearable device 114 ( Figures 1 A-4B) is provided.
- the method 600 can be performed at a wrist-wearable device 114 including a display and one or more sensors configured to detect yaw, pitch, and roll values for the wrist-wearable device 114.
- the method 600 includes receiving (602) an indication at a first point in time that a user donning the wrist- wearable device 114 is providing a digit-to-digit gesture in which one of the user’s digits would touch another of the user’s digits without contacting the display of the wrist-wearable device 114.
- a digit-to-digit gesture can include the user touching a finger to their thumb.
- the method 600 includes, in accordance with a determination that the digit-to-digit gesture is provided while data from the one or more sensors indicates that the wrist-wearable device 114 has a first roll value, causing (604) a target device that is in communication with the wrist-wearable device to perform a first input command.
- the target device is (606) separate and distinct from the wrist-wearable device 114.
- the target device is (612) the wrist-wearable device 114.
- the method 600 includes receiving (610) another indication at a second point in time that is after the first point in time that the user is providing the digit-to-digit gesture again.
- the indication received (612) at the first point in time and the other indication received at the second point in time are both detected based on neuromuscular signals sensed by Electromyography (EMG) sensors of the one or more sensors of the wrist- wearable device 114.
- EMG Electromyography
- the indication received (614) at the first point in time and the other indication received at the second point in time are both detected based on vibration signals sensed by Inertial Measurement Units (IMUs) of the one or more sensors of the wrist- wearable device 114.
- IMUs Inertial Measurement Units
- the method 600 includes, in accordance with a determination that the digit-to- digit gesture is provided again while data from the one or more sensors indicates that the wrist- wearable device 114 has a second roll value that is distinct from the first roll value, causing (615) the target device to perform a second input command that is distinct from the first input command.
- each of the first input command and the second input command controls (616) video playback at the target device.
- the method 600 includes controlling (622) a user interface on the display of the wrist- wearable device 114 based on additional sensor data from Inertial Measurement Units (IMUs) of the one or more sensors of the wrist-wearable device 114.
- controlling (624) the user interface on the display of the wrist- wearable device 114 includes changing zoom levels on a map based on yaw values of the wrist- wearable device 114.
- the method 600 includes selecting (626) different input commands as the first and second input commands based on whether biometric data sensed at the wrist-wearable device 114 indicates that the user is a first user or a second user.
- the commands used for different users can be configured differently. For instance, a first user might prefer that a jump command is executed while the first pinch gesture is received while the wrist- wearable device 114 has the first roll value, but a different second user might prefer that a sprinting command is executed while the first pinch gesture is received while the wrist- wearable device 114 is worn by the different second user and the wrist- wearable device 114 has the first roll value.
- the method 600 includes selecting (628) sensed-value thresholds that are specific to the user, the sensed-value thresholds used by the wrist- wearable device 114 for detecting one or both of (i) the digit-to- digit gesture and (ii) the first and second roll values for different users of the wrist-wearable device 114. Additional examples of the input commands and sensed-value threshold are provided above in reference to Figures 1A-5B.
- the first input command is selected (630) based on a device state of the target device.
- the device state includes (632) information regarding an application that is executing on the target device when the indication is received.
- the first input command causes the target device to perform a first action (e.g., go to previous page in a web application); while the target device is displaying a second application, the first input command also causes the target device to perform the first action (e.g., go to previous page in a web application).
- the first input command causes the device to perform a first action (e.g., go to previous page in a web application); while the target device is displaying a second application, the first input command causes the device to perform a second action distinct from the first action (e.g., cause an avatar to jump in a gaming application).
- a first action e.g., go to previous page in a web application
- the target device is displaying a second application
- the first input command causes the device to perform a second action distinct from the first action (e.g., cause an avatar to jump in a gaming application).
- the method 600 includes establishing (638) a communication channel with the target device, without requiring any input from the user. Establishing a communication channel without requiring any input from the user helps to establish an improved man-machine interface in which the user is able to provide fewer inputs to achieve the desired outcome of controlling the target device. In this example, the user provides no inputs in order to achieve that desired outcome, thereby offering a significant improvement in the man-machine interface and helping to ensure that users are able to maintain a sustained interaction with the wrist- wearable device 114 and also with the target device.
- an ultra-wideband (UWB) antenna of the wristwearable device 114 is used (640) for establishing the communication channel.
- the wrist- wearable device 114 has UWB antennas that are used to transmit data and localize antennas.
- the UWB antennas are used to search for and pair with the target device and exchange authentication information.
- a user may be signed into a Facebook account on the target device, and the target device is already set up to authenticate.
- Bluetooth is used for controlling the target device.
- the method 600 in response to detecting that the wrist-wearable device 114 has moved closer to a second target device distinct from the target device, the method 600 includes establishing (642-a) a second communication channel with the second target device instead of the target device, without requiring any input from the user. In some embodiments, the method 600 further includes receiving (642-b) an additional indication at a third point in time that the user intends to perform the digit-to-digit gesture once again. In some embodiments, the method 600 also includes, in accordance with a determination that the gesture that is once again provided is provided while the one or more sensors indicate that the wrist- wearable device 114 has a third roll value, causing the second target device to perform a third input command, using the second communication channel.
- FIGs 7A and 7B illustrate an example wrist-wearable device 950, in accordance with some embodiments.
- the wearable device 114 shown and described in reference to Figures 1A-5B can be an instance of the wrist-wearable device 950.
- Figure 7A illustrates a perspective view of the wrist-wearable device 950 that includes a watch body 954 decoupled from a watch band 962.
- Watch body 954 and watch band 962 can have a substantially rectangular or circular shape and can be configured to allow a user to wear the wrist-wearable device 950 on a body part (e.g., a wrist).
- the wrist-wearable device 950 can include a retaining mechanism 963 (e.g., a buckle, a hook and loop fastener, etc.) for securing watch band 962 to the user’s wrist.
- the wrist-wearable device 950 can also include a coupling mechanism 960 (e.g., a cradle) for detachably coupling watch body (or capsule) 954 (via a coupling surface 956 of the watch body 954) to watch band 962.
- Functions executed by the wrist-wearable device 950 can include, without limitation, display of visual content to the user (e.g., visual content displayed on display 115), sensing user input (e.g., sensing a touch on button 958, sensing biometric data on sensor 964, sensing neuromuscular signals on neuromuscular sensor 965, etc.), messaging (e.g., text, speech, video, etc.), image capture, wireless communications (e.g., cellular, near field, WiFi, personal area network, etc.), location determination, financial transactions, providing haptic feedback, alarms, notifications, biometric authentication, health monitoring, sleep monitoring, etc.
- visual content e.g., visual content displayed on display 115
- sensing user input e.g., sensing a touch on button 958, sensing biometric data on sensor 964, sensing neuromuscular signals on neuromuscular sensor 965, etc.
- messaging e.g., text, speech, video, etc.
- image capture e.g., text, speech, video
- the wrist-wearable device 950 can further perform any functions described above in reference to Figures 1A-6D.
- functions can be executed on the wrist- wearable device 950 in conjunction with an artificial-reality environment which includes, but is not limited to, virtual-reality (VR) environments (including non-immersive, semi-immersive, and fully immersive VR environments), augmented-reality environments (including marker-based augmented-reality environments, markerless augmented-reality environments, location-based augmented-reality environments, and projection-based augmented-reality environments), hybrid reality, and other types of mixed-reality environments.
- VR virtual-reality
- augmented-reality environments including marker-based augmented-reality environments, markerless augmented-reality environments, location-based augmented-reality environments, and projection-based augmented-reality environments
- hybrid reality and other types of mixed-reality environments.
- the watch band 962 can be configured to be worn by a user such that an inner surface of the watch band 962 is in contact with the user’s skin.
- sensor 964 is in contact with the user’s skin.
- the sensor 964 can be a biosensor that senses a user’s heart rate, saturated oxygen level, temperature, sweat level, muscle intentions, or a combination thereof (any of these can be examples of the biometric sensor described above and used in conjunction with the positional-state determinations described herein, and can also be associated with the capsule portion instead of the band portion).
- the watch band 962 can include multiple sensors 964 that can be distributed on an inside and/or an outside surface of the watch band 962.
- the watch body 954 can include the same or different sensors than the watch band 962 (or the watch band 962 can include no sensors at all in some embodiments). For example, multiple sensors can be distributed on an inside and/or an outside surface of watch body 954.
- the watch body 954 (e.g., a capsule portion) can include, without limitation, a magnetic field sensor, antenna return-loss sensor, front-facing image sensor 925A and/or rear-facing image sensor 925B, a biometric sensor (e.g., biometric sensor), an IMU sensor 1042, a heart rate sensor, a saturated oxygen sensor, a neuromuscular sensor(s), an altimeter sensor, a temperature sensor, a bio-impedance sensor, a pedometer sensor, an optical sensor, a touch sensor (e.g., capacitive sensor 1089 shown in Figure 8), a sweat sensor, etc.
- a biometric sensor e.g., biometric sensor
- IMU sensor 1042 e.g., an IMU sensor 1042
- a heart rate sensor e.g., a saturated oxygen sensor
- a neuromuscular sensor(s) e.g., an altimeter sensor
- a temperature sensor e.g., a bio-impedance sensor,
- the sensor 964 can also include a sensor that provides data about a user’s environment, including a user’s motion (e.g., an IMU sensor 1042), altitude, location, orientation, gait, or a combination thereof.
- the sensor 964 can also include a light sensor (e.g., an infrared light sensor, a visible light sensor) that is configured to track a position and/or motion of watch body 954 and/or watch band 962.
- Watch band 962 can transmit the data acquired by the sensor 964 to watch body 954 using a wired communication method (e.g., a universal asynchronous receiver/transmitter (UART), a universal serial bus (USB) transceiver, etc.) and/or a wireless communication method (e.g., near field communication, Bluetooth TM, etc.).
- Watch band 962 can be configured to operate (e.g., to collect data using sensor 964) independent of whether watch body 954 is coupled to or decoupled from watch band 962.
- the watch band 962 and/or watch body 954 can include a haptic device 966 (e.g., a vibratory haptic actuator) that is configured to provide haptic feedback (e.g., a cutaneous and/or kinesthetic sensation, etc.) to the user’s skin.
- a haptic device 966 e.g., a vibratory haptic actuator
- the sensor 964 and/or haptic device 966 can be configured to operate in conjunction with multiple applications including, without limitation, health monitoring, social media, game playing, and artificial reality (e.g., the applications associated with artificial reality).
- the watch band 962 can include a neuromuscular sensor 965 (e.g., an Electromyography (EMG) sensor, a mechanomyogram sensor, a sonomyography sensor, etc.).
- Neuromuscular sensor 965 can sense a user’s intention to perform certain motor actions (this neuromuscular sensor 965 can be another example of a sensor used as the biometric sensor in conjunction with the positional-state determinations described herein).
- the sensed muscle intention can be used to control certain user interfaces displayed on the display 115 of the wrist-wearable device 950 and/or can be transmitted to device responsible for rendering an artificial-reality environment (e.g., ahead-mounted display) to perform an action in an associated artificial-reality environment, such as to control the motion of a virtual device displayed to the user.
- an artificial-reality environment e.g., ahead-mounted display
- Signals from neuromuscular sensor 965 can be used to provide a user with an enhanced interaction with a physical object and/or a virtual object in an artificial-reality application generated by an artificial-reality system (e.g., user interface objects presented on the display 115, or another computing device 650 (e.g., ahead-mounted display)). Signals from neuromuscular sensor 965 can be obtained (e.g., sensed and recorded) by one or more neuromuscular sensors 965 of watch band 962.
- Figure 7A shows one neuromuscular sensor 965
- watch band 962 can include a plurality of neuromuscular sensors 965 arranged circumferentially on an inside surface of watch band 962 such that the plurality of neuromuscular sensors 965 contact the skin of the user.
- Watch band 962 can include a plurality of neuromuscular sensors 965 arranged circumferentially on an inside surface of watch band 962.
- Neuromuscular sensor 965 can sense and record neuromuscular signals from the user as the user performs muscular activations (e.g., movements, gestures, etc.).
- the muscular activations performed by the user can include static gestures, such as placing the user’s hand palm down on a table; dynamic gestures, such as grasping a physical or virtual object; and covert gestures that are imperceptible to another person, such as slightly tensing a joint by co-contracting opposing muscles or using sub-muscular activations.
- the muscular activations performed by the user can include symbolic gestures (e.g., gestures mapped to other gestures, interactions, or commands, for example, based on a gesture vocabulary that specifies the mapping of gestures to commands).
- the wrist- wearable device 950 can include a coupling mechanism (also referred to as a cradle) for detachably coupling watch body 954 to watch band 962.
- a user can detach watch body 954 from watch band 962 in order to reduce the encumbrance of the wrist-wearable device 950 to the user.
- the wrist-wearable device 950 can include a coupling surface 956 on the watch body 954 and/or watch band coupling mechanism(s) 960 (e.g., a cradle, a tracker band, a support base or a clasp).
- a user can perform any type of motion to couple watch body 954 to watch band 962 and to decouple watch body 954 from watch band 962. For example, a user can twist, slide, turn, push, pull, or rotate watch body 954 relative to watch band 962, or a combination thereof, to attach watch body 954 to watch band 962 and to detach watch body 954 from watch band 962.
- watch band coupling mechanism 960 can include a type of frame or shell that allows watch body 954 coupling surface 956 to be retained within watch band coupling mechanism 960.
- Watch body 954 can be detachably coupled to watch band 962 through a friction fit, magnetic coupling, a rotation-based connector, a shear-pin coupler, a retention spring, one or more magnets, a clip, a pin shaft, a hook and loop fastener, or a combination thereof.
- watch body 954 can be decoupled from watch band 962 by actuation of release mechanism 970.
- the release mechanism 970 can include, without limitation, a button, a knob, a plunger, a handle, a lever, a fastener, a clasp, a dial, a latch, or a combination thereof.
- the wrist- wearable device 950 can include a single release mechanism 970 or multiple release mechanisms 970 (e.g., two release mechanisms 970 positioned on opposing sides of the wrist-wearable device 950, such as spring-loaded buttons).
- the release mechanism 970 can be positioned on watch body 954 and/or watch band coupling mechanism 960.
- Figure 7A shows release mechanism 970 positioned at a comer of watch body 954 and at a comer of watch band coupling mechanism 960, the release mechanism 970 can be positioned anywhere on watch body 954 and/or watch band coupling mechanism 960 that is convenient for a user of wrist- wearable device 950 to actuate.
- a user of the wrist- wearable device 950 can actuate the release mechanism 970 by pushing, turning, lifting, depressing, shifting, or performing other actions on the release mechanism 970.
- Actuation of the release mechanism 970 can release (e.g., decouple) the watch body 954 from the watch band coupling mechanism 960 and the watch band 962 allowing the user to use the watch body 954 independently from watch band 962.
- decoupling the watch body 954 from the watch band 962 can allow the user to capture images using rear-facing image sensor 925B.
- FIG. 7B is a side view of the wrist-wearable device 950.
- the wrist-wearable device 950 of Figure 7B can include a watch body interface 980 (another example of a cradle for the capsule portion of the wrist- wearable device 114).
- the watch body 954 can be detachably coupled to the watch body interface 980.
- Watch body 954 can be detachably coupled to watch body interface 980 through a friction fit, magnetic coupling, a rotationbased connector, a shear-pin coupler, a retention spring, one or more magnets, a clip, a pin shaft, a hook and loop fastener, or a combination thereof.
- watch body 954 can be decoupled from watch body interface 980 by actuation of a release mechanism.
- the release mechanism can include, without limitation, a button, a knob, a plunger, a handle, a lever, a fastener, a clasp, a dial, a latch, or a combination thereof.
- the wristband system functions can be executed independently in watch body 954, independently in watch body interface 980, and/or in communication between watch body 954 and watch body interface 980.
- Watch body interface 980 can be configured to operate independently (e.g., execute functions independently) from watch body 954. Additionally, or alternatively, watch body 954 can be configured to operate independently (e.g., execute functions independently) from watch body interface 980.
- Watch body interface 980 and/or watch body 954 can each include the independent resources required to independently execute functions.
- watch body interface 980 and/or watch body 954 can each include a power source (e.g., a battery), a memory, data storage, a processor (e.g., a CPU), communications, a light source, and/or input/output devices.
- watch body interface 980 can include all of the electronic components of watch band 962.
- one or more electronic components can be housed in watch body interface 980 and one or more other electronic components can be housed in portions of watch band 962 away from watch body interface 980.
- FIG 8 is a block diagram of a wrist-wearable device system 1000, according to at least one embodiment of the present disclosure.
- the wearable device 114 described in detail above is an example wrist-wearable device system 1000, so wearable device 114 will be understood to include the components shown and described for wrist-wearable device system 1000 below.
- the wrist- wearable device system 1000 can have a split architecture (e.g., a split mechanical architecture or a split electrical architecture) between a watch body 1004 (e.g., a capsule 954) and a watch band 1012 (e.g., a band portion 962/ band portion 108), which was described above in reference to Figures 7A and 7B.
- a split architecture e.g., a split mechanical architecture or a split electrical architecture
- Each of watch body 1004 and watch band 1012 can have a power source, a processor, a memory, sensors, a charging device, and a communications device that enables each of watch body 1004 and watch band 1012 to execute computing, controlling, communication, and sensing functions independently in watch body 1004, independently in watch band 1012, and/or in communication between watch body 1004 and watch band 1012.
- watch body 1004 can include capacitive sensor 1089, magnetic field sensor 1095, antenna return-loss (RL) sensor 124, biometric sensor 126, battery 1028, CPU 1026, storage 1002, heart rate sensor 1058, EMG sensor 1046, SpO2 sensor 1054, altimeter 1048, IMU sensor 1042, random access memory 1003, charging input 1030 and communication devices NFC 1015, LTE 1018, and WiFi/BluetoothTM 1020.
- watch band 1012 can include battery 1038, microcontroller unit 1052, memory 1050, heart rate sensor 1058, EMG sensor 1046, SpO2 sensor 1054, altimeter 1048, IMU sensor 1042, charging input 1034 and wireless transceiver 1040.
- Memory 1050 may further include device state table 1091, an example of which is shown in Figure 1C.
- a level of functionality of at least one of watch band 1012 or watch body 1004 can be modified when watch body 1004 is detached from watch band 1012.
- the level of functionality that can be modified can include the functionality of at least one sensor (e.g., heart rate sensor 1058, EMG sensor 1046, etc.).
- Each of watch body 1004 and watch band 1012 can execute instructions stored in storage 1002 and memory 1050 respectively that enables at least one sensor (e.g., heart rate sensor 1058, EMG sensor 1046, etc.) in watch band 1012 to acquire data when watch band 1012 is detached from watch body 1004 and when watch band 1012 is attached to watch body 1004.
- Watch body 1004 and watch band 1012 can further execute instructions stored in storage 1002 and memory 1050, respectively, which enables watch band 1012 to transmit the acquired data to watch body 1004 (or other computing device such as a head mounted display or other computing device 350; Figure 3) using wired communications 1027 and/or wireless transceiver 1040.
- watch body 1004 can display visual content to a user on touchscreen display 1013 (e.g., an instance of display 115) and play audio content on speaker 125.
- Watch body 1004 can receive user inputs such as audio input from microphone 127 and touch input from buttons 1024.
- Watch body 1004 can also receive inputs associated with a user’s location and/or surroundings. For example, watch body 1004 can receive location information from GPS 1016 and/or altimeter 1048 of watch band 1012.
- Watch body 1004 can receive image data from at least one image sensor 135 (e.g., a camera).
- Image sensor 135 can include front-facing image sensor 925A ( Figure 7A) and/or rear-facing image sensor 925B ( Figure 7B).
- Front-facing image sensor 925A and/or rear-facing image sensor 925B can capture wide-angle images of the area surrounding frontfacing image sensor 925A and/or rear-facing image sensor 925B such as hemispherical images (e.g., at least hemispherical, substantially spherical, etc.), 180-degree images, 360- degree area images, panoramic images, ultra-wide area images, or a combination thereof.
- hemispherical images e.g., at least hemispherical, substantially spherical, etc.
- 180-degree images 360- degree area images
- panoramic images ultra-wide area images, or a combination thereof.
- front-facing image sensor 925A and/or rear-facing image sensor 925B can be configured to capture images having a range between 45 degrees and 360 degrees.
- Certain input information received by watch body 1004 e.g., user inputs, etc.
- watch band 1012 e.g., certain input information received by watch band 1012
- certain input information e.g., acquired sensor data, neuromuscular sensor data, etc.
- Watch body 1004 and watch band 1012 can receive a charge using a variety of techniques.
- watch body 1004 and watch band 1012 can use a wired charging assembly (e.g., power cords) to receive the charge.
- watch body 1004 and/or watch band 1012 can be configured for wireless charging.
- a portable charging device can be designed to mate with a portion of watch body 1004 and/or watch band 1012 and wirelessly deliver usable power to a battery of watch body 1004 and/or watch band 1012.
- Watch body 1004 and watch band 1012 can have independent power and charging sources to enable each to operate independently. Watch body 1004 and watch band 1012 can also share power (e.g., one can charge the other) via power management IC 1032 in watch body 1004 and power management IC 1036 in watch band 1012. Power management IC 1032 and power management IC 1036 can share power over power and ground conductors and/or over wireless charging antennas.
- Wrist-wearable device system 1000 can operate in conjunction with a health- monitoring application that acquires biometric and activity information associated with the user.
- the health-monitoring application can be designed to provide information to a user that is related to the user’s health.
- wrist-wearable device system 1000 can monitor a user’s physical activity by acquiring data from IMU sensor 1042 while simultaneously monitoring the user’s heart rate via heart rate sensor 1058 and saturated blood oxygen levels via SpO2 sensor 1054.
- CPU 1026 can process the acquired data and display health related information to the user on touchscreen display 1013.
- Wrist- wearable device system 1000 can detect when watch body 1004 and watch band 1012 are connected to one another (e.g., mechanically connected and/or electrically or magnetically connected) or detached from one another.
- pin(s), power/ground connections 1060, wireless transceiver 1040, and/or wired communications 1027 can detect whether watch body 1004 and watch band 1012 are mechanically and/or electrically or magnetically connected to one another (e.g., detecting a disconnect between the one or more electrical contacts of power/ground connections 1060 and/or wired communications 1027).
- watch body 1004 and watch band 1012 when watch body 1004 and watch band 1012 are mechanically and/or electrically disconnected from one another (e.g., watch body 1012 has been detached from watch band 1012 as described with reference to Figures 7A and 7B), watch body 1004 and/or watch band 1012 can operate with modified level of functionality (e.g., reduced functionality) as compared to when watch body 1004 and watch band 1012 are mechanically and/or electrically connected to one another.
- the modified level of functionality e.g., switching from full functionality to reduced functionality and from reduced functionality to full functionality
- wrist- wearable device system 1000 determines that watch body 1004 and watch band 1012 are mechanically and/or electrically disconnected from one another and connected to each other, respectively.
- Modifying the level of functionality can reduce power consumption in battery 1028 and/or battery 1038.
- any of the sensors e.g., heart rate sensor 1058, EMG sensor 1046, SpO2 sensor 1054, altimeter 1048, etc.
- processors e.g., CPU 1026, microcontroller unit 1052, etc.
- communications elements e.g., communication devices NFC 1015, GPS 1016, LTE 1018, WiFi/BluetoothTM 1020, antennas 1093, etc.
- actuators e.g., haptics 1022, 1049, etc.
- Watch body 1004 and watch band 1012 can return to full functionality when watch body 1004 and watch band 1012 are mechanically and/or electrically connected to one another.
- the level of the sensors e.g., heart rate sensor 1058, EMG sensor 1046, SpO2 sensor 1054, altimeter 1048, etc.
- processors e.g., CPU 1026, microcontroller unit 1052, etc.
- communications elements e
- wrist- wearable device system 1000 can detect when watch body 1004 and watch band 1012 are coupled to one another (e.g., mechanically connected and/or electrically connected) or decoupled from one another.
- watch body 1004 can modify a level of functionality (e.g., activate and/or deactivate certain functions) based on whether watch body 1004 is coupled to watch band 1012.
- CPU 1026 can execute instructions that detect when watch body 1004 and watch band 1012 are coupled to one another and activate front-facing image sensor 925 A.
- CPU 1026 can activate front-facing image sensor 925 A based on receiving user input (e.g., a user touch input from touchscreen display 1013, a user voice command from microphone 127, a user gesture recognition input from EMG sensor 1046, etc.).
- CPU 1026 can modify a level of functionality (e.g., activate and/or deactivate additional functions). For example, CPU 1026 can detect when watch body 1004 and watch band 1012 are decoupled from one another and activate rear-facing image sensor 925B.
- CPU 1026 can activate rear-facing image sensor 925B automatically (e.g., without user input) and/or based on receiving user input (e.g., a touch input, a voice input, an intention detection, etc.). Automatically activating rear-facing image sensor 925B can allow a user to take wide-angle images without having to provide user input to activate rear-facing image sensor 925B.
- rear-facing image can be activated based on an image capture criterion (e.g., an image quality, an image resolution, etc.).
- an image capture criterion e.g., an image quality, an image resolution, etc.
- rear-facing image sensor 925B can receive an image (e.g., a test image).
- CPU 1026 and/or rear-facing image sensor 925B can analyze the received test image data and determine whether the test image data satisfies the image capture criterion (e.g., the image quality exceeds a threshold, the image resolution exceeds a threshold, etc.).
- Rear-facing image sensor 925B can be activated when the test image data satisfies the image capture criterion.
- rear-facing image sensor 925B can be deactivated when the test image data fails to satisfy the image capture criterion.
- CPU 1026 can detect when watch body 1004 is coupled to watch band 1012 and deactivate rear-facing image sensor 925B.
- CPU 1026 can deactivate rear-facing image sensor 925B automatically (e.g., without user input) and/or based on receiving user input (e.g., a touch input, a voice input, an intention detection, etc.). Deactivating rear-facing image sensor 925B can automatically (e.g., without user input) reduce the power consumption of watch body 1004 and increase the battery charge time in watch body 1004.
- wrist- wearable device system 1000 can include a coupling sensor 1007 that senses whether watch body 1004 is coupled to or decoupled from watch band 1012.
- Coupling sensor 1007 can be included in any of watch body 1004, watch band 1012, or watch band coupling mechanism 960 of Figures 7A and 7B.
- Coupling sensor 1007 e.g., a proximity sensor
- Coupling sensor 1007 can include, without limitation, an inductive proximity sensor, a limit switch, an optical proximity sensor, a capacitive proximity sensor, a magnetic proximity sensor, an ultrasonic proximity sensor, or a combination thereof.
- CPU 1026 can detect when watch body 1004 is coupled to watch band 1012 or decoupled from watch band 1012 by reading the status of coupling sensor 1007.
- FIG. 9 shows an example technique for using data from an inertial measurement unit to detect digit-to-digit gestures.
- a vibration sensor e.g., IMU sensor 1042 shown in Figure 8
- a wristwearable device e.g., wrist-wearable device 114 shown in Figure 1A
- a body e.g., a skin surface and/or bone structure
- the wrist of user 110 may experience vibration that arrived via the skin and/or bone of user 110.
- IMU sensor 1042 incorporated into wrist- wearable device 114 may measure, detect, and/or sense the acceleration caused by a vibration.
- the vibration sensor may generate an electrical response that corresponds to the vibration that arrived at the wearable via the body of the user.
- vibration sensor may measure vibration from various all degrees of freedom.
- vibration sensor may generate an electrical response that is commensurate with and/or representative of vibration.
- a processing device may determine that the user has made a specific gesture with at least one body part based at least in part on the electrical response generated by the vibration sensor. For example, a processing device and/or a processing device incorporated into a head-mounted display of the artificial-reality system may determine that user 110 has made a specific gesture with their hand or fingers. In this example, processing device and/or the processing device incorporated into the wrist-wearable device 114 may reach this determination based at least in part on the electrical response generated by vibration sensor.
- a processing device may generate an input command for the artificial-reality system based at least in part on the specific gesture made by the user.
- a processing device and/or the processing device incorporated into the wristwearable device 114 may modify one or more graphics included in image frames presented to the user via the head-mounted display. Such modifications to those graphics may account for the specific gesture made by the user’s hand and/or fingers.
- the wrist- wearable device 114 may be able to detect certain gestures made by a user 110 of the artificial-reality system.
- the wrist-wearable device 114 may constitute a wristband that includes a vibration sensor.
- the vibration sensor may detect and/or sense a vibration resulting from that gesture.
- the vibration sensor may then generate an electrical response consistent with that vibration.
- the wristwearable device 114 may be able to identify which particular gesture was made by the user 110 by analyzing the electrical response generated by the vibration sensor.
- the wrist-wearable device 114 may cause the artificial-reality system to modify the user’s artificial reality experience to account for that gesture.
- Figure 9 is an example of how IMU sensor data can be used to detect digit-to-digit gestures.
- first, second, etc. are, in some instances, used herein to describe various elements, these elements should not be limited by these terms. These terms are only used to distinguish one element from another. For example, a first widget could be termed a second widget, and, similarly, a second widget could be termed a first widget, without departing from the scope of the various described implementations. The first widget and the second widget are both widgets, but they are not the same condition unless explicitly stated as such.
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Health & Medical Sciences (AREA)
- Biomedical Technology (AREA)
- Dermatology (AREA)
- General Health & Medical Sciences (AREA)
- Neurology (AREA)
- Neurosurgery (AREA)
- User Interface Of Digital Computer (AREA)
- Position Input By Displaying (AREA)
Abstract
Description
Claims
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202280059968.4A CN118076940A (en) | 2021-09-03 | 2022-09-06 | System for differently interpreting finger gestures of a user's finger based on roll values of a wrist-wearable device worn by the user and method of using the same |
EP22782616.1A EP4396657A1 (en) | 2021-09-03 | 2022-09-06 | Systems for interpreting a digit-to-digit gesture by a user differently based on roll values of a wrist- wearable device worn by the user, and methods of use thereof |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US202163240810P | 2021-09-03 | 2021-09-03 | |
US63/240,810 | 2021-09-03 | ||
US17/900,809 | 2022-08-31 | ||
US17/900,809 US12093464B2 (en) | 2021-09-03 | 2022-08-31 | Systems for interpreting a digit-to-digit gesture by a user differently based on roll values of a wrist-wearable device worn by the user, and methods of use thereof |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2023034631A1 true WO2023034631A1 (en) | 2023-03-09 |
Family
ID=83505740
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/US2022/042612 WO2023034631A1 (en) | 2021-09-03 | 2022-09-06 | Systems for interpreting a digit-to-digit gesture by a user differently based on roll values of a wrist- wearable device worn by the user, and methods of use thereof |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO2023034631A1 (en) |
Citations (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20090054067A1 (en) * | 2007-08-23 | 2009-02-26 | Telefonaktiebolaget Lm Ericsson (Publ) | System and method for gesture-based command and control of targets in wireless network |
US20120127070A1 (en) * | 2010-11-22 | 2012-05-24 | Electronics And Telecommunications Research Institute | Control signal input device and method using posture recognition |
US20130317648A1 (en) * | 2012-05-25 | 2013-11-28 | California Institute Of Technology | Biosleeve human-machine interface |
US20140098018A1 (en) * | 2012-10-04 | 2014-04-10 | Microsoft Corporation | Wearable sensor for tracking articulated body-parts |
US20150084860A1 (en) * | 2013-09-23 | 2015-03-26 | Thalmic Labs Inc. | Systems, articles, and methods for gesture identification in wearable electromyography devices |
US20170123487A1 (en) * | 2015-10-30 | 2017-05-04 | Ostendo Technologies, Inc. | System and methods for on-body gestural interfaces and projection displays |
US20170199586A1 (en) * | 2016-01-08 | 2017-07-13 | 16Lab Inc. | Gesture control method for interacting with a mobile or wearable device utilizing novel approach to formatting and interpreting orientation data |
US20180164892A1 (en) * | 2015-06-26 | 2018-06-14 | Intel Corporation | Technologies for micro-motion-based input gesture control of wearable computing devices |
US20200073483A1 (en) * | 2018-08-31 | 2020-03-05 | Ctrl-Labs Corporation | Camera-guided interpretation of neuromuscular signals |
EP3660633A1 (en) * | 2019-07-31 | 2020-06-03 | Taurum Technologies SL | Hand-worn data-input device |
US20210064132A1 (en) * | 2019-09-04 | 2021-03-04 | Facebook Technologies, Llc | Systems, methods, and interfaces for performing inputs based on neuromuscular control |
US20210089131A1 (en) * | 2019-09-23 | 2021-03-25 | Apple Inc. | Finger-Mounted Input Devices |
US20210232226A1 (en) * | 2020-01-28 | 2021-07-29 | Pison Technology, Inc. | Determining a geographical location based on human gestures |
-
2022
- 2022-09-06 WO PCT/US2022/042612 patent/WO2023034631A1/en active Application Filing
Patent Citations (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20090054067A1 (en) * | 2007-08-23 | 2009-02-26 | Telefonaktiebolaget Lm Ericsson (Publ) | System and method for gesture-based command and control of targets in wireless network |
US20120127070A1 (en) * | 2010-11-22 | 2012-05-24 | Electronics And Telecommunications Research Institute | Control signal input device and method using posture recognition |
US20130317648A1 (en) * | 2012-05-25 | 2013-11-28 | California Institute Of Technology | Biosleeve human-machine interface |
US20140098018A1 (en) * | 2012-10-04 | 2014-04-10 | Microsoft Corporation | Wearable sensor for tracking articulated body-parts |
US20150084860A1 (en) * | 2013-09-23 | 2015-03-26 | Thalmic Labs Inc. | Systems, articles, and methods for gesture identification in wearable electromyography devices |
US20180164892A1 (en) * | 2015-06-26 | 2018-06-14 | Intel Corporation | Technologies for micro-motion-based input gesture control of wearable computing devices |
US20170123487A1 (en) * | 2015-10-30 | 2017-05-04 | Ostendo Technologies, Inc. | System and methods for on-body gestural interfaces and projection displays |
US20170199586A1 (en) * | 2016-01-08 | 2017-07-13 | 16Lab Inc. | Gesture control method for interacting with a mobile or wearable device utilizing novel approach to formatting and interpreting orientation data |
US20200073483A1 (en) * | 2018-08-31 | 2020-03-05 | Ctrl-Labs Corporation | Camera-guided interpretation of neuromuscular signals |
EP3660633A1 (en) * | 2019-07-31 | 2020-06-03 | Taurum Technologies SL | Hand-worn data-input device |
US20210064132A1 (en) * | 2019-09-04 | 2021-03-04 | Facebook Technologies, Llc | Systems, methods, and interfaces for performing inputs based on neuromuscular control |
US20210089131A1 (en) * | 2019-09-23 | 2021-03-25 | Apple Inc. | Finger-Mounted Input Devices |
US20210232226A1 (en) * | 2020-01-28 | 2021-07-29 | Pison Technology, Inc. | Determining a geographical location based on human gestures |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN108499105B (en) | Method, device and storage medium for adjusting visual angle in virtual environment | |
KR102471977B1 (en) | Method for displaying one or more virtual objects in a plurality of electronic devices, and an electronic device supporting the method | |
US8570273B1 (en) | Input device configured to control a computing device | |
US20160036965A1 (en) | Mobile terminal and method of operating the same | |
EP2733574A2 (en) | Controlling a graphical user interface | |
EP3299933B1 (en) | Method for displaying a navigator associated with content and electronic device for implementing the same | |
US20240028129A1 (en) | Systems for detecting in-air and surface gestures available for use in an artificial-reality environment using sensors at a wrist-wearable device, and methods of use thereof | |
EP3195098A2 (en) | Remote user interface | |
US10890982B2 (en) | System and method for multipurpose input device for two-dimensional and three-dimensional environments | |
KR102110208B1 (en) | Glasses type terminal and control method therefor | |
KR102297473B1 (en) | Apparatus and method for providing touch inputs by using human body | |
US20240019938A1 (en) | Systems for detecting gestures performed within activation-threshold distances of artificial-reality objects to cause operations at physical electronic devices, and methods of use thereof | |
US20190235687A1 (en) | Electronic device and operating method therefor | |
KR102654621B1 (en) | Method for displaying object and electronic device thereof | |
EP4325335A1 (en) | Multi-stage gestures detected based on neuromuscular-signal sensors of a wearable device to activate user-interface interactions with low-false positive rates, and systems and methods of use thereof | |
US12093464B2 (en) | Systems for interpreting a digit-to-digit gesture by a user differently based on roll values of a wrist-wearable device worn by the user, and methods of use thereof | |
KR20180054228A (en) | Method for providing content and electronic device thereof | |
WO2016049842A1 (en) | Hybrid interaction method for portable or wearable intelligent device | |
CN116438503A (en) | Electronic device and operation method thereof | |
EP4325343A1 (en) | Navigating a user interface using in-air gestures detected via neuromuscular-signal sensors of a wearable device, and systems and methods of use thereof | |
US20230359422A1 (en) | Techniques for using in-air hand gestures detected via a wrist-wearable device to operate a camera of another device, and wearable devices and systems for performing those techniques | |
WO2023034631A1 (en) | Systems for interpreting a digit-to-digit gesture by a user differently based on roll values of a wrist- wearable device worn by the user, and methods of use thereof | |
WO2017110178A1 (en) | Information processing device, information processing method, and program | |
WO2023244851A1 (en) | Systems for detecting in-air and surface gestures available for use in an artificial-reality environment using sensors at a wrist-wearable device, and methods of use thereof | |
EP4439249A1 (en) | Easy-to-remember interaction model using in-air hand gestures to control artificial-reality headsets, and methods of use thereof |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 22782616 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 202280059968.4 Country of ref document: CN |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2022782616 Country of ref document: EP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 2022782616 Country of ref document: EP Effective date: 20240403 |