CN112114688A - Electronic device for rotating a graphical object presented on a display and corresponding method - Google Patents
Electronic device for rotating a graphical object presented on a display and corresponding method Download PDFInfo
- Publication number
- CN112114688A CN112114688A CN201910536301.9A CN201910536301A CN112114688A CN 112114688 A CN112114688 A CN 112114688A CN 201910536301 A CN201910536301 A CN 201910536301A CN 112114688 A CN112114688 A CN 112114688A
- Authority
- CN
- China
- Prior art keywords
- touch
- touch input
- sensitive display
- processors
- graphical object
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0346—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
-
- G—PHYSICS
- G02—OPTICS
- G02F—OPTICAL DEVICES OR ARRANGEMENTS FOR THE CONTROL OF LIGHT BY MODIFICATION OF THE OPTICAL PROPERTIES OF THE MEDIA OF THE ELEMENTS INVOLVED THEREIN; NON-LINEAR OPTICS; FREQUENCY-CHANGING OF LIGHT; OPTICAL LOGIC ELEMENTS; OPTICAL ANALOGUE/DIGITAL CONVERTERS
- G02F1/00—Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics
- G02F1/01—Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics for the control of the intensity, phase, polarisation or colour
- G02F1/13—Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics for the control of the intensity, phase, polarisation or colour based on liquid crystals, e.g. single liquid crystal display cells
- G02F1/133—Constructional arrangements; Operation of liquid crystal cells; Circuit arrangements
- G02F1/1333—Constructional arrangements; Manufacturing methods
- G02F1/13338—Input devices, e.g. touch panels
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/04817—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/16—Sound input; Sound output
- G06F3/167—Audio in a user interface, e.g. using voice commands for navigating, audio feedback
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Nonlinear Science (AREA)
- Audiology, Speech & Language Pathology (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Mathematical Physics (AREA)
- Chemical & Material Sciences (AREA)
- Crystallography & Structural Chemistry (AREA)
- Multimedia (AREA)
- Optics & Photonics (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The present invention relates to an electronic device and a corresponding method for rotating a graphical object presented on a display. An electronic device includes a housing defining a first major surface separated from a second major surface by one or more minor surfaces. A touch sensitive display is located on the first major surface. The touch-sensitive surface is located on the second major surface. The one or more processors cause the touch-sensitive display to present a graphical object that visually represents the rotatable item. When the touch-sensitive display detects a first touch input that occurs in a first direction on the touch-sensitive display and the touch-sensitive surface detects a second touch input that occurs in a second direction on the touch-sensitive surface that is opposite the first direction, the one or more processors may rotate the rotatable item in response to the first touch input and the second touch input. Translation of the rotatable article may also occur.
Description
Background
Technical Field
The present disclosure relates generally to electronic devices and, more particularly, to electronic devices having front and rear displays.
Background
The use of portable electronic devices has become very widespread. The vast majority of people carry smartphones, tablet computers or laptop computers everyday to communicate with others, maintain messengers, consume entertainment and manage their lives.
As the technology incorporated into these portable electronic devices becomes more advanced, they have their feature sets. Modern smart phones have more computing power than desktop computers a few years ago. Furthermore, while early portable electronic devices included physical keyboards, most modern portable electronic devices included touch-sensitive displays. It would be advantageous to have an improved electronic device that allows for more intuitive use of these new features.
Drawings
Fig. 1 illustrates an exemplary electronic device in accordance with one or more embodiments of the present disclosure.
Fig. 2 illustrates one exemplary method in accordance with one or more embodiments of the present disclosure.
Fig. 3 illustrates one or more method steps in accordance with one or more embodiments of the present disclosure.
Fig. 4 illustrates one or more method steps in accordance with one or more embodiments of the present disclosure.
Fig. 5 illustrates another exemplary method according to one or more embodiments of the present disclosure.
Fig. 6 illustrates one or more method steps in accordance with one or more embodiments of the present disclosure.
Fig. 7 illustrates another exemplary method according to one or more embodiments of the present disclosure.
Fig. 8 illustrates yet another exemplary method in accordance with one or more embodiments of the present disclosure.
Fig. 9 illustrates one or more functions suitably performed in response to a graphical object visually appearing to rotate in accordance with one or more embodiments of the present disclosure.
Fig. 10 illustrates one or more method steps in accordance with one or more embodiments of the present disclosure.
Fig. 11 illustrates one or more method steps in accordance with one or more embodiments of the present disclosure.
Fig. 12 illustrates one or more embodiments of the present disclosure.
Skilled artisans will appreciate that elements in the figures are illustrated for simplicity and clarity and have not necessarily been drawn to scale. For example, the dimensions of some of the elements in the figures may be exaggerated relative to other elements to help improve understanding of embodiments of the present disclosure.
Detailed description of the drawings
Before describing in detail embodiments that are in accordance with the present disclosure, it should be observed that the embodiments reside primarily in combinations of method steps and apparatus components related to presenting graphical objects representing a rotatable article on a display, detecting touch input on both a front display and a rear display, and visually causing the rotatable article to appear to rotate in response to the touch input. Any process descriptions or blocks in flow charts should be understood as representing modules, segments, or portions of code which include one or more executable instructions for implementing specific logical functions or steps in the process.
Including alternative embodiments, and it will be apparent that the functions may be performed in an order different than illustrated or discussed, including substantially concurrently or in reverse order, depending on the functionality involved. Accordingly, the apparatus components and method steps have been represented where appropriate by conventional symbols in the drawings, showing only those specific details that are pertinent to understanding the embodiments of the present disclosure so as not to obscure the disclosure with details that will be readily apparent to those of ordinary skill in the art having the benefit of the description herein.
Embodiments of the present disclosure do not describe the implementation of any general business method intended to process business information, nor apply known business processes to the particular technical environment of the internet. Furthermore, embodiments of the present disclosure do not create or alter associations that use general computer functionality and conventions of conventional network operation. Rather, embodiments of the present disclosure employ approaches that, when applied to electronic devices and/or user interface technologies, improve the functionality of the electronic devices themselves and improve the overall user experience to overcome problems that arise particularly in the art relating to user interaction with electronic devices.
It will be appreciated that embodiments of the disclosure described herein may be comprised of one or more conventional processors and unique stored program instructions that control the one or more processors to implement, in conjunction with certain non-processor circuits, some, most, or all of the functions of detecting a first touch input on a first touch-sensitive display, a second touch input on a second touch-sensitive surface, or a display, and rotating a graphical object configured as a rotatable item in response to the first touch input and the second touch input described herein. The non-processor circuits may include, but are not limited to, a radio receiver, a radio transmitter, signal drivers, clock circuits, power source circuits, and user input devices.
As such, these functions may be interpreted as steps of a method: the method translates the rotatable item in a first direction from a first position to a second position displaced relative to the first position, and visually appears to rotate into and/or out of the touch-sensitive display, using one or more processors operable with the touch-sensitive display and the touch-sensitive surface. Alternatively, some or all functions could be implemented by a state machine that has no stored program instructions, or in one or more Application Specific Integrated Circuits (ASICs), in which each function or some combinations of certain of the functions are implemented as custom logic. Of course, a combination of these two approaches may be used. Thus, methods and means for these functions have been described herein. Moreover, it is expected that one of ordinary skill, notwithstanding possibly significant effort and many design choices motivated by, for example, available time, current technology, and economic considerations, when guided by the concepts and principles disclosed herein will be readily capable of generating such software instructions and programs and ASICs with minimal experimentation.
Embodiments of the present disclosure are described in detail below. Referring to the drawings, like numbers indicate like parts throughout the views. As used in the specification herein and throughout the claims, the following terms take the meanings explicitly associated herein, unless the context clearly dictates otherwise: the meaning of "a", "an" and "the" includes plural references, and the meaning of "in …" includes "in …" and "on … (on)". Relational terms such as first and second, top and bottom, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions.
As used herein, directional terms such as "upward," "downward," "vertical," "horizontal," and the like are intended to refer to the environment of the described electronic device. For example, a graphical object representing a rotatable item may be presented on a touch sensitive display or surface, where the touch sensitive display is shown in elevation around the defined X, Y and Z axes. In these examples, the X-Y plane will define a horizontal plane, where the direction out of the page is defined as the negative Y direction and the direction into the page is defined as the positive Y direction. Upward will be defined as the positive Z direction and downward will be defined as the negative Z direction. Thus, as described below, when the rotatable article visually rotates "into the display" or "into the device" or "into the touch-sensitive surface," this means that the visual rotation of the rotatable article occurs about an axis that lies in the X-Z plane (when presented in elevation), or about an axis in the X-Y plane, but tilting the Y axis (when presented in perspective), a portion of the rotatable article on one side of the axis appears to move in the positive Y direction, while the other portion on the other side of the axis appears to rotate in the negative Y direction, and so on.
As used herein, components may be "operably coupled" when information may be transmitted between the components, even though one or more intervening or intervening components may be present between or along the connection paths. The terms "substantially", "essentially", "approximately", "about" or any other form thereof are defined as being close to the meaning as understood by one of ordinary skill in the art, and in one non-limiting embodiment the term is defined to be within 10%, in another embodiment within 5%, in another embodiment within 1%, and in another embodiment within 0.5%. The term coupled, as used herein, is defined as connected, although not necessarily directly, and not necessarily mechanically. Further, reference numerals shown in parentheses herein denote components shown in the figures other than the figures in discussion. For example, talking about a device (10) while discussing figure a would refer to an element 10 shown in a figure other than figure a.
Embodiments of the present disclosure contemplate the ability to control objects presented on a display of an electronic device using multi-touch actions and gestures to increase the functionality of the device. Furthermore, embodiments of the present disclosure contemplate a strong need for the ability to use touch input delivered at multiple locations to control graphical objects, user actuation targets, and other items presented on a display, as it serves to improve the overall user experience to overcome problems that arise in the art particularly in connection with user interaction with electronic devices.
Accordingly, embodiments of the present disclosure advantageously provide novel and useful multi-touch actions applicable to electronic devices having a touch-sensitive display located on a first major surface of the electronic device and a second touch-sensitive surface located on a second major surface of the electronic device. In one or more embodiments, the touch-sensitive surface is configured as a second rear-facing touch-sensitive display in one or more embodiments. In other embodiments, the touch-sensitive surface may be configured as a touch sensor without graphical indicia rendering capabilities.
In one or more embodiments, the graphical object is presented on a touch-sensitive display. In one or more embodiments, the graphical object represents a mechanically rotatable item. In one or more embodiments, the axis of rotation about which the rotatable article rotates is aligned with an axis that lies, for example, in the X-Z plane in the plane defined by the display. In other embodiments, such as when presented in an isometric view, the axis of rotation is aligned in the X-Y plane, but oriented obliquely with respect to the Y axis. Other directions of the rotation axis will be described below. Other will be apparent to those of ordinary skill in the art having the benefit of this disclosure.
In one or more embodiments, a touch sensitive display detects a first touch input. In one or more embodiments, a touch-sensitive display detects a first touch input that occurs in a first direction. In one or more embodiments, the first touch input at least partially traverses through the graphical object. In one or more embodiments, the touch-sensitive surface detects a second touch input. In one or more embodiments, the touch-sensitive surface detects a second touch input that occurs in a second direction. In one or more embodiments, the second direction is opposite the first direction.
In one or more embodiments, one or more processors operable with the touch-sensitive display and the touch-sensitive surface visually cause the graphical object to appear to rotate in response to the first touch input and the second touch input. Where the graphical object is configured as, for example, a crown of a watch, in one or more embodiments, the one or more processors may rotate the crown into or out of the display in response to the first touch input and the second touch input.
Thus, in accordance with one or more embodiments of The present disclosure, a touch-sensitive display may present graphical objects configured as "dial widgets" similar to a scroll wheel, crown, performance "The Price is RightTM"of" Showcase ShowdownTMAn "upper" bull wheel, "or other rotatable item. In one or more embodiments, a user may touch the front of the electronic device and the back of the electronic device to pass a first touch input and a second touch input that "grab" and rotate the dial widget to rotate the dial widget into or out of the display in response to the first touch input and the second touch input. Returning to the illustration of the watch crown, the methods and apparatus described below utilize dual-surface multi-touch user input to allow a user to visually simulate rotation of the watch crown into or out of the front touch-sensitive display with front/back surface touch input.
The methods, systems, and devices described herein advantageously provide a more intuitive rotational user input experience than systems that allow an object to rotate about an axis oriented parallel to the Y-axis (orthogonal to the display). In contrast to these "Y-axis only" rotation systems, embodiments of the present disclosure advantageously allow a person to perform rotation of graphical objects presented on a touch-sensitive display using familiar hand motions, where the thumb is located at the front of the device and the fingers are located at the back of the device.
In addition to rotating the graphical object into or out of the display in response to the first touch input and the second touch input, embodiments of the present disclosure may also facilitate a panning operation. In one or more embodiments, the touch-sensitive display and the touch-sensitive surface may present a graphical object representing the rotatable item at a first location on the touch-sensitive display. Thereafter, in one or more embodiments, the touch-sensitive display may detect a first touch input at least partially through the graphical object. In one or more embodiments, the first touch input occurs in a first direction.
Also, in one or more embodiments, the touch-sensitive surface may detect a second touch input. In one or more embodiments, the second touch input also occurs in the first direction. In one or more embodiments, upon occurrence of the first and second touch inputs, the one or more processors operable with the touch-sensitive display and the touch-sensitive surface may translate the rotatable item from a first position to a second position displaced from the first position in the first direction. Continuing with the example of a watch crown, this detection and translation occurrence may allow a user to virtually grasp and "pull" or "push" the crown stem from the graphical object depicting the watch.
Thereafter, the touch-sensitive surface may detect a third touch input that occurs and that at least partially traverses the graphical object. In one or more embodiments, the third touch input occurs in a second direction different from the first direction of the first touch input or the second touch input. In one or more embodiments, the touch-sensitive surface may then detect a fourth touch input that occurs in a third direction that is opposite the second direction. In one or more embodiments where this occurs, the one or more processors may cause the graphical object to visually appear to rotate in response to the second touch input and the third touch input. Thus, after virtually "pulling" the crown out of the virtual representation of the watch, the user can rotate the handle into or out of the display and then push the crown back to return it to its idle position. Other uses for embodiments of the present disclosure will be apparent to those of ordinary skill in the art having the benefit of the present disclosure.
Turning next to fig. 1, illustrated therein is an exemplary electronic device 100 configured in accordance with one or more embodiments of the present disclosure. It should be noted that the electronic device 100 may be one of various types of devices. In one embodiment, the electronic device 100 is a portable electronic device, one example of which is a smartphone, which will be used in the figures for illustration purposes. However, it will be apparent to those of ordinary skill in the art having the benefit of this disclosure that the electronic device 100 may also be other types of devices, including a palm top computer, a tablet computer, a gaming device, a media player, a wearable device, or other portable wireless communication device. Other devices will also be apparent to those of ordinary skill in the art having the benefit of this disclosure.
Also shown in fig. 1 is an exemplary schematic block diagram 102 of the exemplary electronic device 100 of fig. 1. It should be understood that fig. 1 is provided for exemplary purposes only and to illustrate components of one electronic device 100 in accordance with embodiments of the present disclosure, and is not intended to be a complete schematic block diagram 102 of the various components that may be included in electronic device 100. Accordingly, other electronic devices in accordance with embodiments of the present disclosure may include various other components not shown in fig. 1, or may include a combination of two or more components, or divide a particular component into two or more separate components, and still be within the scope of the present disclosure.
In one or more embodiments, the schematic block diagram 102 is configured as a printed circuit board assembly disposed within a housing 103 of the electronic device 100. The various components may be electrically coupled together by conductors or buses arranged along one or more printed circuit boards.
The exemplary schematic block diagram 102 of fig. 1 includes many different components. Embodiments of the present disclosure contemplate that the number and arrangement of these components may vary depending on the particular application. Accordingly, an electronic device configured according to embodiments of the present disclosure may include some components not shown in fig. 1, and may not require other components shown, and thus may be omitted.
In one or more embodiments, the housing 103 of the electronic device 100 defines a first major surface 104 and a second major surface 105. In one or more embodiments, the first major surface 104 and the second major surface 105 are separated by one or more minor surfaces 106, 107. In one or more embodiments, the user interface of the electronic device 100 includes a touch-sensitive display 101 located on the first major surface 104 of the electronic device 100. In one or more embodiments, the user interface further includes a touch-sensitive surface 108 located on the second major surface 105 of the electronic device 100.
In one or more embodiments, the touch-sensitive surface 108 comprises a second touch-sensitive display. So configured, information, graphical objects, user actuation targets, and other graphical indicia may be presented on the front side of the electronic device 100 using the touch-sensitive display 101, or on the back side of the electronic device using a second sensitive display. However, the touch-sensitive surface 108 may take other forms as would be apparent to one of ordinary skill in the art having the benefit of this disclosure. In other embodiments, the touch-sensitive surface 108 will be a surface that may not present graphical indicia, for example, but simply detects touch input received from a user's finger, stylus, or other object.
In one or more embodiments, with respect to being touch sensitive, each of touch sensitive display 101 and touch sensitive surface 108 includes a respective touch sensor. As shown in FIG. 1, in one or more embodiments, the touch sensitive display 101 includes a first touch sensor 109 and the touch sensitive surface 108 includes a second touch sensor 110.
In one or more embodiments, each of the first touch sensor 109 and the second touch sensor 110 can include any one of a capacitive touch sensor, an infrared touch sensor, a resistive touch sensor, another touch sensitive technology, or a combination thereof. A capacitive touch sensitive device includes a plurality of capacitive sensors, such as electrodes, disposed along a substrate. So configured, each capacitive sensor may be configured, in conjunction with associated control circuitry, e.g., one or more processors 112 operable with the touch-sensitive display 101 and the touch-sensitive surface 108, to detect an object that is proximate to, or touching, the surface of the touch-sensitive display 101 and/or the surface of the touch-sensitive surface 108 by establishing electric field lines between the pair of capacitive sensors and then detecting perturbations of the electric field lines.
The electric field lines may be established according to a periodic waveform such as a square wave, a sine wave, a triangular wave, or other periodic waveform emitted by one sensor and detected by another sensor. For example, a capacitive sensor may be formed by arranging indium tin oxide patterned as electrodes on a substrate. Indium tin oxide can be used in such systems because it is transparent and conductive. Furthermore, it can be deposited in thin layers by a printing process. The capacitive sensor may also be deposited on the substrate by electron beam evaporation, physical vapor deposition, or other various sputter deposition techniques.
In one or more embodiments, a user may communicate user input to touch-sensitive display 101 and/or touch-sensitive surface 108 by communicating touch input from a finger, stylus, or other object disposed proximate to touch-sensitive display 101 and/or touch-sensitive surface 108. In one embodiment, the touch sensitive display 101, and optionally the touch sensitive surface 108 when configured as a second touch sensitive display, is configured as an Active Matrix Organic Light Emitting Diode (AMOLED) display. However, it should be noted that other types of displays, including liquid crystal displays, are suitable for use with the user interface and will be apparent to those of ordinary skill in the art having the benefit of this disclosure.
In addition to the touch-sensitive display 101 and/or the touch-sensitive surface 108, other features may be located on either the first major surface 104 or the second major surface 105. User interface components, such as buttons or other control devices, for example, may also be disposed on the first major surface 104 or the second major surface 105 to facilitate additional control of the electronic device 100. Other features may be added and may be located at the front of the housing 103, at the sides of the housing 103, or at the rear of the housing 103. For example, in one or more embodiments, the image capture device 117 or speaker 118 can be located on the first major surface 104 or the second major surface 105.
In one embodiment, the electronic device includes one or more processors 112. In one embodiment, the one or more processors 112 may include an application processor, and optionally one or more auxiliary processors. One or both of the application processor or the auxiliary processor may include one or more processors. One or both of the application processor or the auxiliary processor may be a microprocessor, a set of processing components, one or more ASICs, programmable logic, or other types of processing devices.
The application processor and the auxiliary processor may operate with various components in the schematic block diagram 102. Each of the application processor and the auxiliary processor may be configured to process and execute executable software code to implement various functions of the electronic device with which the schematic block diagram 102 operates. For example, in one embodiment, the one or more processors 112 include one or more circuits operable to present display information, such as images, text, and video, on the touch-sensitive display 101, and optionally on the touch-sensitive surface 108 configured as a touch-sensitive display. A storage device, such as memory 111, may optionally store executable software code used by the one or more processors 112 during operation.
As shown in FIG. 1, one or more processors 112 present content 119 on touch-sensitive display 101. The illustrated content 119 includes one or more graphical objects, such as graphical object 120, which is an image of a Barster dog. In one or more embodiments, such content 119 is retrieved from one or more remote servers using the communication circuitry 113. The content 119 may also be retrieved locally. Content 119 may include one or more user actuated targets 121 that a user may touch to perform operations such as launching an application, opening a web page, navigating to a different screen, and the like.
In the exemplary embodiment, schematic block diagram 102 also includes communication circuitry 113, which may be configured for wired or wireless communication with one or more other devices or networks. The network may include a wide area network, a local area network, and/or a personal area network. Examples of wide area networks include GSM, CDMA, W-CDMA, CDMA-2000, iDEN, TDMA, 2.5 generation 3GPP GSM networks, 3 rd generation 3GPP WCDMA networks, 3GPP Long Term Evolution (LTE) networks, and 3GPP2CDMA communication networks, UMTS networks, E-UTRA networks, GPRS networks, iDEN networks, and others. The communication circuit 113 may also communicate using wireless techniques such as, but not limited to, point-to-point or ad hoc network communications such as home radio frequency (HomeRF), bluetooth, and IEEE 802.11(a, b, g, or n); and other forms of wireless communication such as infrared technology. The communication circuit 113 may include a wireless communication circuit, one of a receiver, transmitter, or transceiver, and one or more antennas.
In one embodiment, one or more processors 112 may be responsible for performing the primary functions of the electronic device with which the schematic block diagram 102 may operate. For example, in one embodiment, the one or more processors 112 include one or more circuits operable with the touch-sensitive display 101, and optionally the touch-sensitive surface 108 configured as a rear-facing display, to present display information to a user. Executable software code used by the one or more processors 112 may be configured as one or more modules 114 operable with the one or more processors 112. Such modules 114 may store instructions, control algorithms, and the like.
The audio input/processor may operate with one or more predefined authentication references stored in the memory 111. The predefined authentication reference may include a representation of a base speech model, a representation of a trained speech model, or other representation of a predefined audio sequence used by the audio input/processor to receive and recognize speech commands received with audio input captured by the audio capture device. In one embodiment, the audio input/processor may include a speech recognition engine. Regardless of the specific implementation used in various embodiments, the audio input/processor may access various speech models stored with predefined authentication references to recognize the voice command. The audio input/processor may also include one or more audio input devices, such as one or more microphones.
In one or more embodiments, the other components 115 can include various sensors operable with the one or more processors 112. These sensors may include a geo-locator that functions as a location detector, a direction detector that determines a direction and/or direction of movement of the electronic device 100 in three-dimensional space, an imager, a face analyzer, an environment analyzer, and a gaze detector.
The context engine 116 may then operate with the other components 115 to detect, infer, capture, and otherwise determine people and actions occurring in the environment surrounding the electronic device 100. For example, where included, one embodiment of the context engine 116 uses an adjustable algorithm that employs contextual evaluation of information, data, and events to determine the context and framework of evaluation. These evaluations may be learned by repeated data analysis. Alternatively, the user may use the user interface to enter various parameters, constructs, rules, and/or paradigms that command or otherwise direct the context engine 116 to detect multimodal social cues, emotional states, emotions, and other contextual information. In one or more embodiments, the context engine 116 may include an artificial neural network or other similar technology.
Now that the various hardware components have been described, attention will be directed to methods of using an electronic device in accordance with one or more embodiments of the present disclosure. Turning now to FIG. 2, an exemplary method 200 for using the electronic device (100) of FIG. 1 to quickly, easily, and simply visually rotate a graphical object representing a rotating article and having an axis of rotation that is one of in an X-Z plane, parallel to the X-Z plane, or in an X-Y plane and tilted with respect to the Y axis in response to touch input applied to a touch-sensitive display (101) and a touch-sensitive surface (108) in accordance with one or more embodiments of the present invention is shown.
Beginning at step 201, method 200 includes presenting a graphical object representing a rotatable item with a touch-sensitive display located on a first major surface of an electronic device. In one or more embodiments, the electronic device further includes a touch-sensitive display located on the rear major surface of the electronic device. So configured, i.e. where the touch sensitive surface on the back of the electronic device is configured as another touch sensitive display, step 201 comprises causing the other touch sensitive display to present another graphical object visually representing the rotatable item on the other touch sensitive display.
In one or more embodiments, step 201 includes presenting the rotatable article as configured to rotate about an axis of rotation passing through the right and left sides of the rotatable article. In one or more embodiments, the rotatable item is presented on the touch-sensitive display in one of a front view, a top view, a bottom view, a back view, or an isometric view. However, in one or more embodiments, the rotatable item is never presented in a right or left view.
By presenting the rotatable item in this manner at step 201, the axis of rotation is presented in one of lying in an X-Z plane, parallel to the X-Z plane, or lying in an X-Y plane and tilted with respect to the Y axis, thereby allowing the rotatable item to visually appear to rotate into and/or out of the display in response to user input, rather than parallel to the plane defined by the display. This will be explained in more detail below with reference to fig. 3 to 4.
The rotatable article may take any of a variety of forms. As mentioned above, the rotatable article may be a generic rotatable gadget, game show wheel, casino game, lever, dial, crown, knob, or other object. As will be shown in more detail below with reference to fig. 3, 4, 6, and 10, in one or more embodiments, the rotatable item comprises a representation of a graduated cylindrical object on the touch-sensitive display. As will be described in greater detail below with reference to fig. 11, in one or more embodiments, when the display of the graphical object that occurs at step 201 includes presenting a representation of a watch on the touch-sensitive display and the touch-sensitive surface, the rotatable article may include a crown of the watch.
Other rotatable articles will be apparent to those of ordinary skill in the art having the benefit of this disclosure. For example, in another fanciful embodiment, the rotatable item may be an item that must be rotated to engage with another item, such as a key that rotates when inserted into a lock, a light bulb that rotates into a socket, or a cap that rotates when coupled or decoupled from a bottle.
At step 202, method 200 detects, with a touch-sensitive display, a first touch input occurring at the touch-sensitive display. In one or more embodiments, step 202 includes determining a direction in which the first touch input occurred. In one or more embodiments, step 202 includes detecting a first touch input at least partially interacting with or passing through a graphical object representing a rotatable item. In one or more embodiments, step 202 optionally includes determining a first distance that the first touch input occurred along the direction in which the first touch input occurred.
At step 203, method 200 includes detecting, with a touch-sensitive surface located on a second major surface of the electronic device, a second touch input occurring at the touch-sensitive surface. In the exemplary embodiment of fig. 2, the touch sensitive surface that detects the touch input is configured as another touch sensitive display. As will be described in more detail below with reference to fig. 5-6, in other embodiments, the touch-sensitive surface is a surface without graphical indicia display capability. Where the touch-sensitive surface that detects touch input at step 203 is configured as a touch-sensitive display, as is the case in the exemplary embodiment of fig. 2, step 201 may include not only the presentation of graphical objects representing rotatable items on the touch-sensitive display, but also the presentation of graphical objects on the touch-sensitive surface.
In one or more embodiments, step 203 includes determining a direction in which the first touch input occurred. In one or more embodiments, step 203 comprises detecting a first touch input at least partially interacting with or passing through a graphical object representing the rotatable item. In one or more embodiments, step 203 optionally includes determining a second distance that the second touch input occurred along the direction in which the second touch input occurred.
In one or more embodiments, decision 204 determines whether a first direction of a first touch input occurring on the touch-sensitive display and a second touch direction of a second touch input on the touch-sensitive surface comprise direction vectors oriented in opposite directions. In other words, in one or more embodiments, decision 204 includes determining whether a first direction of a first touch input occurring on the touch-sensitive display and a second direction of a second touch input on the touch-sensitive surface are at least partially opposite or completely opposite. In one or more embodiments, where they are at least partially or fully opposite, the method moves to optional decision 205. In the event they are not, a control operation is performed according to a common direction in which a first touch input on the touch sensitive display and a second touch input on the touch sensitive surface occur, step 206.
At optional decision 205, method 200 optionally includes determining, with the one or more processors, whether at least a portion of the first touch input detected at step 202 and at least a portion of the second touch input detected at step 203 occur simultaneously. In the event that this optional decision 205 is included and a portion of the first touch input detected at step 202 and a portion of the second touch input detected at step 203 do not occur simultaneously, a control operation is performed at step 206 in accordance with the sequentially occurring first and second touch inputs. However, in the event that this optional decision 205 is included and a portion of the first touch input detected at step 202 and a portion of the second touch input detected at step 203 do occur simultaneously, the method moves to step 207.
In step 207, in one or more embodiments, one or more processors of the electronic device cause the graphical object to visually appear to rotate. In one or more embodiments, where step 202 comprises determining a distance at which the first touch input occurred and step 203 comprises determining another distance at which the second touch input occurred, step 207 comprises the one or more processors visually appearing to rotate the graphical object by an amount of rotation that is a function of the first distance, the second distance, or a combination thereof.
By way of illustration, in one or more embodiments, step 207 may comprise the one or more processors visually causing the graphical object to appear to be rotated by an amount of rotation proportional to the first distance. In another embodiment, step 207 may include the one or more processors visually causing the graphical object to appear to be rotated by an amount of rotation proportional to the second distance. In yet another embodiment, step 207 may include the one or more processors visually causing the graphical object to appear to be rotated by an amount of rotation proportional to an average of the first distance and the second distance. In yet another embodiment, step 207 may include the one or more processors visually causing the graphical object to appear to be rotated by an amount of rotation proportional to a weighted average of the first distance and the second distance. Other metrics for determining how much the one or more processors should visually make the graphical object appear to rotate will be apparent to those of ordinary skill in the art having the benefit of this disclosure.
In one or more embodiments, step 208 includes one or more processors of the electronic device performing the control operations. By way of illustration, in one or more embodiments, step 208 comprises one or more processors performing the adjustment operation. Turning briefly to fig. 9, some adjustments that may be made at step 208 are illustrated.
In one or more embodiments, one or more processors of the electronic device may perform a volume adjustment operation by adjusting the volume 901 of the electronic device to perform an adjustment operation. By way of example, when one or more processors of the electronic device cause the graphical object to visually appear to rotate, this may represent a volume knob that is turned in response to the first touch input and the second touch input. In this case, the one or more processors may adjust the volume 901 up or down depending on the direction of each of the first touch input or the second touch input. Further, in one or more embodiments, the one or more processors may adjust the volume 901 by an adjustment magnitude that is proportional to the amount of rotation, which, as described above, may be proportional to the first distance, the second distance, an average of the first distance and the second distance, a weighted average of the first distance and the second distance, or other mathematical combination of the first distance and the second distance. Other techniques for determining the magnitude of the adjustment will be apparent to those of ordinary skill in the art having the benefit of this disclosure.
In one or more embodiments, the one or more processors of the electronic device may perform the adjustment operation by adjusting the time setting 902 of the electronic device. By way of example, when one or more processors of the electronic device cause the graphical object to visually appear to rotate, this may represent a crown that is turned in response to the first touch input and the second touch input. Alternatively, this may represent a dial of a digital watch that is turned in response to the first input and the second input. In yet other embodiments, this may represent a graduated digit wheel (dial wheel) rotation, where at least one graduated digit wheel represents an hour, another graduated digit wheel represents a ten digit of a minute, and another graduated digit wheel represents a unit number of minutes, shown as X: YZ, where X represents an hour, Y represents a ten digit of a minute, and Z represents a unit number of minutes. In other embodiments, the tens of minutes and the units of minutes may be combined into a single graduated dial wheel.
In one or more embodiments, where one of the scenes is presented and a first touch input and a second touch input are received, the one or more processors can adjust the time setting 902 up or down according to each direction of the first touch input or the second touch input. Further, in one or more embodiments, the one or more processors can adjust the time setting 902 by an adjustment magnitude that is proportional to the amount of rotation. For example, where the amount of rotation is proportional to the first distance, the second distance, an average of the first and second distances, a weighted average of the first and second distances, or other mathematical combination of the first and second distances, the number of hours, the tens of minutes, the units of minutes, or the combination of the tens and units of minutes may be adjusted in proportion to the magnitude of the first and second distances.
In one or more embodiments, the one or more processors of the electronic device may perform an adjustment operation by adjusting the display brightness 903 of the electronic device in response to the first touch input and the second touch input, thereby performing a light emission output adjustment operation. By way of example, when one or more processors of the electronic device cause the graphical object to visually appear to rotate, this may represent a rotatable dimmer switch that turns in response to the first touch input and the second touch input. In this case, the one or more processors may adjust the display brightness 903 or illumination level of the display up or down depending on the direction of each of the first touch input or the second touch input. As described above, in one or more embodiments, the one or more processors can adjust the display brightness 903 by an adjustment magnitude proportional to the amount of rotation as mentioned above.
In one or more embodiments, the one or more processors of the electronic device may perform the adjustment operation by performing zoom adjustment operation 904 with an imager of the electronic device. By way of illustration, when one or more processors of the electronic device cause the graphical object to visually appear to rotate, this may represent a traditional cylindrical lens (cylindrical lens) that rotates in response to the first touch input and the second touch input. In this case, the one or more processors may cause the imager of the electronic device to zoom in or out depending on the direction of each of the first touch input or the second touch input. In one or more embodiments, the one or more processors may adjust the amount of zoom by an adjustment magnitude that is proportional to the amount of rotation, as described above.
In one or more embodiments, one or more processors of the electronic device may perform the adjustment operation by adjusting the alert output 905 of the electronic device. By way of example, when one or more processors of the electronic device cause the graphical object to visually appear to rotate, this may represent a control knob that is turned in response to the first touch input and the second touch input. In this case, the one or more processors may change upward or downward depending on the direction of each of the first touch input or the second touch input to adjust the output level of the alert, the alert being visual, audible, tactile, or a combination thereof.
In reviewing the above, in one or more embodiments, the electronic device can include various output devices such as a video output component, an auxiliary device, an audio output component, and/or a mechanical output component. In one or more embodiments, the output level, i.e., brightness, strobe rate, color or other visual output, sound selection, volume, sound tempo, timbre, tone or other audible output, or tactile intensity, tactile vibration signal, tactile tempo or other mechanical output, varies as a function of the first touch input and the second touch input. In one or more embodiments, the one or more processors may adjust the alert output 905 by an adjustment magnitude proportional to the amount of rotation, as described above.
In one or more embodiments, the one or more processors of the electronic device may perform the adjustment operation by adjusting the content selection 906 presented by the electronic device, thereby performing the content selection adjustment operation. By way of example, if the electronic device is outputting music content, video content, or other information content, in one or more embodiments, when one or more processors of the electronic device cause the graphical object to visually appear to rotate, this may represent a selector knob that is turned in response to the first touch input and the second touch input. In this case, the one or more processors may switch the music content selection, the video content selection, or the other information content selection to another content selection in response to the first touch input or the second touch input.
In one or more embodiments, one or more processors of the electronic device may perform the adjustment operation by adjusting a scroll operation 907 of the electronic device. By way of example, when one or more processors of the electronic device cause the graphical object to visually appear to rotate, this may represent a scroll knob that is turned in response to the first touch input and the second touch input. In this case, the one or more processors may scroll through one or more pictures, titles, songs, movies, television programs, dates of a calendar, email selections, text message selections, or other content selections in a forward scroll action, or a reverse scroll action, depending on the direction of each of the first touch input or the second touch input.
In one or more embodiments, the one or more processors may perform the scrolling operation 907 by an adjustment magnitude proportional to the amount of rotation, which, as described above, may be proportional to the first distance, the second distance, an average of the first distance and the second distance, a weighted average of the first distance and the second distance, or other mathematical combination of the first distance and the second distance. Other techniques for determining the magnitude of the adjustment will be apparent to those of ordinary skill in the art having the benefit of this disclosure.
It should be noted that the various control operations shown in fig. 9 are intended to illustrate only some of the myriad of control operations that may be performed in response to the method (200) shown in fig. 2, in which, after rendering a graphical object that visually represents the rotatable article, the touch-sensitive display detects a first touch input occurring in a first direction on the touch-sensitive display and a second touch input occurring in a second direction on the touch-sensitive surface, and thereafter, the one or more processors rotate the rotatable article in response to the first touch input and the second touch input occurring in the opposite direction. Many other control operations will be apparent to those of ordinary skill in the art having the benefit of this disclosure.
Returning now to FIG. 2, in one or more embodiments that include optional decision 205, step 208 occurs only when at least a portion of the first user input detected at step 202 and at least a portion of the second user input detected at step 203 occur simultaneously. Thus, in one or more embodiments, step 208 comprises: the graphical object is made to visually appear to rotate only when at least a portion of the first user input and at least a portion of the second user input occur simultaneously.
Turning now to fig. 3 to 4, there is shown an electronic device employing the method described above with reference to fig. 2. Beginning with fig. 3, an electronic device 100 is shown therein that includes a housing 103, the housing 103 defining a first major surface 104 separated from a second major surface 105 by one or more minor surfaces. The touch sensitive display 101 is located on the first major surface 104 and the touch sensitive surface 108 is located on the second major surface 105. As described above with reference to fig. 1, one or more processors (112) may operate with touch-sensitive display 101 and touch-sensitive surface 108.
As shown in step 301 of FIG. 3, in the exemplary embodiment, one or more processors (112) cause touch-sensitive display 101 to present graphical object 303 that visually represents the rotatable item. In this embodiment, the rotatable item comprises a graduated cylindrical object 304 on the touch sensitive display 101.
In the exemplary embodiment, rendered graphical object 303 is configured to rotate about a rotation axis 309 that passes through the right and left sides of graphical object 303. In this exemplary embodiment, the rotatable item is presented in a front view on the touch sensitive display 101. Thus, the rotation axis 309 is aligned in a plane defined by the touch sensitive display 101, e.g., the rotation axis 309 lies in the X-Z plane. Alternatively, in another embodiment, the axis of rotation 309 may lie in a plane parallel to the X-Z plane.
In this exemplary embodiment, touch-sensitive surface 108 is configured as a second touch-sensitive display located on second major surface 105 of electronic device 100. Accordingly, the one or more processors (112) cause the second touch-sensitive display to present another graphical object 313, which graphical object 313 in this example also represents the back side of the graphical object 303 on the back side of the electronic device 100.
In this exemplary embodiment, the touch sensitive display 101 presents the front of the rotatable item, while the touch sensitive surface 108 configured as a touch sensitive display presents the rear of the rotatable item. In this exemplary embodiment, the front portion of the rotatable article comprises the numbers 1, 2, 3, 4, 5, each located between corresponding graduations, while the rear portion of the rotatable article comprises the numbers 6, 7, 8, 9, 0, also located between corresponding graduations, respectively.
As shown in step 301, the touch sensitive display 101 detects a first touch input 305 occurring on the touch sensitive display 101. As seen by comparing step 301 and step 302, in this exemplary embodiment, the touch input 305 comprises a gesture input moving in a first direction 307 or having a direction vector oriented in the first direction 307.
As shown in step 301, the touch-sensitive surface detects a second touch input 306 that occurs on the touch-sensitive surface 108. As seen by comparing step 301 and step 302, in this exemplary embodiment, the second touch input 306 comprises another gesture input moving in a second direction 308 or having a direction vector oriented in the second direction 308.
In one or more embodiments, the one or more processors (112) confirm that the first touch input 305 and the second touch input 306 at least partially interact with, interface with, and/or traverse the graphical object 303. In fig. 3, this occurs when the first touch input 305 at least partially overlaps the graphical object 303 at the beginning and end of the gesture input. Similarly, the second touch input 306 at least partially overlaps another graphical object 313 at the beginning and end of the gesture input.
Thus, in this example, as the first touch input 305 and the second touch input 306 move in the first direction 307 and the second direction 308, they pass at least partially through the graphical object 303 and the further graphical object 313 as the first touch input 305 and the second touch input 306 physically grip and turn the rotatable item, with an upper portion of the rotatable item (above the rotation axis 309) moving out of the touch sensitive display 101 and a lower portion of the rotatable item (below the rotation axis 309) moving into the touch sensitive display 101.
In one or more embodiments, the one or more processors (112) rotate the rotatable item only when the first touch input 305 and/or the second touch input 306 at least partially pass through, interact with, or engage the rotatable item. Thus, in one or more embodiments, the rotatable item will rotate only if the first touch input 305 and the second touch input 306 virtually "grab" the rotatable item.
In one or more embodiments, the one or more processors (112) determine whether the movement is in the opposite direction by determining whether at least a portion of the first touch input 305 and at least a portion of the second touch input 306 pass through locations 310, 311 of the touch-sensitive display 101 and the touch-sensitive surface 108, wherein the locations 310, 311 are defined by a reference axis 312 that is orthogonally aligned with one or both of the touch-sensitive display 101 and/or the touch-sensitive surface 108. This effectively determines whether the first touch input 305 and the second touch input 306 "cross" at the reference axis 312 or "overlap" along the reference axis 312, as would be the case if the first touch input 305 and the second touch input 306 actually rotated the rotatable item into the touch sensitive display 101.
In one or more embodiments, the one or more processors (112) cause the graphical object 303 to visually appear to rotate only when at least a portion of the first touch input 305 and at least a portion of the second touch input 306 pass through the locations 310, 311. In other words, in one or more embodiments, the one or more processors (112) cause the graphical object 303 to rotate only when the first touch input 305 and the second touch input 306 overlap at some point at a reference axis 312 orthogonally oriented to the touch-sensitive display 101 during their movement or gesture along the touch-sensitive display 101 and the touch-sensitive surface 108, respectively. This occurs in this illustrative example, as shown at step 302. Thus, the one or more processors (112) rotate the rotatable item into the touch-sensitive display 101 in response to the first touch input 305 and the second touch input 306 occurring in opposite directions.
As seen by comparing step 301 and step 302, this causes the rotatable item presented on the touch sensitive display 101 at step 301 to change from presenting a number 1, 2, 3, 4, 5 between respective scales to presenting a number 2, 3, 4, 5, 6 between respective scales at step 302. Similarly, this causes the touch-sensitive surface 108 at step 301 to change from presenting the numbers 6, 7, 8, 9, 0 between the respective scales to presenting the numbers 1, 0, 9, 8, 7 between the respective scales at step 302. As shown, at step 302, the graphical object 303 has been rotated into the touch-sensitive display, with the upper portion of the rotatable item (above the axis of rotation 309) moving out of the touch-sensitive display 101 and the lower portion of the rotatable item (below the axis of rotation 309) moving into the touch-sensitive display 101. As shown, the graphical object 303 is rotated into the touch sensitive display. The article moves out of the touch sensitive display 101 and the lower part of the rotatable article (below the rotation axis 309) moves into the touch sensitive display 101.
In one or more embodiments, the one or more processors (112) optionally determine whether the first touch input 305 and the second touch input 306 occur simultaneously. Although the first touch input 305 and the second touch input 306 may occur continuously, with the rotatable item virtually pushed with the first touch input 305 and virtually pulled with the second touch input, in some embodiments, rotation occurs only when the first touch input 305 and the second touch input 306 virtually simultaneously grip the rotatable item. Thus, in one or more embodiments, the one or more processors (112) rotate the rotatable item only when at least some of the first touch inputs 305 occur simultaneously with at least some of the second touch inputs 306, as is the case with steps 301 and 302 in fig. 3.
In one or more embodiments, the one or more processors (112) and/or touch-sensitive display 101 determine a distance 314 that a gesture defined by first touch input 305 occurred across touch-sensitive display 101. Similarly, the one or more processors (112) and/or the touch-sensitive surface 108 may determine a distance 315 that a gesture defined by the second touch input 306 occurs across the touch-sensitive surface 108. In one or more embodiments, where this occurs, the one or more processors (112) may cause graphical object 303 and/or additional graphical object 313 to visually appear to be rotated about rotational axis 309 by an amount of rotation 316, the amount of rotation 316 being a function of first distance 314, second distance 315, or a combination thereof.
In the exemplary embodiment of fig. 3, the one or more processors (112) visually make the graphical object 303 and the further graphical object 313 appear to be rotated by an amount of rotation 316, the amount of rotation 316 being proportional to an average of the first distance 314 and the second distance 315. However, embodiments of the present disclosure are not limited thereto. In other embodiments, rotation amount 316 is only proportional to first distance 314. In yet other embodiments, amount of rotation 316 is only proportional to second distance 315. In other embodiments, rotation amount 316 is proportional to a weighted average of first distance 314 and second distance 315. Other metrics for determining how much the one or more processors should visually make the graphical object 303 and/or the further graphical object 313 appear to be rotated will be apparent to those of ordinary skill in the art having the benefit of this disclosure.
Turning now to fig. 4, there is shown an electronic device 100, again comprising a housing 103, the housing 103 defining a first major surface 104 separated from a second major surface 105 by one or more minor surfaces. The touch sensitive display 101 is again located on the first major surface 104, while the touch sensitive surface 108 is located on the second major surface 105. One or more processors (112) may operate with touch-sensitive display 101 and touch-sensitive surface 108.
As shown in step 401, in the exemplary embodiment, one or more processors (112) cause touch-sensitive display 101 to present graphical object 404 that visually represents the rotatable item. In this embodiment, the rotatable item comprises a graduated cylindrical object 405 depicted on the touch sensitive display 101.
As with the embodiment of FIG. 3 described above, in FIG. 4, touch-sensitive surface 108 is configured as a second touch-sensitive display located on second major surface 105 of electronic device 100. Accordingly, the one or more processors (112) cause the second touch-sensitive display to present another graphical object 414, which graphical object 414, in this example, also represents the back side of the graphical object 404 on the back side of the electronic device 100.
In this exemplary embodiment, the rotatable item is presented on the touch-sensitive display 101 from a front, left perspective (as viewed on the touch-sensitive display 101). Thus, the axis of rotation 406 is oriented in the X-Y plane, but the axis of rotation 406 is aligned in a direction oblique to the Y axis. The touch sensitive display 101 presents the front, left portion of the rotatable article, while the touch sensitive surface 108 configured as a touch sensitive display presents the rear, right portion of the rotatable article. In this exemplary embodiment, the front portion of the rotatable article comprises the numbers 1, 2, 3, 4, each located between corresponding graduations, while the rear portion of the rotatable article comprises the numbers 6, 7, 8, 9, each also located between corresponding graduations.
As with the embodiment of fig. 3, the graphical object 404 presented in fig. 4 is configured to rotate about a rotation axis 406 passing through the right and left sides of the graphical object 404. As shown in steps 402, 403 when the graphical object 404 is rotated, the portion of the rotatable item above the axis of rotation 406 appears to move in the positive Y-direction, while the portion of the rotatable item below the axis of rotation 406 appears to rotate in the negative Y-direction.
As shown in step 402, the touch sensitive display 101 detects a first touch input 407 occurring on the touch sensitive display 101. In this exemplary embodiment, the touch-sensitive surface 108 simultaneously detects a second touch input 408 that occurs on the touch-sensitive surface 108. As shown in step 402, the first touch input 407 virtually grips the rotatable item in a first position, while the second touch input 408 virtually grips the rotatable item in a second position.
In the transition from step 402 to step 403, the first touch input 407 virtually rotates the rotatable item out of the touch-sensitive display 101, while the second touch input 408 virtually rotates the rotatable item out of the touch-sensitive surface 108. In particular, in the transition from step 402 to step 403, the first touch input 407 moves down and to the right along the touch sensitive display 101, travels a first distance 409 along the touch sensitive display 101 while engaging (and thus passing through) the graphical object 404. Similarly, the second touch input 408 moves up and to the left along the touch-sensitive surface 108, traveling a second distance 415 while engaging (and thus passing through) another graphical object 414.
In doing so, the first touch input 407 and the second touch input 408 move in opposite directions. Further, the first touch input 407 and the second touch input 408 pass through locations 410, 411 of the touch sensitive display 101 and the touch sensitive surface 108, the locations 410, 411 each being defined as an area around a reference axis 412, the reference axis 412 being aligned orthogonally to one or both of the touch sensitive display 101 and/or the touch sensitive surface 108. This confirms that the first touch input 407 and the second touch input 408 "cross" the reference axis 412 or "overlap" each other along a minor dimension (minor dimension) of the electronic device 100. Thus, the one or more processors (112) rotate the rotatable article about the axis of rotation 406.
This causes the rotatable item presented on the touch sensitive display 101 at step 402 to change from presenting the numbers 1, 2, 3, 4 between the respective scales to presenting the numbers 8, 9, 0, 1 between the respective scales at step 403, as seen by comparing step 402 and step 403. In the exemplary embodiment, the amount of rotation is a function of an average of a first distance 409 of the first touch input 407 through the touch-sensitive display 101 and a second distance 415 of the second touch input 408 through the touch-sensitive surface 108. The graphical object 404 is thus rotated into the touch sensitive display 101, as shown in step 403, wherein the upper part of the rotatable item (above the rotation axis 406) is moved to the right and out of the touch sensitive display 101, while the lower part of the rotatable item (below the rotation axis 406) is moved to the left and into the touch sensitive display 101.
Turning next to fig. 5, another method 500 configured in accordance with one or more embodiments of the present disclosure is illustrated. Beginning at step 501, method 500 includes presenting a graphical object representing a rotatable item with a touch-sensitive display located on a first major surface of an electronic device. In one or more embodiments, the rotatable article is presented as being rotatable about an axis of rotation passing through the right and left sides of the rotatable article.
As previously mentioned, the rotatable article may be presented with a rotational axis oriented in one of the X-Z plane (front or rear view, to top or bottom view), parallel to the X-Z plane (front or rear view, to top or bottom view, but visually replayed into the visual presentation of the display), or lying in the X-Y plane and tilted with respect to the Y axis (front, right isometric view; front, left isometric view; top, right isometric view; top, left isometric view; bottom, right isometric view; bottom, left isometric view; rear, right isometric view; right, front isometric view; right, rear isometric view; left, rear isometric view; right, top isometric view; right, bottom isometric view; left, top isometric view; or left, bottom isometric view). By presenting the rotatable article in this manner at step 501, the rotatable article visually appears to rotate into and/or out of the touch-sensitive display in response to user input, rather than parallel to the plane defined by the touch-sensitive display, as would be the case if the axis of rotation were along or parallel to the Y-axis.
As previously mentioned, the rotatable article may take any of a variety of forms. These include a generic rotatable gadget, game show wheel, casino game, light bulb, bottle cap, lever, dial, crown, graduated cylindrical object, knob, or other object. Other rotatable articles will be apparent to those of ordinary skill in the art having the benefit of this disclosure.
At step 502, method 500 detects, with a touch-sensitive display, a first touch input occurring at the touch-sensitive display. In one or more embodiments, step 502 includes determining a direction of movement of the first touch input. In one or more embodiments, step 502 includes detecting a first touch input at least partially interacting with or passing through a graphical object representing a rotatable item. In one or more embodiments, step 502 optionally includes determining a first distance that the first touch input occurred along the direction in which the first touch input occurred.
At step 503, method 500 includes detecting, with the touch-sensitive surface on the second major surface of the electronic device, a second touch input occurring at the touch-sensitive surface. In contrast to the method (200) of fig. 2 described above, where the rotatable item is presented on both the touch-sensitive display and the touch-sensitive surface, the method 500 of fig. 5 is designed for the case where there is no display of the rotatable item on the touch-sensitive surface. This may be due to the touch-sensitive surface not having graphical indicia display capabilities in one or more embodiments. In other embodiments, this may be the case where the touch-sensitive surface is configured as a touch-sensitive display but does not present graphical indicia to save power or for other reasons. Other configurations of electronic devices in which the rear-facing touch-sensitive surface may not present graphical objects representing rotatable items will be apparent to those of ordinary skill in the art having the benefit of this disclosure.
In one or more embodiments, step 503 includes determining a direction of movement of the second touch input. In one or more embodiments, step 503 optionally includes determining a second distance that the second touch input occurred along the direction in which the second touch input occurred.
In one or more embodiments, step 503 further includes detecting whether a second touch input (i.e., a touch input detected on the touch-sensitive surface) occurs at a location corresponding to the first touch input detected at step 502. In other words, when the rotatable article is presented on the touch-sensitive display in step 501, and where step 502 comprises detecting a first touch input at least partially interacting with or passing through a graphical object representing the rotatable article, in one or more embodiments step 503 comprises determining whether a second touch input, if it has been presented on the touch-sensitive surface, occurs at a location on the touch-sensitive surface that may reasonably be associated with a location of interaction with or passing through the rotatable article, or whether the second touch input occurs at a location of interaction with or passing through the rotatable article that is not entirely associated with the first touch input.
This can occur in one of several different ways. In one or more embodiments, step 503 includes determining, with the one or more processors, whether at least a portion of the first user input and at least a portion of the second user input pass through locations of the touch-sensitive display and the touch-sensitive surface that correspond to a location where the rotatable object is or should be (if the touch-sensitive surface is a touch-sensitive display that presents another graphical object representing the rotatable item). In one or more embodiments, this includes determining whether at least a portion of the first user input and at least a portion of the second user input pass through a location of the touch-sensitive display and the touch-sensitive surface defined by a reference axis (which may have a varying width), the reference axis being aligned orthogonally to the touch-sensitive surface. Other techniques for determining whether the location of the second touch input corresponds to a location on the touch-sensitive surface that may be reasonably associated with interacting with or passing through the rotatable item will be apparent to those of ordinary skill in the art having the benefit of this disclosure.
In one or more embodiments, a determination 504 is made whether a first direction of a first touch input occurring on the touch-sensitive display and a second direction of a second touch on the touch-sensitive surface comprise direction vectors oriented in opposite directions. In one or more embodiments, this decision 504 includes determining whether a first direction of a first touch input occurring on the touch-sensitive display and a second direction of a second touch input on the touch-sensitive surface are at least partially opposite or completely opposite.
In one or more embodiments, where they are at least partially or fully opposite, the method moves to optional decision 505. In the event they are not, a control operation is performed in step 506 according to a common direction in which a first touch input on the touch sensitive display and a second touch input on the touch sensitive surface occur.
At optional decision 505, method 500 optionally includes determining, with one or more processors, whether at least a portion of the first touch input detected at step 502 and at least a portion of the second touch input detected at step 503 occur simultaneously. In one or more embodiments, where step 503 includes determining whether the second touch input occurred at a location on the touch-sensitive surface that may be reasonably correlated with a location with respect to or through the rotatable item, decision 505 may include verifying whether the second touch input occurred at a location on the touch-sensitive surface that may be reasonably correlated with a location with respect to or through the rotatable item.
In the event that this optional decision 505 is included and a portion of the first touch input detected at step 502 and a portion of the second touch input detected at step 203 do not occur simultaneously, or the second input does not occur at a location on the touch-sensitive surface that may be reasonably correlated with the location of interaction with or passage through the rotatable item, a control operation is performed at step 506 in accordance with the first touch input and the second touch input, which occur sequentially. However, where this optional decision 505 is included and a portion of the first touch input detected at step 502 and a portion of the second touch input detected at step 503 do occur simultaneously, and optionally at a location on the second touch input touch sensitive surface that may be reasonably correlated with the location of interaction with or passage through the rotatable article, for example, when at least a portion of the first user input and at least a portion of the second user input pass through a location identified by a reference axis that is vertically oriented with respect to the touch sensitive display, the method proceeds to step 507.
At step 507, in one or more embodiments, one or more processors of the electronic device cause the graphical object to visually appear to rotate. In one or more embodiments, where step 502 includes determining a distance at which the first touch input occurred and step 503 includes determining another distance at which the second touch input occurred, step 507 includes the one or more processors visually causing the graphical object to appear to rotate by an amount of rotation that is a function of the first distance, the second distance, or a combination thereof. In one or more embodiments, where step 503 comprises determining whether the second touch input occurs at a location on the touch-sensitive surface that can reasonably correlate to a location of interaction with or traversal of the rotatable item, and/or decision 505 comprises confirming whether the second touch input occurs at a location on the touch-sensitive surface that can reasonably correlate to a location of interaction with or traversal of the rotatable item, step 507 comprises causing the graphical object to visually appear to rotate only when at least a portion of the first user input and at least a portion of the second user input traverse locations defined by axes perpendicular to the touch-sensitive display.
In one or more embodiments, step 507 includes the one or more processors visually making the graphical object appear to be rotated by an amount of rotation proportional to the first distance. In another embodiment, step 507 may include the one or more processors visually making the graphical object appear to be rotated by an amount of rotation proportional to the second distance. In yet another embodiment, step 507 may include the one or more processors visually causing the graphical object to appear to be rotated by an amount of rotation proportional to an average of the first distance and the second distance. In yet another embodiment, step 507 may include the one or more processors visually causing the graphical object to appear to be rotated by an amount of rotation proportional to a weighted average of the first distance and the second distance. Other metrics for determining how much the one or more processors should visually make the graphical object appear to rotate will be apparent to those of ordinary skill in the art having the benefit of this disclosure.
In one or more embodiments, step 508 includes one or more processors of the electronic device performing the control operations. By way of illustration, in one or more embodiments, step 508 includes one or more processors performing the adjustment operation. An example of the adjustment operation and other operations that may be performed at step 508 are shown and described above with reference to fig. 9. Others will be apparent to those of ordinary skill in the art having the benefit of this disclosure.
Turning now to fig. 6, there is shown an electronic device 100, again comprising a housing 103, the housing 103 defining a first major surface 104 separated from a second major surface 105 by one or more minor surfaces. The touch sensitive display 101 is again located on the first major surface 104, while the touch sensitive surface 108 is located on the second major surface 105. One or more processors (112) may operate with touch-sensitive display 101 and touch-sensitive surface 108.
As shown in step 601, in the exemplary embodiment, one or more processors (112) cause touch-sensitive display 101 to present graphical object 604 that visually represents the rotatable item. In this embodiment, the rotatable item comprises a graduated cylindrical object 605 depicted on the touch sensitive display 101.
In contrast to fig. 4-5 described above, in this example, the touch-sensitive surface 108 is not configured to present graphical indicia. The absence of graphical objects or marker display operations may occur for various reasons. In one illustrative example, the touch-sensitive surface 108 may be simply configured as a touch sensor without graphical indicia display capability. In other embodiments, the touch-sensitive surface 108 may actually be configured as a touch-sensitive display. However, the touch-sensitive display may not present graphical objects or indicia to save power or for other reasons. Other reasons for not presenting graphical information by the touch-sensitive surface 108 will be apparent to those of ordinary skill in the art having the benefit of this disclosure.
As with FIG. 4 described above, in this exemplary embodiment, the rotatable item is presented on the touch-sensitive display 101 from a front, left perspective (as viewed on the touch-sensitive display 101). Thus, the rotation axis 606 is oriented in the X-Y plane, but the rotation axis 606 is aligned in a direction oblique to the Y axis. Thus, the touch sensitive display 101 presents the front, left portion of the rotatable item. In this exemplary embodiment, the front of the rotatable article comprises the numbers 1, 2, 3, 4, each number being located between the respective graduations. Nothing is presented on the touch-sensitive surface 108.
The graphical object 604 presented in fig. 6 is configured to rotate around a rotation axis 606 passing through the right and left sides of the graphical object 604. As shown by steps 602, 603 when the graphical object 604 is rotated, the portion of the rotatable item above the axis of rotation 606 appears to move in the positive Y-direction, while the portion of the rotatable item below the axis of rotation 606 appears to rotate in the negative Y-direction.
As shown in step 602, the touch sensitive display 101 detects a first touch input 607 occurring on the touch sensitive display 101. In this exemplary embodiment, the touch-sensitive surface 108 simultaneously detects a second touch input 608 that occurs on the touch-sensitive surface 108. As shown in step 602, the first touch input 607 virtually grips the rotatable item in a first position, while the second touch input 608 virtually grips the rotatable item by occurring at a position where the rotatable item is also physically gripped if it is a real rotatable item and its rotation axis 606 passes through the electronic device 100. Note that although the rotatable item is not graphically presented on the touch-sensitive surface 108, the second touch input 608 occurs at this location.
In the transition from step 602 to step 603, the combination of the first touch input 607 and the second touch input 608 virtually rotate the portion of the rotatable item above the rotation axis 606 into the touch-sensitive display 101, while the second touch input 608 virtually rotates the rotatable item into the touch-sensitive surface 108. In particular, in the transition from step 602 to step 603, the first touch input 607 moves up and to the right along the touch sensitive display 101, travels a first distance 609 along the touch sensitive display 101 while engaging (and thus passing through) the graphical object 604. Similarly, the second touch input 608 moves down and to the left along the touch-sensitive surface 108 to travel the second distance 613.
In doing so, the first touch input 607 and the second touch input 608 move in opposite directions. Further, the first touch input 607 and the second touch input 608 pass through locations 610, 611 of the touch sensitive display 101 and the touch sensitive surface 108, the locations 610, 611 each being defined by a reference axis 612, the reference axis 612 being orthogonally aligned with one or both of the touch sensitive display 101 and/or the touch sensitive surface 108. This confirms that the first touch input 607 and the second touch input 608 "cross" or "overlap" each other in the secondary dimension of the electronic device 100 along the reference axis 612. Accordingly, the one or more processors (112) rotate the rotatable article about the axis of rotation 606.
This causes the rotatable item presented on the touch sensitive display 101 at step 602 to change from presenting the numbers 1, 2, 3, 4 between the respective scales to presenting the numbers 8, 9, 0, 1 between the respective scales at step 403, as seen by comparing step 602 and step 603. Note that if the direction of the first 607 and second 608 touch inputs were opposite to the direction shown, the rotatable item would rotate in the opposite direction, possibly going from presenting the numbers 1, 2, 3, 4 between the respective scales of step 602 to presenting the numbers 6, 7, 8, 9 between the respective scales of step 603.
In one or more embodiments, the amount of rotation is a function of an average of a first distance 609 of the first touch input 607 through the touch-sensitive display 101 and a second distance 613 of the second touch input 608 through the touch-sensitive surface 108. The graphical object 604 is thus rotated into the touch sensitive display 101, as shown in step 603, wherein the upper part of the rotatable item (above the rotation axis 606) is moved to the right and out of the touch sensitive display 101, while the lower part of the rotatable item (below the rotation axis 606) is moved to the left and into the touch sensitive display 101.
Turning now to fig. 7-8, methods 700, 800 are illustrated for manipulating a graphical object representing a rotatable article presented at least on a touch sensitive display of an electronic device having touch sensors on a rear major face by communicating touch input to the touch sensitive display and the touch sensitive surface. The methods 700, 800 of fig. 7-8 are particularly applicable where the rotatable article is configured as a crown, four-wheel drive/differential lock control, tuner/bass/treble control, or other rotatable control object that may be rotated and translated laterally.
Beginning with FIG. 7, at step 701, method 700 includes presenting a graphical object with a touch-sensitive display located on a first major surface of an electronic device. In one or more embodiments, step 701 includes presenting a graphical object representing the rotatable item. In one or more embodiments, step 701 includes presenting the rotatable item at a first location on the touch-sensitive display.
At step 702, method 700 includes detecting, with a touch-sensitive display, a first touch input on the touch-sensitive display. In one or more embodiments, step 702 includes detecting a first touch input occurring in a first direction across the touch-sensitive display. In one or more embodiments, step 702 includes detecting a first touch input at least partially crossing, overlapping, engaging, or otherwise interacting with a graphical object.
At step 703, method 700 includes detecting a second touch input with the touch-sensitive surface on the second major surface of the electronic device. In one or more embodiments, step 703 includes detecting a second touch input occurring in a second direction across the touch-sensitive surface.
In some embodiments, the touch-sensitive surface will be capable of presenting graphical objects, images, and other information. For example, in one or more embodiments, the touch-sensitive surface is configured as a touch-sensitive display. In this case, step 703 may optionally include detecting a second touch input that at least partially crosses, overlaps, joins or otherwise interacts with the graphical object.
In other embodiments, the touch-sensitive surface will be capable of presenting graphical objects, images and other information, and optionally not so, or alternatively not capable of presenting graphical objects, images and other information. In this case, if the graphical object has been presented on the touch-sensitive surface in a forward view and a backward view representing how the rotatable article looks like a single unitary physical object, step 703 may comprise detecting whether the second touch input occurred at a location that partially passed through, overlapped on, engaged with, or otherwise interacted with the graphical object.
Thus, in one or more embodiments, if the rotatable object is presented at a location on the touch-sensitive surface that corresponds to the location at which the rotatable object is presented on the touch-sensitive surface, step 703 may include determining whether the second touch input occurred at a location on the touch-sensitive surface that may reasonably be associated with a location at which the rotatable object interacts with or passes through the rotatable object. In one or more embodiments, step 703 includes determining whether at least a portion of the first user input and at least a portion of the second user input pass through a location defined by an axis perpendicular to the touch-sensitive display.
In one or more embodiments, decision 704 determines whether the first direction of the first touch input detected at step 702 and the second direction of the second touch input detected at step 703 occur in the same direction. In the case where these directions are opposite directions, the method 700 moves to step 705, which rotates the rotatable item as previously described. However, in case the first direction of the first touch input detected in step 702 and the second direction of the second touch input detected in step 703 are the same direction, the method 700 moves to step 706.
In one or more embodiments, step 706 includes translating, with the one or more processors operable with the touch-sensitive display and the touch-sensitive surface, the rotatable item from the first position to the second position. In one or more embodiments, the second position of step 706 is shifted from the first position of step 701. In one or more embodiments, the second position of step 706 is shifted from the first position of step 701 in a common direction of the first direction detected at step 702 and the second direction detected at step 703.
Thus, in one or more embodiments, the first touch input detected at step 702 and the second touch input detected at step 703 occur at a plurality of locations so as to virtually "grip" the rotatable item, and further, where the first touch input detected at step 702 and the second touch input detected at step 703 occur in a common, e.g., the same, direction, step 706 includes virtually translating the rotatable item across the touch-sensitive display in a common direction defined by the first direction and the second direction determined at step 702.
For example, where the rotatable item is a crown, step 706 may include virtually pulling the crown out of the case when the first touch input detected at step 702 and the second touch input detected at step 703 are in a direction away from the case. In contrast, step 706 may comprise virtually inserting the crown into the watch case when the direction of the first touch input detected at step 702 and the second touch input detected at step 703 are towards the watch case, and so on.
In one or more embodiments, after step 706 occurs, method 700 detects a third touch input with the touch-sensitive display at step 707. In one or more embodiments, step 707 includes detecting a third touch input occurring in a third direction along the touch-sensitive display. In one or more embodiments, step 707 includes detecting a third touch input occurring in a third direction a predetermined distance across the touch-sensitive display. In one or more embodiments, step 707 includes detecting a third touch input that occurs in a third direction and that at least partially crosses, overlaps, joins, or otherwise interacts with the graphical object. In one or more embodiments, the third direction is different from the first direction detected at step 702. In one or more embodiments, the third direction is different from the second direction detected at step 703.
In one or more embodiments, step 708 includes detecting a fourth touch input with the touch-sensitive surface. In one or more embodiments, step 708 includes detecting a fourth touch input that occurs in a fourth direction along the touch-sensitive display. In one or more embodiments, step 708 includes detecting a fourth touch input occurring in a fourth direction a predetermined distance across the touch-sensitive display. In one or more embodiments, the fourth direction is different from the first direction detected at step 702. In one or more embodiments, the fourth direction is different from the second direction detected at step 703.
As noted above, in some embodiments, the touch-sensitive surface will be capable of presenting graphical objects, images, and other information. In other embodiments, the touch-sensitive surface will be capable of presenting graphical objects, images and other information, and optionally not so, or alternatively not capable of presenting graphical objects, images and other information.
Thus, in one or more embodiments, if the rotatable object is presented on the touch-sensitive surface at a location that corresponds to the location at which the rotatable object is presented on the touch-sensitive display, step 708 may include determining whether the fourth touch input occurred at a location on the touch-sensitive surface that may reasonably be associated with a location at which the rotatable object interacts with or passes through the rotatable object. In one or more embodiments, step 708 includes determining whether at least a portion of the fourth touch input and at least a portion of the third touch input pass through a location defined by an axis perpendicular to the touch-sensitive display.
In one or more embodiments, decision 709 determines whether the third direction detected at step 708 and the fourth direction detected at step 708 occur in a common direction, e.g., the same direction, or whether they occur in different directions, e.g., opposite directions. For example, where decision 709 determines whether the third direction of the third touch input detected at step 707 and the fourth direction of the fourth touch input detected at step 708 occur in the same direction, the method moves to step 710 where the rotatable item is translated as described above with reference to step 706. In contrast, where decision 709 determines whether the third direction of the third touch input detected in step 707 and the fourth direction of the fourth touch input detected in step 708 occur in opposite directions, the method moves to step 711.
At step 711, in one or more embodiments, the one or more processors of the electronic device cause the graphical object to visually appear to rotate. In one or more embodiments, where step 707 includes determining a distance spanned by an occurrence of a third touch input, and step 708 includes determining another distance spanned by an occurrence of a fourth touch input, step 711 includes the one or more processors visually making the graphical object appear to be rotated by an amount of rotation that is a function of the third distance, the fourth distance, or a combination thereof. Other measures for determining the amount of rotation are described above. Other metrics for determining how much the one or more processors should visually make the graphical object appear to rotate will be apparent to those of ordinary skill in the art having the benefit of this disclosure.
In one or more embodiments, when step 708 includes determining whether the fourth touch input occurred at a location on the touch-sensitive surface that can reasonably be associated with a location of interaction with or traversal of the rotatable object, step 711 includes visually rotating the graphical object only when at least a portion of the first user input and at least a portion of the second user input traverse locations defined by axes perpendicular to the touch-sensitive display.
In one or more embodiments, step 712 then comprises one or more processors of the electronic device performing the control operation. By way of illustration, in one or more embodiments, step 712 includes one or more processors performing the adjustment operation. An example of the adjustment operation, as well as other operations that may be performed at step 712, are shown and described above with reference to fig. 9. Others will be apparent to those of ordinary skill in the art having the benefit of this disclosure.
In practice, method 700 of FIG. 7 shows how a rotatable item presented as a graphical object on a touch-sensitive display and/or touch-sensitive surface can be translated, e.g., pulled left or right, and then rotated much the same way a crown is pulled from a watch case and rotated to set up a watch. Turning now to fig. 8, a complementary method 800 is depicted in which the rotatable article is again translated and rotated, perhaps to adjust another setting or perform another control operation in the electronic device.
Beginning with step 801, in one or more embodiments, method 800 includes detecting, with a touch-sensitive display, a fifth touch input on the touch-sensitive display. In one or more embodiments, step 801 includes detecting a fifth touch input occurring in a fifth direction across the touch-sensitive display. In one or more embodiments, step 801 includes detecting a fifth touch input that at least partially crosses, overlaps, engages, or otherwise interacts with the graphical object.
At step 802, method 800 includes detecting a sixth touch input with the touch-sensitive surface. In one or more embodiments, step 802 includes detecting a sixth touch input that occurs in a sixth direction across the touch-sensitive surface.
As noted above, in some embodiments, the touch-sensitive surface will be capable of presenting graphical objects, images, and other information. In other embodiments, the touch-sensitive surface will be capable of presenting graphical objects, images and other information, and optionally not so, or alternatively not capable of presenting graphical objects, images and other information.
Thus, in one or more embodiments, if the rotatable object is presented on the touch-sensitive surface at a location that corresponds to the location at which the rotatable object is presented on the touch-sensitive display, step 802 may optionally include determining whether the sixth touch input occurred at a location on the touch-sensitive surface that may reasonably be associated with a location at which the rotatable object interacts with or passes through the rotatable object. In one or more embodiments, step 802 optionally includes determining whether at least a portion of the fifth touch input and at least a portion of the sixth touch input pass through a location defined by an axis perpendicular to the touch sensitive display.
In one or more embodiments, the decision 803 determines whether the fifth direction of the fifth touch input detected at step 801 and the sixth direction of the sixth touch input detected at step 802 occurred in the same direction. Where these directions are opposite directions, the method 800 moves to step 804 where the rotatable article is rotated as previously described. However, in case the fifth direction of the fifth touch input detected at step 801 and the sixth direction of the sixth touch input detected at step 802 are the same direction, the method 800 moves to step 805.
In one or more embodiments, step 805 comprises translating, with the one or more processors operable with the touch-sensitive display and the touch-sensitive surface, the rotatable item from a first position (e.g., the second position of step (706) above) to a second position (e.g., the first position of step (701) above). In one or more embodiments, the second location of step 805 is shifted from the first location that occurred at the beginning of method 800, which in one or more embodiments follows step (712) of FIG. 7. In one or more embodiments, the second position of step 805 is shifted from the first position that occurs when method 800 starts in a common direction of the fifth direction detected at step 801 and the sixth direction detected at step 802.
Thus, in one or more embodiments, where the fifth touch input detected at step 801 and the sixth touch input detected at step 802 occur at multiple locations so as to virtually "grip" the rotatable item, and further where the fifth touch input detected at step 801 and the sixth touch input detected at step 802 occur in a common direction, step 805 may include virtually translating the rotatable item across the touch-sensitive display in the common direction defined by the fifth direction detected at step 801 and the sixth direction detected at step 802.
For example, in the case where the rotatable item is a crown, step 805 may include virtually pushing the crown into the case when the direction of the fifth direction detected at step 801 and the sixth direction detected at step 802 are directed toward the case. Thus, if the method (700) of fig. 7 includes pulling the crown out of the case and setting the watch, steps 801 to 805 may include pushing the crown back into the case.
In one or more embodiments, after step 805 occurs, method 800 optionally moves to step 806. In one or more embodiments, step 806 detects a seventh touch input with the touch-sensitive display. In one or more embodiments, step 806 includes detecting a seventh touch input that occurs in a seventh direction along the touch-sensitive display. In one or more embodiments, step 806 includes detecting a seventh touch input across the touch-sensitive display at the predetermined distance that occurs in a seventh direction.
In one or more embodiments, step 806 includes detecting a seventh touch input that occurs in a seventh direction and at least partially crosses, overlaps, joins, or otherwise interacts with the graphical object. In one or more embodiments, the seventh direction is different from the fifth direction detected at step 801. In one or more embodiments, the seventh direction is different from the sixth direction detected at step 802.
In one or more embodiments, step 807 includes detecting an eighth touch input with the touch-sensitive surface. In one or more embodiments, step 807 includes detecting an eighth touch input occurring in an eighth direction along the touch-sensitive surface.
In one or more embodiments, step 807 includes detecting an eighth touch input occurring in an eighth direction across the touch-sensitive surface a predetermined distance. In one or more embodiments, the eighth direction is different from the fifth direction detected at step 801. In one or more embodiments, the eighth direction is different from the sixth direction detected at step 802.
As noted above, in some embodiments, the touch-sensitive surface will be capable of presenting graphical objects, images, and other information. In other embodiments, the touch-sensitive surface will either be capable of presenting graphical objects, images and other information, and optionally not, or alternatively not.
Thus, in one or more embodiments, if the rotatable object is presented on the touch-sensitive surface at a location that corresponds to the location at which the rotatable object is presented on the touch-sensitive display, step 807 may include determining whether the eighth touch input occurred at a location on the touch-sensitive surface that may reasonably correlate to a location of interaction with or passage through the rotatable item. In one or more embodiments, step 807 includes determining whether at least a portion of the eighth touch input and at least a portion of the seventh touch input pass through a location defined by an axis perpendicular to the touch-sensitive display.
In one or more embodiments, the decision 808 determines whether the seventh direction detected at step 806 and the eighth direction detected at step 807 occur in a common direction, e.g., in the same direction, or whether they occur in different directions, e.g., in opposite directions. For example, where the decision 808 determines whether the seventh direction of the seventh touch input detected at step 806 and the eighth direction of the eighth touch input detected at step 807 occurred in the same direction, the method moves to step 809 where the rotatable item is translated as described above with reference to step 805. In contrast, where the decision 808 determines whether the seventh direction of the seventh touch input detected at step 806 and the eighth direction of the eighth touch input detected at step 807 occurred in opposite directions, the method moves to step 810.
At step 810, in one or more embodiments, one or more processors of the electronic device cause the graphical object to visually appear to rotate. In one or more embodiments, where step 806 comprises determining a distance spanned by an occurring seventh touch input, and step 807 comprises determining another distance spanned by an occurring eighth touch input, step 810 comprises the one or more processors visually causing the graphical object to appear to be rotated by an amount of rotation that is a function of the seventh distance, the eighth distance, or a combination thereof. Other measures for determining the amount of rotation are described above. Other metrics for determining how much the one or more processors should visually make the graphical object appear to rotate will be apparent to those of ordinary skill in the art having the benefit of this disclosure.
In one or more embodiments, when step 807 comprises determining whether the seventh touch input occurred at a location on the touch-sensitive surface that can reasonably be associated with a location of interaction with or traversal of the rotatable object, step 810 comprises visually rotating the graphical object only when at least a portion of the seventh user input and at least a portion of the eighth user input traverse a location defined by an axis perpendicular to the touch-sensitive display.
In one or more embodiments, step 811 then includes one or more processors of the electronic device performing the control operation. By way of illustration, in one or more embodiments, step 811 comprises one or more processors performing the adjustment operation. An example of the adjustment operation, as well as other operations that may be performed at step 811, are shown and described above with reference to fig. 9. Others will be apparent to those of ordinary skill in the art having the benefit of this disclosure.
In fact, method 800 of FIG. 8 shows how a rotatable item presented as a graphical object on a touch-sensitive display and/or touch-sensitive surface can be translated, e.g., pulled left or right, and then rotated, much in the same way that a crown is pushed into the case and then rotated to wind a watch after setup. Turning now to fig. 10, there is shown one or more method steps illustrating how a rotatable article is translated, in accordance with a portion of the method of fig. 7-8 described above.
Beginning with step 1001, the electronic device 100 includes a touch-sensitive display 101 located on a first major surface 104 of the electronic device 100. As shown in step 1001, touch sensitive display 101 presents a graphical object representing rotatable item 1003.
In this exemplary embodiment, rotatable article 1003 includes a control knob that may be rotated in accordance with embodiments of the present disclosure to perform adjustment operations in electronic device 100. In this particular example, rotatable article 1003 is capable of performing two different adjustment operations: a first adjustment operation when the rotatable article 1003 is in the first position and rotated, and a second adjustment operation when the rotatable article 1003 is in the second position and rotated. Fig. 10 shows how the translation between the first position and the second position takes place.
In one or more embodiments, the rotatable item 1003 is initially presented at a first location 1004 on the touch-sensitive display 101, as shown in step 1001. Thereafter, the touch sensitive display 101 detects a first touch input 1005 on the touch sensitive display 101. In one or more embodiments, touch sensitive display 101 detects a first touch input 1005 occurring in a first direction 1006. In one or more embodiments, touch sensitive display 101 detects a first touch input 1005 occurring in a first direction 1006 and at least partially traversing or otherwise interacting with rotatable item 1003, as shown in step 1001.
In one or more embodiments, the touch-sensitive surface is located on the second major surface of the electronic device 100. While the touch-sensitive surface is not shown in fig. 10, in one embodiment it may be similar to the touch-sensitive surface (108) of fig. 3. In another embodiment, it may be similar to the touch-sensitive surface (108) of FIG. 6.
In one or more embodiments, the touch-sensitive surface detects a second touch input 1007 that occurs on the touch-sensitive surface. In one or more embodiments, the touch-sensitive surface detects a second touch input 1007 that occurs in a second direction 1008. In the case where the touch-sensitive surface is configured similar to the touch-sensitive surface (108) of fig. 3, where the touch-sensitive surface is configured as a touch-sensitive display, in one or more embodiments, the touch-sensitive surface detects second touch input 1007 that occurs in second direction 1008 and at least partially passes through or otherwise interacts with rotatable item 1003.
As shown in step 1002, where first direction 1006 and second direction 1008 comprise direction vectors aligned in the same direction, in one or more embodiments, one or more processors (112) of electronic device 100 translate rotatable item 1003 from first position 1004 shown in step 1001 to second position 1009 shown in step 1002. In the exemplary embodiment, second position 1009 is displaced in first direction 1006 from first position 1004.
In this exemplary embodiment, rotatable article 1003 is capable of performing two different adjustment operations: a first adjustment operation when the rotatable article 1003 is in the first position and rotated, and a second adjustment operation when the rotatable article 1003 is in the second position and rotated.
To indicate in fig. 10, the control indicator of rotatable article 1003 changes as rotatable article 1003 translates from first position 1004 to second position 1009. In the first position 1004, the adjustment scale is 1, 2, 3, 4, etc., while in the second position the adjustment scale is low, medium high and high. Thus, in one or more embodiments, rotatable article 1003 prompts the user to rotate the rotatable article using the back and front surfaces of electronic device 100 by presenting an adjustment scale or other indicia on rotatable article 1003.
By way of example, if rotatable article 1003 is configured as an audio control knob, the first adjustment operation may be a volume adjustment operation, while the second adjustment operation is a treble adjustment operation, and so on. Similarly, if rotatable article 1003 is configured as a crown, the first operation may include winding or setting a date for the watch, while the second adjustment operation is a set time. If the rotatable article is configured for four-wheel drive/differential lock control, then the first operation may be a switch from two-wheel drive to four-wheel drive with an unlocking differential, and the second adjustment operation may be a switch from two-wheel drive to four-wheel drive with a locking differential, and so on.
In yet other embodiments, rotatable article 1003 may be in an inactive state when in first position 1004. In such embodiments, rotatable article 1003 may be actuated while being translated to second position 1009. Thus, in one or more embodiments, rotatable item 1003 is activated when it is pulled out by pulling rotatable item 1003 across the front and back surfaces of electronic device 100, as shown in step 1002. Instead, rotatable article 1003 may be deactivated by pushing it back from second position 1009, and so on. It should be noted that although the methods of fig. 7-8 and the method steps of fig. 10 describe and illustrate the translation of the rotatable article between the first position and the second position, it should be understood that these methods and method steps may be extended to allow translation of the rotatable article between three, four or more positions, each corresponding to a different function, and so forth.
Turning next to FIG. 11, to further highlight the translational and rotational combination of operations, watch 1105 will be presented as a graphical object on touch-sensitive display 101 of electronic device 100, where watch 1105 has crown 1106 as a rotatable object. Beginning at step 1101, the touch-sensitive display 101 presents a watch 1105 on the first major surface 104 of the electronic device 100. Attached to watch 1105, in this exemplary embodiment, is a graphical object representing a rotatable item, in this case, crown 1106. As shown in step 1101, crown 1106 is shown positioned at a first location 1108 on touch-sensitive display 101.
At step 1102, the touch sensitive display 101 detects a first touch input 1109 that occurs in a first direction 1110 and at least partially through the crown 1106. At step 1102, a touch-sensitive surface located on a second major surface of the electronic device 100 detects a second touch input 1111 occurring in the first direction 1110. In this example, since both the first touch input 1109 and the second touch input 1111 are grasping the crown 1106 and moving in the first direction 1110, i.e., moving to the right, the one or more processors (112) of the electronic device 100 translate the crown 1106 from the first position 1108 to the second position 1112. In this example, the second position 1112 is shifted from the first position 1108 in the first direction 1110.
Thereafter, at step 1103, the touch-sensitive display 101 detects a third touch input 1113. In this example, the third touch input 1113 occurs in the second direction 1114. In this example, third touch input 1113 at least partially passes through crown 1106.
Simultaneously, the touch-sensitive surface detects a fourth touch input 1115. In this example, the fourth touch input 1115 occurs in the third direction 1116. In this example, the third direction 1116 is opposite the second direction 1114. Accordingly, one or more processors (112) of the electronic device visually appear to rotate crown 1106 in response to third touch input 1113 and fourth touch input 1115.
In the exemplary embodiment, in addition to visually appearing to rotate crown 1106 in response to third touch input 1113 and fourth touch input 1115, one or more processors (112) of electronic device 100 also perform control operations that adjust a first mode of operation of electronic device 100 while crown 1106 is visually rotated. In this illustration, the adjustment is setting time. As seen by comparing step 1102 and step 1103, the time setting of electronic device 100 changes from twelve points to two points. This is but one example of how adjustments may be made by rotating a graphical object configured in accordance with one or more embodiments of the present disclosure. Many others will be apparent to those of ordinary skill in the art having the benefit of this disclosure.
As shown in step 1104, touch-sensitive display 101 then detects a fifth touch input 1117 that occurs in fourth direction 1118 and at least partially through crown 1106. At the same time, the touch-sensitive surface detects a sixth touch input 1119 that occurs in fourth direction 1118. In one or more embodiments, one or more processors (112) of electronic device 100 then cause crown 1106 to translate from second position 1112 to first position 1108.
In one or more embodiments, step 1103 may then be repeated for crown 1106 located at first location 1108. For example, touch-sensitive display 101 may detect a seventh touch input that occurs in second direction 1114 and at least partially through crown 1106 while the touch-sensitive surface simultaneously detects an eighth touch input that occurs in third direction 1116. Where this occurs, in one or more embodiments, one or more processors (112) of electronic device 100 can visually cause crown 1106 to appear to rotate in response to the seventh touch input and the eighth touch input. The one or more processors (112) may also optionally perform another control operation that adjusts the second mode of operation of the electronic device 100 while the crown 1106 is visually rotated, an example of which is setting the date of the electronic device 100.
Thus, fig. 11 illustrates an electronic device 100 that can perform multi-touch sensing on a front surface and a back surface. In one or more embodiments, a rotatable article, in this example, crown 1106, prompts the user to rotate crown 1106 using the back and front surfaces, thereby mimicking the rotational movement of the crown. In one or more embodiments, crown 1106 can be activated by pulling crown 1106 using touch inputs passing along the front and back surfaces, thereby mimicking the pulling of crown 1106 to set the time. In other embodiments, crown 1106 may perform a first control function when in first position 1108 and then perform a second function when in second position 1112. Further, as described above, these locations may extend from first location 1108 and second location 1112 to any number of locations as desired for a particular application.
Thus, in the foregoing specification, specific embodiments of the present disclosure have been described. However, one of ordinary skill in the art appreciates that various modifications and changes can be made without departing from the scope of the present disclosure as set forth in the claims below. Thus, while the preferred embodiments of the disclosure have been illustrated and described, it will be clear that the disclosure is not so limited. Numerous modifications, changes, variations, substitutions and equivalents will occur to those skilled in the art without departing from the spirit and scope of the present disclosure as defined by the appended claims.
Turning next to FIG. 12, various embodiments of the present disclosure are illustrated. Beginning at 1201, a method presents a graphical object representing a rotatable item with a touch-sensitive display located on a first major surface of an electronic device. At 1201, the method includes detecting, with a touch sensitive display, a first touch input that occurs in a first direction and at least partially through a graphical object.
At 1201, the method includes detecting, with a touch-sensitive surface on a second major surface of the electronic device, a second touch input occurring in a second direction opposite the first direction. At 1201, the method includes visually making the graphical object appear to rotate in response to the first touch input and the second touch input using one or more processors operable with the touch-sensitive display and the touch-sensitive surface.
At 1202, the method of 1201 includes determining, with the touch sensitive display, a first distance that the first touch input occurred in the first direction. At 1202, the method includes determining, with the touch-sensitive surface, a second distance that the second touch input occurred in the second direction. At 1202, the method includes visually making the graphical object appear to be rotated by an amount of rotation that is a function of the first distance, the second distance, or a combination thereof, using one or more processors. At 1203, the amount of rotation of 1202 is proportional to the first distance.
The method at 1204, 1202 further includes performing, with the one or more processors, an adjustment operation having an adjustment magnitude proportional to the amount of rotation. The adjustment operation at 1205, 1204 includes one of a volume adjustment operation, a light output operation, or a scroll operation.
At 1206, the method of 1201 further includes determining, with the one or more processors, whether at least a portion of the first touch input and at least a portion of the second touch input occur simultaneously. At 1206, causing the graphical object to visually appear to rotate occurs only when at least a portion of the first touch input and at least a portion of the second touch input occur simultaneously.
At 1207, the method of 1201 further includes determining, with the one or more processors, whether at least a portion of the first touch input and at least a portion of the second touch input pass through a location of the touch-sensitive display and the touch-sensitive surface defined by a reference axis orthogonally aligned with the touch-sensitive surface. At 1207, the graphical object is made to visually appear to rotate only when at least a portion of the first touch input and at least a portion of the second touch input pass through the location.
Presenting the graphical object of 1201 comprises presenting a representation of the graduated cylindrical object on the touch sensitive display 1208. Presenting the graphical object at 1209, 1201 includes presenting a representation of a watch on the touch-sensitive display, the rotatable item including a crown of the watch.
At 1210, an electronic device includes a housing defining a first major surface separated from a second major surface by one or more minor surfaces. At 1210, the electronic device includes a touch-sensitive display on a first major surface. At 1210, the electronic device includes a touch-sensitive surface on the second major surface. At 1210, an electronic device includes one or more processors operable with a touch-sensitive display and a touch-sensitive surface.
At 1210, the one or more processors cause the touch-sensitive display to present a graphical object that visually represents the rotatable item. At 1210, the touch-sensitive display detects a first touch input occurring in a first direction on the touch-sensitive display. At 1210, the touch-sensitive surface detects a second touch input occurring in a second direction on the touch-sensitive surface that is opposite the first direction. At 1210, the one or more processors rotate the rotatable item in response to the first touch input and the second touch input occurring in opposite directions.
The one or more processors of 1210 cause the rotatable item to rotate 1211 only when the first touch input at least partially passes through the rotatable item. The one or more processors of 1211 rotate the rotatable item only when the first touch input and the second touch input overlap at a reference axis that is orthogonally oriented to the touch sensitive display at 1212.
The one or more processors at 1213, 1212 rotate the rotatable article only when at least some of the first touch inputs coincide with at least some of the second touch inputs. At 1214, rotatable article 1213 comprises a crown.
The touch sensitive surface at 1215, 1212 includes another touch sensitive display. At 1215, the one or more processors cause another touch sensitive display to present another graphical object that visually represents the rotatable item on the other touch sensitive display. The one or more processors of 1215, 1216, cause the rotatable item to rotate only when the first touch input at least partially passes through the graphical object and the second touch input at least partially passes through another graphical object.
At 1217, a method in an electronic device includes presenting, with a touch-sensitive display located on a first major surface of the electronic device, a graphical object representing a rotatable item at a first location on the touch-sensitive display. At 1217, the method includes detecting, with the touch-sensitive display, a first touch input that occurs in a first direction and at least partially through the graphical object.
At 1217, the method includes detecting a second touch input occurring in the first direction with the touch-sensitive surface on the second major surface of the electronic device. At 1217, the method includes translating, with one or more processors operable with the touch-sensitive display and the touch-sensitive surface, the rotatable item from a first position to a second position displaced from the first position in the first direction.
Thereafter, the method of 1217 includes detecting, with the touch-sensitive display, a third touch input that occurs in the second direction and that at least partially passes through the graphical object. The method of 1217 then includes detecting, with the touch-sensitive surface, a fourth touch input that occurs in a third direction opposite the second direction. Then, the method of 1217 includes visually making the graphical object appear to rotate in response to the third touch input and the fourth touch input, using the one or more processors.
At 1218, the method 1217 further includes performing, with the one or more processors, a control operation that adjusts a first mode of operation of the electronic device as the graphical object visually rotates. The method at 1219, 1217 further comprises detecting, with the touch-sensitive display, a fifth touch input that occurs in the fourth direction and at least partially passes through the graphical object. The method at 1219, 1217 includes detecting, with the touch-sensitive surface, a sixth touch input that occurred in a fourth direction. At 1219, the method includes translating, with the one or more processors, the rotatable item from the second position to the first position.
At 1220, 1219 the method includes detecting, with the touch-sensitive display, a seventh touch input that occurs in the second direction and at least partially through the graphical object. At 1220, the method includes detecting, with the touch-sensitive surface, an eighth touch input that occurred in the third direction.
At 1220, the method includes visually appearing, with the one or more processors, the graphical object to rotate in response to the seventh touch input and the eighth touch input. At 1220, the method includes executing, with the one or more processors, another control operation that adjusts a second mode of operation of the electronic device while the graphical object is visually rotated.
The specification and figures are to be regarded in an illustrative rather than a restrictive sense, and all such modifications are intended to be included within the scope of present disclosure. The benefits, advantages, solutions to problems, and any element(s) that may cause any benefit, advantage, or solution to occur or become more pronounced are not to be construed as a critical, required, or essential feature or element of any or all the claims.
Claims (20)
1. A method in an electronic device, comprising:
presenting, with a touch-sensitive display located on a first major surface of the electronic device, a graphical object representing a rotatable item;
detecting, with the touch-sensitive display, a first touch input that occurs in a first direction and at least partially through the graphical object;
detecting, with a touch-sensitive surface located on a second major surface of the electronic device, a second touch input occurring in a second direction opposite the first direction; and
visually appearing, with one or more processors, the graphical object to rotate in response to the first touch input and the second touch input, the one or more processors being operable with the touch-sensitive display and the touch-sensitive surface.
2. The method of claim 1, further comprising:
determining, with the touch-sensitive display, a first distance that the first touch input occurred in the first direction;
determining, with the touch-sensitive surface, a second distance that the second touch input occurred in the second direction; and
visually, with the one or more processors, the graphical object is rotated by an amount of rotation that is a function of the first distance, the second distance, or a combination thereof.
3. The method of claim 2, wherein the amount of rotation is proportional to the first distance.
4. The method of claim 2, further comprising, with the one or more processors, performing an adjustment operation having an adjustment magnitude proportional to the amount of rotation.
5. The method of claim 4, the adjustment operation comprising one of a volume adjustment operation, a light output operation, or a scrolling operation.
6. The method of claim 1, further comprising: determining, with the one or more processors, whether at least a portion of the first touch input and at least a portion of the second touch input occur simultaneously, wherein the graphical object is made to visually appear to rotate to occur only when the at least a portion of the first touch input and the at least a portion of the second touch input occur simultaneously.
7. The method of claim 1, further comprising: determining, with the one or more processors, whether at least a portion of the first touch input and at least a portion of the second touch input pass through a location of the touch-sensitive display and the touch-sensitive surface defined by a reference axis orthogonally aligned with the touch-sensitive surface, wherein the graphical object is caused to visually appear to rotate to occur only when the at least a portion of the first touch input and the at least a portion of the second touch input pass through the location.
8. The method of claim 1, rendering the graphical object comprising: presenting a representation of the graduated cylindrical object on the touch-sensitive display.
9. The method of claim 1, rendering the graphical object comprising: presenting a representation of a watch on the touch-sensitive display, the rotatable article comprising a crown of the watch.
10. An electronic device, comprising:
a housing defining a first major surface separated from a second major surface by one or more minor surfaces;
a touch sensitive display on the first major surface;
a touch-sensitive surface on the second major surface; and
one or more processors operable with the touch-sensitive display and the touch-sensitive surface;
the one or more processors cause the touch-sensitive display to present a graphical object that visually represents a rotatable item;
the touch-sensitive display detecting a first touch input occurring on the touch-sensitive display in a first direction;
the touch-sensitive surface detects a second touch input occurring on the touch-sensitive surface in a second direction that is opposite the first direction; and
the one or more processors rotate the rotatable item in response to the first touch input and the second touch input occurring in opposite directions.
11. The electronic device of claim 10, the one or more processors to rotate the rotatable item only when the first touch input at least partially passes through the rotatable item.
12. The electronic device of claim 11, the one or more processors to rotate the rotatable item only when the first touch input and the second touch input overlap at a reference axis orthogonally oriented to the touch-sensitive display.
13. The electronic device of claim 12, the one or more processors to rotate the rotatable item only when at least some of the first touch inputs coincide with at least some of the second touch inputs.
14. The electronic device of claim 13, the rotatable article comprising a crown.
15. The electronic device of claim 12, the touch-sensitive surface comprising another touch-sensitive display, the one or more processors causing the another touch-sensitive display to present another graphical object that visually represents the rotatable item on the another touch-sensitive display.
16. The electronic device of claim 15, the one or more processors to rotate the rotatable item only when the first touch input at least partially passes through the graphical object and the second touch input at least partially passes through the other graphical object.
17. A method in an electronic device, the method comprising:
presenting, with a touch-sensitive display located on a first major surface of the electronic device, a graphical object representing a rotatable item at a first location on the touch-sensitive display;
detecting, with the touch-sensitive display, a first touch input that occurs in a first direction and at least partially through the graphical object;
detecting, with a touch-sensitive surface located on a second major surface of the electronic device, a second touch input occurring in the first direction;
translating, with one or more processors operable with the touch-sensitive display and the touch-sensitive surface, the rotatable item in the first direction from the first position to a second position displaced relative to the first position; and
thereafter:
detecting, with the touch-sensitive display, a third touch input that occurs in a second direction and at least partially through the graphical object;
detecting, with the touch-sensitive surface, a fourth touch input occurring in a third direction opposite the second direction; and
visually appearing, with the one or more processors, the graphical object to rotate in response to the third touch input and the fourth touch input.
18. The method of claim 17, further comprising: performing, with the one or more processors, a control operation that adjusts a first mode of operation of the electronic device while the graphical object visually rotates.
19. The method of claim 17, further comprising:
detecting, with the touch-sensitive display, a fifth touch input that occurs in a fourth direction and at least partially through the graphical object;
detecting, with the touch-sensitive surface, a sixth touch input that occurs in the fourth direction; and
translating, with the one or more processors, the rotatable item from the second position to the first position.
20. The method of claim 19, further comprising:
detecting, with the touch-sensitive display, a seventh touch input that occurs in the second direction and at least partially passes through the graphical object;
detecting, with the touch-sensitive surface, an eighth touch input that occurs in the third direction;
visually appearing, with the one or more processors, the graphical object as rotating in response to the seventh touch input and the eighth touch input; and
executing, with the one or more processors, another control operation that adjusts a second mode of operation of the electronic device while the graphical object visually rotates.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910536301.9A CN112114688A (en) | 2019-06-20 | 2019-06-20 | Electronic device for rotating a graphical object presented on a display and corresponding method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910536301.9A CN112114688A (en) | 2019-06-20 | 2019-06-20 | Electronic device for rotating a graphical object presented on a display and corresponding method |
Publications (1)
Publication Number | Publication Date |
---|---|
CN112114688A true CN112114688A (en) | 2020-12-22 |
Family
ID=73795755
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201910536301.9A Pending CN112114688A (en) | 2019-06-20 | 2019-06-20 | Electronic device for rotating a graphical object presented on a display and corresponding method |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN112114688A (en) |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102362249A (en) * | 2009-03-24 | 2012-02-22 | 微软公司 | Bimodal touch sensitive digital notebook |
US20130147833A1 (en) * | 2011-12-09 | 2013-06-13 | Ident Technology Ag | Electronic Device with a User Interface that has more than Two Degrees of Freedom, the User Interface Comprising a Touch-Sensitive Surface and Contact-Free Detection Means |
CN105025630A (en) * | 2015-07-28 | 2015-11-04 | 广东欧珀移动通信有限公司 | Brightness adjusting method and intelligent watch |
CN109661645A (en) * | 2016-09-06 | 2019-04-19 | 微软技术许可有限责任公司 | Sign language for the equipment with multiple touch-control surfaces |
-
2019
- 2019-06-20 CN CN201910536301.9A patent/CN112114688A/en active Pending
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102362249A (en) * | 2009-03-24 | 2012-02-22 | 微软公司 | Bimodal touch sensitive digital notebook |
US20130147833A1 (en) * | 2011-12-09 | 2013-06-13 | Ident Technology Ag | Electronic Device with a User Interface that has more than Two Degrees of Freedom, the User Interface Comprising a Touch-Sensitive Surface and Contact-Free Detection Means |
CN105025630A (en) * | 2015-07-28 | 2015-11-04 | 广东欧珀移动通信有限公司 | Brightness adjusting method and intelligent watch |
CN109661645A (en) * | 2016-09-06 | 2019-04-19 | 微软技术许可有限责任公司 | Sign language for the equipment with multiple touch-control surfaces |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11740727B1 (en) | Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback | |
JP6254147B2 (en) | Object control method in device having transparent display, device and recording medium | |
CN105164714A (en) | User terminal device and controlling method thereof | |
CN103502923A (en) | Touch and non touch based interaction of a user with a device | |
CN109976654A (en) | A kind of screen content method of adjustment, device, mobile terminal and storage medium | |
CN112114688A (en) | Electronic device for rotating a graphical object presented on a display and corresponding method | |
US11240358B2 (en) | Electronic devices and methods for moving content presentation on one or more displays |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |