KR102019116B1 - Terminal and method for controlling the same - Google Patents
Terminal and method for controlling the same Download PDFInfo
- Publication number
- KR102019116B1 KR102019116B1 KR1020120072804A KR20120072804A KR102019116B1 KR 102019116 B1 KR102019116 B1 KR 102019116B1 KR 1020120072804 A KR1020120072804 A KR 1020120072804A KR 20120072804 A KR20120072804 A KR 20120072804A KR 102019116 B1 KR102019116 B1 KR 102019116B1
- Authority
- KR
- South Korea
- Prior art keywords
- touch
- screen
- controller
- icon
- module
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/14—Digital output to display device ; Cooperation and interconnection of the display device with other functional units
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M2250/00—Details of telephonic subscriber devices
- H04M2250/22—Details of telephonic subscriber devices including a touch pad, a touch sensor or a touch detector
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Telephone Function (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The present invention provides a user with a touch UI (hereinafter referred to as UI) for navigating function icons within the screen to a user in a touch screen, and the user manipulates the touch UI. A mobile terminal capable of selecting and executing an icon and a control method thereof.
Description
The present invention relates to a portable terminal and a method of controlling the same so that the use of the terminal can be implemented in consideration of user convenience.
Terminal can move It may be divided into a mobile terminal (mobile / portable terminal) and a stationary terminal depending on whether or not. The mobile terminal may be further classified into a handheld terminal and a vehicle mount terminal according to whether a user can directly carry it.
As the terminal functions are diversified, for example, such a terminal is a multimedia player having a complex function such as taking a picture or a video, playing a music or video file, playing a game, or receiving a broadcast. Is being implemented.
In order to support and increase the function of such a terminal, it may be considered to improve the structural part and / or the software part of the terminal.
Currently, the display size having a touch screen function of a mobile terminal is increasing in size to provide many functions to a user.
However, as the screen of the portable terminal becomes larger, there is a problem in that the user is inconvenient to touch the screen using the held finger with the user holding the portable terminal with one hand.
SUMMARY OF THE INVENTION An object of the present invention is to provide a user with a touch UI (hereinafter referred to as 'UI') capable of navigating objects in the screen in a touch screen, and the user operating the touch UI The present invention provides a portable terminal and a control method thereof for selecting and executing a desired object.
According to an aspect of the present invention, there is provided a portable terminal including: a touch screen displaying a screen including two or more objects; Displaying a touch UI for selecting and executing a desired object among the objects in the screen, selecting any one of the objects according to a first touch action input on the touch UI, and on the touch UI And a controller configured to execute a function assigned to the selected object according to a second touch action input to the touch object.
In addition, the method for controlling a mobile terminal according to the present invention includes displaying a screen including two or more objects; Displaying a touch user interface (UI) for moving highlights between the objects in the screen; Moving a highlight between the objects according to a first touch action input on the touch UI; And executing a function assigned to an object in which the moved highlight is located according to a second touch action input on the touch UI.
In accordance with another aspect of the present invention, a portable terminal and a method of controlling the same provide a user with a touch UI (hereinafter, referred to as 'UI') capable of navigating function icons within the screen in a touch screen. Has the effect of selecting and executing a desired function icon by manipulating the touch UI.
1 is a block diagram of a mobile terminal according to an embodiment of the present invention.
2 is a front perspective view of a mobile terminal according to an embodiment of the present invention.
3 is a front view of a portable terminal for explaining an operation state of the portable terminal according to the present invention.
4 is a flowchart illustrating a process of selecting and executing function icons in a touch screen using a touch UI displayed in the touch screen according to the present invention.
5 to 14 are explanatory views illustrating a process of selecting and executing function icons in a touch screen screen using a touch UI displayed in a touch screen screen according to the present invention.
Hereinafter, a portable terminal according to the present invention will be described in detail with reference to the drawings. The suffixes "module" and "unit" for components used in the following description are given or used in consideration of ease of specification, and do not have distinct meanings or roles from each other.
The portable terminal described herein may include a mobile phone, a smart phone, a laptop computer, a digital broadcasting terminal, a personal digital assistant (PDA), a portable multimedia player (PMP), navigation, and the like. However, it will be readily apparent to those skilled in the art that the configuration according to the embodiments described herein may also be applied to fixed terminals such as digital TVs, desktop computers, etc., except when applicable only to portable terminals.
1 is a block diagram of a mobile terminal according to an embodiment of the present invention.
The
Hereinafter, the components will be described in order.
The
The
The broadcast channel may include a satellite channel and a terrestrial channel. The broadcast management server may mean a server that generates and transmits a broadcast signal and / or broadcast related information or a server that receives a previously generated broadcast signal and / or broadcast related information and transmits the same to a terminal. The broadcast signal may include not only a TV broadcast signal, a radio broadcast signal, and a data broadcast signal, but also a broadcast signal having a data broadcast signal combined with a TV broadcast signal or a radio broadcast signal.
The broadcast related information may mean information related to a broadcast channel, a broadcast program, or a broadcast service provider. The broadcast related information may also be provided through a mobile communication network. In this case, it may be received by the
The broadcast related information may exist in various forms. For example, it may exist in the form of Electronic Program Guide (EPG) of Digital Multimedia Broadcasting (DMB) or Electronic Service Guide (ESG) of Digital Video Broadcast-Handheld (DVB-H).
The
The broadcast signal and / or broadcast related information received through the
The
The
The short
The
Referring to FIG. 1, the A /
The image frame processed by the
The
The
The
The
The
The
Some of these displays can be configured to be transparent or light transmissive so that they can be seen from the outside. This may be referred to as a transparent display. A representative example of the transparent display is TOLED (Transparant OLED). The rear structure of the
There may be two or
When the
The touch sensor may be configured to convert a change in pressure applied to a specific portion of the
If there is a touch input to the touch sensor, the corresponding signal (s) is sent to the touch controller. The touch controller processes the signal (s) and then transmits the corresponding data to the
The
Examples of the proximity sensor include a transmission photoelectric sensor, a direct reflection photoelectric sensor, a mirror reflection photoelectric sensor, a high frequency oscillation proximity sensor, a capacitive proximity sensor, a magnetic proximity sensor, and an infrared proximity sensor. When the touch screen is capacitive, the touch screen is configured to detect the proximity of the pointer by the change of the electric field according to the proximity of the pointer. In this case, the touch screen (touch sensor) may be classified as a proximity sensor.
Hereinafter, for convenience of explanation, the act of allowing the pointer to be recognized without being in contact with the touch screen so that the pointer is located on the touch screen is referred to as a "proximity touch", and the touch The act of actually touching the pointer on the screen is called "contact touch." The position where the proximity touch is performed by the pointer on the touch screen refers to a position where the pointer is perpendicular to the touch screen when the pointer is in proximity proximity.
The proximity sensor detects a proximity touch and a proximity touch pattern (for example, a proximity touch distance, a proximity touch direction, a proximity touch speed, a proximity touch time, a proximity touch position, and a proximity touch movement state). Information corresponding to the sensed proximity touch operation and proximity touch pattern may be output on the touch screen.
The
The
The
In addition to vibration, the
The
The
In detail, the
The
Preferably, the
The
The
The
The identification module is a chip that stores various types of information for authenticating the use authority of the
The interface unit may be a passage through which power from the cradle is supplied to the
The
The
The
Various embodiments described herein may be implemented in a recording medium readable by a computer or similar device using, for example, software, hardware or a combination thereof.
According to a hardware implementation, the embodiments described herein include application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable gate arrays (FPGAs), and the like. It may be implemented using at least one of processors, controllers, micro-controllers, microprocessors, and electrical units for performing other functions. The described embodiments may be implemented by the
According to the software implementation, embodiments such as the procedures and functions described herein may be implemented as separate software modules. Each of the software modules may perform one or more functions and operations described herein. Software code may be implemented in software applications written in a suitable programming language. The software code may be stored in the
2 is a front perspective view of an example of a mobile terminal according to the present invention;
The disclosed
The body includes a casing (casing, housing, cover, etc.) that forms an exterior. In this embodiment, the case may be divided into a
The cases may be formed by injecting synthetic resin or may be formed of a metal material, for example, a metal material such as stainless steel (STS) or titanium (Ti).
The
The
The
The content input by the first or
The antenna 116 for receiving a broadcast signal may be additionally disposed on the side of the terminal body. The antenna 116 constituting a part of the broadcast receiving module 111 (refer to FIG. 1) may be installed to be pulled out of the terminal body.
The terminal body is equipped with a
Hereinafter, the operation of the
3 is a front view of a portable terminal for explaining an operation state of the portable terminal according to the present invention.
The
In order to input such information, at least one of the letters, numbers, symbols, graphics, or icons may be displayed in a predetermined arrangement so as to be implemented in the form of a keypad. Such a keypad may be called a 'virtual keypad'.
3 illustrates receiving a touch applied to the virtual keypad through the front of the terminal body.
The
For example, an
In addition, the
In response to the display unit 151 (touch screen) and the touch pad 135 being touched together within a predetermined time range, one function of the terminal may be executed. When touched together, the user may clamp the terminal body using the thumb and index finger. For example, the function may include activation or deactivation of the
For convenience of description, it is assumed that the
In addition, a graphic in the form of a highlight, arrow, or finger for pointing a specific object or selecting a menu on the
However, the pointer is often used interchangeably to mean a finger, a stylus pen, or the like for a touch operation. Therefore, in the present specification, the graphic displayed on the display unit is referred to as a cursor to clearly distinguish the two, and a physical means for performing touch, proximity touch, and gesture such as a finger or a stylus pen is called a pointer.
Hereinafter, referring to FIGS. 4 to 14, according to the present invention, a touch UI for navigating objects in the screen may be provided to a user in a touch screen, and the user may manipulate the touch UI to select a desired object. The process of making selection and execution is described in detail.
4 is a flowchart illustrating a process of selecting and executing function icons in a touch screen using a touch UI displayed in the touch screen according to the present invention.
5 to 14 are explanatory views illustrating a process of selecting and executing function icons in a touch screen screen using a touch UI displayed in a touch screen screen according to the present invention.
4 to 15, the
The screen may include a web page screen including two or more hyperlinks, a home screen / standby screen including two or more application icons, a menu screen including two or more menu icons, a screen including two or more folders, and two or more files. It may be a screen including the.
In addition, the objects may be arranged and displayed in a grid grid in the screen.
The
If a touch gesture for calling the touch UI is input on the screen [S120], the
In this case, as illustrated in FIG. 5, the
That is, as shown in (a) of FIG. 5, when the touch gesture setting menu 210 for calling the touch UI is selected by the user among the menus provided in the
In this case, the touch gesture setting menu 210 includes a touch gesture setting window 211 for calling the touch UI.
When the
Thereafter, the
Meanwhile, when the touch UI is displayed on the screen, the
6A illustrates an example of a screen including two or more objects, and illustrates a
As shown in FIG. 6A, when the
That is, as shown in FIG. 6B, when the
In addition, as illustrated in FIG. 6C, when the
In this case, the
If all or part of the
In addition, the
That is, when the
4, when the first touch action is input on the touch UI, the
If a second touch action is input on the touch UI, the
Hereinafter, the process of S140 and S150 will be described in detail with reference to FIGS. 7 to 14.
First, FIGS. 7 to 9 illustrate a process of selecting a desired object on a screen and executing a function of an object using the trackball-shaped
7 to 9 are examples of a screen including two or more objects, and show a
When the
The
That is, as shown in FIG. 7A, when the first point on the
For example, in FIG. 7A, the first point is an upper left position of the
Subsequently, as shown in FIG. 7B, the
For example, in FIG. 7B, the drag touch action direction from the first point to the second point has a right direction. In this case, as shown in FIG. The highlight displayed on the
In addition, when the highlight moves from the
Subsequently, as illustrated in FIG. 7C, the
For example, in FIG. 7C, the drag touch action direction from the second point to the third point has a downward direction. In this case, as shown in FIG. The highlight displayed on the
In addition, when the highlight moves downward from the
In addition, although not shown in FIG. 7, whenever the highlight is located on the icons, the
For example, the
8 and 9 illustrate a process of executing a function assigned to an object whose highlight is moved through the
As shown in FIG. 8A, the
That is, as shown in FIG. 8B, the
In addition, as illustrated in FIG. 8C, the
Next, as shown in (a) of FIG. 9, in the state where a highlight is placed on the
Next, FIGS. 10 and 11 illustrate a process of selecting a desired object on a screen and executing a function of an object by using upper, lower, left, and right areas of the
As illustrated in FIG. 10A, when the first region of any one of the upper, lower, left, and right regions on the
For example, in FIGS. 10A and 10B, the highlight is positioned on the
Next, as illustrated in FIG. 10B, when the second region of any one of the upper, lower, left, and right regions on the
For example, in FIGS. 10B and 10C, the highlight is positioned on the
Next, FIG. 11 illustrates a process of executing a function assigned to an object whose highlight is moved through the
As shown in FIG. 11A, in the state where the highlight is positioned on the
12 to 14 illustrate a process of selecting a desired object and executing a function of an object on the screen using the
As shown in FIG. 12A, when the specific first point on the
As an example, if the coordinate (x, y) of the first point touched on the
Next, as illustrated in FIG. 12B, when the specific second point on the
For example, if the coordinates (x, y) of the second point touched on the
Next, FIGS. 13 and 14 illustrate a process of executing a function allocated to an object to which a highlight is moved through the
As shown in FIG. 13A, the
In addition, as shown in FIG. 14A, in the state where the highlight is located on the
It will be apparent to those skilled in the art that the present invention can be embodied in other specific forms without departing from the spirit and essential features of the present invention.
The present invention described above can be embodied as computer readable codes on a medium in which a program is recorded. The computer-readable medium includes all kinds of recording devices in which data that can be read by a computer system is stored. Examples of computer-readable media include ROM, RAM, CD-ROM, magnetic tape, floppy disks, optical data storage devices, and the like, which are also implemented in the form of carrier waves (eg, transmission over the Internet). It also includes. In addition, the computer may include the
Accordingly, the above detailed description should not be construed as limiting in all aspects and should be considered as illustrative. The scope of the invention should be determined by reasonable interpretation of the appended claims, and all changes within the equivalent scope of the invention are included in the scope of the invention.
The above-described mobile terminal and its control method may not be limitedly applied to the configuration and method of the above-described embodiments, but the embodiments may be selectively or partially all of the embodiments so that various modifications may be made. It may be configured in combination.
100: mobile terminal 110: wireless communication unit
111: broadcast receiving unit 112: mobile communication module
113: wireless internet module 114: short-range communication module
115: location information module 120: A / V input unit
121: camera 122: microphone
130: user input unit 140: sensing unit
141: proximity sensor 150: output unit
151: display unit 152: sound output module
153: alarm module 154: haptic module
155: Projector Module 160: Memory
170: interface unit 180: control unit
181: multimedia module 190: power supply
Claims (15)
A memory configured with a touch gesture assigned with a function for calling a touch UI for selecting and executing a desired object among the objects; And
And a controller configured to display the touch UI in the screen when the pattern of the touch gesture input on the screen matches the pattern of the touch gesture set in the memory.
The control unit,
Display the touch UI in a trackball shape,
The highlight is moved to any one of the objects according to a drag touch action having a specific movement direction input on the trackball,
When the drag touch action for the trackball is released, an application corresponding to the object where the highlight is located at the time when the drag touch action is released, is executed.
And displaying an execution screen of the executed application in a thumbnail form in the trackball.
The controller may display the touch UI on the remaining area except for the objects in the screen.
The controller may be configured to transparently display the touch UI when the touch UI and at least one of the objects overlap each other.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020120072804A KR102019116B1 (en) | 2012-07-04 | 2012-07-04 | Terminal and method for controlling the same |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020120072804A KR102019116B1 (en) | 2012-07-04 | 2012-07-04 | Terminal and method for controlling the same |
Publications (2)
Publication Number | Publication Date |
---|---|
KR20140005481A KR20140005481A (en) | 2014-01-15 |
KR102019116B1 true KR102019116B1 (en) | 2019-09-06 |
Family
ID=50140867
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
KR1020120072804A KR102019116B1 (en) | 2012-07-04 | 2012-07-04 | Terminal and method for controlling the same |
Country Status (1)
Country | Link |
---|---|
KR (1) | KR102019116B1 (en) |
Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2012077273A1 (en) | 2010-12-07 | 2012-06-14 | パナソニック株式会社 | Electronic device |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR101842198B1 (en) * | 2010-04-08 | 2018-03-26 | 엘지전자 주식회사 | Mobile terminal and method for controlling thereof |
KR101718026B1 (en) * | 2010-09-06 | 2017-04-04 | 엘지전자 주식회사 | Method for providing user interface and mobile terminal using this method |
-
2012
- 2012-07-04 KR KR1020120072804A patent/KR102019116B1/en active IP Right Grant
Patent Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2012077273A1 (en) | 2010-12-07 | 2012-06-14 | パナソニック株式会社 | Electronic device |
Also Published As
Publication number | Publication date |
---|---|
KR20140005481A (en) | 2014-01-15 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
KR101788051B1 (en) | Mobile terminal and method for controlling thereof | |
KR101740439B1 (en) | Mobile terminal and method for controlling thereof | |
KR101802760B1 (en) | Mobile terminal and method for controlling thereof | |
KR101788049B1 (en) | Mobile terminal and method for controlling thereof | |
KR101952682B1 (en) | Mobile terminal and method for controlling thereof | |
KR101078929B1 (en) | Terminal and internet-using method thereof | |
KR20110113844A (en) | Mobile terminal and method for controlling thereof | |
KR20100125635A (en) | The method for executing menu in mobile terminal and mobile terminal using the same | |
KR20120063694A (en) | Mobile terminal and method for controlling thereof | |
KR20100042005A (en) | Mobile terminal and method for controlling display thereof | |
KR20120060358A (en) | Mobile terminal and method for controlling thereof | |
KR20110045659A (en) | Method for controlling icon display in mobile terminal and mobile terminal thereof | |
KR20120009546A (en) | Mobile terminal and method for controlling thereof | |
KR20110107059A (en) | Mobile terminal and method for controlling thereof | |
KR101878141B1 (en) | Mobile terminal and method for controlling thereof | |
KR20110111877A (en) | Mobile terminal and method for controlling thereof | |
KR101842198B1 (en) | Mobile terminal and method for controlling thereof | |
KR20100104562A (en) | Mobile terminal and method for controlling wallpaper display thereof | |
KR20110080315A (en) | Mobile terminal | |
KR20100099587A (en) | Mobile terminal and method for controlling the same | |
KR20120028532A (en) | Mobile terminal and method for controlling thereof | |
KR20110061235A (en) | Mobile terminal and control for it | |
KR20110123380A (en) | Mobile terminal and method for controlling thereof | |
KR20110116383A (en) | Mobile terminal and quick access operation control method thereof | |
KR20110065748A (en) | Mobile terminal and method for controlling thereof |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
A201 | Request for examination | ||
E902 | Notification of reason for refusal | ||
E90F | Notification of reason for final refusal | ||
E701 | Decision to grant or registration of patent right | ||
GRNT | Written decision to grant |