WO2021107200A1 - Terminal mobile et procédé de commande de terminal mobile - Google Patents
Terminal mobile et procédé de commande de terminal mobile Download PDFInfo
- Publication number
- WO2021107200A1 WO2021107200A1 PCT/KR2019/016632 KR2019016632W WO2021107200A1 WO 2021107200 A1 WO2021107200 A1 WO 2021107200A1 KR 2019016632 W KR2019016632 W KR 2019016632W WO 2021107200 A1 WO2021107200 A1 WO 2021107200A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- application
- mobile terminal
- cloud server
- splash
- image
- Prior art date
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M1/00—Substation equipment, e.g. for use by subscribers
- H04M1/72—Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
- H04M1/724—User interfaces specially adapted for cordless or mobile telephones
- H04M1/72403—User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M1/00—Substation equipment, e.g. for use by subscribers
- H04M1/72—Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
- H04M1/725—Cordless telephones
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M3/00—Automatic or semi-automatic exchanges
- H04M3/42—Systems providing special services or facilities to subscribers
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M2201/00—Electronic components, circuits, software, systems or apparatus used in telephone systems
- H04M2201/34—Microprocessors
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M2201/00—Electronic components, circuits, software, systems or apparatus used in telephone systems
- H04M2201/36—Memories
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M2201/00—Electronic components, circuits, software, systems or apparatus used in telephone systems
- H04M2201/38—Displays
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M2250/00—Details of telephonic subscriber devices
- H04M2250/06—Details of telephonic subscriber devices including a wireless LAN interface
Definitions
- the terminal may be divided into a mobile/portable terminal and a stationary terminal according to whether the terminal can be moved.
- the mobile terminal can be divided into a handheld terminal and a vehicle mounted terminal depending on whether the user can carry it directly.
- mobile terminals are diversifying. For example, there are functions for data and voice communication, photo and video shooting through a camera, voice recording, music file playback through a speaker system, and image or video output to the display. Some terminals add an electronic game play function or perform a multimedia player function. In particular, recent mobile terminals can receive multicast signals that provide broadcast and visual content such as video or television programs.
- a user interface for controlling a device by detecting a motion or gesture of a user based on 3D vision technology is also developing.
- the three-dimensional vision-based UO can be applied to various applications by supplementing the existing two-dimensional touch-based UI.
- an object can be controlled in three dimensions, even when the device is in a location that the user cannot touch, and the user's hand is contaminated or wearing gloves. It allows you to control the device even when it is difficult to touch, such as when there is one. Accordingly, a three-dimensional vision-based gesture recognition technology is in the spotlight.
- AR augmented reality
- the mobile terminal overcomes the capacity limit of local storage through the cloud environment, and users can easily share files on multiple devices through the cloud environment.
- An object of the present invention is to shorten the time when a splash image of an application installed on a cloud server is output to a mobile terminal.
- the mobile terminal may include a display unit, a communication unit communicating with a cloud server in which an application is installed, a local storage including a splash image of the application, and a request to execute the application to the cloud.
- the cloud server executes the application
- the splash image may include a first splash image fixed in response to the application or a second splash image corresponding to a screen at a time when the execution of the application is stopped.
- the processor when the executed application is terminated, receives the first splash image from the cloud server, and updates the storage image stored in the local storage to the first splash image.
- the processor receives the first splash image from the cloud server, and updates the storage image stored in the local storage to the first splash image.
- the processor when the executed application is stopped, receives the second splash image from the cloud server, and updates the storage image stored in the local storage to the second splash image.
- the processor receives the second splash image from the cloud server, and updates the storage image stored in the local storage to the second splash image.
- the storage image when the application is first executed, the storage image may be received from the cloud server in response to the application execution request and stored in the local storage.
- the method for controlling a mobile terminal includes the steps of receiving an application execution request, transmitting the execution request to a cloud server in which the application is installed, and splashing the application stored in local storage It may include outputting an image to a display unit, receiving screen information configured by the application executed from the cloud server, and outputting a screen corresponding to the received screen information to the display unit.
- the cloud server may include an application code for executing the application.
- the splash image may include a first splash image that is fixed in response to the application or a second splash image that corresponds to a screen when the execution of the application is stopped.
- receiving the first splash image from the cloud server, and updating the splash image stored in the local storage to the first splash image may further include.
- receiving the second splash image from the cloud server, and updating the splash image stored in the local storage to the second splash image may include more.
- the storage image when the application is first executed, the storage image may be received from the cloud server in response to the application execution request and stored in the local storage.
- the mobile terminal may prevent delay in outputting a splash image when an application installed on a cloud server is executed.
- 1A is a block diagram of a mobile terminal according to an embodiment.
- FIGS. 1B and 1C are conceptual views of a mobile terminal viewed from different directions according to an exemplary embodiment.
- FIG. 2 illustrates a cloud system according to an embodiment.
- FIG. 3 is a block diagram of a mobile terminal according to an embodiment.
- FIG. 4 illustrates a processor executing an application in a mobile terminal according to an embodiment.
- FIG. 5 illustrates a processor executing an application in a mobile terminal according to an embodiment.
- FIG. 6 illustrates a main splash window according to an embodiment.
- FIG. 7 illustrates a previous screen splash window according to an embodiment.
- FIG. 8 illustrates a processor executing an application in a mobile terminal according to an embodiment.
- FIG. 9 illustrates a processor executing an application in a mobile terminal according to an embodiment.
- FIG. 10 illustrates a processor executing an application in a mobile terminal according to an embodiment.
- FIG. 11 illustrates a processor executing an application in a mobile terminal according to an embodiment.
- FIG. 12 illustrates a processor for acquiring a splash image stored in a local storage of a mobile terminal according to an embodiment.
- the mobile terminal described in this specification includes a mobile phone, a smart phone, a laptop computer, a digital broadcasting terminal, a personal digital assistant (PDA), a portable multimedia player (PMP), a navigation system, and a slate PC.
- PDA personal digital assistant
- PMP portable multimedia player
- slate PC slate PC
- tablet PCs ultrabooks
- wearable devices for example, watch-type terminals (smartwatch), glass-type terminals (smart glass), HMD (head mounted display), etc. may be included. have.
- FIG. 1A is a block diagram illustrating a mobile terminal related to the present invention
- FIGS. 1B and 1C are conceptual views of an example of the mobile terminal related to the present invention viewed from different directions.
- the mobile terminal 100 includes a wireless communication unit 110 , an input unit 120 , a sensing unit 140 , an output unit 150 , an interface unit 160 , a memory 170 , a control unit 180 , and a power supply unit 190 . ) and the like.
- the components shown in FIG. 1A are not essential for implementing the mobile terminal, so the mobile terminal described in this specification may have more or fewer components than those listed above.
- the wireless communication unit 110 is between the mobile terminal 100 and the wireless communication system, between the mobile terminal 100 and another mobile terminal 100, or between the mobile terminal 100 and an external server. It may include one or more modules that enable wireless communication between them. In addition, the wireless communication unit 110 may include one or more modules for connecting the mobile terminal 100 to one or more networks.
- the wireless communication unit 110 may include at least one of a broadcast reception module 111 , a mobile communication module 112 , a wireless Internet module 113 , a short-range communication module 114 , and a location information module 115 . .
- the input unit 120 includes a camera 121 or an image input unit for inputting an image signal, a microphone 122 or an audio input unit for inputting an audio signal, and a user input unit 123 for receiving information from a user, for example, , a touch key, a push key, etc.).
- the voice data or image data collected by the input unit 120 may be analyzed and processed as a user's control command.
- the sensing unit 140 may include one or more sensors for sensing at least one of information in the mobile terminal, surrounding environment information surrounding the mobile terminal, and user information.
- the sensing unit 140 may include a proximity sensor 141, an illumination sensor 142, an illumination sensor, a touch sensor, an acceleration sensor, a magnetic sensor, and gravity.
- G-sensor gyroscope sensor
- motion sensor RGB sensor
- infrared sensor IR sensor: infrared sensor
- fingerprint sensor fingerprint sensor
- ultrasonic sensor ultrasonic sensor
- optical sensors eg, cameras (see 121)
- microphones see 122
- battery gauges environmental sensors (eg, barometers, hygrometers, thermometers, radiation detection sensors, It may include at least one of a thermal sensor, a gas sensor, etc.) and a chemical sensor (eg, an electronic nose, a healthcare sensor, a biometric sensor, etc.).
- the mobile terminal disclosed in the present specification may combine and utilize information sensed by at least two or more of these sensors.
- the output unit 150 is for generating an output related to visual, auditory or tactile sense, and may include at least one of a display 151 , a sound output unit 152 , a haptip module 153 , and an optical output unit.
- the display 151 may implement a touch screen by forming a layer structure with the touch sensor or being integrally formed therewith. Such a touch screen may function as the user input unit 123 providing an input interface between the mobile terminal 100 and the user, and may provide an output interface between the mobile terminal 100 and the user.
- the interface unit 160 serves as a passage with various types of external devices connected to the mobile terminal 100 .
- Such an interface unit 160 a wired / wireless headset port (port), an external charger port (port), a wired / wireless data port (port), a memory card (memory card) port, for connecting a device equipped with an identification module It may include at least one of a port, an audio input/output (I/O) port, a video input/output (I/O) port, and an earphone port.
- the mobile terminal 100 may perform appropriate control related to the connected external device.
- the memory 170 stores data supporting various functions of the mobile terminal 100 .
- the memory 170 may store a plurality of application programs (or applications) driven in the mobile terminal 100 , data for operation of the mobile terminal 100 , and commands. At least some of these application programs may be downloaded from an external server through wireless communication. In addition, at least some of these application programs may exist on the mobile terminal 100 from the time of shipment for basic functions (eg, incoming calls, outgoing functions, message reception, and outgoing functions) of the mobile terminal 100 . Meanwhile, the application program may be stored in the memory 170 , installed on the mobile terminal 100 , and driven to perform an operation (or function) of the mobile terminal by the controller 180 .
- the controller 180 In addition to the operation related to the application program, the controller 180 generally controls the overall operation of the mobile terminal 100 .
- the controller 180 may provide or process appropriate information or functions to the user by processing signals, data, information, etc. input or output through the above-described components or by driving an application program stored in the memory 170 .
- controller 180 may control at least some of the components discussed with reference to FIG. 1A in order to drive an application program stored in the memory 170 . Furthermore, in order to drive the application program, the controller 180 may operate by combining at least two or more of the components included in the mobile terminal 100 with each other.
- the power supply unit 190 receives external power and internal power under the control of the controller 180 to supply power to each component included in the mobile terminal 100 .
- the power supply 190 includes a battery, and the battery may be a built-in battery or a replaceable battery.
- At least some of the respective components may operate cooperatively with each other to implement an operation, control, or control method of a mobile terminal according to various embodiments to be described below.
- the operation, control, or control method of the mobile terminal may be implemented on the mobile terminal by driving at least one application program stored in the memory 170 .
- FIG. 1B and 1C illustrate basic features of a foldable mobile terminal in an unfolded state.
- the mobile terminal 100 includes a display 151 , first and second sound output units 152a and 152b , a proximity sensor 141 , an illuminance sensor 142 , a light output unit 154 , and first and second cameras. (121a, 121b), the first and second operation units (123a, 123b), a microphone 122, the interface unit 160, etc. may be provided.
- the camera 121a and the first operation unit 123a are disposed, the second operation unit 123b, the microphone 122 and the interface unit 160 are disposed on the side of the terminal body, and the first operation unit 123b is disposed on the rear surface of the terminal body. 2
- the mobile terminal 100 in which the sound output unit 152b and the second camera 121b are disposed will be described as an example.
- first operation unit 123a may not be provided on the front surface of the terminal body
- second sound output unit 152b may be provided on the side surface of the terminal body instead of the rear surface of the terminal body.
- the display 151 displays (outputs) information processed by the mobile terminal 100 .
- the display 151 may display execution screen information of an application program driven in the mobile terminal 100 or UI (User Interface) and GUI (Graphic User Interface) information according to the execution screen information.
- UI User Interface
- GUI Graphic User Interface
- the display 151 includes a liquid crystal display (LCD), a thin film transistor-liquid crystal display (TFT LCD), an organic light-emitting diode (OLED), and a flexible display.
- LCD liquid crystal display
- TFT LCD thin film transistor-liquid crystal display
- OLED organic light-emitting diode
- a flexible display e.g., a three-dimensional display (3D display), may include at least one of an e-ink display (e-ink display).
- two or more displays 151 may exist according to an implementation form of the mobile terminal 100 .
- a plurality of displays may be spaced apart or disposed integrally on one surface, or may be respectively disposed on different surfaces.
- the display 151 may include a touch sensor for sensing a touch on the display 151 so as to receive a control command input by a touch method. Using this, when a touch is made on the display 151 , the touch sensor detects the touch, and the controller 180 may generate a control command corresponding to the touch based thereon.
- the content input by the touch method may be letters or numbers, or menu items that can be instructed or designated in various modes.
- the touch sensor is configured in the form of a film having a touch pattern and is disposed between the window 151a covering the display 151 and a plurality of layers constituting the display 151 or directly on the back surface of the window 151a. It can also be a patterned metal wire. Alternatively, the touch sensor may be integrally formed with the display. For example, the touch sensor may be disposed on a substrate of the display or provided inside the display.
- the display 151 may form a touch screen together with the touch sensor, and in this case, the touch screen may function as the user input unit 123 (refer to FIG. 1A ). In some cases, the touch screen may replace at least some functions of the first operation unit 123a.
- the first sound output unit 152a may be implemented as a receiver that transmits a call sound to the user's ear, and the second sound output unit 152b is a loudspeaker that outputs various alarm sounds or multimedia reproduction sounds. ) can be implemented in the form of
- a sound hole for emitting sound generated from the first sound output unit 152a may be formed in the window 151a of the display 151 .
- the present invention is not limited thereto, and the sound may be configured to be emitted along an assembly gap between structures (eg, a gap between the window 151a and the front case 101). In this case, the appearance of the mobile terminal 100 may be simpler because the holes formed independently for sound output are not visible or hidden.
- the light output unit is configured to output light to notify the occurrence of an event.
- Examples of the event may include a message reception, a call signal reception, a missed call, an alarm, a schedule notification, an email reception, and information reception through an application.
- the controller 180 may control the light output unit to end the light output.
- the first camera 121a processes an image frame of a still image or a moving image obtained by an image sensor in a shooting mode or a video call mode.
- the processed image frame may be displayed on the display 151 and stored in the memory 170 .
- the first and second manipulation units 123a and 123b are an example of the user input unit 123 operated to receive a command for controlling the operation of the mobile terminal 100, and may be collectively referred to as a manipulating portion. have.
- the first and second operation units 123a and 123b may be adopted in any manner as long as they are operated in a tactile manner such as touch, push, scroll, etc. by the user while receiving a tactile feeling.
- the first and second manipulation units 123a and 123b may be operated in a manner in which the user is operated without a tactile feeling through a proximity touch, a hovering touch, or the like.
- the first operation unit 123a is illustrated as a touch key, but the present invention is not limited thereto.
- the first operation unit 123a may be a push key (mechanical key) or may be configured as a combination of a touch key and a push key.
- the contents input by the first and second operation units 123a and 123b may be variously set.
- the first operation unit 123a receives commands such as menu, home key, cancel, search, etc.
- the second operation unit 123b is output from the first or second sound output units 152a and 152b. It is possible to receive a command such as adjusting the volume of the sound and switching to the touch recognition mode of the display 151 .
- a rear input unit (not shown) may be provided on the rear surface of the terminal body.
- the rear input unit is manipulated to receive a command for controlling the operation of the mobile terminal 100 , and input contents may be variously set. For example, a command such as power on/off, start, end, scroll, etc., adjustment of the sound output from the first and second sound output units 152a and 152b, and the display 151 to a touch recognition mode. Commands such as switching can be input.
- the rear input unit may be implemented in a form capable of input by a touch input, a push input, or a combination thereof.
- the rear input unit may be disposed to overlap the display 151 on the front side in the thickness direction of the terminal body.
- the rear input unit may be disposed on the upper rear portion of the terminal body so that the user can easily manipulate the terminal body by using the index finger when holding the terminal body with one hand.
- the present invention is not necessarily limited thereto, and the position of the rear input unit may be changed.
- the rear input unit when the rear input unit is provided on the rear surface of the terminal body, a new type of user interface using the same can be implemented.
- the aforementioned touch screen or rear input unit replaces at least some functions of the first operation unit 123a provided on the front surface of the terminal body, the first operation unit 123a is not disposed on the front surface of the terminal body,
- the display 151 may be configured with a larger screen.
- the mobile terminal 100 may be provided with a fingerprint recognition sensor 143 for recognizing a user's fingerprint, and the controller 180 may use fingerprint information detected through the fingerprint recognition sensor 143 as an authentication means.
- the fingerprint recognition sensor may be built into the display 151 or the user input unit 123, or may be provided at a separate location.
- the microphone 122 is configured to receive a user's voice, other sounds, and the like.
- the microphone 122 may be provided at a plurality of locations and configured to receive stereo sound.
- the interface unit 160 serves as a passage through which the mobile terminal 100 can be connected to an external device.
- the interface unit 160 is a connection terminal for connection with another device (eg, earphone, external speaker), a port for short-range communication (eg, infrared port (IrDA Port), Bluetooth port (Bluetooth) Port), a wireless LAN port, etc.], or at least one of a power supply terminal for supplying power to the mobile terminal 100 .
- the interface unit 160 may be implemented in the form of a socket for accommodating an external card such as a subscriber identification module (SIM), a user identity module (UIM), or a memory card for information storage.
- SIM subscriber identification module
- UIM user identity module
- memory card for information storage.
- a second camera 121b may be disposed on the rear side of the terminal body.
- the second camera 121b has a photographing direction substantially opposite to that of the first camera 121a.
- the second camera 121b may include a plurality of lenses arranged along at least one line.
- the plurality of lenses may be arranged in a matrix form.
- Such a camera may be referred to as an 'array camera'.
- an image may be captured in various ways using a plurality of lenses, and an image of better quality may be obtained.
- the flash 124 may be disposed adjacent to the second camera 121b.
- the flash 124 illuminates light toward the subject when the subject is photographed by the second camera 121b.
- a second sound output unit 152b may be additionally disposed on the terminal body.
- the second sound output unit 152b may implement a stereo function together with the first sound output unit 152a, and may be used to implement a speakerphone mode during a call.
- At least one antenna for wireless communication may be provided in the terminal body.
- the antenna may be built into the terminal body or formed in the case.
- an antenna forming a part of the broadcast reception module 111 (refer to FIG. 1A ) may be configured to be withdrawable from the terminal body.
- the antenna may be formed in a film type and attached to the inner surface of the rear cover 103 , or a case including a conductive material may be configured to function as an antenna.
- the terminal body is provided with a power supply unit 190 (refer to FIG. 1A ) for supplying power to the mobile terminal 100 .
- the power supply unit 190 may include a battery 191 that is built into the terminal body or is detachably configured from the outside of the terminal body.
- the battery 191 may be configured to receive power through a power cable connected to the interface unit 160 .
- the battery 191 may be configured to be wirelessly charged through a wireless charging device.
- the wireless charging may be implemented by a magnetic induction method or a resonance method (magnetic resonance method).
- the rear cover 103 is coupled to the rear case 102 to cover the battery 191 to limit the detachment of the battery 191 and to protect the battery 191 from external shocks and foreign substances. is foreshadowing
- the rear cover 103 may be detachably coupled to the rear case 102 .
- the camera (the camera 121 shown in FIG. 1 and the camera 121a shown in FIG. 1A) applied to the present invention is located around the touch screen of the mobile device. Accordingly, it is possible to detect depth information on an object (eg, a user's finger) within a predetermined distance from the touch screen of the mobile device. Such a camera may be referred to as a depth camera.
- the first method is a method of using a multi-camera (or lens), which captures visible light using two or more cameras, and generates a 3D image using depth information.
- the second method is a method of mounting a separate sensor for depth sensing in a camera module, and more specifically, structured light (SL) and time of flight (ToF) methods are applied.
- SL structured light
- TOF time of flight
- SL method After irradiating a laser of a specific pattern, such as a straight line or a grid pattern, to an object to be photographed, information in which the pattern is deformed according to the shape of the object surface is analyzed. Furthermore, after calculating the depth information, a 3D-based shooting result is derived by synthesizing it with a picture taken by an image sensor. To implement this, a laser infrared (IR) projector that emits a specific pattern, an infrared depth sensor, an image sensor, and a 3D processor may be used.
- IR laser infrared
- depth information is calculated by measuring the time it takes for a laser to go to a photographing target and then reflected back, and then synthesized with a photograph taken by an image sensor to derive 3D-based photographing results.
- a laser infrared (IR) projector a receiving sensor, an image sensor, a 3D processor, and the like may be used.
- FIG. 2 illustrates a cloud system according to an embodiment.
- the cloud system may include a mobile terminal 200 , a communication network 300 , and a cloud server 400 .
- the mobile terminal 200 is a device capable of communicating with the cloud server 400 and may be a cloud terminal.
- the cloud terminal is a terminal in which most of the services provided by the cloud server 400 are processed unlike the application execution process in the terminal, and the terminal itself may serve as an input/output device.
- the cloud terminal may be a terminal that executes an application installed in a virtual file system (VFS)-based cloud server through an operating system (OS).
- VFS virtual file system
- OS operating system
- the VFS may be an abstraction layer above the actual file system.
- the purpose of VFS is to allow applications to access multiple file systems in the same way. For example, with VFS, applications can directly access local storage and cloud servers, so they may not notice the difference between local storage and cloud servers. In addition, it is possible to access the file system without feeling the difference beyond the OS difference.
- the communication network 300 may be a single or complex network that enables communication by connecting the mobile terminal 200 and the cloud server 400 through wired, wireless and/or the Internet.
- the cloud server 400 is a storage device on the communication network 300 , and may be a storage device that the mobile terminal 200 can access through the communication network 300 at any time and anywhere.
- FIG. 3 is a block diagram of a mobile terminal 200 according to an embodiment. Specifically, the mobile terminal 200 of FIG. 3 may represent the controller 180 of FIG. 1 .
- the mobile terminal 200 may include an Operating System (OS) 210 , a Virtual File System (VFS) 220 , a cloud file manager 230 , and a cloud file cache 240 .
- OS Operating System
- VFS Virtual File System
- cloud file manager 230 a cloud file manager 230
- cloud file cache 240 a cloud file cache 240
- the OS 210 is an operating system installed in the mobile terminal 200 and may include an Android-based OS.
- the VFS 220 may be used to mediate communication between the OS 210 and the cloud file manager 230 while defining an interface or standard between the kernel and the actual file system.
- the cloud file manager 230 may be for mediating communication with the OS when the mobile terminal 200 executes an application installed on the cloud server 400 on the communication network 300 .
- the cloud file cache 240 may represent an internal cache area of the cloud file manager 230 .
- the cloud file manager 230 may store a file downloaded from the cloud server 400 as a cache file in the cloud file cache 240 .
- the cloud file manager 230 may request the cloud server 400 to execute an application through the mobile terminal 200 .
- the cloud file manager 230 may temporarily or semi-permanently store the file received from the cloud server 400 in the cloud file cache 240 .
- the file stored in the cloud file cache 240 may include a screen image (hereinafter, referred to as a splash window) output during the time required to execute the application.
- FIG. 4 illustrates a processor executing an application in the mobile terminal 200 according to an embodiment. Specifically, FIG. 4 shows a processor executing an application stored in the local storage of the mobile terminal 200 .
- the mobile terminal 200 may receive a request to execute an application from the user through the input device (S301).
- the input device may include at least one of a touch signal, a motion signal, and a voice signal applied to the touch panel.
- the mobile terminal 200 may identify the application requested to be executed and read the application code from the local storage (S302).
- the mobile terminal 200 may read the splash window corresponding to the application executed through the application code from the local storage and output the splash window to the display (S303).
- a time required to output the splash window after the application execution request may be defined as the first time t1.
- the first time t1 may be a time required for the first screen of the application to be output to the application execution request.
- the mobile terminal 200 may output a splash window and initialize data (S304).
- the step of initializing the data may include the step of initializing the application code.
- the mobile terminal 200 may configure the first screen of application execution (S305) and output the configured first screen to the display (S306).
- a time required to output the first screen after the application execution request may be defined as a second time t2.
- the second time t2 may be a time required for the application to respond to the application execution request.
- FIG. 5 illustrates a processor executing an application in the mobile terminal 200 according to an embodiment. Specifically, FIG. 5 shows a processor executing an application stored in the local storage of the mobile terminal 200 .
- the mobile terminal 200 may receive a request to execute an application from the user through the input device (S401).
- the input device may include at least one of a touch signal, a motion signal, and a voice signal applied to the touch panel.
- the mobile terminal 200 may identify the application requested to be executed and read the application code from the local storage (S402).
- the mobile terminal 200 may determine whether the application requested to be executed is in the running state (S403). The mobile terminal 200 may discriminate whether the application requested to be executed is in the running state through the application code or separately stored data.
- the mobile terminal 200 may output a main splash window when the application requested to be executed is not in the running state (S403, No) (S404).
- the main splash window may be a representative image provided when the application is first started.
- the main splash window may refer to FIG. 6 .
- the mobile terminal 200 may output a main splash window and initialize data (S405).
- the step of initializing the data may include the step of initializing the application code.
- the mobile terminal 200 may configure the first screen of application execution (S406) and output the configured first screen to the display (S407).
- the mobile terminal 200 may output a previous screen splash window when the execution-requested application is in the running state (S403, No) (S408).
- the previous screen splash window may be an image corresponding to the execution screen of the application when execution is stopped.
- the previous screen splash window may refer to FIG. 7 .
- the previous screen splash window may be stored in local storage.
- the previous screen splash window may be updated correspondingly if the application is running.
- the mobile terminal 200 may output the previous screen splash window and re-read the data (S409).
- the step of re-reading the data may include detecting a point where the application code stops while executing.
- the mobile terminal 200 may configure the previous screen stopped during application execution (S410), and output the configured previous screen to the display (S411).
- FIG. 6 illustrates a main splash window according to an embodiment.
- FIG. 6( a ) illustrates an embodiment in which a main splash window output when a phone application is first executed is output.
- FIG. 6(b) shows an embodiment in which the main splash window output when the message application is first executed is output.
- FIG. 6( c ) shows an embodiment in which the main splash window output when the contact application is first executed is output.
- FIG. 7 illustrates a previous screen splash window according to an embodiment.
- Figure 7 (a) shows an embodiment in which the phone application is stopped while inputting a specific phone number (010-XXX-XXXX), and outputting the previous screen splash window stored in the local storage.
- FIG. 7(b) shows an embodiment in which the message application is stopped while executing a message window with a specific person (Chaeyeon), and the previous screen splash window stored in the local storage is output.
- FIG. 7(c) shows an embodiment in which the contact application is stopped while running the contact window of a specific person (Chaeyeon), and the previous screen splash window stored in the local storage is output.
- FIG. 8 illustrates a processor executing an application in the mobile terminal 200 according to an embodiment. Specifically, FIG. 8 illustrates a processor executing an application stored in the cloud server 400 of the mobile terminal 200 .
- the mobile terminal 200 may receive a request to execute an application from the user through the input device (S501).
- the input device may include at least one of a touch signal, a motion signal, and a voice signal applied to the touch panel.
- the mobile terminal 200 may transmit the application execution request to the cloud server 400 (S502).
- the cloud file manager 230 may transmit an application execution request to the cloud server 400 through the communication network 300 .
- the cloud server 400 may receive the application execution request (S503) and read the execution-requested application code (S504).
- the cloud server 400 may generate a splash window corresponding to the execution-requested application code through the code (S505).
- the mobile terminal 200 may receive the splash window from the cloud server 400 (S506), and may output the received splash window (S507).
- the cloud server 400 may transmit the splash window (S506) to the mobile terminal 200 and initialize the data (S508).
- the step of initializing the data may include the step of initializing the application code.
- the cloud server 400 may configure the first screen of application execution (S509) and generate the first screen (S510).
- the mobile terminal 200 may output the received first screen to the display (S512).
- FIG. 9 illustrates a processor executing an application in the mobile terminal 200 according to an embodiment. Specifically, FIG. 9 illustrates a processor executing an application stored in the cloud server 400 of the mobile terminal 200 .
- the mobile terminal 200 may receive a request to execute an application from the user through the input device (S601).
- the input device may include at least one of a touch signal, a motion signal, and a voice signal applied to the touch panel.
- the mobile terminal 200 may transmit the application execution request to the cloud server 400 (S602).
- the cloud file manager 230 may transmit an application execution request to the cloud server 400 through the communication network 300 .
- the cloud server 400 may receive the application execution request (S603) and read the execution-requested application code (S604).
- the cloud server 400 may identify that the application requested to be executed is in the running state (S605).
- the running state may be a state including a suspended state.
- a state contrasting with the execution state may be an end state.
- a main splash window may be created.
- the main splash window may be a representative image of an application requested to be executed.
- a previous screen splash window may be created.
- the previous screen splash window may be an image corresponding to the screen at the point in time when the execution of the previous application is stopped.
- the cloud server 400 may transmit a splash window to the mobile terminal 200 in response to the state of the application requested to be executed. Specifically, when the execution-requested application is in an end state (S605, No), the main splash window may be transmitted to the mobile terminal 200 (S608). In addition, when the execution-requested application is in a suspended state (S605, Yes), the previous screen splash window may be transmitted to the mobile terminal 200 (S609).
- the mobile terminal 200 may output the splash window to the display unit (S610).
- the cloud server 400 may transmit the main splash window to the mobile terminal 200 and initialize data (S611).
- the step of initializing the data may include the step of initializing the application code.
- the cloud server 400 may configure the first screen of the application execution (S612) and generate the first screen (S613).
- the cloud server 400 may transmit the previous screen splash window to the mobile terminal 200 and read the data again (S615).
- the step of re-reading the data may include detecting a point where the application code stops while executing.
- the cloud server 400 may configure the previous screen stopped during application execution (S616) and generate the previous screen (S617).
- the mobile terminal 200 may output a reception screen on the display unit (S619).
- FIG. 10 illustrates a processor executing an application in the mobile terminal 200 according to an embodiment. Specifically, FIG. 10 illustrates a processor executing an application stored in the cloud server 400 of the mobile terminal 200 .
- the mobile terminal 200 may receive a request to execute an application from the user through the input device (S701).
- the input device may include at least one of a touch signal, a motion signal, and a voice signal applied to the touch panel.
- the mobile terminal 200 may transmit the application execution request to the cloud server 400 (S702).
- the cloud file manager 230 may transmit an application execution request to the cloud server 400 through the communication network 300 .
- the mobile terminal 200 may acquire a splash window from local storage before or at the same time before or after transmitting the application execution request to the cloud server 400, and output the acquired splash window to the display unit (S703).
- the splash window may be pre-stored in the local storage in response to the application requested to be executed.
- the mobile terminal does not receive and output the splash window from the cloud server 400, but directly output the splash window stored in the local storage, thereby reducing the time for outputting the splash window in response to the application execution request.
- the mobile terminal may output the splash window stored in the local storage until the first screen received from the cloud server 400 is output ( S710 ) after the application of the splash window is requested.
- the cloud server 400 may receive the application execution request (S702) and read the execution-requested application code (S704).
- the cloud server 400 may transmit the splash window to the mobile terminal 200 and initialize data (S706).
- the step of initializing the data may include the step of initializing the application code.
- the cloud server 400 may configure the first screen of the application execution (S707) and generate the first screen (S708).
- the mobile terminal 200 may output the received first screen to the display (S710).
- FIG. 11 illustrates a processor executing an application in the mobile terminal 200 according to an embodiment. Specifically, FIG. 11 illustrates a processor executing an application stored in the cloud server 400 of the mobile terminal 200 .
- the mobile terminal 200 may receive a request to execute an application from the user through the input device (S801).
- the input device may include at least one of a touch signal, a motion signal, and a voice signal applied to the touch panel.
- the mobile terminal 200 may transmit the application execution request to the cloud server 400 (S802).
- the cloud file manager 230 may transmit an application execution request to the cloud server 400 through the communication network 300 .
- the mobile terminal 200 may acquire a splash window from local storage before or at the same time before or after transmitting the application execution request to the cloud server 400, and output the acquired splash window to the display unit (S803).
- the splash window may be pre-stored in the local storage in response to the application requested to be executed.
- the mobile terminal does not receive and output the splash window from the cloud server 400, but directly output the splash window stored in the local storage, thereby reducing the time for outputting the splash window in response to the application execution request.
- the mobile terminal may output the splash window stored in the local storage until the first screen received from the cloud server 400 is output ( S710 ) after the application of the splash window is requested.
- the cloud server 400 may receive the application execution request (S802) and read the execution-requested application code (S804).
- the cloud server 400 may discriminate that the application requested to be executed is in the running state (S806).
- the running state may be a state including a suspended state.
- a state contrasting with the execution state may be an end state.
- the cloud server 400 may transmit a main splash window to the mobile terminal 200 and initialize data (S807).
- the step of initializing the data may include the step of initializing the application code.
- the cloud server 400 may configure the first screen of the application execution (S808) and generate the first screen (S809).
- the cloud server 400 may transmit the previous screen splash window to the mobile terminal 200 and re-read the data (S810).
- the step of re-reading the data may include detecting a point where the application code stops while executing.
- the cloud server 400 may configure the previous screen stopped during application execution (S811) and generate the previous screen (S812).
- the mobile terminal 200 When the mobile terminal 200 receives the first screen or the previous screen generated from the cloud server 400 (S813, S814), the mobile terminal 200 may output the reception screen on the display unit (S815).
- FIG. 12 illustrates a processor for acquiring a splash image stored in a local storage of the mobile terminal 200 according to an embodiment.
- a user may use the application executed in the cloud server 400 through the mobile terminal 200 (S901).
- the mobile terminal 200 may transmit data corresponding to application use to the cloud server 400 (S902).
- the cloud server 400 may receive data corresponding to application use and synchronize the data (S903).
- the cloud server 400 may configure an execution screen in response to data synchronization (S904) and generate an execution screen (S905).
- the mobile terminal 200 may receive the execution screen from the cloud server 400 and output the execution screen corresponding to the use of the application to the display unit (S907).
- the user may stop or terminate the use of the application through the mobile terminal 200 (S908). Stopping the use of the application may not terminate the execution of the application, but may be a transition to a state where another application is temporarily executed and used. Ending the use of the application may be transitioning the application to an end state.
- the cloud server 400 may receive a signal to stop or terminate the use of the application through the mobile terminal 200 (S909) The cloud server 400 may determine whether the signal maintains the execution state of the currently running application (S910).
- the cloud server 400 may generate a main splash window when the running application is switched to the end state (S910, No) (S911).
- the cloud server 400 may transmit the generated main splash window to the mobile terminal 200, and the mobile terminal may store the received main splash window in the local storage (S915).
- the cloud server 400 may generate a previous screen splash window at the interrupted time point (S912).
- the cloud server 400 may transmit the generated previous screen splash window to the mobile terminal 200, and the mobile terminal may store the received previous screen splash window in local storage (S915).
- the local storage may update and store the stored splash window. That is, when storing a new splash window, the previously stored splash window may be deleted. This can reduce the capacity required for local storage.
- the splash window initially stored in the local storage may transmit an application execution request to the cloud server 400 for the first time in the mobile terminal 200 , and may be received from the cloud server 400 .
- the mobile terminal 200 executes the application installed in the cloud server 400 and outputs a splash window stored in the local storage before receiving the execution screen, thereby shortening the execution time felt by the user.
- the mobile terminal 200 When the mobile terminal 200 stops or terminates the execution of the running application, it receives the splash window from the cloud server 400, updates the splash window stored in the local storage, and outputs the updated splash window at the next execution. have.
- the mobile terminal 200 may receive and store the splash window initially stored in the local storage to the cloud server while requesting the execution of the application for the first time.
Landscapes
- Engineering & Computer Science (AREA)
- Signal Processing (AREA)
- Computer Networks & Wireless Communication (AREA)
- Human Computer Interaction (AREA)
- User Interface Of Digital Computer (AREA)
- Controls And Circuits For Display Device (AREA)
- Telephone Function (AREA)
Abstract
Un terminal mobile selon un mode de réalisation peut être caractérisé en ce qu'il comprend : une unité d'affichage ; une unité de communication qui communique avec un serveur infonuagique dans lequel une application est installée ; une mémoire locale qui comprend une image d'attente de l'application ; et un processeur qui délivre au serveur infonuagique une requête d'exécution pour l'application, reçoit des informations d'écran formées par l'application exécutée par le serveur infonuagique, et délivre un écran correspondant aux informations d'écran reçues sur l'unité d'affichage. Lors de la réception de la requête d'exécution pour l'application, le processeur commande ainsi à l'unité d'affichage de délivrer l'image d'attente jusqu'à ce que l'écran correspondant aux informations d'écran reçues du serveur infonuagique soit délivré sur le dispositif d'affichage.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020190155929A KR102668752B1 (ko) | 2019-11-28 | 2019-11-28 | 이동 단말기 및 이동 단말기 제어 방법 |
KR10-2019-0155929 | 2019-11-28 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2021107200A1 true WO2021107200A1 (fr) | 2021-06-03 |
Family
ID=76130592
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/KR2019/016632 WO2021107200A1 (fr) | 2019-11-28 | 2019-11-28 | Terminal mobile et procédé de commande de terminal mobile |
Country Status (2)
Country | Link |
---|---|
KR (1) | KR102668752B1 (fr) |
WO (1) | WO2021107200A1 (fr) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20220244043A1 (en) * | 2019-06-27 | 2022-08-04 | Kubota Corporation | Bending angle calculation method and calculation apparatus |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR102572383B1 (ko) * | 2021-08-24 | 2023-08-29 | 삼성에스디에스 주식회사 | 애플리케이션 관리 방법 및 장치 |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR100590690B1 (ko) * | 2005-01-06 | 2006-06-19 | 에스케이 텔레콤주식회사 | 대기모드 기반 정보채널 방송의 실시간 수신을 위한이동통신 시스템 및 방법 |
KR20090040967A (ko) * | 2007-10-23 | 2009-04-28 | (주) 엘지텔레콤 | 대기화면상에 콘텐츠를 표시하는 이동통신 단말기 및 그제어방법 |
KR20090048802A (ko) * | 2007-11-12 | 2009-05-15 | 주식회사 엘지텔레콤 | 휴대 단말 대기 화면 서비스에서 rts기능을 통한멀티미디어 콘텐츠 직접 재생 방법 및 장치 |
KR20160098822A (ko) * | 2015-02-11 | 2016-08-19 | 엔트릭스 주식회사 | 클라우드 스트리밍 서비스 시스템, 화질 저하도에 따른 이미지 클라우드 스트리밍 서비스 방법 및 이를 위한 장치 |
JP2019067355A (ja) * | 2017-09-29 | 2019-04-25 | 株式会社ドワンゴ | サーバおよび端末 |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10315108B2 (en) * | 2015-08-19 | 2019-06-11 | Sony Interactive Entertainment America Llc | Local application quick start with cloud transitioning |
-
2019
- 2019-11-28 KR KR1020190155929A patent/KR102668752B1/ko active IP Right Grant
- 2019-11-28 WO PCT/KR2019/016632 patent/WO2021107200A1/fr active Application Filing
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR100590690B1 (ko) * | 2005-01-06 | 2006-06-19 | 에스케이 텔레콤주식회사 | 대기모드 기반 정보채널 방송의 실시간 수신을 위한이동통신 시스템 및 방법 |
KR20090040967A (ko) * | 2007-10-23 | 2009-04-28 | (주) 엘지텔레콤 | 대기화면상에 콘텐츠를 표시하는 이동통신 단말기 및 그제어방법 |
KR20090048802A (ko) * | 2007-11-12 | 2009-05-15 | 주식회사 엘지텔레콤 | 휴대 단말 대기 화면 서비스에서 rts기능을 통한멀티미디어 콘텐츠 직접 재생 방법 및 장치 |
KR20160098822A (ko) * | 2015-02-11 | 2016-08-19 | 엔트릭스 주식회사 | 클라우드 스트리밍 서비스 시스템, 화질 저하도에 따른 이미지 클라우드 스트리밍 서비스 방법 및 이를 위한 장치 |
JP2019067355A (ja) * | 2017-09-29 | 2019-04-25 | 株式会社ドワンゴ | サーバおよび端末 |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20220244043A1 (en) * | 2019-06-27 | 2022-08-04 | Kubota Corporation | Bending angle calculation method and calculation apparatus |
US12044524B2 (en) * | 2019-06-27 | 2024-07-23 | Kubota Corporation | Bending angle calculation method and calculation apparatus |
Also Published As
Publication number | Publication date |
---|---|
KR20210066546A (ko) | 2021-06-07 |
KR102668752B1 (ko) | 2024-05-23 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2016003036A1 (fr) | Module capteur d'éclairement de proximité et terminal mobile l'utilisant | |
WO2019194346A1 (fr) | Structure stratifiée de pcb et terminal mobile la comprenant | |
WO2016076474A1 (fr) | Terminal mobile et son procédé de commande | |
WO2017018603A1 (fr) | Terminal mobile et son procédé de commande | |
WO2016167437A1 (fr) | Terminal mobile | |
WO2020004685A1 (fr) | Terminal mobile | |
WO2020256463A1 (fr) | Dispositif électronique comprenant un écran secondaire et procédé de fonctionnement associé | |
WO2017007101A1 (fr) | Dispositif intelligent et son procédé de commande | |
WO2018135675A1 (fr) | Dispositif électronique | |
WO2018135679A1 (fr) | Terminal mobile | |
WO2018101508A1 (fr) | Terminal mobile | |
WO2015190668A1 (fr) | Terminal mobile | |
WO2022154470A1 (fr) | Appareil électronique comprenant un module de microphone | |
WO2017030236A1 (fr) | Terminal mobile et son procédé de commande | |
WO2021070982A1 (fr) | Dispositif électronique de partage de contenu et procédé de commande correspondant | |
WO2015108287A1 (fr) | Terminal mobile | |
WO2019160190A1 (fr) | Terminal mobile comprenant un boîtier métallique et procédé de fabrication de boîtier métallique | |
WO2021107200A1 (fr) | Terminal mobile et procédé de commande de terminal mobile | |
WO2022085940A1 (fr) | Procédé et appareil de commande d'affichage d'une pluralité d'objets sur un dispositif électronique | |
WO2016195144A1 (fr) | Module appareil photo, et terminal mobile le comprenant | |
WO2018131747A1 (fr) | Terminal mobile et procédé de commande associé | |
WO2021167167A1 (fr) | Terminal mobile | |
WO2015122551A1 (fr) | Terminal mobile et son procédé de commande | |
WO2021145473A1 (fr) | Terminal mobile et procédé de commande associé | |
WO2018084338A1 (fr) | Terminal mobile |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 19954194 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 19954194 Country of ref document: EP Kind code of ref document: A1 |