KR20170011915A - Mobile terminal and method for controlling the same - Google Patents
Mobile terminal and method for controlling the same Download PDFInfo
- Publication number
- KR20170011915A KR20170011915A KR1020150105381A KR20150105381A KR20170011915A KR 20170011915 A KR20170011915 A KR 20170011915A KR 1020150105381 A KR1020150105381 A KR 1020150105381A KR 20150105381 A KR20150105381 A KR 20150105381A KR 20170011915 A KR20170011915 A KR 20170011915A
- Authority
- KR
- South Korea
- Prior art keywords
- drones
- mobile terminal
- area
- control unit
- security level
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 44
- 230000033001 locomotion Effects 0.000 claims abstract description 33
- 238000013459 approach Methods 0.000 claims abstract description 14
- 230000008859 change Effects 0.000 claims description 15
- 238000003672 processing method Methods 0.000 claims description 4
- 238000004891 communication Methods 0.000 description 33
- 230000006870 function Effects 0.000 description 26
- 230000008569 process Effects 0.000 description 11
- 238000010295 mobile communication Methods 0.000 description 10
- 238000012545 processing Methods 0.000 description 8
- 238000010586 diagram Methods 0.000 description 7
- 230000000694 effects Effects 0.000 description 7
- 238000005516 engineering process Methods 0.000 description 6
- 230000007774 longterm Effects 0.000 description 5
- 230000003287 optical effect Effects 0.000 description 5
- 230000000007 visual effect Effects 0.000 description 5
- 230000005540 biological transmission Effects 0.000 description 4
- 238000001514 detection method Methods 0.000 description 4
- 239000010408 film Substances 0.000 description 3
- 239000011521 glass Substances 0.000 description 3
- 239000011159 matrix material Substances 0.000 description 3
- 229910052751 metal Inorganic materials 0.000 description 3
- 239000002184 metal Substances 0.000 description 3
- 239000004984 smart glass Substances 0.000 description 3
- 230000005236 sound signal Effects 0.000 description 3
- XUIMIQQOPSSXEZ-UHFFFAOYSA-N Silicon Chemical compound [Si] XUIMIQQOPSSXEZ-UHFFFAOYSA-N 0.000 description 2
- 230000008901 benefit Effects 0.000 description 2
- 238000004364 calculation method Methods 0.000 description 2
- 239000003086 colorant Substances 0.000 description 2
- 238000012790 confirmation Methods 0.000 description 2
- 238000010168 coupling process Methods 0.000 description 2
- 230000014509 gene expression Effects 0.000 description 2
- 230000006698 induction Effects 0.000 description 2
- 239000004973 liquid crystal related substance Substances 0.000 description 2
- 229910052710 silicon Inorganic materials 0.000 description 2
- 239000010703 silicon Substances 0.000 description 2
- 239000007787 solid Substances 0.000 description 2
- 229910001220 stainless steel Inorganic materials 0.000 description 2
- 239000010935 stainless steel Substances 0.000 description 2
- 229920003002 synthetic resin Polymers 0.000 description 2
- 239000000057 synthetic resin Substances 0.000 description 2
- 239000010936 titanium Substances 0.000 description 2
- RTAQQCXQSZGOHL-UHFFFAOYSA-N Titanium Chemical compound [Ti] RTAQQCXQSZGOHL-UHFFFAOYSA-N 0.000 description 1
- 230000001133 acceleration Effects 0.000 description 1
- 229910052782 aluminium Inorganic materials 0.000 description 1
- XAGFODPZIPBFFR-UHFFFAOYSA-N aluminium Chemical compound [Al] XAGFODPZIPBFFR-UHFFFAOYSA-N 0.000 description 1
- 239000004020 conductor Substances 0.000 description 1
- 230000005684 electric field Effects 0.000 description 1
- 230000007613 environmental effect Effects 0.000 description 1
- VJYFKVYYMZPMAB-UHFFFAOYSA-N ethoprophos Chemical compound CCCSP(=O)(OCC)SCCC VJYFKVYYMZPMAB-UHFFFAOYSA-N 0.000 description 1
- 230000003760 hair shine Effects 0.000 description 1
- 238000005286 illumination Methods 0.000 description 1
- 230000001939 inductive effect Effects 0.000 description 1
- 238000002347 injection Methods 0.000 description 1
- 239000007924 injection Substances 0.000 description 1
- 238000001746 injection moulding Methods 0.000 description 1
- 230000009545 invasion Effects 0.000 description 1
- 238000001646 magnetic resonance method Methods 0.000 description 1
- 238000012423 maintenance Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 210000003205 muscle Anatomy 0.000 description 1
- 238000011017 operating method Methods 0.000 description 1
- 230000010355 oscillation Effects 0.000 description 1
- 238000003909 pattern recognition Methods 0.000 description 1
- 230000000149 penetrating effect Effects 0.000 description 1
- 230000005855 radiation Effects 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 238000009774 resonance method Methods 0.000 description 1
- 238000007789 sealing Methods 0.000 description 1
- 230000035807 sensation Effects 0.000 description 1
- 239000010454 slate Substances 0.000 description 1
- 238000005507 spraying Methods 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 239000000126 substance Substances 0.000 description 1
- 239000000758 substrate Substances 0.000 description 1
- 230000001502 supplementing effect Effects 0.000 description 1
- 239000010409 thin film Substances 0.000 description 1
- 229910052719 titanium Inorganic materials 0.000 description 1
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 1
Images
Classifications
-
- H04M1/72533—
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/0011—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement
- G05D1/0038—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement by providing the operator with simple or augmented images from one or more cameras located onboard the vehicle, e.g. tele-operation
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M2250/00—Details of telephonic subscriber devices
- H04M2250/52—Details of telephonic subscriber devices including functional features of a camera
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Aviation & Aerospace Engineering (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Telephone Function (AREA)
- Controls And Circuits For Display Device (AREA)
Abstract
Description
BACKGROUND OF THE
A terminal can be divided into a mobile terminal (mobile / portable terminal) and a stationary terminal according to whether the terminal can be moved. The mobile terminal can be divided into a handheld terminal and a vehicle mounted terminal according to whether the user can directly carry the mobile terminal.
The functions of mobile terminals are diversified. For example, there are data and voice communication, photographing and video shooting through a camera, voice recording, music file playback through a speaker system, and outputting an image or video on a display unit. Some terminals are equipped with an electronic game play function or a multimedia player function. In particular, modern mobile terminals can receive multicast signals that provide visual content such as broadcast and video or television programs.
Such a terminal has various functions, for example, in the form of a multimedia device having multiple functions such as photographing and photographing of a moving picture, reproduction of a music or video file, reception of a game and broadcasting, etc. .
In order to support and enhance the functionality of such terminals, it may be considered to improve the structural and / or software parts of the terminal.
The terminal can remotely control an external electronic device by executing a specific application. At this time, the terminal can not only display a control key for controlling an external electronic device, but also receive data from an external electronic device and display it on the display unit.
In particular, when the drones are controlled using the terminal, the direction of the drones can be controlled using the position information of the drones. However, with the miniaturization of drones and the upgrading of the drones' built-in cameras, driving or shooting of drones may be restricted in certain areas, such as in military areas. It is necessary to study the control method of the drones in the restricted area of driving or shooting.
The present invention is directed to solving the above-mentioned problems and other problems. Another object is to receive location information of a drone and display the location information of the dron, the map of the movable area of the dron and the security level of each area on the map based on the location information of the dron, A mobile terminal that can transmit a control signal requesting at least one of limitation of movement of a dragon or restriction of shooting of a dragon to a drone according to a security level when information is close to an area where a security level is set, It is for that purpose.
According to an aspect of the present invention, there is provided a display apparatus including a display unit, a display unit, and a display unit. The display unit receives location information of a drone, A map of the area and a security level of each area on the map; and when the location information of the dron is close to the area where the security level is set, And a control unit for transmitting a control signal requesting at least one of the control signals to the drones.
According to another aspect of the present invention, there is provided a method for controlling a drone, comprising the steps of: receiving position information of a drone; determining a position of the drone based on position information of the drone, A control for requesting at least one of movement restriction of the drones or photographing restriction of the drones in accordance with the security level when the position information of the dron is close to the area where the security level is set, And transmitting the signal to the drones.
Effects of the mobile terminal and the control method according to the present invention will be described as follows.
According to at least one of the embodiments of the present invention, there is an advantage that the possibility of privacy invasion due to the operation of the drones can be reduced.
In addition, according to at least one embodiment of the present invention, there is an advantage that the driving or shooting state of the drones can be controlled differentially according to the security level of the drones.
Further scope of applicability of the present invention will become apparent from the following detailed description. It should be understood, however, that the detailed description and specific examples, such as the preferred embodiments of the invention, are given by way of illustration only, since various changes and modifications within the spirit and scope of the invention will become apparent to those skilled in the art.
1A is a block diagram illustrating a mobile terminal according to the present invention.
1B and 1C are conceptual diagrams illustrating an example of a mobile terminal according to the present invention in different directions.
2 is a flowchart illustrating an exemplary embodiment of a control method of a mobile terminal according to the present invention.
3 to 12 are diagrams for explaining an embodiment according to a control method of a mobile terminal according to the present invention.
Hereinafter, embodiments of the present invention will be described in detail with reference to the accompanying drawings, wherein like reference numerals are used to designate identical or similar elements, and redundant description thereof will be omitted. The suffix "module" and " part "for the components used in the following description are given or mixed in consideration of ease of specification, and do not have their own meaning or role. In the following description of the embodiments of the present invention, a detailed description of related arts will be omitted when it is determined that the gist of the embodiments disclosed herein may be blurred. It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory and are intended to provide further explanation of the invention as claimed. , ≪ / RTI > equivalents, and alternatives.
Terms including ordinals, such as first, second, etc., may be used to describe various elements, but the elements are not limited to these terms. The terms are used only for the purpose of distinguishing one component from another.
It is to be understood that when an element is referred to as being "connected" or "connected" to another element, it may be directly connected or connected to the other element, . On the other hand, when an element is referred to as being "directly connected" or "directly connected" to another element, it should be understood that there are no other elements in between.
The singular expressions include plural expressions unless the context clearly dictates otherwise.
In the present application, the terms "comprises", "having", and the like are used to specify that a feature, a number, a step, an operation, an element, a component, But do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, or combinations thereof.
The mobile terminal described in this specification includes a mobile phone, a smart phone, a laptop computer, a digital broadcasting terminal, a personal digital assistant (PDA), a portable multimedia player (PMP), a navigation device, a slate PC A tablet PC, an ultrabook, a wearable device such as a smartwatch, a smart glass, and a head mounted display (HMD). have.
However, it will be appreciated by those skilled in the art that the configuration according to the embodiments described herein may be applied to fixed terminals such as a digital TV, a desktop computer, a digital signage, and the like, will be.
1A to 1C are block diagrams for explaining a mobile terminal according to the present invention, and FIGS. 1B and 1C are conceptual diagrams showing an example of a mobile terminal according to the present invention in different directions.
The
The
The
The
The
The
The
In addition, the
In addition to the operations related to the application program, the
In addition, the
The
At least some of the components may operate in cooperation with one another to implement a method of operation, control, or control of a mobile terminal according to various embodiments described below. In addition, the operation, control, or control method of the mobile terminal may be implemented on the mobile terminal by driving at least one application program stored in the
Hereinafter, the various components of the
First, referring to the
The
The wireless signal may include various types of data depending on a voice call signal, a video call signal or a text / multimedia message transmission / reception.
The
Wireless Internet technologies include, for example, wireless LAN (WLAN), wireless fidelity (Wi-Fi), wireless fidelity (Wi-Fi) Direct, DLNA (Digital Living Network Alliance), WiBro Interoperability for Microwave Access, High Speed Downlink Packet Access (HSDPA), High Speed Uplink Packet Access (HSUPA), Long Term Evolution (LTE) and Long Term Evolution-Advanced (LTE-A) 113 transmit and receive data according to at least one wireless Internet technology, including Internet technologies not listed above.
The
The short-
Here, the other
The
Next, the
The
The
Meanwhile, the
First, the
Examples of the
On the other hand, for convenience of explanation, the act of recognizing that the object is located on the touch screen in proximity with no object touching the touch screen is referred to as "proximity touch & The act of actually touching an object on the screen is called a "contact touch. &Quot; The position at which the object is closely touched on the touch screen means a position where the object corresponds to the touch screen vertically when the object is touched. The
The touch sensor uses a touch (or touch input) applied to the touch screen (or the display unit 151) by using at least one of various touch methods such as a resistance film type, a capacitive type, an infrared type, an ultrasonic type, Detection.
For example, the touch sensor may be configured to convert a change in a pressure applied to a specific portion of the touch screen or a capacitance generated in a specific portion to an electrical input signal. The touch sensor may be configured to detect a position, an area, a pressure at the time of touch, a capacitance at the time of touch, and the like where a touch object touching the touch screen is touched on the touch sensor. Here, the touch object may be a finger, a touch pen, a stylus pen, a pointer, or the like as an object to which a touch is applied to the touch sensor.
Thus, when there is a touch input to the touch sensor, the corresponding signal (s) is sent to the touch controller. The touch controller processes the signal (s) and transmits the corresponding data to the
On the other hand, the
On the other hand, the touch sensors and the proximity sensors discussed above can be used independently or in combination to provide a short touch (touch), a long touch, a multi touch, a drag touch ), Flick touch, pinch-in touch, pinch-out touch, swipe touch, hovering touch, and the like. Touch can be sensed.
The ultrasonic sensor can recognize the position information of the object to be sensed by using ultrasonic waves. Meanwhile, the
The
The
The
Also, the
In the stereoscopic display unit, a three-dimensional display system such as a stereoscopic system (glasses system), an autostereoscopic system (no-glasses system), and a projection system (holographic system) can be applied.
The
The
In addition to vibration, the
The
The
The signal output from the
The
The identification module is a chip for storing various information for authenticating the use right of the
The
The
The
Meanwhile, as described above, the
In addition, the
The
In addition, the
As another example, the
In the following, various embodiments may be embodied in a recording medium readable by a computer or similar device using, for example, software, hardware, or a combination thereof.
Referring to FIGS. 1B and 1C, the disclosed
Here, the terminal body can be understood as a concept of referring to the
The
A
In some cases, electronic components may also be mounted on the
As shown, when the
These
The
Meanwhile, the
The
1B and 1C, a
However, these configurations are not limited to this arrangement. These configurations may be excluded or replaced as needed, or placed on different planes. For example, the
The
The
In addition, the
The
The touch sensor may be a film having a touch pattern and disposed between the
In this way, the
The first
The
The
The
The first and
In this figure, the
The contents input by the first and
On the other hand, a rear input unit (not shown) may be provided on the rear surface of the terminal body as another example of the
The rear input unit may be disposed so as to overlap with the
When a rear input unit is provided on the rear surface of the terminal body, a new type of user interface using the rear input unit can be realized. When the
Meanwhile, the
The
The
And a
The
The
And a second
The terminal body may be provided with at least one antenna for wireless communication. The antenna may be embedded in the terminal body or formed in the case. For example, an antenna constituting a part of the broadcast receiving module 111 (see FIG. 1A) may be configured to be able to be drawn out from the terminal body. Alternatively, the antenna may be formed in a film type and attached to the inner surface of the
The terminal body is provided with a power supply unit 190 (see FIG. 1A) for supplying power to the
The
The
The
Hereinafter, embodiments related to a control method that can be implemented in a mobile terminal configured as above will be described with reference to the accompanying drawings. It will be apparent to those skilled in the art that the present invention may be embodied in other specific forms without departing from the spirit or essential characteristics thereof.
FIG. 2 is a flowchart illustrating an exemplary embodiment of a control method for a mobile terminal according to the present invention. FIGS. 3 to 12 are views for explaining an exemplary embodiment of a control method for a mobile terminal according to the present invention.
2, the
Specifically, the mobile terminal may execute a specific application to perform pairing with a drone. Here, the specific application refers to an application that performs a pairing between a mobile terminal and a drone, and provides a screen for controlling the operation of the drone in the paired mobile terminal.
The control unit of the mobile terminal can receive the location information from the paired drones and display the received location information of the drones, the map of the movable area of the drones, and the security level of each area on the map on the execution screen of the specific application . The control unit may display the location information of the drones on the map at a corresponding point with a specific icon. The control unit can display the range of the movable map differently according to the current location information of the drones, and can display the security level of each area on the map. In this case, the security level may be divided into a plurality of classes such as 'Level 1', 'Level 2', 'Level 3', or the like, . For example, the control unit may display the map of the movable area as the entire area of the display unit based on the location information of the drones, and display the area where the security level is set to be distinguished from other areas. Further, the control unit may display a specific icon on the map at a position corresponding to the position information of the drones.
The
More specifically, when the received location information of the drones approaches the security level set area and within a predetermined distance, the control unit can determine that the location is close to the set area.
If it is determined that the new location information does not approach the security area, the
The
The movement restriction of the drones includes prohibition of movement, and it can be classified into movement restriction or movement restriction which moves to an altitude higher than the set altitude. The photographing restriction includes prohibition of photographing, and can be classified into photographing prohibition or photographing restriction for image processing of the photographed image.
The control unit may display the moving path or the photographed image of the dron that receives the control signal requesting the movement restriction or the photographing restriction on the display unit.
The
The
The
For example, when the security level is 'Level 1', the control unit prohibits both the movement and the shooting of the drone. When the security level is 'Level 2', the control unit limits the movement of the dragon and allows shooting. In addition, if the security level is 'Level 3', the control unit permits movement and restricts shooting only.
When the position information of the dron approaches the region in which the movement restriction of the dron is set, the
When the position information of the dron approaches the region where the movement restriction of the dron is set, the
When the position information of the dron approaches the area where the shooting restriction of the dron is set, the
If the position information of the dron is close to the region where the shooting restriction of the dron is set, the control unit selects either the first option for turning off the camera of the drones or the second option for applying the image processing method set in the captured image of the camera Can be displayed in a pop-up window.
When receiving the change information of the security level of each area on the map from the server, the
If the location information of the drones is out of the area where the security level is set, the control unit may transmit the control signal to the drones so as to return to the operating method in the previous state. That is, when the drones are out of the area where the security level is set, the control unit can control the drones in a state in which there is no restriction on movement and shooting.
Hereinafter, specific embodiments will be described with reference to Figs. 3 to 12. Fig.
FIG. 3 is a view for explaining a pairing method of a mobile terminal and a drone relating to the present invention, and FIG. 4 is a view for explaining an example of movement restriction or photographing restriction according to the security level of the drones according to the present invention.
Referring to FIG. 3, the
The
Referring to FIG. 4, the control unit of the mobile terminal can display the area where the security level is set on the map displayed on the display unit. At this time, when the security level of each area is set to be different from each other, the color, brightness and the like of each area can be displayed to be distinguished from each other.
The security level can be divided into
The control unit combines the operation control level and the image control level to map the area on the map to the
FIG. 5 is a diagram showing location information of a dron in a mobile terminal according to the present invention, a map of a movable area of a dron, and an area in which a security level is set on a map area. FIG. Area, a shot image display area, and a control key display area.
Referring to FIG. 5, the
The control unit displays a specific icon (D) on the displayed map at a location corresponding to the current location information of the drones and displays a security level (
The control unit can display the driving direction (dr) of the dron on the map as the depth information of the dron is deeper.
6, the
Specifically, the control unit can display the map of the movable area of the drones, the current position information of the drones, and the security area in the map display area A1 based on the position information received from the drones. The control unit can display the image photographed at the current position of the dron in the shot image display area A2 and display the control key of the drones in the control key display area A3. The controller may change the position of the specific icon D displayed in the map display area A1 and the image of the photographed image display area A2 based on the information received in real time when the real time position information and the photographed image are received from the drones have.
FIGS. 7 to 10 are diagrams for explaining a drones control method when the drones enter the operation restriction region in the mobile terminal according to the present invention.
7, the
Also, although not shown, when the position information of the drone approaches the area PA in which the movement restriction of the drone is set, the control unit can automatically select the corresponding option according to the security level and transmit the control signal to the drone.
8 to 9, the
The
The
If the first option is selected through the user interface W2, the control unit can adjust the driving altitude of the drones (Fig. 8 (b)). Specifically, when the control unit enters the boundary of the security zone PA, the control unit adjusts the driving altitude of the drones to a predetermined height, and controls the driving altitude of the drones to be recovered to the previous altitude have.
When the height of the drones is adjusted and the vehicle is in operation, the control unit displays the contents indicating the drones in the pop-up window W1 and displays the traveling direction of the drones on the map (FIG. 9A).
If the second option is selected through the user interface W2, the control unit can adjust the travel path of the drones (Fig. 8 (c)). Specifically, the control unit can change the travel route of the drones so as not to penetrate the security area PA.
When the drones are running on the detour route, the control unit can display the contents indicating this in the popup window W1 and display the detour route dr on the map (Fig. 9 (b)).
If the third option is selected via the user interface W2, the control unit can adjust the drones to return to the user's current position (Fig. 8 (d)). Specifically, the control unit can control the driving of the drones so as to return to the current position of the user by switching the direction when reaching the boundary of the security area PA.
When the drones are returning to the user's position, the control unit displays the contents indicating the drones in the pop-up window W1 and displays the drones' running information on the map (Fig. 9 (c)).
10, when the
Specifically, when the mobile terminal and the drones are paired using the short distance communication method, if the drones approach the boundary line of the region (R1) communicable with the mobile terminal within a set distance range, the controller can no longer communicate with the drones , The drones can be controlled in advance. For example, when communication with the drone is not possible in the short-distance communication method, the control unit may display the user interface W including the return to the user position, the current position maintenance, and the control in the manual mode.
FIGS. 11 to 12 are views for explaining a drones control method when a dron enters a photographing restriction region in a mobile terminal according to the present invention.
11, when the position information of the dron approaches the region where the shooting limit of the dron is set, the
Specifically, when the position information of the drone is close to the area PA in which the photographing restriction is set, the control unit selects either the first option for turning off the camera of the drone or the second option for applying the image processing method set for the photographed image of the camera A user interface (W2) for selecting one of the options can be displayed.
Although not shown, when the position information of the drone is close to the area PA in which the photographing restriction is set, the control unit can automatically select the corresponding option according to the security level and transmit the control signal to the camera mounted on the drones. For example, when the security level is
Referring to FIG. 12, the
Specifically, when the first option is selected, the control unit can display a notification window W indicating that the photographing is prohibited area (Fig. 12 (a)).
In addition, when the second option is selected, the control unit can display a processed image obtained by processing the photographed image by a predetermined image processing method on the display unit 151 ((b) to (d) of FIG. 12). For example, if the photographed image is a building, the control unit does not perform the image processing, and if the photographed image includes a person, the control unit can set the person to perform mosaic processing only (Fig. 12 (b), (c)). In addition, the control unit can be set to blend the entire shot image (Fig. 12 (d)).
According to the present invention, when the drones are operated using the mobile terminal, a method of restricting the operation or limiting the shooting can be automatically or manually selected as the dron enters the security area, And shooting can be automatically controlled.
The present invention described above can be embodied as computer-readable codes on a medium on which a program is recorded. The computer readable medium includes all kinds of recording devices in which data that can be read by a computer system is stored. Examples of the computer readable medium include a hard disk drive (HDD), a solid state disk (SSD), a silicon disk drive (SDD), a ROM, a RAM, a CD-ROM, a magnetic tape, a floppy disk, , And may also be implemented in the form of a carrier wave (e.g., transmission over the Internet). Also, the computer may include a
100: mobile terminal 110: wireless communication unit
120: Input unit
140: sensing unit 150: output unit
160: interface unit 170: memory
180: control unit 190: power supply unit
Claims (8)
The location information of the drones, the map of the movable area of the drones, and the security level of each area on the map, based on the location information of the drones, A control unit for transmitting a control signal requesting at least one of movement restriction of the drones or photographing restriction of the drones to the drones according to the security level when the position information is close to the area where the security level is set;
.
Wherein the control unit further displays at least one of the control key of the drones or the photographed image of the drones on the display unit so as to be distinguished from the area where the map is displayed.
Wherein the control unit selects at least one of restriction of movement of the drones or restriction of shooting of the drones according to the security level.
When the position information of the dron approaches the region in which the movement restriction of the dron is set, the control unit controls the first option for adjusting the altitude, the second option for bypassing the route, A pop-up window for displaying a user interface for selecting one of the user interfaces.
Wherein the control unit applies a first option for turning off the camera of the drones or an image processing method set for the photographed image of the camera when the position information of the dron approaches the region where the shooting limit of the dron is set And a user interface for selecting one of the second options is displayed in a pop-up window.
Wherein the control unit updates the change information and displays the update information on the display unit when receiving the change information of the security level of each area on the map from the server.
Wherein the control unit transmits a control signal to the drones so as to return to the driving method of the previous state when the location information of the drones is out of the area where the security level is set.
Displaying the location information of the drones, a map of the movable area of the drones, and a security level of each area on the map, based on the location information of the drones; And
Transmitting, to the drones, a control signal requesting at least one of movement restriction of the drones or photographing of the drones in accordance with the security level, when the position information of the drones is close to the area where the security level is set;
And transmitting the control information to the mobile terminal.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020150105381A KR20170011915A (en) | 2015-07-24 | 2015-07-24 | Mobile terminal and method for controlling the same |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020150105381A KR20170011915A (en) | 2015-07-24 | 2015-07-24 | Mobile terminal and method for controlling the same |
Publications (1)
Publication Number | Publication Date |
---|---|
KR20170011915A true KR20170011915A (en) | 2017-02-02 |
Family
ID=58151913
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
KR1020150105381A KR20170011915A (en) | 2015-07-24 | 2015-07-24 | Mobile terminal and method for controlling the same |
Country Status (1)
Country | Link |
---|---|
KR (1) | KR20170011915A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2023195734A1 (en) * | 2022-04-05 | 2023-10-12 | 주식회사 아르고스다인 | Method and system for providing self-drone shooting service |
-
2015
- 2015-07-24 KR KR1020150105381A patent/KR20170011915A/en unknown
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2023195734A1 (en) * | 2022-04-05 | 2023-10-12 | 주식회사 아르고스다인 | Method and system for providing self-drone shooting service |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
KR20170024846A (en) | Mobile terminal and method for controlling the same | |
KR20160019145A (en) | Mobile terminal and method for controlling the same | |
KR20180023310A (en) | Mobile terminal and method for controlling the same | |
KR20160016400A (en) | Mobile terminal and method for controlling the same | |
KR101677644B1 (en) | Mobile terminal and method of controlling the same | |
KR101510704B1 (en) | Mobile terminal and control method for the mobile terminal | |
KR20170021514A (en) | Display apparatus and controlling method thereof | |
KR20170019071A (en) | Mobile terminal and method for controlling the same | |
KR20170029756A (en) | Mobile terminal and method for controlling the same | |
KR20170011240A (en) | Mobile terminal and method for controlling the same | |
KR20160084208A (en) | Mobile terminal and method for controlling the same | |
KR20160096271A (en) | Mobile terminal and method for controlling the same | |
KR20170011915A (en) | Mobile terminal and method for controlling the same | |
KR20160077907A (en) | Mobile terminal and method for controlling the same | |
KR20170029330A (en) | Mobile terminal and method for controlling the same | |
KR20170016700A (en) | Mobile terminal and method for controlling the same | |
KR20160138805A (en) | Mobile terminal and method for controlling the same | |
KR101669210B1 (en) | Mobile terminal | |
KR20160087306A (en) | Mobile terminal | |
KR101641565B1 (en) | Mobile terminal and method for controlling the same | |
KR20170064763A (en) | Mobile terminal and method for controlling the same | |
KR20180032402A (en) | Mobile terminal | |
KR20170064765A (en) | Mobile terminal and method for controlling the same | |
KR20170068066A (en) | Mobile terminal and method for controlling the same | |
KR20170022217A (en) | Mobile terminal and method for controlling the same |