US20170220242A1 - Home security system with touch-sensitive control panel - Google Patents
Home security system with touch-sensitive control panel Download PDFInfo
- Publication number
- US20170220242A1 US20170220242A1 US15/430,506 US201715430506A US2017220242A1 US 20170220242 A1 US20170220242 A1 US 20170220242A1 US 201715430506 A US201715430506 A US 201715430506A US 2017220242 A1 US2017220242 A1 US 2017220242A1
- Authority
- US
- United States
- Prior art keywords
- security
- control panel
- display
- screen
- home screen
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/0416—Control or interface arrangements specially adapted for digitisers
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B15/00—Systems controlled by a computer
- G05B15/02—Systems controlled by a computer electric
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/016—Input arrangements with force or tactile feedback as computer generated output to the user
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04845—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B13/00—Burglar, theft or intruder alarms
- G08B13/18—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
- G08B13/189—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
- G08B13/194—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
- G08B13/196—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
- G08B13/19678—User interface
- G08B13/19682—Graphic User Interface [GUI] presenting system data to the user, e.g. information on a screen helping a user interacting with an alarm system
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B13/00—Burglar, theft or intruder alarms
- G08B13/18—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
- G08B13/189—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
- G08B13/194—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
- G08B13/196—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
- G08B13/19678—User interface
- G08B13/19689—Remote control of cameras, e.g. remote orientation or image zooming control for a PTZ camera
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
- H04N7/183—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/20—Pc systems
- G05B2219/23—Pc programming
- G05B2219/23377—Touch screen, with representation of buttons, machine on screen
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/20—Pc systems
- G05B2219/26—Pc applications
- G05B2219/2642—Domotique, domestic, home control, automation, smart house
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/01—Indexing scheme relating to G06F3/01
- G06F2203/014—Force feedback applied to GUI
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
Definitions
- the present disclosure relates to electronically controlled systems. More particularly, embodiments of the present disclosure relate to control panels and interfaces for use with electronically controlled systems, including automation systems in a home or commercial setting. More particularly still, embodiments of the present disclosure relate to touch-screen interfaces and control panels that are intuitive to use and operate.
- HVAC heating and air conditioning
- operation of multiple devices and systems may be collectively managed using automated control systems.
- Examples may include universal remote controls for operating multiple types of entertainment systems, as well as home security and thermostat systems that can control HVAC and security of a residential or commercial building.
- controls for electronic devices are for individual devices or for centralized control of multiple devices
- user interfaces may be provided to enable a user to interact with the electronic device(s).
- a remote control may, for instance, allow a user to power on or off multiple electronic entertainment devices, or to change volume, input sources, audio/video quality, and the like.
- the interface may allow a user to arm or disarm a security device or change a home heating/cooling scheme or temperature, among other features.
- the user interface may include a control panel having a set of buttons (e.g., numerical, alphabetic, task-specific, etc.).
- the buttons when depressed, can cause an actuator to generate an electronic signal. That signal may be transferred through circuitry in the control panel and/or to a device being monitored or controlled.
- An air conditioning unit or furnace may, for instance, be turned on or off in response to a user pressing a button.
- an alarm on a door or window may be armed or disarmed depending on the button that is depressed.
- buttons on a control panel may limit the overall operation or intuitive operation of the control panel itself.
- each button generally has one or two primary functions, and added functions may require advanced operations that can require the user to consult a menu or operation manual, which operations are non-intuitive.
- Such limited functions for the control buttons may make the control panel difficult to use.
- a button-based interface may make a control panel impractical for centralized use in controlling many different systems and components. This may be the case because the control panel may either require a large number of buttons, or because limited buttons may have multiple different associated operations, making it difficult for the user to learn how to use the control interface.
- the security system includes a control panel monitoring operation and/or status of the sensor for detecting a security condition at the residential or commercial location.
- Input related to the security system may be input at the control panel, which control panel may include a touch-sensitive display.
- the touch-sensitive display may receive and detect multiple simultaneous touches and perform actions based on the multiple touches received.
- control panel may be equipped to differentiate between types of touches.
- the control panel may identify respective locations of each of multiple touches. Gestures associated with one or more of the multiple touches may additionally or alternatively be identified. The number of multiple touches may also be determined. Types of touches detected may include a tap, double tap, drag, flick, rotate, spread, or pinch, among other things.
- the control panel may perform any of a number of different actions. Actions may include pan, zoom, scroll, rotate, and other actions.
- the control panel may include a haptic response component.
- the haptic response component may provide a tactile response to input received at the control panel.
- a tactile response may be provided based on input received at the touch-sensitive display or at components separate from the display. Some components separate from the display may include a home and/or emergency input.
- the security system may include intrusion, fire, flood, or carbon monoxide sensors.
- the security system may also include a camera. If a camera is provided, still or video images may be provided and displayed on the control panel. A map or camera image on the display may be manipulated using the touch-screen capable of recognizing multiple simultaneous touches. Input may be used to rotate, pan, scroll, zoom-in or zoom-out on such images.
- the security system may include a security component, such as a sensor, for detecting a potential security threat.
- a control panel may have a communication link with the security component and monitors a status thereof.
- a display may be included in the control panel and can display information related to the status of the security component.
- the display may be touch-sensitive to receive and recognize input with multiple, simultaneous touches. The display may differentiate between touches, including gestures associated therewith.
- the security system may include a haptic response component to provide tactile feedback when input is received at the control panel.
- Embodiments disclosed herein relate to methods for providing security at a residential or commercial location.
- Security may be provided using a control panel dedicated for use at a particular location.
- the control panel may monitor a security system including at least one sensor. Information relating to the operation or status of the sensor may be displayed on the control panel. Input may also be received at the control panel, which input may include multi-touch input on a touch-sensitive display.
- the touch-sensitive display may be a capacitive touch-screen.
- Sensors may detect intrusion, and can include cameras.
- a camera may provide still or video images that can be displayed on the display of the control panel.
- a map or other image may also be provided from a camera or other device and displayed on the control panel.
- the control panel may select, open, close, delete, zoom, pan, scroll, change, rotate, or otherwise manipulate an image or object on the display.
- control panel of an automation system including security features or systems may also, for instance, monitor entertainment systems within a residential or commercial location. Additional or alternative systems monitored or controlled using the control panel may include HVAC systems (e.g., thermostat, heating, air conditioning, etc.), lighting systems, sprinkler systems, and telephone systems.
- HVAC systems e.g., thermostat, heating, air conditioning, etc.
- lighting systems e.g., sprinkler systems, and telephone systems.
- FIG. 1 is a schematic diagram of an example security system, according to one example embodiment of the present disclosure
- FIG. 2 is a schematic diagram of an exemplary control panel of the security system of FIG. 1 ;
- FIGS. 3 and 4 is an example control panel for use within a security system
- FIG. 5 is an exemplary control panel configured to receive and recognize expand gestures
- FIG. 6 illustrates an exemplary control panel configured to receive and recognize pinch gestures
- FIGS. 7 and 8 illustrate an exemplary control panel configured with zooming functions on a map
- FIGS. 9 and 10 illustrate an exemplary control panel allowing zooming functions on an image
- FIG. 11 illustrates an exemplary control panel configured to receive and recognize rotate gestures
- FIGS. 12 and 13 illustrate an exemplary control panel allowing rotate functions
- FIG. 14 illustrates an exemplary control panel configured to receive and recognize drag and/or flick gestures
- FIGS. 15 and 16 illustrate an exemplary control panel allowing pan functions
- FIGS. 17 and 18 illustrate an exemplary control panel allowing flick functions
- FIG. 19 illustrates an exemplary control panel providing a haptic response.
- Systems, devices and methods according to the present disclosure are configured for use in connection with residential and/or commercial security systems.
- a home or business may use a security system for added safety of residents or patrons, or to protect valuable property.
- the security system may also include capabilities for operating in connection with other systems.
- systems under centralized control with the security system may include lighting components, sprinkler systems, HVAC components, audio/video systems, and the like.
- FIG. 1 an embodiment of an exemplary security system is shown, including a distributed system 100 for managing and monitoring security-related issues of a residence or business.
- the operation of the system 100 may include a network 102 facilitating communication between a network operations center 104 and one or more control panels 106 .
- the network 102 may be capable of carrying electronic communications.
- the Internet, local area networks, wide area networks, virtual private networks (VPN), other communication networks or channels, or any combination of the forgoing may be represented by the network 102 .
- the network 102 may operate in any number of different manners, and may include different components which may be distributed at different locations.
- the network 102 may include a wireless communication system provided by a mobile phone provider, although wired communication may also be used.
- a single network 102 is illustrated, such a component may be illustrative of multiple devices or components.
- the network 102 may include multiple networks interconnected to facilitate communication.
- the network operations center 104 may monitor the operation of the control panel 106 , which may be associated with a security system. For instance, the network operations center 104 may monitor the control panel 106 to ensure they are operating and communicating properly, to update software or firmware on the control panel 106 , and the like. In addition, the network operations center 104 may monitor signals received by the control panel 106 . For instance, if a control panel 106 receives a signal indicative of a breach at an armed door or window of a building, the network operations center 104 may be notified of such event over the network 102 . The network operations center 104 may then perform some security-related function (e.g., notify the police, make a telephone call to the owner of the building, etc.). Of course, the network operations center 104 may provide any number of other functions, and may be distributed among multiple devices, components or facilities.
- some security-related function e.g., notify the police, make a telephone call to the owner of the building, etc.
- control panel 106 may be located at, or otherwise associated with, a particular location such as a home or business. At the respective locations, users may manually operate the control panel 106 to provide security-related functions.
- electronic devices 108 , 110 remote from the control panel 106 may send signals over the network 102 to control operation of the control panel 106 so that manual operation at the control panel 106 is not required.
- the control panel 106 may monitor the operations of a number of different systems, components or appliances. As shown in FIG. 1 , example components monitored by the control panel 106 may include an entertainment system 112 , a HVAC system 114 , a lighting system 116 , a security system 118 , a sprinkler system 120 and/or a telephone system 122 . Other components, appliances, systems, and the like may also be provided as indicated by the illustrated ellipses. Furthermore, the types of security and/or automation systems 124 monitored by additional control panels may include the same or other components.
- the system 100 of the present disclosure is implemented as a communication system in which the operations of electronic components may be monitored through communication links.
- the communication links may be wired or wireless, or may include a combination of wired and wireless links. Regardless of the particular mode of communication, the status or operation of devices and components within the system 100 may be reported or otherwise communicated to a corresponding control panel 106 , network operations center 104 , or one of the electronic devices 108 , 110 .
- the monitored systems 112 - 124 may therefore include any number of different types of components that provide or receive electronic signals.
- the entertainment system 112 may include components such as televisions, recordable media players (e.g., DVD player, Blu-Ray Player, digital video recorders, VCR, etc.), projectors, speakers, stereos, and the like.
- the HVAC system 114 may include thermostats, air conditioners, furnaces, temperature sensors, etc.
- the lighting system 116 may include light fixtures, switches, sensors (e.g., motion sensors), or additional components.
- the security system 118 may include sensors and/or detectors (e.g., motion sensors, magnetic sensors, intrusion sensors, vibration sensors, infrared sensors, ultrasonic detectors, microwave detectors, contact sensors, photoelectric beam detectors, smoke detectors, temperature sensors, carbon monoxide detectors, etc.), video or still cameras, speakers, microphones, or other components.
- the sprinkler system 120 may include valves, actuators, sensors (e.g., flow rate sensors, proximity sensors, etc.), sprinklers, pumps, and the like.
- the telephone system 122 may include telephones, answering machines, call forwarding components, and the like.
- the system 100 is illustrative of an example system that may provide distributed functionality.
- electronic devices 108 , 110 may communicate with the control panel 106 and/or the network communications center 104 .
- a control panel 106 may be at a home. The home owner may be away from home and may use electronic devices 108 , 110 to communicate over the network 102 with the control panel 106 .
- the user may provide input to the electronic devices 108 , 110 to control the signals the control panel 106 sends to components of the security system 118 , HVAC system 114 , lighting system 116 , or the like.
- the control panel 106 may provide information to the electronic devices 108 , 110 .
- a camera of the security system 118 may provide video or still images that can be communicated over the network 102 to allow the user of the electronic devices 108 , 110 to see what is happening at the home. Similar information may be provided to the network operations center 104 , or the network operations center 104 may provide control signals to the control panel 106 .
- the system 100 of FIG. 1 is but one example of a suitable system for embodiments of the present disclosure.
- functions may occur at one or more locations (e.g., sensor monitoring at the control panel 106 , alarm/alert handling at the network communication center 104 , etc.).
- data may be processed or interpreted in other manners.
- components within the system 100 may communicate continuously, by using pushed communications, by using pull communications, or using some other communication system or any combination of the foregoing.
- FIG. 2 illustrates one example embodiment of a control panel 200 that may be used in the system 100 ; however, it should be appreciated that control panels may include any number of different features, components or capabilities. Thus, FIG. 2 and the description thereof should not be considered limiting of the present disclosure.
- the control panel 200 may include multiple components interacting together over one or more communication channels.
- one or more processors 202 may communicate with input/output devices 204 , a communication interface 206 , memory 208 and/or a mass storage device 210 via a communication bus 212 .
- the processors 202 may generally include one or more processing components, including a central processing unit, a graphics processing unit, or the like, any of which may be capable of executing computer-executable instructions received or stored by the security control panel 200 .
- the processors 202 may communicate with the input/output devices 204 using the communication bus 212 .
- the input/output devices 204 may include various components, including a touch-sensitive display 214 , one or more sensors 216 , tactile output components 218 , one or more ports 220 , or other components. Such components may include, for instance, buttons or keypads, a mouse, scanners, printers, cameras, global positioning system (GPS) units, biometric input systems (e.g., iris scanners, fingerprint readers, etc.), other components, or any combination of the foregoing.
- the communication interface 206 may receive or transmit communications via a network (e.g., network 102 of FIG. 1 ), and received communications may be provided over the communication bus 212 and processed by the one of more processors 202 .
- a network e.g., network 102 of FIG. 1
- the security control panel 200 may also include memory 208 and mass storage device 210 .
- the memory 208 may include both persistent and non-persistent storage, and in the illustrated embodiment, the memory 208 is shown as including random access memory (RAM) 222 and read only memory (ROM) 224 .
- RAM random access memory
- ROM read only memory
- Other types of memory or storage may also be included.
- the mass storage device 210 may generally be comprised of persistent storage in a number of different forms. Such forms may include a hard drive, flash-based storage, optical storage devices, magnetic storage devices, or other forms which are either permanently or removably coupled to the security control panel 200 .
- mass storage device 210 may store an operating system 226 defining the general operating functions of the security control panel 200 .
- the operating system 226 may be executed by the processors 202 .
- Other components stored in the mass storage device 210 may include drivers 228 (e.g., to facilitate communication between the processors 202 and the input/output devices 204 ), a browser 230 (e.g., to access or display information obtained over a network, including mark-up pages and information), a gesture application 232 (e.g., for use with the touch-sensitive display 214 ), and application programs.
- Application programs may generally include any program or application used in the operation of the security control panel 200 .
- Examples of application programs may include modules specifically designed for a home security and/or automation system (e.g., security application 234 ), or more general use applications.
- Examples of more general use applications may include word processing applications, spreadsheet applications, games, calendaring applications, weather forecast applications, sports scores applications, and other applications.
- the security application 234 may include applications or modules capable of being used by the security control panel 200 in connection with a security or automation system.
- the modules 236 - 248 may provide similar functions, but for different systems monitored using the security control panel 200 .
- the security application 234 may include an audio module 236 .
- the audio module 236 may generally control how audio components of a security and/or automation system operate. Such audio components may be part of an entertainment system (e.g., speakers for a television or stereo), a security system (e.g., an audible alarm), a telephone system (e.g., an intercom or speaker phone), or any other system.
- the audio module 236 may also allow audio communication (e.g., between different control panels, between the security control panel 200 and a network operations center 104 , or between the control panel 200 and a telephone, etc.).
- An additional application or module within the security application 234 may include an HVAC module 238 .
- the HVAC module 238 may control, monitor or interface with an HVAC system which may include a thermostat, air conditioner, furnace, hot water heater, or other similar components.
- a lighting module 240 may have similar functions, but may control, monitor or interface with lighting components including switches, lighting fixtures, and the like.
- the security module 242 may control, monitor, or interface with security-related components such as intrusion detection components, cameras, global positioning system (GPS) components, and safety components (e.g., fire, flood, carbon monoxide or radon detectors).
- the sprinkler module 244 may automate a sprinkler system, monitor operation of the system (e.g., verify water flow rates at one or more locations), and the like.
- the telephone module 246 may interface with a telephone system. For instance, if a user is away from a residential or commercial location, the telephone module 246 may communicate with the telephone system to automatically forward calls, route them to another person, or the like.
- a video module 248 may be used in connection with video functions within a security and/or automation system. The video module 248 may monitor video feeds from security cameras, interface with video entertainment devices, or provide other video-related functions, or any combination of the foregoing.
- the security application 234 optionally includes a remote access module 250 .
- the remote access module 250 may allow the security control panel 200 to be accessed using remote devices (e.g., devices 108 , 110 of FIG. 1 ), and to potentially have communications relayed through the control panel 200 either from or to the remote device.
- a user of a remote device could potentially set or view audio, HVAC, lighting, security, sprinkler, telephone, video or other settings remotely, and/or monitor audio and/or video feeds from the secured location.
- the security application 234 may also include additional or other modules or components, including authentication, settings, preferences, emergency override, updating, and other modules.
- the control panel 200 of FIG. 2 may provide an intuitive interface by which a user may monitor or control one or more systems within a residential or commercial location.
- the security application 234 (and the modules 236 - 248 ) may be monitored with the touch-sensitive display 214 .
- the touch-sensitive display 214 may include a capacitive, pressure sensitive, or other touch-screen display.
- Each module 236 - 250 of the security application 234 may provide information which may be displayed on the touch-sensitive display 214 for operation of a corresponding system or device.
- the touch-sensitive display 214 may be operated in connection with a gesture application 232 to allow multiple touches, gestures, and other commands to be received directly at the touch-sensitive display 214 .
- FIG. 3 illustrates an example of a control panel 300 incorporating a display 302 that may be operated in accordance with embodiments of the present disclosure.
- the display 302 may display an interface 304 a .
- the interface 304 a may include a view presented by software, firmware, or other components stored on computer readable media in the control panel 300 , or otherwise accessible thereto.
- the control panel 300 may further include additional components. Examples of additional components, illustrated in FIG. 3 , may include an audio component 306 and input buttons 308 , 310 .
- the audio component 306 may include an optional speaker and/or microphone component. For instance, the audio component 306 may allow a sound to be presented when a certain event occurs. For example, an alarm may sound when the control panel 300 is notified of a breach at an armed door, window or other location.
- the control panel 300 may be used in a home entertainment context. Music or other entertainment programming may be accessed and audio data may be provided through the audio component 306 .
- the audio component 306 may act as an intercom.
- a person wishing to enter the residential or commercial location may speak into a microphone and that information can be transmitted to the control panel 300 and output via the audio component 306 .
- Additional two-way communications may be provided.
- the button 308 may be an “emergency” button designed to be pressed when an emergency occurs. In some cases, pressing the button 308 may cause the control panel 300 to contact a remote party such as an emergency response provider (e.g., police, fire, medical, hospital, etc.) or a network operations center. Two-way communication with the remote provider may be facilitated by the audio component 306 as well as by communication systems (e.g., telephone connections, wireless communication, VOIP, etc.) within the control panel 300 .
- communication systems e.g., telephone connections, wireless communication, VOIP, etc.
- an additional button 310 may also be provided as shown on the control panel 300 in FIG. 3 .
- the button 310 is a “home” button. When pressed, the button 310 may trigger a response where the display 302 illustrates a predetermined, home interface.
- the interface 304 a may be a home interface. As a user navigates through different interfaces and views, the button 310 may therefore return the user to the interface 304 a.
- the buttons 308 , 310 may therefore act as input components. In response to user input, an action may be triggered.
- the control panel 300 may include additional or other interfaces. For instance, additional, different or fewer buttons may be provided, or different types of inputs (e.g., switches, toggles, etc.) may be provided.
- the display 302 may act as an additional input mechanism. More particularly, the display 302 may be touch-sensitive (e.g., a touchscreen). The display 302 may recognize a touch from a user, stylus, or the like. When received, the location and/or manner in which the touch is received may trigger a response.
- the interface 304 a displays two selectable options, namely a security option 312 and a home services option 314 . If the user taps or otherwise touches the display 302 at a location corresponding to the position of the security option 312 , an additional interface may be presented to allow the user to select, view, or change security settings (e.g., to arm a security component).
- An example security interface is shown in FIG. 4 , as interface 304 b .
- additional options may be presented. Examples of additional home services options may include options identified in FIG. 2 and the discussion related thereto, as well as in other portions of this disclosure.
- the display 302 of FIGS. 3 and 4 may be configured to recognize any number of different types of inputs. For instance, as discussed herein, a user may tap the display 302 to select a corresponding option (e.g., options 312 , 314 in FIG. 3 or the “arm”, “menu” or “status” options in FIG. 4 ). Further inputs recognized by the display 302 may include multiple, simultaneous touches of the display 302 and/or gestures. Gestures may generally include a touch of the display 302 (e.g., a touch-point), optionally accompanied by a corresponding movement of the finger, stylus or other element touching the display 302 . Gestures may be combined with multiple touch-points or may be used with a single touch-point of the display 302 .
- a touch-point e.g., a touch-point
- Gestures may be combined with multiple touch-points or may be used with a single touch-point of the display 302 .
- multiple touch-point and gesture inputs may be recognized when the display 302 of the control panel 300 is touched.
- a user may use their fingers to touch the display 302 at two touch-points, 316 , 318 .
- the user may then move their fingers in a predetermined manner recognizable by the control panel 300 (e.g., using a gesture recognition application as part of an operating system or as a separate application as shown in FIG. 2 ).
- the predetermined patterns of gestures may include spreading and pinching gestures.
- the spreading gesture shown in FIG. 5
- the touch-points 316 , 318 may spread and move further apart to the positions illustrated as touch-points 316 a and 318 a .
- FIG. 6 illustrates effectively the opposite gesture.
- the user may touch the display 302 at touch-points 316 , 318 and move the touch-points closer together. This is represented in FIG. 6 using the arrows and the moved touch-points shown at touch-points 316 b and 318 b.
- the control panel 300 may recognize the gesture, identify an action associated with the gesture, and perform the action. Performance of the action may be dependent upon the gesture; however, the action may also be dependent on the touch-points 316 , 318 . For instance, if multiple objects are displayed on the display 302 , the action associated with the gesture may affect all objects or some objects (e.g., a single object).
- FIGS. 7-10 illustrate exemplary actions that may be performed based on the gestures performed in FIGS. 5 and 6 .
- FIG. 7 illustrates interface 304 c on the display 201 of the control panel 300 .
- the interface 304 c may display a map 320 .
- the map 320 may be a floor plan or other similar map of a residential or commercial building, although a map may be of other locations (e.g., an entire lot, a neighborhood, a city, etc.).
- the map 320 may serve any of a number of purposes.
- the map 320 may show which building entrances (e.g., windows or doors) associated with a security device may be armed, disarmed, or otherwise configured.
- the map 320 may have other purposes, such as locating a particular person, pet, set of keys, etc. (e.g., using GPS tracking or other mechanisms).
- the control panel 300 and interface 304 c may display an initial presentation of the map 320 . If, however, the user touches the display 302 and performs a gesture (e.g., the gesture in FIG. 5 ), the map 320 may change within the interface 304 c . As an example, by touching two or more places on the display 302 and then spreading the touch-points, a user may zoom-in on the map 320 .
- FIG. 8 illustrates an example in which the control panel 300 causes the interface 304 c to zoom-in on the map 320 .
- the user may zoom-out from the map 320 in the interface 304 c to a desired magnification level.
- such an action may be performed by using an opposite gesture (e.g., the pinch gesture of FIG. 6 ).
- the gesture performed may be repeated multiple times.
- Repeating a spread gesture may, for instance, allow the magnification level to be repeatedly increased on an object or set of objects within the interface 304 c .
- repeating a pinch gesture may repeatedly decrease the magnification level.
- An example of zooming-out on an object is shown in going from the view of map 320 in FIG. 8 to the view in FIG. 7 .
- a gesture may be zoom in or zoom out on an object. More particularly, in some embodiments as shown in FIGS. 9 and 10 , interface 304 d may be displayed on the display 302 of the control panel 300 .
- the interface 304 d may be a view from a camera (e.g., a security camera). The camera may provide a view illustrating a particular object, room, or other location. In this embodiment, FIG. 9 illustrates a view of a particular room.
- the user may touch the display 302 and perform a gesture corresponding to a zoom-in action (e.g., the spreading gesture of FIG. 5 , double-tapping the display, etc.).
- a gesture corresponding to a zoom-out action e.g., the pinching gesture of FIG. 6
- the control panel 300 may zoom out and provide a broader view of a location.
- FIG. 10 illustrates an example where the camera feed is zoomed out, which exposes the room shown in FIG. 9 , as well as an outdoor area.
- the displayed location is merely illustrative, and a camera feed may be located at any location inside or outside a residential, commercial building, or other premises to provide a number of different views.
- Providing for the ability to zoom in and zoom out on the display 302 may allow a user to zoom in or zoom out to see a number of different objects, whether the interface displays a camera feed, map, or some other object. For instance, a user searching for his keys may zoom in on various locations within a building to find and see the keys from the display 302 . Similarly, a user may search for a person carrying a GPS-equipped phone or a pet carrying a GPS transponder. The user could zoom out in a map or camera-feed to find the general location of the person or pet, and then zoom in on the location to find more precise coordinates.
- performing a gesture may change the magnification level as shown in FIGS. 7-10 .
- magnification may be performed directly at the control panel 300 by, for instance, increasing the magnification of an object or set of objects within the display 302 .
- the gestures performed may cause the control panel 300 to perform another action.
- the control panel 300 may communicate with the security camera to cause the camera itself to zoom in or out. Similar capabilities may be provided to allow a user to move the focal point of the camera around (e.g., using a drag gesture as described hereafter with respect to FIGS. 14-16 ). Additional gestures may even provide the ability to change between map views or camera feeds (e.g., using a flick gesture as described with respect to FIGS. 14, 17 and 18 ).
- the pinch and spread gestures may be associated with other functions as well.
- FIG. 11 illustrates another example embodiment in which a user may touch the display 302 of the control panel 300 at touch-points 316 , 318 and move the touch-points in a predetermined manner.
- FIG. 11 illustrates a rotational movement in which the touch-points 316 and 318 move to corresponding positions at touch-points 316 c and 318 c by following rotational or curved paths.
- the particular paths are illustrated as being in a clockwise direction, but they could also be in other directions, including counter-clockwise.
- the rotational gesture illustrated in FIG. 11 may be associated with one or more actions that may change the size, shape, type, number or other characteristics of the objects displayed in the display 302 .
- FIG. 12 illustrates the display 302 as including an interface 304 e having thereon a map 322 .
- the map 322 may be a neighborhood map. If the user touches the display 302 and performs a particular gesture (e.g., the rotational gesture of FIG. 11 ), the map 322 may change.
- FIGS. 12 and 13 illustrate an example in which the map 322 rotates clockwise from a horizontal position ( FIG. 12 ) to a vertical position ( FIG. 13 ) as a result of a rotational gesture.
- a rotational gesture may cause an interface of the display 302 to change and rotate one or more objects in a predetermined direction.
- the direction may always be the same, or may be based on the gesture. For instance, rather than rotating one or more touch-points clockwise, the user may rotate touch points in a counter-clockwise direction. As a result, the map 322 could rotate in a counter-clockwise direction.
- the gesture of FIG. 11 could perform other actions, or could be replaced with other gestures to rotate an object.
- a user could touch the display 302 at a single point and move the touch-point along a curved path.
- the curved path could indicate a desire to rotate an object.
- more than two touch-points may be used.
- two touch-points may be used; however, only touch-point may be moved in a rotational direction (e.g., the second touch-point is held in place).
- FIG. 14 illustrates still another set of example gestures that may be performed on, and recognized by, the display 302 of the control panel 300 .
- the gestures may include a single touch-point 316 moved in a particular direction (e.g., to touch-point 316 d or 316 e ). Depending on the speed with which the touch-point 316 is moved, the gesture may be considered a dragging motion or a flicking motion. More particularly, if the control panel 300 recognizes the motion occurs over a very short time (e.g., less than half a second, less than a quarter second, less than a tenth of a second, etc.) the control panel 300 may determine the gesture is a flick gesture, whereas movements over longer periods of time may be considered dragging gestures. Dragging and flicking gestures may have different associated actions in the display 302 .
- FIGS. 15 and 16 illustrate an exemplary action of the control panel 300 in response to a drag or flick gesture.
- the display 302 may show an interface 304 f which may be a map 324 .
- the map 324 may be a map of a room, house, street, neighborhood, city, or other location.
- the interface 304 f may display one or more other objects.
- the interface 304 may display any object in a security system or home automation system, including views from one or more cameras.
- a panning action may be associated with a drag gesture. Therefore, if the user performs an action similar to the gesture from touch-point 316 to touch-point 316 in FIG. 14 , and does so at a rate recognized as a drag gesture, the map 324 may pan within the interface 304 in a corresponding direction.
- an upward drag gesture may cause the map 324 to pan upward and show additional portions of the map 324 , as shown by the change in the view of the display 302 from FIG. 15 to FIG. 16 .
- An opposite, or downward, drag gesture may cause the control panel 300 to pan the map 324 downward in an opposite direction (e.g., from the view in FIG. 16 to the view in FIG. 15 ).
- Drag gestures may be in any direction such as, for example, diagonal, horizontal, linear, non-linear, curved, other any other associated direction. Furthermore, in some embodiments, a drag gesture may include changes in direction such that the objects in the display 302 pan or scroll about in real-time with the gesture.
- a drag gesture as shown in FIGS. 15 and 16 may generally be used to pan or scroll and change the position of one or more objects within the interface 304 e of the display 302 .
- actions performed in response to gestures may change the interface 304 e itself, rather than the position or other characteristics of objects or images therein.
- FIGS. 17 and 18 illustrate a control panel 300 having a display 302 where the interface may change from a first interface 304 g ( FIG. 17 ) to a second interface 304 h ( FIG. 18 ).
- a flick gesture such as that described with respect to FIG. 14 , may change the displayed interface.
- the first interface 304 g may generally provide information about a sprinkler system managed or monitored using the control panel 300 .
- the first interface 304 g may provide information about a particular zone of a sprinkler system (e.g., flower beds).
- Information such as the current status of the zone, a schedule for the zone if the sprinkler system is automated, an option to change the schedule, an option to start the zone, a map of sprinklers or the coverage area of the zone, information on whether any problems are detected in the zone, and the like may also be displayed.
- a sprinkler system may include multiple zones.
- a second zone may, therefore, also have corresponding information displayed in an associated interface.
- FIG. 18 illustrates a second interface 304 h for displaying information on the second zone (e.g., for a front yard).
- the second zone is shown to be currently turned on, and other information displayed includes the automated schedule, the time remaining in a current program, an option to change the schedule, a map of the zone, and an option to stop the sprinklers of the second zone.
- FIG. 14 provides an example in which a user may rapidly move a contact point on the display 302 from a first touch-point 316 to a second touch-point 316 e . If the control panel 300 of FIG. 17 recognizes the gesture as a flick, the control panel 300 may determine the user wishes to change interfaces rather than scroll objects in the same interface. Thus, the control panel 300 may change from first interface 304 g to second interface 304 h . If the user wants to scroll to an interface corresponding to yet another zone, the user could repeat the flick gesture. A flick gesture in an opposite direction may cause the control panel 300 to move back to a previous interface.
- a flick gesture for instance, a flick gesture.
- a flick gesture may be generally horizontal, but may also be in other directions. Different directions of flick gestures may have different corresponding actions. For instance, a vertical (up or down) flick gesture may cause the display 302 to change from one type of interface (e.g., sprinkler system) to another type (e.g., system menu, entertainment, lighting, HVAC, cameras, alarms, etc.).
- one type of interface e.g., sprinkler system
- another type e.g., system menu, entertainment, lighting, HVAC, cameras, alarms, etc.
- the tap, press, flick, drag, rotate, pinch and spread gestures described herein, as well as the particular actions associated therewith, are intended to merely illustrate examples of some types of gestures that may be recognized by a control panel of a security system. Additional or other gestures or inputs may also be provided. For instance, combinations of the above may be included. As an example, a gesture may be associated with a combined tap and drag motion. Additional gestures may include double tap gestures or tap and hold gestures. In addition, while the illustrated gestures include one or two touch-points, even more touch-points may be monitored. By way of illustration, the display 302 of the control panel 300 may recognize three or more simultaneous touch-points.
- actions associated with a gesture may also be varied, and need not include or be limited to select, zoom, rotate, pan, scroll and interface change actions.
- Other actions associated with gestures may include deleting, copying, moving, bundling, closing, opening, drawing, and other actions.
- buttons, toggles, switches, and mechanical controls are that as the control is activated, there is generally an associated movement of a portion of the control (e.g., a depressed button physically moves downward). This movement can be sensed by a user and the user obtains a real-time, tactile response than an action has been carried out.
- touch-screens and non-mechanical buttons e.g., pressure sensitive or capacitive contact surfaces
- touch-screens and non-mechanical buttons which do not rely on movement of mechanical components generally do not provide such a response. Instead, the user must wait for a visual or audio confirmation that an action has been received.
- the illustrates an example control panel 300 may provide a user with a haptic response and tactile feedback even when non-mechanical input components are used.
- the display 302 of the control panel 300 may show an interface 304 i .
- the user may touch the display 302 to provide an input (e.g., to enter a PIN or other credentials to arm or disarm an entrance, to select an available menu option, etc.).
- the control panel 300 may include a tactile output component (see FIG. 2 ) configured to respond when input is received at the display 302 . For instance, a sensor may detect contact on the display 302 and communicate with the tactile output component.
- the tactile output component may vibrate, rotate, move, or otherwise provide a mechanical stimulus. For instance, a vibration may be generated causing the entire control panel 300 to vibrate as reflected by the arrows and motion lines in FIG. 19 .
- Vibration of the control panel 300 in response to input at the display 302 is merely one example of haptic feedback.
- the display 302 may move or rotate.
- An example may include making the entire display 302 movable relative to the control panel 300 so that when the display 302 is touched, the display 302 moves.
- haptic feedback may be provided in response to input at the optional buttons 308 , 310 .
- input at the buttons 308 , 310 which are separate relative to the display 302 , may result in haptic feedback.
- the haptic feedback may be the same as for the control panel 300 , or different therefrom.
- haptic feedback may be provided for one of the, but not both, buttons 308 , 310 (or one button).
- Embodiments of the present disclosure may comprise or utilize a special purpose or general-purpose computer including computer hardware, such as, for example, one or more processors and system memory.
- Embodiments within the scope of the present disclosure also include physical and other computer-readable media for carrying or storing computer-executable instructions and/or data structures.
- Such computer-readable media can be any available media that can be accessed by a general purpose or special purpose computer system.
- Computer-readable media that store computer-executable instructions are physical storage media.
- Computer-readable media that carry computer-executable instructions are transmission media.
- embodiments of the disclosure may comprise at least two distinctly different kinds of computer-readable media, including at least computer storage media and/or transmission media.
- Examples of computer storage media include RAM, ROM, EEPROM, CD-ROM or other optical disk storage, magnetic disk storage or other magnetic storage devices, flash-based storage, solid-state storage, or any other non-transmission medium which can be used to store desired program code means in the form of computer-executable instructions or data structures and which can be accessed by a general purpose or special purpose computer.
- a “communication network” may generally be defined as one or more data links that enable the transport of electronic data between computer systems and/or modules, engines, and/or other electronic devices, and transmissions media can include a communication network and/or data links, carrier waves, wireless signals, and the like, which can be used to carry desired program or template code means or instructions in the form of computer-executable instructions or data structures within, to or from a communication network. Combinations of storage media and transmission media should also be included within the scope of computer-readable media.
- program code means in the form of computer-executable instructions or data structures can be transferred automatically from transmission media to storage media (or vice versa).
- program code means in the form of computer-executable instructions or data structures received over a network or data link can be buffered in RAM within a network interface module (e.g., a “NIC”), and then eventually transferred to computer system RAM and/or to less volatile computer storage media at a computer system.
- a network interface module e.g., a “NIC”
- computer storage media can be included in computer system components that also (or even primarily) utilize transmission media.
- Computer-executable instructions comprise instructions and data which, when executed at a processor, cause a general purpose computer, special purpose computer (e.g., a security system control panel), or special purpose processing device to perform a certain function or group of functions.
- the computer executable instructions may be, for example, binaries, intermediate format instructions such as assembly language, or even source code.
- embodiments may be practiced in network computing environments with many types of computer system configurations, including, personal computers, desktop computers, laptop computers, message processors, hand-held devices, programmable logic machines, multi-processor systems, microprocessor-based or programmable consumer electronics, network PCs, tablet computing devices, minicomputers, security system control panels, security system network operations centers, mainframe computers, mobile telephones, PDAs, pagers, routers, switches, and the like.
- Embodiments may also be practiced in distributed system environments where local and remote computer systems, which are linked (either by hardwired data links, wireless data links, or by a combination of hardwired and wireless data links) through a network, both perform tasks.
- program modules may be located in both local and remote memory storage devices.
- embodiments of the present disclosure may be practiced in special-purpose, dedicated or other computing devices integrated within or particular to a particular residence, business, company, government agency, or other entity, and that such devices may operate using a network connection, wireless connection, or hardwire connection. Examples may include residential or commercial buildings in connection with security systems configured to monitor local conditions (i.e., at the same building or location), remote conditions (i.e., at a different building or location), or some combination thereof.
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Automation & Control Theory (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- User Interface Of Digital Computer (AREA)
- Alarm Systems (AREA)
Abstract
Description
- The present Application is a continuation of U.S. patent application Ser. No. 14/211,018, titled “Home Security System with Touch-Sensitive Control Panel,” filed on Mar. 14, 2014, which claims priority to U.S. Provisional Patent Application No. 61/791,142, titled “Home Security System with Touch-Screen Interface,” filed on Mar. 15, 2013. The disclosures of each application listed above are incorporated by reference herein in their entireties.
- The present disclosure relates to electronically controlled systems. More particularly, embodiments of the present disclosure relate to control panels and interfaces for use with electronically controlled systems, including automation systems in a home or commercial setting. More particularly still, embodiments of the present disclosure relate to touch-screen interfaces and control panels that are intuitive to use and operate.
- At an increasing rate, people are purchasing electronic devices and appliances that can be used in connection with certain functions of residential or commercial buildings. Many of these devices are used separately for one purpose or another. For instance, home security systems, lighting control systems, security cameras, sprinkler systems, telephone systems, entertainment (e.g., audio, video, television, etc.) systems, heating and air conditioning (HVAC) systems, and the like may each be controlled electronically using various controllers.
- In some cases, operation of multiple devices and systems may be collectively managed using automated control systems. Examples may include universal remote controls for operating multiple types of entertainment systems, as well as home security and thermostat systems that can control HVAC and security of a residential or commercial building.
- Whether controls for electronic devices are for individual devices or for centralized control of multiple devices, user interfaces may be provided to enable a user to interact with the electronic device(s). A remote control may, for instance, allow a user to power on or off multiple electronic entertainment devices, or to change volume, input sources, audio/video quality, and the like. In the context of a security and thermostat system, the interface may allow a user to arm or disarm a security device or change a home heating/cooling scheme or temperature, among other features.
- Such interfaces typically operate using a combination of mechanical and electrical components. For instance, the user interface may include a control panel having a set of buttons (e.g., numerical, alphabetic, task-specific, etc.). The buttons, when depressed, can cause an actuator to generate an electronic signal. That signal may be transferred through circuitry in the control panel and/or to a device being monitored or controlled. An air conditioning unit or furnace may, for instance, be turned on or off in response to a user pressing a button. Similarly, an alarm on a door or window may be armed or disarmed depending on the button that is depressed.
- Buttons on a control panel, while useful for certain functions, may limit the overall operation or intuitive operation of the control panel itself. In particular, each button generally has one or two primary functions, and added functions may require advanced operations that can require the user to consult a menu or operation manual, which operations are non-intuitive. Such limited functions for the control buttons may make the control panel difficult to use. In addition, a button-based interface may make a control panel impractical for centralized use in controlling many different systems and components. This may be the case because the control panel may either require a large number of buttons, or because limited buttons may have multiple different associated operations, making it difficult for the user to learn how to use the control interface.
- In accordance with aspects of the present disclosure, embodiments of methods, systems, software, computer-program products, and the like are described or would be understood and which relate to use of a security system for a residential or commercial location. The security system includes a control panel monitoring operation and/or status of the sensor for detecting a security condition at the residential or commercial location. Input related to the security system may be input at the control panel, which control panel may include a touch-sensitive display. The touch-sensitive display may receive and detect multiple simultaneous touches and perform actions based on the multiple touches received.
- In accordance with some embodiments, the control panel may be equipped to differentiate between types of touches. The control panel may identify respective locations of each of multiple touches. Gestures associated with one or more of the multiple touches may additionally or alternatively be identified. The number of multiple touches may also be determined. Types of touches detected may include a tap, double tap, drag, flick, rotate, spread, or pinch, among other things.
- Based on the types and/or numbers of simultaneous touches, the control panel may perform any of a number of different actions. Actions may include pan, zoom, scroll, rotate, and other actions. In some embodiments, the control panel may include a haptic response component. The haptic response component may provide a tactile response to input received at the control panel. A tactile response may be provided based on input received at the touch-sensitive display or at components separate from the display. Some components separate from the display may include a home and/or emergency input.
- Multiple components and systems may be monitored by a control panel within a security system. The security system may include intrusion, fire, flood, or carbon monoxide sensors. The security system may also include a camera. If a camera is provided, still or video images may be provided and displayed on the control panel. A map or camera image on the display may be manipulated using the touch-screen capable of recognizing multiple simultaneous touches. Input may be used to rotate, pan, scroll, zoom-in or zoom-out on such images.
- In accordance with other embodiments of the present disclosure, a security system is described. The security system may include a security component, such as a sensor, for detecting a potential security threat. A control panel may have a communication link with the security component and monitors a status thereof. A display may be included in the control panel and can display information related to the status of the security component. The display may be touch-sensitive to receive and recognize input with multiple, simultaneous touches. The display may differentiate between touches, including gestures associated therewith. The security system may include a haptic response component to provide tactile feedback when input is received at the control panel.
- Embodiments disclosed herein relate to methods for providing security at a residential or commercial location. Security may be provided using a control panel dedicated for use at a particular location. The control panel may monitor a security system including at least one sensor. Information relating to the operation or status of the sensor may be displayed on the control panel. Input may also be received at the control panel, which input may include multi-touch input on a touch-sensitive display. The touch-sensitive display may be a capacitive touch-screen.
- When providing security, a number of different types of sensors may be used. Sensors may detect intrusion, and can include cameras. A camera may provide still or video images that can be displayed on the display of the control panel. A map or other image may also be provided from a camera or other device and displayed on the control panel. Using multi-touch and/or gesture recognition, the control panel may select, open, close, delete, zoom, pan, scroll, change, rotate, or otherwise manipulate an image or object on the display.
- Further embodiments may relate to security systems further included as part of a home or other automation systems. A control panel of an automation system including security features or systems may also, for instance, monitor entertainment systems within a residential or commercial location. Additional or alternative systems monitored or controlled using the control panel may include HVAC systems (e.g., thermostat, heating, air conditioning, etc.), lighting systems, sprinkler systems, and telephone systems.
- Other aspects, as well as the features and advantages of various aspects, of the present disclosure will become apparent to those of ordinary skill in the art through consideration of the ensuing description, the accompanying drawings and the appended claims.
- In order to describe the manner in which features and other aspects of the present disclosure can be obtained, a more particular description of certain subject matter will be rendered by reference to example embodiments illustrated in the appended drawings. These drawings depict only example embodiments and therefore are not considered to be limiting in scope, nor drawn to scale for all embodiments. Various embodiments will be described and explained with additional specificity and detail through the use of the accompanying drawings in which:
-
FIG. 1 is a schematic diagram of an example security system, according to one example embodiment of the present disclosure; -
FIG. 2 is a schematic diagram of an exemplary control panel of the security system ofFIG. 1 ; -
FIGS. 3 and 4 is an example control panel for use within a security system; -
FIG. 5 is an exemplary control panel configured to receive and recognize expand gestures; -
FIG. 6 illustrates an exemplary control panel configured to receive and recognize pinch gestures; -
FIGS. 7 and 8 illustrate an exemplary control panel configured with zooming functions on a map; -
FIGS. 9 and 10 illustrate an exemplary control panel allowing zooming functions on an image; -
FIG. 11 illustrates an exemplary control panel configured to receive and recognize rotate gestures; -
FIGS. 12 and 13 illustrate an exemplary control panel allowing rotate functions; -
FIG. 14 illustrates an exemplary control panel configured to receive and recognize drag and/or flick gestures; -
FIGS. 15 and 16 illustrate an exemplary control panel allowing pan functions; -
FIGS. 17 and 18 illustrate an exemplary control panel allowing flick functions; and -
FIG. 19 illustrates an exemplary control panel providing a haptic response. - Systems, devices and methods according to the present disclosure are configured for use in connection with residential and/or commercial security systems. Without limiting the scope of the present disclosure, a home or business may use a security system for added safety of residents or patrons, or to protect valuable property. Optionally, the security system may also include capabilities for operating in connection with other systems. For example, systems under centralized control with the security system may include lighting components, sprinkler systems, HVAC components, audio/video systems, and the like.
- Turning now to
FIG. 1 , an embodiment of an exemplary security system is shown, including a distributedsystem 100 for managing and monitoring security-related issues of a residence or business. The operation of thesystem 100 may include anetwork 102 facilitating communication between anetwork operations center 104 and one ormore control panels 106. - The
network 102 may be capable of carrying electronic communications. The Internet, local area networks, wide area networks, virtual private networks (VPN), other communication networks or channels, or any combination of the forgoing may be represented by thenetwork 102. Thus, it should be understood that thenetwork 102 may operate in any number of different manners, and may include different components which may be distributed at different locations. For instance, thenetwork 102 may include a wireless communication system provided by a mobile phone provider, although wired communication may also be used. Moreover, while asingle network 102 is illustrated, such a component may be illustrative of multiple devices or components. For instance, thenetwork 102 may include multiple networks interconnected to facilitate communication. - The
network operations center 104 may monitor the operation of thecontrol panel 106, which may be associated with a security system. For instance, thenetwork operations center 104 may monitor thecontrol panel 106 to ensure they are operating and communicating properly, to update software or firmware on thecontrol panel 106, and the like. In addition, thenetwork operations center 104 may monitor signals received by thecontrol panel 106. For instance, if acontrol panel 106 receives a signal indicative of a breach at an armed door or window of a building, thenetwork operations center 104 may be notified of such event over thenetwork 102. Thenetwork operations center 104 may then perform some security-related function (e.g., notify the police, make a telephone call to the owner of the building, etc.). Of course, thenetwork operations center 104 may provide any number of other functions, and may be distributed among multiple devices, components or facilities. - In at least one embodiment, the
control panel 106 may be located at, or otherwise associated with, a particular location such as a home or business. At the respective locations, users may manually operate thecontrol panel 106 to provide security-related functions. In the same or other embodiments,electronic devices control panel 106 may send signals over thenetwork 102 to control operation of thecontrol panel 106 so that manual operation at thecontrol panel 106 is not required. - The
control panel 106 may monitor the operations of a number of different systems, components or appliances. As shown inFIG. 1 , example components monitored by thecontrol panel 106 may include anentertainment system 112, aHVAC system 114, alighting system 116, asecurity system 118, asprinkler system 120 and/or atelephone system 122. Other components, appliances, systems, and the like may also be provided as indicated by the illustrated ellipses. Furthermore, the types of security and/orautomation systems 124 monitored by additional control panels may include the same or other components. - The
system 100 of the present disclosure is implemented as a communication system in which the operations of electronic components may be monitored through communication links. The communication links may be wired or wireless, or may include a combination of wired and wireless links. Regardless of the particular mode of communication, the status or operation of devices and components within thesystem 100 may be reported or otherwise communicated to acorresponding control panel 106,network operations center 104, or one of theelectronic devices - For example, the
entertainment system 112 may include components such as televisions, recordable media players (e.g., DVD player, Blu-Ray Player, digital video recorders, VCR, etc.), projectors, speakers, stereos, and the like. TheHVAC system 114 may include thermostats, air conditioners, furnaces, temperature sensors, etc. Thelighting system 116 may include light fixtures, switches, sensors (e.g., motion sensors), or additional components. Thesecurity system 118 may include sensors and/or detectors (e.g., motion sensors, magnetic sensors, intrusion sensors, vibration sensors, infrared sensors, ultrasonic detectors, microwave detectors, contact sensors, photoelectric beam detectors, smoke detectors, temperature sensors, carbon monoxide detectors, etc.), video or still cameras, speakers, microphones, or other components. Thesprinkler system 120 may include valves, actuators, sensors (e.g., flow rate sensors, proximity sensors, etc.), sprinklers, pumps, and the like. Thetelephone system 122 may include telephones, answering machines, call forwarding components, and the like. - The
system 100 is illustrative of an example system that may provide distributed functionality. As discussed above, for instance,electronic devices control panel 106 and/or thenetwork communications center 104. For instance, acontrol panel 106 may be at a home. The home owner may be away from home and may useelectronic devices network 102 with thecontrol panel 106. The user may provide input to theelectronic devices control panel 106 sends to components of thesecurity system 118,HVAC system 114,lighting system 116, or the like. Alternatively, thecontrol panel 106 may provide information to theelectronic devices security system 118 may provide video or still images that can be communicated over thenetwork 102 to allow the user of theelectronic devices network operations center 104, or thenetwork operations center 104 may provide control signals to thecontrol panel 106. - The
system 100 ofFIG. 1 is but one example of a suitable system for embodiments of the present disclosure. In the illustrated embodiment, functions may occur at one or more locations (e.g., sensor monitoring at thecontrol panel 106, alarm/alert handling at thenetwork communication center 104, etc.). In other embodiments, data may be processed or interpreted in other manners. Further, components within thesystem 100 may communicate continuously, by using pushed communications, by using pull communications, or using some other communication system or any combination of the foregoing. - The various components, systems and devices of the
system 100 may have varying capabilities, processing power, storage abilities, and the like.FIG. 2 illustrates one example embodiment of acontrol panel 200 that may be used in thesystem 100; however, it should be appreciated that control panels may include any number of different features, components or capabilities. Thus,FIG. 2 and the description thereof should not be considered limiting of the present disclosure. - In
FIG. 2 , thecontrol panel 200 may include multiple components interacting together over one or more communication channels. In one embodiment, one ormore processors 202 may communicate with input/output devices 204, acommunication interface 206,memory 208 and/or a mass storage device 210 via acommunication bus 212. Theprocessors 202 may generally include one or more processing components, including a central processing unit, a graphics processing unit, or the like, any of which may be capable of executing computer-executable instructions received or stored by thesecurity control panel 200. - The
processors 202 may communicate with the input/output devices 204 using thecommunication bus 212. The input/output devices 204 may include various components, including a touch-sensitive display 214, one ormore sensors 216,tactile output components 218, one ormore ports 220, or other components. Such components may include, for instance, buttons or keypads, a mouse, scanners, printers, cameras, global positioning system (GPS) units, biometric input systems (e.g., iris scanners, fingerprint readers, etc.), other components, or any combination of the foregoing. Thecommunication interface 206 may receive or transmit communications via a network (e.g.,network 102 ofFIG. 1 ), and received communications may be provided over thecommunication bus 212 and processed by the one ofmore processors 202. - The
security control panel 200 may also includememory 208 and mass storage device 210. In general, thememory 208 may include both persistent and non-persistent storage, and in the illustrated embodiment, thememory 208 is shown as including random access memory (RAM) 222 and read only memory (ROM) 224. Other types of memory or storage may also be included. The mass storage device 210 may generally be comprised of persistent storage in a number of different forms. Such forms may include a hard drive, flash-based storage, optical storage devices, magnetic storage devices, or other forms which are either permanently or removably coupled to thesecurity control panel 200. In some embodiments, mass storage device 210 may store anoperating system 226 defining the general operating functions of thesecurity control panel 200. In some embodiments, theoperating system 226 may be executed by theprocessors 202. Other components stored in the mass storage device 210 may include drivers 228 (e.g., to facilitate communication between theprocessors 202 and the input/output devices 204), a browser 230 (e.g., to access or display information obtained over a network, including mark-up pages and information), a gesture application 232 (e.g., for use with the touch-sensitive display 214), and application programs. - Application programs may generally include any program or application used in the operation of the
security control panel 200. Examples of application programs may include modules specifically designed for a home security and/or automation system (e.g., security application 234), or more general use applications. Examples of more general use applications may include word processing applications, spreadsheet applications, games, calendaring applications, weather forecast applications, sports scores applications, and other applications. - As shown in
FIG. 2 , in at least one embodiment, thesecurity application 234 may include applications or modules capable of being used by thesecurity control panel 200 in connection with a security or automation system. The modules 236-248 may provide similar functions, but for different systems monitored using thesecurity control panel 200. For instance, thesecurity application 234 may include anaudio module 236. Theaudio module 236 may generally control how audio components of a security and/or automation system operate. Such audio components may be part of an entertainment system (e.g., speakers for a television or stereo), a security system (e.g., an audible alarm), a telephone system (e.g., an intercom or speaker phone), or any other system. Theaudio module 236 may also allow audio communication (e.g., between different control panels, between thesecurity control panel 200 and anetwork operations center 104, or between thecontrol panel 200 and a telephone, etc.). - An additional application or module within the
security application 234 may include anHVAC module 238. TheHVAC module 238 may control, monitor or interface with an HVAC system which may include a thermostat, air conditioner, furnace, hot water heater, or other similar components. Alighting module 240 may have similar functions, but may control, monitor or interface with lighting components including switches, lighting fixtures, and the like. - In some embodiments, the
security module 242 may control, monitor, or interface with security-related components such as intrusion detection components, cameras, global positioning system (GPS) components, and safety components (e.g., fire, flood, carbon monoxide or radon detectors). Thesprinkler module 244 may automate a sprinkler system, monitor operation of the system (e.g., verify water flow rates at one or more locations), and the like. Thetelephone module 246 may interface with a telephone system. For instance, if a user is away from a residential or commercial location, thetelephone module 246 may communicate with the telephone system to automatically forward calls, route them to another person, or the like. Avideo module 248 may be used in connection with video functions within a security and/or automation system. Thevideo module 248 may monitor video feeds from security cameras, interface with video entertainment devices, or provide other video-related functions, or any combination of the foregoing. - As also shown in
FIG. 2 , thesecurity application 234 optionally includes aremote access module 250. Theremote access module 250 may allow thesecurity control panel 200 to be accessed using remote devices (e.g.,devices FIG. 1 ), and to potentially have communications relayed through thecontrol panel 200 either from or to the remote device. Thus, a user of a remote device could potentially set or view audio, HVAC, lighting, security, sprinkler, telephone, video or other settings remotely, and/or monitor audio and/or video feeds from the secured location. Thesecurity application 234 may also include additional or other modules or components, including authentication, settings, preferences, emergency override, updating, and other modules. - In one embodiment, the
control panel 200 ofFIG. 2 may provide an intuitive interface by which a user may monitor or control one or more systems within a residential or commercial location. For example, the security application 234 (and the modules 236-248) may be monitored with the touch-sensitive display 214. For instance, the touch-sensitive display 214 may include a capacitive, pressure sensitive, or other touch-screen display. Each module 236-250 of thesecurity application 234 may provide information which may be displayed on the touch-sensitive display 214 for operation of a corresponding system or device. In some embodiments, the touch-sensitive display 214 may be operated in connection with agesture application 232 to allow multiple touches, gestures, and other commands to be received directly at the touch-sensitive display 214. -
FIG. 3 illustrates an example of acontrol panel 300 incorporating adisplay 302 that may be operated in accordance with embodiments of the present disclosure. Thedisplay 302 may display aninterface 304 a. Theinterface 304 a may include a view presented by software, firmware, or other components stored on computer readable media in thecontrol panel 300, or otherwise accessible thereto. - The
control panel 300 may further include additional components. Examples of additional components, illustrated inFIG. 3 , may include anaudio component 306 andinput buttons audio component 306 may include an optional speaker and/or microphone component. For instance, theaudio component 306 may allow a sound to be presented when a certain event occurs. For example, an alarm may sound when thecontrol panel 300 is notified of a breach at an armed door, window or other location. As another example, thecontrol panel 300 may be used in a home entertainment context. Music or other entertainment programming may be accessed and audio data may be provided through theaudio component 306. - In the same or other embodiments, the
audio component 306 may act as an intercom. A person wishing to enter the residential or commercial location may speak into a microphone and that information can be transmitted to thecontrol panel 300 and output via theaudio component 306. Additional two-way communications may be provided. For instance, thebutton 308 may be an “emergency” button designed to be pressed when an emergency occurs. In some cases, pressing thebutton 308 may cause thecontrol panel 300 to contact a remote party such as an emergency response provider (e.g., police, fire, medical, hospital, etc.) or a network operations center. Two-way communication with the remote provider may be facilitated by theaudio component 306 as well as by communication systems (e.g., telephone connections, wireless communication, VOIP, etc.) within thecontrol panel 300. - In some embodiments, an
additional button 310 may also be provided as shown on thecontrol panel 300 inFIG. 3 . In this embodiment, thebutton 310 is a “home” button. When pressed, thebutton 310 may trigger a response where thedisplay 302 illustrates a predetermined, home interface. As an example, theinterface 304 a may be a home interface. As a user navigates through different interfaces and views, thebutton 310 may therefore return the user to theinterface 304 a. - The
buttons control panel 300 may include additional or other interfaces. For instance, additional, different or fewer buttons may be provided, or different types of inputs (e.g., switches, toggles, etc.) may be provided. Additionally, in thecontrol panel 300 ofFIG. 3 , thedisplay 302 may act as an additional input mechanism. More particularly, thedisplay 302 may be touch-sensitive (e.g., a touchscreen). Thedisplay 302 may recognize a touch from a user, stylus, or the like. When received, the location and/or manner in which the touch is received may trigger a response. As an illustration, theinterface 304 a displays two selectable options, namely asecurity option 312 and ahome services option 314. If the user taps or otherwise touches thedisplay 302 at a location corresponding to the position of thesecurity option 312, an additional interface may be presented to allow the user to select, view, or change security settings (e.g., to arm a security component). An example security interface is shown inFIG. 4 , asinterface 304 b. Alternatively, by tapping or otherwise touching thedisplay 302 at a location corresponding to the position of thehome services option 314, additional options may be presented. Examples of additional home services options may include options identified inFIG. 2 and the discussion related thereto, as well as in other portions of this disclosure. - The
display 302 ofFIGS. 3 and 4 may be configured to recognize any number of different types of inputs. For instance, as discussed herein, a user may tap thedisplay 302 to select a corresponding option (e.g.,options FIG. 3 or the “arm”, “menu” or “status” options inFIG. 4 ). Further inputs recognized by thedisplay 302 may include multiple, simultaneous touches of thedisplay 302 and/or gestures. Gestures may generally include a touch of the display 302 (e.g., a touch-point), optionally accompanied by a corresponding movement of the finger, stylus or other element touching thedisplay 302. Gestures may be combined with multiple touch-points or may be used with a single touch-point of thedisplay 302. - In some embodiments, as shown in
FIGS. 5 and 6 , multiple touch-point and gesture inputs may be recognized when thedisplay 302 of thecontrol panel 300 is touched. InFIG. 5 , for instance, a user may use their fingers to touch thedisplay 302 at two touch-points, 316, 318. Upon touching thedisplay 302, the user may then move their fingers in a predetermined manner recognizable by the control panel 300 (e.g., using a gesture recognition application as part of an operating system or as a separate application as shown inFIG. 2 ). - In exemplary embodiments as shown in
FIGS. 5 and 6 , the predetermined patterns of gestures may include spreading and pinching gestures. In particular, the spreading gesture, shown inFIG. 5 , may include touching thedisplay 302 at touch-points FIG. 5 using the illustrated arrows. The touch-points points FIG. 6 illustrates effectively the opposite gesture. InFIG. 6 , the user may touch thedisplay 302 at touch-points FIG. 6 using the arrows and the moved touch-points shown at touch-points 316 b and 318 b. - When multiple touch-points and gestures are detected, the
control panel 300 may recognize the gesture, identify an action associated with the gesture, and perform the action. Performance of the action may be dependent upon the gesture; however, the action may also be dependent on the touch-points display 302, the action associated with the gesture may affect all objects or some objects (e.g., a single object). -
FIGS. 7-10 illustrate exemplary actions that may be performed based on the gestures performed inFIGS. 5 and 6 . For example,FIG. 7 illustratesinterface 304 c on the display 201 of thecontrol panel 300. Theinterface 304 c may display amap 320. Themap 320 may be a floor plan or other similar map of a residential or commercial building, although a map may be of other locations (e.g., an entire lot, a neighborhood, a city, etc.). Themap 320 may serve any of a number of purposes. In some embodiments, for instance inFIG. 7 , themap 320 may show which building entrances (e.g., windows or doors) associated with a security device may be armed, disarmed, or otherwise configured. In other embodiments, themap 320 may have other purposes, such as locating a particular person, pet, set of keys, etc. (e.g., using GPS tracking or other mechanisms). - In some embodiments, as shown in
FIG. 7 , thecontrol panel 300 andinterface 304 c may display an initial presentation of themap 320. If, however, the user touches thedisplay 302 and performs a gesture (e.g., the gesture inFIG. 5 ), themap 320 may change within theinterface 304 c. As an example, by touching two or more places on thedisplay 302 and then spreading the touch-points, a user may zoom-in on themap 320.FIG. 8 illustrates an example in which thecontrol panel 300 causes theinterface 304 c to zoom-in on themap 320. - Alternatively, the user may zoom-out from the
map 320 in theinterface 304 c to a desired magnification level. In some embodiments, such an action may be performed by using an opposite gesture (e.g., the pinch gesture ofFIG. 6 ). In either case, the gesture performed may be repeated multiple times. Repeating a spread gesture may, for instance, allow the magnification level to be repeatedly increased on an object or set of objects within theinterface 304 c. Similarly, repeating a pinch gesture may repeatedly decrease the magnification level. An example of zooming-out on an object is shown in going from the view ofmap 320 inFIG. 8 to the view inFIG. 7 . - In further embodiments, a gesture may be zoom in or zoom out on an object. More particularly, in some embodiments as shown in
FIGS. 9 and 10 ,interface 304 d may be displayed on thedisplay 302 of thecontrol panel 300. Theinterface 304 d may be a view from a camera (e.g., a security camera). The camera may provide a view illustrating a particular object, room, or other location. In this embodiment,FIG. 9 illustrates a view of a particular room. - If the user wants a closer view of the particular object or location, the user may touch the
display 302 and perform a gesture corresponding to a zoom-in action (e.g., the spreading gesture ofFIG. 5 , double-tapping the display, etc.). Similarly, by performing a gesture corresponding to a zoom-out action (e.g., the pinching gesture ofFIG. 6 ), thecontrol panel 300 may zoom out and provide a broader view of a location.FIG. 10 illustrates an example where the camera feed is zoomed out, which exposes the room shown inFIG. 9 , as well as an outdoor area. Of course, the displayed location is merely illustrative, and a camera feed may be located at any location inside or outside a residential, commercial building, or other premises to provide a number of different views. - Providing for the ability to zoom in and zoom out on the
display 302 may allow a user to zoom in or zoom out to see a number of different objects, whether the interface displays a camera feed, map, or some other object. For instance, a user searching for his keys may zoom in on various locations within a building to find and see the keys from thedisplay 302. Similarly, a user may search for a person carrying a GPS-equipped phone or a pet carrying a GPS transponder. The user could zoom out in a map or camera-feed to find the general location of the person or pet, and then zoom in on the location to find more precise coordinates. - In some embodiments, performing a gesture may change the magnification level as shown in
FIGS. 7-10 . Such magnification may be performed directly at thecontrol panel 300 by, for instance, increasing the magnification of an object or set of objects within thedisplay 302. In other embodiments, however, the gestures performed may cause thecontrol panel 300 to perform another action. In the example of a camera feed, for instance, when a gesture corresponding to a zoom function is received, thecontrol panel 300 may communicate with the security camera to cause the camera itself to zoom in or out. Similar capabilities may be provided to allow a user to move the focal point of the camera around (e.g., using a drag gesture as described hereafter with respect toFIGS. 14-16 ). Additional gestures may even provide the ability to change between map views or camera feeds (e.g., using a flick gesture as described with respect toFIGS. 14, 17 and 18 ). The pinch and spread gestures may be associated with other functions as well. - In some embodiments, other gestures may manipulate the interface of the display of the
control panel 300.FIG. 11 , for instance, illustrates another example embodiment in which a user may touch thedisplay 302 of thecontrol panel 300 at touch-points FIG. 11 illustrates a rotational movement in which the touch-points - The rotational gesture illustrated in
FIG. 11 may be associated with one or more actions that may change the size, shape, type, number or other characteristics of the objects displayed in thedisplay 302. For instance,FIG. 12 illustrates thedisplay 302 as including aninterface 304 e having thereon amap 322. In some instances, themap 322 may be a neighborhood map. If the user touches thedisplay 302 and performs a particular gesture (e.g., the rotational gesture of FIG. 11), themap 322 may change.FIGS. 12 and 13 illustrate an example in which themap 322 rotates clockwise from a horizontal position (FIG. 12 ) to a vertical position (FIG. 13 ) as a result of a rotational gesture. - In accordance with some embodiments, a rotational gesture may cause an interface of the
display 302 to change and rotate one or more objects in a predetermined direction. The direction may always be the same, or may be based on the gesture. For instance, rather than rotating one or more touch-points clockwise, the user may rotate touch points in a counter-clockwise direction. As a result, themap 322 could rotate in a counter-clockwise direction. - In view of the disclosure herein, one skilled in the art should also appreciate that the gesture of
FIG. 11 could perform other actions, or could be replaced with other gestures to rotate an object. As an example, a user could touch thedisplay 302 at a single point and move the touch-point along a curved path. The curved path could indicate a desire to rotate an object. In other embodiments, more than two touch-points may be used. In still further embodiments, two touch-points may be used; however, only touch-point may be moved in a rotational direction (e.g., the second touch-point is held in place). -
FIG. 14 illustrates still another set of example gestures that may be performed on, and recognized by, thedisplay 302 of thecontrol panel 300. The gestures may include a single touch-point 316 moved in a particular direction (e.g., to touch-point point 316 is moved, the gesture may be considered a dragging motion or a flicking motion. More particularly, if thecontrol panel 300 recognizes the motion occurs over a very short time (e.g., less than half a second, less than a quarter second, less than a tenth of a second, etc.) thecontrol panel 300 may determine the gesture is a flick gesture, whereas movements over longer periods of time may be considered dragging gestures. Dragging and flicking gestures may have different associated actions in thedisplay 302. - For instance,
FIGS. 15 and 16 illustrate an exemplary action of thecontrol panel 300 in response to a drag or flick gesture. Thedisplay 302 may show an interface 304 f which may be amap 324. Themap 324 may be a map of a room, house, street, neighborhood, city, or other location. In some embodiments, the interface 304 f may display one or more other objects. For example, the interface 304 may display any object in a security system or home automation system, including views from one or more cameras. - In some embodiments, a panning action may be associated with a drag gesture. Therefore, if the user performs an action similar to the gesture from touch-
point 316 to touch-point 316 inFIG. 14 , and does so at a rate recognized as a drag gesture, themap 324 may pan within the interface 304 in a corresponding direction. In this particular embodiment, an upward drag gesture may cause themap 324 to pan upward and show additional portions of themap 324, as shown by the change in the view of thedisplay 302 fromFIG. 15 toFIG. 16 . An opposite, or downward, drag gesture may cause thecontrol panel 300 to pan themap 324 downward in an opposite direction (e.g., from the view inFIG. 16 to the view inFIG. 15 ). Drag gestures may be in any direction such as, for example, diagonal, horizontal, linear, non-linear, curved, other any other associated direction. Furthermore, in some embodiments, a drag gesture may include changes in direction such that the objects in thedisplay 302 pan or scroll about in real-time with the gesture. - A drag gesture as shown in
FIGS. 15 and 16 may generally be used to pan or scroll and change the position of one or more objects within theinterface 304 e of thedisplay 302. In other embodiments, actions performed in response to gestures may change theinterface 304 e itself, rather than the position or other characteristics of objects or images therein. -
FIGS. 17 and 18 , for instance, illustrate acontrol panel 300 having adisplay 302 where the interface may change from afirst interface 304 g (FIG. 17 ) to asecond interface 304 h (FIG. 18 ). In some embodiments, a flick gesture, such as that described with respect toFIG. 14 , may change the displayed interface. For example, inFIG. 17 , thefirst interface 304 g may generally provide information about a sprinkler system managed or monitored using thecontrol panel 300. In this embodiment, thefirst interface 304 g may provide information about a particular zone of a sprinkler system (e.g., flower beds). Information such as the current status of the zone, a schedule for the zone if the sprinkler system is automated, an option to change the schedule, an option to start the zone, a map of sprinklers or the coverage area of the zone, information on whether any problems are detected in the zone, and the like may also be displayed. - A sprinkler system may include multiple zones. A second zone may, therefore, also have corresponding information displayed in an associated interface.
FIG. 18 illustrates asecond interface 304 h for displaying information on the second zone (e.g., for a front yard). In this embodiment, the second zone is shown to be currently turned on, and other information displayed includes the automated schedule, the time remaining in a current program, an option to change the schedule, a map of the zone, and an option to stop the sprinklers of the second zone. - One embodiment of the present disclosure contemplates changing between the
first interface 304 g andsecond interface 304 h using a gesture, for instance, a flick gesture.FIG. 14 provides an example in which a user may rapidly move a contact point on thedisplay 302 from a first touch-point 316 to a second touch-point 316 e. If thecontrol panel 300 ofFIG. 17 recognizes the gesture as a flick, thecontrol panel 300 may determine the user wishes to change interfaces rather than scroll objects in the same interface. Thus, thecontrol panel 300 may change fromfirst interface 304 g tosecond interface 304 h. If the user wants to scroll to an interface corresponding to yet another zone, the user could repeat the flick gesture. A flick gesture in an opposite direction may cause thecontrol panel 300 to move back to a previous interface. - A flick gesture may be generally horizontal, but may also be in other directions. Different directions of flick gestures may have different corresponding actions. For instance, a vertical (up or down) flick gesture may cause the
display 302 to change from one type of interface (e.g., sprinkler system) to another type (e.g., system menu, entertainment, lighting, HVAC, cameras, alarms, etc.). - The tap, press, flick, drag, rotate, pinch and spread gestures described herein, as well as the particular actions associated therewith, are intended to merely illustrate examples of some types of gestures that may be recognized by a control panel of a security system. Additional or other gestures or inputs may also be provided. For instance, combinations of the above may be included. As an example, a gesture may be associated with a combined tap and drag motion. Additional gestures may include double tap gestures or tap and hold gestures. In addition, while the illustrated gestures include one or two touch-points, even more touch-points may be monitored. By way of illustration, the
display 302 of thecontrol panel 300 may recognize three or more simultaneous touch-points. - The types of actions associated with a gesture may also be varied, and need not include or be limited to select, zoom, rotate, pan, scroll and interface change actions. Other actions associated with gestures may include deleting, copying, moving, bundling, closing, opening, drawing, and other actions.
- Some or all embodiments of the present disclosure may also include other features to facilitate use or desirability of the control panel in a security system. One aspect of buttons, toggles, switches, and mechanical controls is that as the control is activated, there is generally an associated movement of a portion of the control (e.g., a depressed button physically moves downward). This movement can be sensed by a user and the user obtains a real-time, tactile response than an action has been carried out. In contrast, touch-screens and non-mechanical buttons (e.g., pressure sensitive or capacitive contact surfaces) which do not rely on movement of mechanical components generally do not provide such a response. Instead, the user must wait for a visual or audio confirmation that an action has been received.
- In some embodiments, as shown in
FIG. 19 , the illustrates anexample control panel 300 may provide a user with a haptic response and tactile feedback even when non-mechanical input components are used. In some embodiment, thedisplay 302 of thecontrol panel 300 may show an interface 304 i. According to some embodiments, the user may touch thedisplay 302 to provide an input (e.g., to enter a PIN or other credentials to arm or disarm an entrance, to select an available menu option, etc.). Thecontrol panel 300 may include a tactile output component (seeFIG. 2 ) configured to respond when input is received at thedisplay 302. For instance, a sensor may detect contact on thedisplay 302 and communicate with the tactile output component. When the tactile output component receives an indication of contact, the tactile output component may vibrate, rotate, move, or otherwise provide a mechanical stimulus. For instance, a vibration may be generated causing theentire control panel 300 to vibrate as reflected by the arrows and motion lines inFIG. 19 . - Vibration of the
control panel 300 in response to input at thedisplay 302 is merely one example of haptic feedback. In other embodiments, for instance, thedisplay 302 may move or rotate. An example may include making theentire display 302 movable relative to thecontrol panel 300 so that when thedisplay 302 is touched, thedisplay 302 moves. In still other embodiments, haptic feedback may be provided in response to input at theoptional buttons buttons display 302, may result in haptic feedback. The haptic feedback may be the same as for thecontrol panel 300, or different therefrom. In some embodiments, haptic feedback may be provided for one of the, but not both,buttons 308, 310 (or one button). - Embodiments of the present disclosure may comprise or utilize a special purpose or general-purpose computer including computer hardware, such as, for example, one or more processors and system memory. Embodiments within the scope of the present disclosure also include physical and other computer-readable media for carrying or storing computer-executable instructions and/or data structures. Such computer-readable media can be any available media that can be accessed by a general purpose or special purpose computer system. Computer-readable media that store computer-executable instructions are physical storage media. Computer-readable media that carry computer-executable instructions are transmission media. Thus, by way of example, and not limitation, embodiments of the disclosure may comprise at least two distinctly different kinds of computer-readable media, including at least computer storage media and/or transmission media.
- Examples of computer storage media include RAM, ROM, EEPROM, CD-ROM or other optical disk storage, magnetic disk storage or other magnetic storage devices, flash-based storage, solid-state storage, or any other non-transmission medium which can be used to store desired program code means in the form of computer-executable instructions or data structures and which can be accessed by a general purpose or special purpose computer.
- When information is transferred or provided over a communication network or another communications connection (either hardwired, wireless, or a combination of hardwired or wireless) to a computing device, the computing device properly views the connection as a transmission medium. A “communication network” may generally be defined as one or more data links that enable the transport of electronic data between computer systems and/or modules, engines, and/or other electronic devices, and transmissions media can include a communication network and/or data links, carrier waves, wireless signals, and the like, which can be used to carry desired program or template code means or instructions in the form of computer-executable instructions or data structures within, to or from a communication network. Combinations of storage media and transmission media should also be included within the scope of computer-readable media.
- Further, upon reaching various computer system components, program code means in the form of computer-executable instructions or data structures can be transferred automatically from transmission media to storage media (or vice versa). For example, computer-executable instructions or data structures received over a network or data link can be buffered in RAM within a network interface module (e.g., a “NIC”), and then eventually transferred to computer system RAM and/or to less volatile computer storage media at a computer system. Thus, it should be understood that computer storage media can be included in computer system components that also (or even primarily) utilize transmission media.
- Computer-executable instructions comprise instructions and data which, when executed at a processor, cause a general purpose computer, special purpose computer (e.g., a security system control panel), or special purpose processing device to perform a certain function or group of functions. The computer executable instructions may be, for example, binaries, intermediate format instructions such as assembly language, or even source code. Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the described features or acts described above, nor performance of the described acts or steps by the components described above. Rather, the described features and acts are disclosed as example forms of implementing the claims.
- Those skilled in the art will appreciate that the embodiments may be practiced in network computing environments with many types of computer system configurations, including, personal computers, desktop computers, laptop computers, message processors, hand-held devices, programmable logic machines, multi-processor systems, microprocessor-based or programmable consumer electronics, network PCs, tablet computing devices, minicomputers, security system control panels, security system network operations centers, mainframe computers, mobile telephones, PDAs, pagers, routers, switches, and the like.
- Embodiments may also be practiced in distributed system environments where local and remote computer systems, which are linked (either by hardwired data links, wireless data links, or by a combination of hardwired and wireless data links) through a network, both perform tasks. In a distributed computing environment, program modules may be located in both local and remote memory storage devices.
- Those skilled in the art will also appreciate that embodiments of the present disclosure may be practiced in special-purpose, dedicated or other computing devices integrated within or particular to a particular residence, business, company, government agency, or other entity, and that such devices may operate using a network connection, wireless connection, or hardwire connection. Examples may include residential or commercial buildings in connection with security systems configured to monitor local conditions (i.e., at the same building or location), remote conditions (i.e., at a different building or location), or some combination thereof.
- Although the foregoing description contains many specifics, these should not be construed as limiting the scope of the disclosure or of any of the appended claims, but merely as providing information pertinent to some specific embodiments that may fall within the scopes of the disclosure and the appended claims. Various embodiments are described, some of which incorporate differing features. The features illustrated or described relative to one embodiment are interchangeable and/or may be employed in combination with features of any other embodiment herein. In addition, other embodiments of the disclosure may also be devised which lie within the scopes of the disclosure and the appended claims. The scope of the disclosure is, therefore, indicated and limited only by the appended claims and their legal equivalents. All additions, deletions and modifications to the disclosure, as disclosed herein, that fall within the meaning and scopes of the claims are to be embraced by the claims.
Claims (21)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/430,506 US20170220242A1 (en) | 2013-03-15 | 2017-02-12 | Home security system with touch-sensitive control panel |
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201361791142P | 2013-03-15 | 2013-03-15 | |
US14/211,018 US9568902B2 (en) | 2013-03-15 | 2014-03-14 | Home security system with touch-sensitive control panel |
US15/430,506 US20170220242A1 (en) | 2013-03-15 | 2017-02-12 | Home security system with touch-sensitive control panel |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/211,018 Continuation US9568902B2 (en) | 2013-03-15 | 2014-03-14 | Home security system with touch-sensitive control panel |
Publications (1)
Publication Number | Publication Date |
---|---|
US20170220242A1 true US20170220242A1 (en) | 2017-08-03 |
Family
ID=51525300
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/211,018 Active US9568902B2 (en) | 2013-03-15 | 2014-03-14 | Home security system with touch-sensitive control panel |
US15/430,506 Abandoned US20170220242A1 (en) | 2013-03-15 | 2017-02-12 | Home security system with touch-sensitive control panel |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/211,018 Active US9568902B2 (en) | 2013-03-15 | 2014-03-14 | Home security system with touch-sensitive control panel |
Country Status (1)
Country | Link |
---|---|
US (2) | US9568902B2 (en) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107367970A (en) * | 2017-08-15 | 2017-11-21 | 意诺科技有限公司 | A kind of control panel, control system and control method |
US10037637B1 (en) * | 2017-01-30 | 2018-07-31 | Kyocera Document Solutions Inc. | Security system |
US10181230B2 (en) * | 2017-04-20 | 2019-01-15 | Sensormatic Electronics, LLC | System and method for controlling access at access point |
US11467711B2 (en) | 2017-12-21 | 2022-10-11 | Ademco Inc. | Systems and methods for displaying and associating context images with zones of a security system |
Families Citing this family (31)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
ES2659773T3 (en) * | 2013-03-15 | 2018-03-19 | Vivint, Inc | Using a control panel as a wireless access point |
US10564813B2 (en) * | 2013-06-18 | 2020-02-18 | Samsung Electronics Co., Ltd. | User terminal apparatus and management method of home network thereof |
KR102318442B1 (en) | 2013-06-18 | 2021-10-28 | 삼성전자주식회사 | User terminal device and method of managing home network thereof |
CN105474580B (en) | 2013-06-18 | 2019-02-15 | 三星电子株式会社 | The management method of subscriber terminal equipment and its home network |
US9890967B2 (en) | 2013-08-28 | 2018-02-13 | Trane International Inc. | Systems and methods for HVAC and irrigation control |
US20150081106A1 (en) * | 2013-09-18 | 2015-03-19 | Trane International Inc. | Systems and Methods for HVAC and Irrigation Control |
TWD169765S (en) * | 2014-07-18 | 2015-08-11 | 財團法人資訊工業策進會 | A portion of a cloud infrared controller |
USD746165S1 (en) * | 2014-12-22 | 2015-12-29 | Chuango Security Technology Corporation | Home security keypad |
USD746166S1 (en) * | 2014-12-23 | 2015-12-29 | Chuango Security Technology Corporation | Electronic control unit for a security system |
US9514636B2 (en) | 2014-12-30 | 2016-12-06 | Google Inc. | Premises management system with prevention measures |
JP6229816B2 (en) * | 2015-03-27 | 2017-11-15 | 日本電気株式会社 | Mobile monitoring device, program, and control method |
CN105446302A (en) * | 2015-12-25 | 2016-03-30 | 惠州Tcl移动通信有限公司 | Smart terminal-based smart home equipment instruction interaction method and system |
US10713873B1 (en) * | 2015-12-31 | 2020-07-14 | Vivint, Inc. | Traveling automation preferences |
US10490003B2 (en) | 2015-12-31 | 2019-11-26 | Vivint, Inc. | Guest mode access |
US11176808B2 (en) * | 2016-01-06 | 2021-11-16 | Johnson Controls Fire Protection LP | Interface actuator device and method of use |
EP3736787B1 (en) * | 2018-07-11 | 2023-04-12 | Honeywell International Inc. | System and method for device address assignment in an alarm system using interactive address assignment for faster commissioning |
USD886656S1 (en) * | 2018-11-06 | 2020-06-09 | Frontpoint Security Solutions, LLC | Keypad for a security system |
USD887301S1 (en) * | 2018-11-06 | 2020-06-16 | Frontpoint Security Solutions, LLC | Control panel for a security system |
US20200279473A1 (en) * | 2019-02-28 | 2020-09-03 | Nortek Security & Control Llc | Virtual partition of a security system |
US11626010B2 (en) * | 2019-02-28 | 2023-04-11 | Nortek Security & Control Llc | Dynamic partition of a security system |
EP3973229A4 (en) * | 2019-05-23 | 2022-07-06 | Alarm.com Incorporated | Advanced monitoring of an hvac system |
US11881093B2 (en) | 2020-08-20 | 2024-01-23 | Denso International America, Inc. | Systems and methods for identifying smoking in vehicles |
US12017506B2 (en) | 2020-08-20 | 2024-06-25 | Denso International America, Inc. | Passenger cabin air control systems and methods |
US11932080B2 (en) | 2020-08-20 | 2024-03-19 | Denso International America, Inc. | Diagnostic and recirculation control systems and methods |
US11813926B2 (en) | 2020-08-20 | 2023-11-14 | Denso International America, Inc. | Binding agent and olfaction sensor |
US11636870B2 (en) | 2020-08-20 | 2023-04-25 | Denso International America, Inc. | Smoking cessation systems and methods |
US11760169B2 (en) | 2020-08-20 | 2023-09-19 | Denso International America, Inc. | Particulate control systems and methods for olfaction sensors |
US11828210B2 (en) | 2020-08-20 | 2023-11-28 | Denso International America, Inc. | Diagnostic systems and methods of vehicles using olfaction |
US11760170B2 (en) | 2020-08-20 | 2023-09-19 | Denso International America, Inc. | Olfaction sensor preservation systems and methods |
US12132869B2 (en) * | 2020-08-26 | 2024-10-29 | Damon Curry | Phone interface system |
US11756531B1 (en) | 2020-12-18 | 2023-09-12 | Vivint, Inc. | Techniques for audio detection at a control system |
Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20070032225A1 (en) * | 2005-08-03 | 2007-02-08 | Konicek Jeffrey C | Realtime, location-based cell phone enhancements, uses, and applications |
US20090288011A1 (en) * | 2008-03-28 | 2009-11-19 | Gadi Piran | Method and system for video collection and analysis thereof |
US20110199495A1 (en) * | 2010-02-12 | 2011-08-18 | Honeywell International Inc. | Method of manipulating assets shown on a touch-sensitive display |
US20110199314A1 (en) * | 2010-02-12 | 2011-08-18 | Honeywell International Inc. | Gestures on a touch-sensitive display |
US20120006883A1 (en) * | 2009-03-16 | 2012-01-12 | Kawasaki Jukogyo Kabushiki Kaisha | Apparatus and method for friction stir welding |
US20120028297A1 (en) * | 2007-11-20 | 2012-02-02 | 3M Innovative Properties Company | Environmental sampling articles and methods |
US20120194336A1 (en) * | 2011-01-31 | 2012-08-02 | Honeywell International Inc. | User interfaces for enabling information infusion to improve situation awareness |
US20120242850A1 (en) * | 2011-03-21 | 2012-09-27 | Honeywell International Inc. | Method of defining camera scan movements using gestures |
US20140005809A1 (en) * | 2012-06-27 | 2014-01-02 | Ubiquiti Networks, Inc. | Method and apparatus for configuring and controlling interfacing devices |
US20140068486A1 (en) * | 2012-08-31 | 2014-03-06 | Verizon Patent And Licensing Inc. | Connected home user interface systems and methods |
Family Cites Families (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5086385A (en) * | 1989-01-31 | 1992-02-04 | Custom Command Systems | Expandable home automation system |
US20080088587A1 (en) | 2001-02-22 | 2008-04-17 | Timothy Pryor | Compact rtd instrument panels and computer interfaces |
US20030038849A1 (en) | 2001-07-10 | 2003-02-27 | Nortel Networks Limited | System and method for remotely interfacing with a plurality of electronic devices |
US20040260407A1 (en) * | 2003-04-08 | 2004-12-23 | William Wimsatt | Home automation control architecture |
US20060028428A1 (en) * | 2004-08-05 | 2006-02-09 | Xunhu Dai | Handheld device having localized force feedback |
US8686952B2 (en) * | 2008-12-23 | 2014-04-01 | Apple Inc. | Multi touch with multi haptics |
US20110314515A1 (en) * | 2009-01-06 | 2011-12-22 | Hernoud Melanie S | Integrated physical and logical security management via a portable device |
US8429565B2 (en) * | 2009-08-25 | 2013-04-23 | Google Inc. | Direct manipulation gestures |
US20110113360A1 (en) * | 2009-11-12 | 2011-05-12 | Bank Of America Corporation | Facility monitoring and control system interface |
US9405395B2 (en) * | 2010-08-04 | 2016-08-02 | Crestron Electronics, Inc. | Wall-mounted control system for a portable touch screen device |
US8489065B2 (en) * | 2011-05-03 | 2013-07-16 | Robert M Green | Mobile device controller application for any security system |
US20120282886A1 (en) * | 2011-05-05 | 2012-11-08 | David Amis | Systems and methods for initiating a distress signal from a mobile device without requiring focused visual attention from a user |
US9462210B2 (en) | 2011-11-04 | 2016-10-04 | Remote TelePointer, LLC | Method and system for user interface for interactive devices using a mobile device |
US9472072B2 (en) * | 2012-05-04 | 2016-10-18 | Honeywell International Inc. | System and method of post event/alarm analysis in CCTV and integrated security systems |
AU2013100081A4 (en) | 2012-10-22 | 2013-03-07 | Automate Projects Pty Ltd | Home Environment Automated Real Time (HEART) System |
US9839101B2 (en) * | 2015-03-06 | 2017-12-05 | Lutron Electronics Co., Inc. | Load control adjustment from a wearable wireless device |
-
2014
- 2014-03-14 US US14/211,018 patent/US9568902B2/en active Active
-
2017
- 2017-02-12 US US15/430,506 patent/US20170220242A1/en not_active Abandoned
Patent Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20070032225A1 (en) * | 2005-08-03 | 2007-02-08 | Konicek Jeffrey C | Realtime, location-based cell phone enhancements, uses, and applications |
US20120028297A1 (en) * | 2007-11-20 | 2012-02-02 | 3M Innovative Properties Company | Environmental sampling articles and methods |
US20090288011A1 (en) * | 2008-03-28 | 2009-11-19 | Gadi Piran | Method and system for video collection and analysis thereof |
US20120006883A1 (en) * | 2009-03-16 | 2012-01-12 | Kawasaki Jukogyo Kabushiki Kaisha | Apparatus and method for friction stir welding |
US20110199495A1 (en) * | 2010-02-12 | 2011-08-18 | Honeywell International Inc. | Method of manipulating assets shown on a touch-sensitive display |
US20110199314A1 (en) * | 2010-02-12 | 2011-08-18 | Honeywell International Inc. | Gestures on a touch-sensitive display |
US20120194336A1 (en) * | 2011-01-31 | 2012-08-02 | Honeywell International Inc. | User interfaces for enabling information infusion to improve situation awareness |
US20120242850A1 (en) * | 2011-03-21 | 2012-09-27 | Honeywell International Inc. | Method of defining camera scan movements using gestures |
US20140005809A1 (en) * | 2012-06-27 | 2014-01-02 | Ubiquiti Networks, Inc. | Method and apparatus for configuring and controlling interfacing devices |
US20140068486A1 (en) * | 2012-08-31 | 2014-03-06 | Verizon Patent And Licensing Inc. | Connected home user interface systems and methods |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10037637B1 (en) * | 2017-01-30 | 2018-07-31 | Kyocera Document Solutions Inc. | Security system |
US20180218550A1 (en) * | 2017-01-30 | 2018-08-02 | Kyocera Document Solutions Inc. | Security system |
US10181230B2 (en) * | 2017-04-20 | 2019-01-15 | Sensormatic Electronics, LLC | System and method for controlling access at access point |
CN107367970A (en) * | 2017-08-15 | 2017-11-21 | 意诺科技有限公司 | A kind of control panel, control system and control method |
US11467711B2 (en) | 2017-12-21 | 2022-10-11 | Ademco Inc. | Systems and methods for displaying and associating context images with zones of a security system |
Also Published As
Publication number | Publication date |
---|---|
US9568902B2 (en) | 2017-02-14 |
US20140267112A1 (en) | 2014-09-18 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9568902B2 (en) | Home security system with touch-sensitive control panel | |
US11853646B2 (en) | User interfaces for audio media control | |
US20210385417A1 (en) | Camera and visitor user interfaces | |
US11513667B2 (en) | User interface for audio message | |
EP3241372B1 (en) | Contextual based gesture recognition and control | |
US20210349680A1 (en) | User interface for audio message | |
JP6302599B2 (en) | Grouping method, grouping device, program, and recording medium for smart devices in smart home system | |
US10911685B2 (en) | Monitoring apparatus and system which detects events occurred in a region of interest and counts a number of occurred events | |
US9760174B1 (en) | Haptic feedback as accessibility mode in home automation systems | |
US11444799B2 (en) | Method and system of controlling device using real-time indoor image | |
DK201670604A1 (en) | User interface for managing controllable external devices | |
WO2020096969A1 (en) | System and apparatus for a home security system | |
US11675832B2 (en) | Search apparatus | |
US20190116305A1 (en) | Monitoring apparatus and system | |
CN110995551A (en) | Storage medium, intelligent panel and interaction method thereof | |
AU2023204396B2 (en) | User interface for audio message |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: VIVINT, INC., UTAH Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:DUNN, ALEX J.;SANTIAGO, TODD MATTHEW;NYE, JAMES E.;AND OTHERS;SIGNING DATES FROM 20140226 TO 20140227;REEL/FRAME:041232/0208 |
|
AS | Assignment |
Owner name: BANK OF AMERICA N.A., NORTH CAROLINA Free format text: SUPPL. NO. 2 SECURITY AGREEMENT;ASSIGNOR:VIVINT, INC.;REEL/FRAME:047024/0048 Effective date: 20180906 Owner name: BANK OF AMERICA, N.A., NORTH CAROLINA Free format text: SECURITY AGREEMENT;ASSIGNOR:VIVINT, INC.;REEL/FRAME:047029/0304 Effective date: 20180906 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
AS | Assignment |
Owner name: WILMINGTON TRUST, NATIONAL ASSOCIATION, DELAWARE Free format text: SECURITY AGREEMENT;ASSIGNOR:VIVINT, INC.;REEL/FRAME:049283/0566 Effective date: 20190510 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE AFTER FINAL ACTION FORWARDED TO EXAMINER |
|
STCB | Information on status: application discontinuation |
Free format text: EXPRESSLY ABANDONED -- DURING EXAMINATION |
|
AS | Assignment |
Owner name: VIVINT, INC., UTAH Free format text: RELEASE BY SECURED PARTY;ASSIGNOR:BANK OF AMERICA, N.A.;REEL/FRAME:056832/0824 Effective date: 20210709 |