US20100020221A1 - Camera Interface in a Portable Handheld Electronic Device - Google Patents
Camera Interface in a Portable Handheld Electronic Device Download PDFInfo
- Publication number
- US20100020221A1 US20100020221A1 US12/508,534 US50853409A US2010020221A1 US 20100020221 A1 US20100020221 A1 US 20100020221A1 US 50853409 A US50853409 A US 50853409A US 2010020221 A1 US2010020221 A1 US 2010020221A1
- Authority
- US
- United States
- Prior art keywords
- touch sensitive
- sensitive screen
- user
- selected area
- camera
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
- 238000000034 method Methods 0.000 claims abstract description 59
- 230000008569 process Effects 0.000 claims abstract description 35
- 230000033001 locomotion Effects 0.000 claims abstract description 30
- 230000004044 response Effects 0.000 claims abstract description 19
- 230000007480 spreading Effects 0.000 claims abstract description 12
- 239000003550 marker Substances 0.000 claims description 18
- 230000003287 optical effect Effects 0.000 claims description 9
- 238000012937 correction Methods 0.000 claims description 5
- 210000003811 finger Anatomy 0.000 description 43
- 230000006870 function Effects 0.000 description 12
- 238000010586 diagram Methods 0.000 description 7
- 230000001413 cellular effect Effects 0.000 description 6
- 210000003813 thumb Anatomy 0.000 description 6
- 230000009471 action Effects 0.000 description 4
- 238000012544 monitoring process Methods 0.000 description 3
- 239000007787 solid Substances 0.000 description 3
- 238000005516 engineering process Methods 0.000 description 2
- 238000003825 pressing Methods 0.000 description 2
- 238000012545 processing Methods 0.000 description 2
- 230000005540 biological transmission Effects 0.000 description 1
- 239000003086 colorant Substances 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 230000000994 depressogenic effect Effects 0.000 description 1
- 210000004247 hand Anatomy 0.000 description 1
- 230000010354 integration Effects 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 230000005236 sound signal Effects 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/62—Control of parameters via user interfaces
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/63—Control of cameras or camera modules by using electronic viewfinders
- H04N23/631—Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters
- H04N23/632—Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters for displaying or modifying preview images prior to image capturing, e.g. variety of image resolutions or capturing parameters
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/63—Control of cameras or camera modules by using electronic viewfinders
- H04N23/633—Control of cameras or camera modules by using electronic viewfinders for displaying additional information relating to control or operation of the camera
- H04N23/635—Region indicators; Field of view indicators
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/67—Focus control based on electronic image sensor signals
- H04N23/675—Focus control based on electronic image sensor signals comprising setting of focusing regions
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/70—Circuitry for compensating brightness variation in the scene
- H04N23/73—Circuitry for compensating brightness variation in the scene by influencing the exposure time
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/76—Television signal recording
- H04N5/765—Interface circuits between an apparatus for recording and another apparatus
- H04N5/77—Interface circuits between an apparatus for recording and another apparatus between a recording apparatus and a television camera
- H04N5/772—Interface circuits between an apparatus for recording and another apparatus between a recording apparatus and a television camera the recording apparatus and the television camera being placed in the same enclosure
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/048—Indexing scheme relating to G06F3/048
- G06F2203/04808—Several contacts: gestures triggering a specific function, e.g. scrolling, zooming, right-click, when the user establishes several contacts with the surface simultaneously; e.g. using several fingers or a combination of fingers and pen
Definitions
- the disclosed embodiments relate generally to portable handheld electronic devices, such as cellular telephone handsets and digital cameras, and more particularly to a user interface having a touch sensitive screen for controlling camera functions.
- Portable handheld electronic devices such as the IPHONE multifunction device by Apple Inc.
- the IPHONE device has a touch sensitive screen as part of its user interface.
- the touch screen lets the user select a particular application program to be run, by performing a single finger gesture on the touch sensitive screen. For example, the user can point to (touch) the icon of a particular application, which results in the application being automatically launched in the device.
- the camera application in particular, allows the user to navigate amongst previously stored pictures taken using the camera directly on the touch screen.
- a shutter button icon that can be touched by the user to release the shutter and thereby take a picture of the scene that is before the camera.
- Other uses of the touch sensitive screen include navigating around a Web page that is being displayed by single finger gestures, and zooming into a displayed Web page by performing a so called multi-finger spread gesture on the touch sensitive screen. The user can also zoom out of the Web page, by performing a multi-finger pinch gesture.
- a method is performed at a handheld electronic device having a built-in digital camera and a touch sensitive screen.
- the method includes detecting a multi-finger gesture on the touch sensitive screen, wherein the touch sensitive screen is serving as part of an electronic viewfinder of the camera; storing coordinates of a location corresponding to the detected multi-finger gesture; translating the stored coordinates to a selected area of an image that is captured by the camera and that is being displayed on the touch sensitive screen; contracting or expanding the selected area in response to the user's fingers undergoing a pinching movement or a spreading movement, respectively, while the detected multi-finger gesture remains in contact with the touch sensitive screen; and applying an automatic image capture parameter adjustment process that gives priority to the selected area.
- This gives the user finer control of auto focus, auto exposure, and auto while balance (“3A”) adjustments in the camera.
- a handheld electronic device which comprises a touch sensitive screen, a detector configured to detect a multi-finger gesture on the touch sensitive screen and store coordinates of a location of the detected gesture; an a digital camera.
- the digital camera includes an image sensor, a lens to form an optical image on the image sensor, a viewfinder module configured to display on the touch sensitive screen a scene at which the lens is aimed, and a priority module coupled to the detector.
- the priority module is configured to translate the stored coordinates to a selected area of a digital image of the scene that is being displayed on the touch sensitive screen by the viewfinder module, contract or expand the selected area in response to the user's fingers undergoing a pinching movement or a spreading movement, respectively, and apply an automatic image capture parameter adjustment process that gives priority to the selected area for taking a picture of the scene.
- a multi-touch pinch or spread gesture may define the hint or priority area, for calculating exposure parameters.
- a method is performed at a handheld electronic device having a built-in digital camera and a touch sensitive screen.
- the method includes detecting an initial finger gesture by a user on the touch sensitive screen, wherein the touch sensitive screen serves as part of an electronic viewfinder of the camera; storing coordinates of the initial finger gesture; detecting a closed path on the touch sensitive screen that includes the location of the detected initial finger gesture, wherein the user's finger moves while remaining in contact with the touch sensitive screen to define the closed path, and storing coordinates of the closed path.
- the method also includes translating the stored coordinates of the closed path to a selected portion of an image that is captured by the camera and that is being displayed on the touch sensitive screen; and applying an automatic image capture parameter adjustment process that gives priority to the selected portion.
- an apparatus which comprises a handheld electronic device configured to operate at least in a digital camera mode and a mobile telephone mode.
- the digital camera mode is configured to permit a user of the apparatus to take a digital picture of a scene
- the mobile telephone mode is configured to permit the user of the apparatus to participate in a wireless telephone call and hear the call through a built-in receiver of the apparatus.
- the apparatus has a button exposed to the user that alternatively controls loudness of the built-in receiver when the apparatus is operating in the mobile telephone mode, and the button acts as shutter button when the apparatus is operating in the digital camera mode.
- FIG. 1 shows a portable handheld device having a built-in digital camera and a touch sensitive screen, in the hands of its user undergoing a single finger gesture during a still image capture process.
- FIG. 2 is a flow diagram of operations in the electronic device during a still image capture process, in accordance with FIG. 1 .
- FIG. 3 shows the portable handheld electronic device undergoing a multi-finger gesture during a still image capture process.
- FIG. 4 is a flow diagram of operations in the electronic device during a still image capture process, in accordance with FIG. 3 .
- FIG. 5 illustrates another embodiment of the invention, where the user draws a polygon through a single finger touch gesture, to define the priority area for image capture.
- FIG. 6 is a flow diagram of still image capture in the electronic device, in accordance with the embodiment of FIG. 5 .
- FIG. 7 shows a block diagram of an example, portable handheld multifunction device in which an embodiment of the invention may be implemented.
- FIG. 1 shows a portable handheld electronic device 100 having a built-in digital camera and a touch sensitive screen 104 in the hand of its user, undergoing a finger gesture during a still image capture process.
- the portable device 100 is shown while it is held in the user's left hand 107 , and the user's right hand 109 is making the finger gesture on the touch screen.
- the device 100 may be an IPHONE device by Apple Inc., of Cupertino, Calif. Alternatively, it could be any other portable handheld electronic device that has a built-in digital camera and a touch sensitive screen.
- the built-in digital camera includes a lens 103 located in this example on the back face of the device 100 .
- the lens may be a fixed optical lens system or it may have focus and optical zoom capability.
- inside the device 100 are an electronic image sensor and associated hardware circuitry and running software that can capture a digital image of a scene 102 that is before the lens 103 .
- the digital camera functionality of the device 100 includes an electronic or digital viewfinder (also referred to as a preview function).
- the viewfinder displays live, captured video of the scene 102 that is before the camera, on a portion of the touch sensitive screen 104 as shown.
- the digital camera also includes a soft or virtual shutter button whose icon 105 is displayed by the screen 104 , directly below the viewfinder image area.
- a physical shutter button may be implemented in the device 100 .
- the device 100 includes all of the needed circuitry and/or software for implementing the digital camera functions of the electronic viewfinder ( 726 , FIG. 7 ), shutter release, and automatic image capture parameter adjustment as described below.
- the user performs a single-finger gesture on the touch sensitive screen 104 as shown.
- the finger gesture is formed by the user's right hand 109 (although it could just as well be the user's left hand 107 ).
- the user positions the single-finger touch gesture on a preview portion of the touch screen.
- the device 100 has detected this touch down and has automatically drawn a marker 96 (in this case, the closed contour that has a box shape), centered around the location of the touch down.
- the user then moves her right hand 109 around the preview portion of the touch screen, to a location of the image of the scene 102 that corresponds to an object in the scene (or some portion of the scene) to which priority should be given when the digital camera adjusts the image capture parameters in preparation for taking a picture of the scene.
- the user may move the marker 96 from up above the mountains and the trees down towards a location near the ground or where a man is walking.
- the user may lift off her finger gesture, which in turn signals the camera to accept the final location of the marker and the underlying portion of the image as the priority area of the scene.
- FIG. 2 A flow diagram of operations for taking the digital picture, in accordance with the above, is shown in FIG. 2 .
- a view finder function begins execution which displays video of the scene 102 that is before the camera lens 103 (block 22 ).
- the user aims the camera lens so that the desired portion of the scene appears on the preview portion of the screen 104 .
- a camera application or a touch screen application running in the device 100 detects a single-finger touch gesture and stores screen coordinates of its location (block 24 ).
- a marker 96 is then automatically displayed around the screen location of the touch gesture (block 26 ).
- the marker 96 is moved around the preview portion of the touch screen, in lock step with the user moving her finger touch gesture along the surface of the touch screen (block 28 ).
- An area of the image of the scene being shown in the preview portion and that underlies the final location of the marker is defined to be an area selected by the user for priority (block 29 ).
- This priority area may be finalized, for example, in response to the user lifting off her finger.
- the priority area of the image may be a fixed chunk of pixels that are about coextensive with the boundary of the marker 96 .
- the priority area may be an object in the scene located at or near the marker 96 , as detected by the camera application using digital image processing techniques.
- an automatic image capture parameter adjustment process is applied by the device 100 to give priority to the selected area (block 30 ). Additional details of this process will be explained below.
- the picture can be taken, for example, when the user gives the shutter release command (block 32 ). Several ways of defining the shutter release command are also described below. Thus, the process described above gives the user finer control of picture taking adjustments.
- the marker 96 is displayed in a variable state to indicate that one or more parameters are being adjusted.
- the marker 96 is displayed in an alternating sequence of colors, such as white, blue, white, blue, while the automatic image capture parameter adjustment process sets priority to the selected area (e.g., the marker 96 changes color while the camera is focusing).
- the display of marker 96 includes an animation of the boundary of the marker oscillating or “wiggling” on screen while the automatic image capture parameter adjustment process gives priority to the selected area under the location of the marker.
- display of marker 96 is terminated.
- FIG. 3 another embodiment of the invention is shown where the user defines the priority area this time by a multi-finger gesture.
- the multi-finger gesture is also formed by the user's right hand 109 (although it could just as well be the user's left hand 107 , while the device is held by the user's right hand 109 ).
- the thumb and index finger are brought close to each other or touch each other, simultaneously with their tips being in contact with the surface of the screen 104 to create two contact points thereon.
- the user positions this multi-touch gesture, namely the two contact points, at a location of the image of the scene 102 that corresponds to an object in the scene (or portion of the scene) to which priority should be given when the digital camera adjusts the image capture parameters in preparation for taking a picture of the scene.
- this multi-touch gesture namely the two contact points
- the user has selected the location where a person appears between a mountainside in the background and a tree in the foreground.
- the device 100 may cause a contour 106 , in this example, the outline of a box, to be displayed on the screen 104 , around the location of the detected multi-finger gesture.
- the contour 106 is associated, e.g. by software running in the device 100 , with a taken or selected priority area of the image (to which priority will be given in the image capture parameter adjustment process).
- the user can then contract or expand the size of the priority area, by making a pinching movement or a spreading movement, respectively, with her thumb and index fingers of her right hand 109 while the fingertips remain in contact with the touch sensitive screen 104 .
- the device 100 has the needed hardware and software to distinguish between a pinching movement and a spreading movement, and appropriately contracts or expands the size of the priority area.
- the digital camera can command the digital camera to take a picture after adjusting the image capture parameters to give priority to the selected area. This may be done by, for example, lifting her fingers off of the touch sensitive display screen 104 and then actuating the shutter release button (e.g., touching and then lifting off the soft shutter button icon 105 ). If instead the user would like default image capture parameter values to be used, then he would simply actuate the generic shutter button icon 105 without first touching the preview of the scene that is being displayed.
- FIG. 4 a flow diagram of operations for taking a digital picture using the device 100 , in accordance with an embodiment of the invention is shown.
- an electronic viewfinder function begins execution which displays video of the scene 102 that is before the camera lens 103 (block 402 ).
- the user can now aim the camera lens so that the desired portion of the scene appears on the touch sensitive screen 104 .
- a camera application or a separate touch screen application running in device 100 detects a multi-finger touch gesture, and stores screen coordinates of its location (block 404 ). These screen coordinates are then translated to refer to a corresponding area of an image of the scene (block 406 ).
- the screen location of the contour 106 is compared to the pixel content of the displayed image, within and near that contour (or underlying the contour), to determine that an object, in this case an image of a man walking, is present in that location.
- the pixels that make up the man thus become the selected or taken area of the image of the scene.
- the device 100 may contract or expand the selected area, in response to the user's fingers undergoing a pinching movement or a spreading movement, respectively while remaining in contact with the touch sensitive screen (block 408 ).
- the user can expand the selected area, to include more pixels of the image, by spreading the index finger and thumb of the right hand 109 , while they are in contact with the screen. This may be reflected by the device 100 enlarging the contour 106 that is being displayed.
- the device 100 detects the spreading movement and in response allocates more pixels to the selected area, for example, equally in all directions.
- the device 100 will contract the selected area (i.e., allocate fewer pixels of the image to define the selected area) in response to detecting that the user's fingers are undergoing a pinching movement, that is, the thumb and index finger are moved closer towards each other.
- the device 100 next applies an automatic image capture parameter adjustment process that gives priority to the selected area (block 410 ).
- This process may include making automatic adjustments to focus, exposure, and color correction (e.g., white balance) parameters. These are sometimes referred to as 3 A adjustments.
- the adjusted parameters will be applied by the camera when “taking the next picture” of a scene that is before it.
- Focus adjustment may include making movement adjustments to an optical lens system of the device 100 , so as to focus on the selected area of the scene.
- Exposure adjustments include changing the exposure time or integration time of the entire image sensor or portions of the image sensor, based upon characteristics of the selected area including its brightness (rather than that of the entire image).
- adjustments to the color correction parameters may include changing the parameters used to apply a white balance algorithm to a raw image obtained from the image sensor of the camera (that will ultimately become the “picture” of the scene).
- the marker 96 is displayed in a variable state to indicate that one or more parameters are being adjusted.
- the picture is taken when the user gives the shutter release command.
- the device 100 detecting that a shutter release command has been invoked (block 412 ).
- output from the image sensor may not be accepted until after having detected that the multi-finger gesture has been lifted off the touch sensitive screen.
- the camera takes the shot only after the user lifts her fingers off the touch screen.
- the picture is, of course, taken using the image capture parameters that have been adjusted to give priority to the selected area.
- the picture is taken only after expiration of a timer that was set upon the parameters having been adjusted.
- a shutter button may be depressed half-way by the user, to signify that the image capture parameters be adjusted to give priority to the selected area, and then after the parameters have been adjusted, the device 100 waits a predetermined time interval before accepting the user's command to take the shot (e.g., upon the user pressing the shutter button the rest of the way).
- the camera function of the device 100 tracks movement of an object in the scene, that has been captured in the selected area of the image, as the object and the device 100 move relative to each other and while the multi-touch gesture is still present on the touch screen.
- the camera could, for example, maintain focus on the moving object only so long as the multi-touch gesture is on the screen, and then at the moment the multi-touch gesture has lifted off the screen, a still picture of the moving object is taken.
- focus would be maintained even after the multi-touch gesture has lifted off, and then the picture of the moving object is taken when a separate virtual or physical shutter button is actuated by the user.
- the above-described process for the use of a multi-touch pinch or spread gesture to define the hint or priority area for calculating exposure parameters, may be extended to complex scenes in which there may be two or more distinct priority areas that the user would like to define. For example, the user may wish to select for priority both a dark shadow portion and a medium tone portion, but not a bright portion, of the scene.
- the device 100 captures and displays live video of the scene 102 to which the camera lens 103 is pointed, using a digital viewfinder function on the touch sensitive screen 104 (block 602 ). While showing this live video, the device 100 monitors the screen 104 for a single finger touch gesture. The initial single finger touch gesture is then detected and screen coordinates of its location are stored, while monitoring the screen (block 604 ).
- the closed path is detected on the touch sensitive screen and coordinates of the closed path are stored (block 606 ).
- the contour 306 is being drawn on the preview area of the touch screen that underlies the user's finger touch.
- the detected closed path may include the location of the detected initial finger gesture.
- the stored coordinates of the closed path are translated to a portion of the image that is being displayed on the screen 104 (block 608 ).
- the user's finger is tracing a closed path, which is depicted by a contour 306 , that is surrounding an image of a man walking in the scene.
- the stored screen coordinates of the closed path are translated to the selected area of the image of the scene, i.e. a graphical object representing the walking man (block 608 ).
- the rest of the process may be as described above, namely, the application of an automatic image capture parameter adjustment process that gives priority to the selected area defined within the contour or path 306 (block 610 ), and taking a picture of the scene using the parameters as they have been adjusted to give priority to the selected area of the scene, in response to detecting a shutter release command (block 612 ).
- the shutter release command may include simply the act of the user lifting off her finger, following the initial single finger touch gesture and the tracing of the contour 306 .
- the user may simply wish to accept the default image capture parameter values available in the device 100 , and so may take the picture by pressing the generic shutter button (e.g., icon 105 or physical menu button 108 of the device 100 ), without first touching the preview area of the touch screen 104 .
- the generic shutter button e.g., icon 105 or physical menu button 108 of the device 100
- the user can zoom in and zoom out of the preview of the scene, using multi-touch pinch and spread gestures, respectively.
- a multi-finger gesture is detected on the touch screen.
- the preview portion displays either a zoomed-in or zoomed-out version of the scene, in response to the user's fingers undergoing a spreading movement or a pinching movement on the touch screen.
- zooming in or zooming out is performed immediately prior to selecting the priority area of the previewed scene, and just prior to taking a picture of the scene (according to the zoom setting and priority area selected).
- zooming into or out of the scene may be implemented using an optical zoom capability of the device 100 , and/or a digital zoom capability.
- a volume control button of the device 100 can also be used as a shutter release button.
- the portable handheld electronic device 100 has at least two modes of operation, namely a digital camera mode and a mobile telephone mode.
- a digital camera mode a user of the device is to point the lens 103 of the device to a scene and then take a digital picture of the scene using the built-in camera functionality.
- the telephone mode the user is to participate in a wireless telephone call and will hear the call through a built-in receiver 112 (ear speaker phone) of the device.
- the device 100 also has a button 110 , in this case, located on a side of the external enclosure of the device 110 as shown (as opposed to its front face, its back face, and its top and bottom sides).
- the button 110 is exposed to the user and in the telephone mode controls the loudness of the receiver 112 .
- the button 110 acts as a generic shutter button.
- the button 110 may be any one of a variety of different types, and generally is actuated by the user in different directions to increase and decrease, respectively, the loudness of the receiver 112 in the telephone mode. Shutter release occurs in this case when the button is actuated by the user in either direction (while in the camera mode).
- shutter button 110 which also acts as the loudness or volume button in telephone mode
- virtual shutter button icon 105 which is positioned immediately below the preview image area of the touch screen 104 .
- the device 100 may be a personal computer, such as a laptop, tablet, or handheld computer.
- the device 100 may be a cellular phone handset, personal digital assistant (PDA), or a multi-function consumer electronic device, such as the IPHONE device.
- PDA personal digital assistant
- the device 100 has a processor 704 that executes instructions to carry out operations associated with the device 100 .
- the instructions may be retrieved from memory 720 and, when executed, control the reception and manipulation of input and output data between various components of device 100 .
- the memory 720 may store an operating system program that is executed by the processor 704 , and one or more application programs are said to run on top of the operating system to perform different functions described below.
- the touch sensitive screen 104 displays a graphical user interface (GUI) to allow a user of the device 100 to interact with various application programs running in the device 100 .
- the GUI displays icons or graphical images that represent application programs, files, and their associated commands on the screen 104 . These may include windows, fields, dialog boxes, menus, buttons, cursors, scrollbars, etc. During operation, the user can select and activate various graphical images to initiate functions associated therewith.
- the touch screen 104 also acts as an input device, to transfer data from the outside world into the device 100 .
- This input is received via, for example, the user's finger touching the surface of the screen 104 .
- the screen 104 and its associated circuitry recognizes touches, as well as the position and perhaps the magnitude of touches and their duration on the surface of the screen 104 . These may be done by a gesture detector program 722 that may be executed by the processor 704 .
- a dedicated processor may be provided to process touch inputs, in order to reduce demand for a main processor of the system.
- the touch sensing capability of the screen 104 may be based on technology such as capacitive sensing, resistive sensing, or other suitable solid state technologies.
- the touch sensing may be based on single point sensing or multi-point or multi-touch sensing. Single point touch sensing is capable of only distinguishing a single touch, while multi-point sensing is capable of distinguishing multiple touches that occur at the same time.
- the input device aspect of the touch screen 104 may be integrated with its display device.
- the input device may be positioned “on top of” the display device, so that the user can manipulate the GUI directly by, for example, placing her finger on top of an object that is being displayed, in order to control that object. Note that this is different than how a touchpad works, because in a touchpad there is no one-to-one relationship such as this. With touchpads, the input device is not aligned with the display device, and the two are sometimes in different planes altogether. Additional details concerning the touch sensitive screen 104 and operation of the gesture detector 722 to detect user gestures (in this case, single and multi-touch finger gestures) are described in U.S. Patent Application Publication No.
- the gesture detector 722 recognizes the occurrence of gestures and informs one or more software agents running in the device 100 of these gestures and/or what actions to take in response to such gestures.
- a gesture may be identified as a command for performing certain action in an application program, and in particular, a camera application as described below.
- a static gesture does not involve motion
- a dynamic gesture is one that includes motion, e.g. movement of a single or multi-touch point on the screen 104 .
- a continuous gesture is one that is performed in a single stroke in contact with the screen 104
- a segmented gesture is one that is performed in a sequence of distinct steps or strokes, including at least one lift off from the touch screen 104 .
- the device 100 may recognize a gesture and take an associated action at essentially the same time as the gesture, that is, the gesture and the action simultaneously occur side-by-side rather than being a two-step process.
- the graphical image of the screen moves in lock step with the finger motion.
- an object presented on the display device continuously follows the gesture that is occurring on the input device, that is, there is a one-to-one relationship between the gesture being performed and the object shown on the display portion.
- fingers may spread apart or close together (pinch) in order to cause the object shown on the display to zoom in during the spread and zoom out during the close or pinch.
- a solid state image sensor 706 is built into the device 100 and may be located at a focal plane of an optical system that includes the lens 103 .
- An optical image of a scene before the camera is formed on the image sensor 706 , and the sensor 706 responds by capturing the scene in the form of a digital image or picture consisting of pixels that will then be stored in memory 720 .
- the image sensor 706 may include a solid state image sensor chip with several options available for controlling how an image is captured. These options are set by image capture parameters that can be adjusted automatically, by the priority (camera) application 728 .
- the priority application 728 can make automatic adjustments, that is without specific user input, to focus, exposure and color correction parameters (sometimes referred to as 3 A adjustments) based on a hint or priority portion of the scene that is to be imaged.
- This selected or target area may be computed by the priority application 728 , by translating the stored coordinates of the detected gesture to certain pixel coordinates of a digital image of the scene that is being displayed at the moment of the touch gesture occurring.
- the priority application 728 may contract or expand this selected area in response to receiving an indication from the gesture detector 722 that the user's fingers are undergoing a pinch or spread movement, respectively.
- the priority application 728 will apply an automatic image capture parameter adjustment process that adjusts one or more image capture parameters, to give priority to the selected area for taking a picture of the scene.
- the device 100 may operate not just in a digital camera mode, but also in a mobile telephone mode. This is enabled by the following components of the device 100 .
- An integrated antenna 708 that is driven and sensed by RF circuitry 710 is used to transmit and receive cellular network communication signals from a nearby base station (not shown).
- a mobile phone application 724 executed by the processor 704 presents mobile telephony options on the touch sensitive screen 104 for the user, such as a virtual telephone keypad with call and end buttons.
- the mobile phone application 724 also controls at a high level the two-way conversation in a typical mobile telephone call, by allowing the user to speak into the built-in microphone 714 while at the same time being able to hear the other side of the conversation through the receive or ear speaker 112 .
- the mobile phone application 724 also responds to the user's selection of the receiver volume, by detecting actuation of the physical volume button 110 .
- the processor 704 may include a cellular base band processor that is responsible for much of the digital audio signal processing functions associated with a cellular phone call, including encoding and decoding the voice signals of the participants to the conversation.
- the device 100 may be placed in either the digital camera mode or the mobile telephone mode, in response to, for example, the user actuating a physical menu button 108 and then selecting an appropriate icon on the display device of the touch sensitive screen 104 .
- the mobile phone application 724 controls loudness of the receiver 112 , based on a detected actuation or position of the physical volume button 110 .
- the priority (camera) application 728 responds to actuation of the volume button 110 as if the latter were a physical shutter button (for taking pictures).
- This use of the volume button 110 as a physical shutter button may be an alternative to a soft or virtual shutter button whose icon is simultaneously displayed on the display device of the screen 104 during camera mode (see, e.g. FIG. 3 , where icon 105 may be a generic virtual shutter button (default exposure parameters) and is displayed below the preview portion of the display device of the touch sensitive screen 104 ).
- An embodiment of the invention may be a machine-readable medium having stored thereon instructions which program a processor to perform some of the operations described above. In other embodiments, some of these operations might be performed by specific hardware components that contain hardwired logic. Those operations might alternatively be performed by any combination of programmed computer components and custom hardware components.
- a machine-readable medium may include any mechanism for storing or transmitting information in a form readable by a machine (e.g., a computer), not limited to Compact Disc Read-Only Memory (CD-ROMs), Read-Only Memory (ROMs), Random Access Memory (RAM), Erasable Programmable Read-Only Memory (EPROM), and a transmission over the Internet.
- a machine e.g., a computer
- CD-ROMs Compact Disc Read-Only Memory
- ROMs Read-Only Memory
- RAM Random Access Memory
- EPROM Erasable Programmable Read-Only Memory
- the multi-finger touch down may be defined as a set of one or more predetermined patterns detected in the input device of the touch sensitive screen 104 .
- a particular pattern may be defined for the joint tips of the index finger and thumb of the same hand, being pressed against the touch screen, for a certain interval of time.
- a pattern may be defined by the tips of the index finger and thumb being spaced apart from each other and held substantially in that position for a predetermined period of time.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Human Computer Interaction (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Studio Devices (AREA)
Abstract
In accordance with some embodiments, a method is performed at a handheld electronic device having a built-in digital camera and a touch sensitive screen. The method includes detecting a multi-finger gesture on the touch sensitive screen, wherein the touch sensitive screen is serving as part of an electronic viewfinder of the camera; storing coordinates of a location corresponding to the detected multi-finger gesture; translating the stored coordinates to a selected area of an image that is captured by the camera and that is being displayed on the touch sensitive screen; contracting or expanding the selected area in response to the user's fingers undergoing a pinching movement or a spreading movement, respectively, while the detected multi-finger gesture remains in contact with the touch sensitive screen; and applying an automatic image capture parameter adjustment process that gives priority to the selected area.
Description
- This application claims priority to U.S. Provisional Patent App. No. 61/083,455, “Camera Interface in a Portable Handheld Electronic Device,” filed Jul. 24, 2008, which is incorporated by reference herein in its entirety.
- The disclosed embodiments relate generally to portable handheld electronic devices, such as cellular telephone handsets and digital cameras, and more particularly to a user interface having a touch sensitive screen for controlling camera functions.
- Portable handheld electronic devices, such as the IPHONE multifunction device by Apple Inc., have a built-in digital camera, in addition to other functions such as cellular telephony and digital audio and video file playback. The IPHONE device, in particular, has a touch sensitive screen as part of its user interface. The touch screen lets the user select a particular application program to be run, by performing a single finger gesture on the touch sensitive screen. For example, the user can point to (touch) the icon of a particular application, which results in the application being automatically launched in the device. The camera application, in particular, allows the user to navigate amongst previously stored pictures taken using the camera directly on the touch screen. In addition, there is a shutter button icon that can be touched by the user to release the shutter and thereby take a picture of the scene that is before the camera. Other uses of the touch sensitive screen include navigating around a Web page that is being displayed by single finger gestures, and zooming into a displayed Web page by performing a so called multi-finger spread gesture on the touch sensitive screen. The user can also zoom out of the Web page, by performing a multi-finger pinch gesture.
- Several methods for operating a built-in digital camera of a portable, handheld electronic device are described. In some embodiments, a method is performed at a handheld electronic device having a built-in digital camera and a touch sensitive screen. The method includes detecting a multi-finger gesture on the touch sensitive screen, wherein the touch sensitive screen is serving as part of an electronic viewfinder of the camera; storing coordinates of a location corresponding to the detected multi-finger gesture; translating the stored coordinates to a selected area of an image that is captured by the camera and that is being displayed on the touch sensitive screen; contracting or expanding the selected area in response to the user's fingers undergoing a pinching movement or a spreading movement, respectively, while the detected multi-finger gesture remains in contact with the touch sensitive screen; and applying an automatic image capture parameter adjustment process that gives priority to the selected area. This gives the user finer control of auto focus, auto exposure, and auto while balance (“3A”) adjustments in the camera.
- In some embodiments, a handheld electronic device is provided which comprises a touch sensitive screen, a detector configured to detect a multi-finger gesture on the touch sensitive screen and store coordinates of a location of the detected gesture; an a digital camera. The digital camera includes an image sensor, a lens to form an optical image on the image sensor, a viewfinder module configured to display on the touch sensitive screen a scene at which the lens is aimed, and a priority module coupled to the detector. The priority module is configured to translate the stored coordinates to a selected area of a digital image of the scene that is being displayed on the touch sensitive screen by the viewfinder module, contract or expand the selected area in response to the user's fingers undergoing a pinching movement or a spreading movement, respectively, and apply an automatic image capture parameter adjustment process that gives priority to the selected area for taking a picture of the scene. Thus, a multi-touch pinch or spread gesture may define the hint or priority area, for calculating exposure parameters.
- In some embodiments, a method is performed at a handheld electronic device having a built-in digital camera and a touch sensitive screen. The method includes detecting an initial finger gesture by a user on the touch sensitive screen, wherein the touch sensitive screen serves as part of an electronic viewfinder of the camera; storing coordinates of the initial finger gesture; detecting a closed path on the touch sensitive screen that includes the location of the detected initial finger gesture, wherein the user's finger moves while remaining in contact with the touch sensitive screen to define the closed path, and storing coordinates of the closed path. The method also includes translating the stored coordinates of the closed path to a selected portion of an image that is captured by the camera and that is being displayed on the touch sensitive screen; and applying an automatic image capture parameter adjustment process that gives priority to the selected portion.
- In some embodiments, an apparatus is provided, which comprises a handheld electronic device configured to operate at least in a digital camera mode and a mobile telephone mode. The digital camera mode is configured to permit a user of the apparatus to take a digital picture of a scene, while the mobile telephone mode is configured to permit the user of the apparatus to participate in a wireless telephone call and hear the call through a built-in receiver of the apparatus. Further, the apparatus has a button exposed to the user that alternatively controls loudness of the built-in receiver when the apparatus is operating in the mobile telephone mode, and the button acts as shutter button when the apparatus is operating in the digital camera mode.
- Other embodiments are also described.
- The above summary does not include an exhaustive list of all aspects of the present invention. It is contemplated that the invention includes all systems and methods that can be practiced from all suitable combinations of the various aspects summarized above, as well as those disclosed in the Detailed Description below and particularly pointed out in the claims filed with the application. Such combinations may have particular advantages not specifically recited in the above summary.
- The embodiments of the invention are illustrated by way of example and not by way of limitation in the figures of the accompanying drawings in which like references indicate similar elements. It should be noted that references to “an” or “one” embodiment of the invention in this disclosure are not necessarily to the same embodiment, and they mean at least one.
-
FIG. 1 shows a portable handheld device having a built-in digital camera and a touch sensitive screen, in the hands of its user undergoing a single finger gesture during a still image capture process. -
FIG. 2 is a flow diagram of operations in the electronic device during a still image capture process, in accordance withFIG. 1 . -
FIG. 3 shows the portable handheld electronic device undergoing a multi-finger gesture during a still image capture process. -
FIG. 4 is a flow diagram of operations in the electronic device during a still image capture process, in accordance withFIG. 3 . -
FIG. 5 illustrates another embodiment of the invention, where the user draws a polygon through a single finger touch gesture, to define the priority area for image capture. -
FIG. 6 is a flow diagram of still image capture in the electronic device, in accordance with the embodiment ofFIG. 5 . -
FIG. 7 shows a block diagram of an example, portable handheld multifunction device in which an embodiment of the invention may be implemented. - In this section several preferred embodiments of this invention are explained with reference to the appended drawings. Whenever the shapes, relative positions and other aspects of the parts described in the embodiments are not clearly defined, the scope of the invention is not limited only to the parts shown, which are meant merely for the purpose of illustration.
-
FIG. 1 shows a portable handheldelectronic device 100 having a built-in digital camera and a touchsensitive screen 104 in the hand of its user, undergoing a finger gesture during a still image capture process. In this example, theportable device 100 is shown while it is held in the user'sleft hand 107, and the user'sright hand 109 is making the finger gesture on the touch screen. Thedevice 100 may be an IPHONE device by Apple Inc., of Cupertino, Calif. Alternatively, it could be any other portable handheld electronic device that has a built-in digital camera and a touch sensitive screen. The built-in digital camera includes alens 103 located in this example on the back face of thedevice 100. The lens may be a fixed optical lens system or it may have focus and optical zoom capability. Although not depicted inFIG. 1 , inside thedevice 100 are an electronic image sensor and associated hardware circuitry and running software that can capture a digital image of ascene 102 that is before thelens 103. - The digital camera functionality of the
device 100 includes an electronic or digital viewfinder (also referred to as a preview function). The viewfinder displays live, captured video of thescene 102 that is before the camera, on a portion of the touchsensitive screen 104 as shown. In this case, the digital camera also includes a soft or virtual shutter button whoseicon 105 is displayed by thescreen 104, directly below the viewfinder image area. As an alternative or in addition, a physical shutter button may be implemented in thedevice 100. Thedevice 100 includes all of the needed circuitry and/or software for implementing the digital camera functions of the electronic viewfinder (726,FIG. 7 ), shutter release, and automatic image capture parameter adjustment as described below. - In
FIG. 1 , the user performs a single-finger gesture on the touchsensitive screen 104 as shown. In this example, the finger gesture is formed by the user's right hand 109 (although it could just as well be the user's left hand 107). The user positions the single-finger touch gesture on a preview portion of the touch screen. Thedevice 100 has detected this touch down and has automatically drawn a marker 96 (in this case, the closed contour that has a box shape), centered around the location of the touch down. The user then moves herright hand 109 around the preview portion of the touch screen, to a location of the image of thescene 102 that corresponds to an object in the scene (or some portion of the scene) to which priority should be given when the digital camera adjusts the image capture parameters in preparation for taking a picture of the scene. For example, the user may move themarker 96 from up above the mountains and the trees down towards a location near the ground or where a man is walking. After the marker has been dragged to the desired portion of the scene where the user wants the camera to give priority, the user may lift off her finger gesture, which in turn signals the camera to accept the final location of the marker and the underlying portion of the image as the priority area of the scene. Once the user has finalized the selection of this priority area, he can command the digital camera to take a picture, after adjusting the image capture parameters to give priority to the selected area. This may be done by, for example, lifting her finger off the touch sensitive display screen which not only finalizes the location of hint area but also automatically signals the device to take the picture after adjusting the parameters. A flow diagram of operations for taking the digital picture, in accordance with the above, is shown inFIG. 2 . - Referring now to
FIG. 2 , after having powered on thedevice 100 and placed it in digital camera mode, a view finder function begins execution which displays video of thescene 102 that is before the camera lens 103 (block 22). The user aims the camera lens so that the desired portion of the scene appears on the preview portion of thescreen 104. While monitoring the screen, a camera application (or a touch screen application) running in thedevice 100 detects a single-finger touch gesture and stores screen coordinates of its location (block 24). Amarker 96 is then automatically displayed around the screen location of the touch gesture (block 26). Themarker 96 is moved around the preview portion of the touch screen, in lock step with the user moving her finger touch gesture along the surface of the touch screen (block 28). An area of the image of the scene being shown in the preview portion and that underlies the final location of the marker is defined to be an area selected by the user for priority (block 29). This priority area may be finalized, for example, in response to the user lifting off her finger. The priority area of the image may be a fixed chunk of pixels that are about coextensive with the boundary of themarker 96. Alternatively, the priority area may be an object in the scene located at or near themarker 96, as detected by the camera application using digital image processing techniques. - Once the selected area has been determined, an automatic image capture parameter adjustment process is applied by the
device 100 to give priority to the selected area (block 30). Additional details of this process will be explained below. Once the parameters have been adjusted inblock 30, the picture can be taken, for example, when the user gives the shutter release command (block 32). Several ways of defining the shutter release command are also described below. Thus, the process described above gives the user finer control of picture taking adjustments. - In some embodiments, during the automatic image capture parameter adjustment process, the
marker 96 is displayed in a variable state to indicate that one or more parameters are being adjusted. - For example, in some embodiments, the
marker 96 is displayed in an alternating sequence of colors, such as white, blue, white, blue, while the automatic image capture parameter adjustment process sets priority to the selected area (e.g., themarker 96 changes color while the camera is focusing). In some embodiments, the display ofmarker 96 includes an animation of the boundary of the marker oscillating or “wiggling” on screen while the automatic image capture parameter adjustment process gives priority to the selected area under the location of the marker. - In some embodiments, after the automatic image capture parameter adjustment process is completed, display of
marker 96 is terminated. - In
FIG. 3 , another embodiment of the invention is shown where the user defines the priority area this time by a multi-finger gesture. In this example, the multi-finger gesture is also formed by the user's right hand 109 (although it could just as well be the user'sleft hand 107, while the device is held by the user's right hand 109). In particular, the thumb and index finger are brought close to each other or touch each other, simultaneously with their tips being in contact with the surface of thescreen 104 to create two contact points thereon. The user positions this multi-touch gesture, namely the two contact points, at a location of the image of thescene 102 that corresponds to an object in the scene (or portion of the scene) to which priority should be given when the digital camera adjusts the image capture parameters in preparation for taking a picture of the scene. In this example, the user has selected the location where a person appears between a mountainside in the background and a tree in the foreground. - In response to detecting the multi-touch finger gesture, the
device 100 may cause acontour 106, in this example, the outline of a box, to be displayed on thescreen 104, around the location of the detected multi-finger gesture. Thecontour 106 is associated, e.g. by software running in thedevice 100, with a taken or selected priority area of the image (to which priority will be given in the image capture parameter adjustment process). The user can then contract or expand the size of the priority area, by making a pinching movement or a spreading movement, respectively, with her thumb and index fingers of herright hand 109 while the fingertips remain in contact with the touchsensitive screen 104. Thedevice 100 has the needed hardware and software to distinguish between a pinching movement and a spreading movement, and appropriately contracts or expands the size of the priority area. - Once the user has finalized the selected area to which priority is to be given, he can command the digital camera to take a picture after adjusting the image capture parameters to give priority to the selected area. This may be done by, for example, lifting her fingers off of the touch
sensitive display screen 104 and then actuating the shutter release button (e.g., touching and then lifting off the soft shutter button icon 105). If instead the user would like default image capture parameter values to be used, then he would simply actuate the genericshutter button icon 105 without first touching the preview of the scene that is being displayed. Several alternatives to this process will now be described. - Turning now to
FIG. 4 , a flow diagram of operations for taking a digital picture using thedevice 100, in accordance with an embodiment of the invention is shown. After having powered on thedevice 100 and placed it in digital camera mode, an electronic viewfinder function begins execution which displays video of thescene 102 that is before the camera lens 103 (block 402). The user can now aim the camera lens so that the desired portion of the scene appears on the touchsensitive screen 104. While monitoring the screen, a camera application (or a separate touch screen application) running indevice 100 detects a multi-finger touch gesture, and stores screen coordinates of its location (block 404). These screen coordinates are then translated to refer to a corresponding area of an image of the scene (block 406). Thus, for example, referring back toFIG. 3 , the screen location of thecontour 106 is compared to the pixel content of the displayed image, within and near that contour (or underlying the contour), to determine that an object, in this case an image of a man walking, is present in that location. The pixels that make up the man thus become the selected or taken area of the image of the scene. - Next, the
device 100 may contract or expand the selected area, in response to the user's fingers undergoing a pinching movement or a spreading movement, respectively while remaining in contact with the touch sensitive screen (block 408). Thus, in the example ofFIG. 3 , the user can expand the selected area, to include more pixels of the image, by spreading the index finger and thumb of theright hand 109, while they are in contact with the screen. This may be reflected by thedevice 100 enlarging thecontour 106 that is being displayed. Thedevice 100 detects the spreading movement and in response allocates more pixels to the selected area, for example, equally in all directions. In a similar manner, thedevice 100 will contract the selected area (i.e., allocate fewer pixels of the image to define the selected area) in response to detecting that the user's fingers are undergoing a pinching movement, that is, the thumb and index finger are moved closer towards each other. - With the selected area having been determined in this manner (block 408), the
device 100 next applies an automatic image capture parameter adjustment process that gives priority to the selected area (block 410). This process may include making automatic adjustments to focus, exposure, and color correction (e.g., white balance) parameters. These are sometimes referred to as 3A adjustments. The adjusted parameters will be applied by the camera when “taking the next picture” of a scene that is before it. Focus adjustment may include making movement adjustments to an optical lens system of thedevice 100, so as to focus on the selected area of the scene. Exposure adjustments include changing the exposure time or integration time of the entire image sensor or portions of the image sensor, based upon characteristics of the selected area including its brightness (rather than that of the entire image). Similarly, adjustments to the color correction parameters may include changing the parameters used to apply a white balance algorithm to a raw image obtained from the image sensor of the camera (that will ultimately become the “picture” of the scene). As described above, in some embodiments, during the automatic image capture parameter adjustment process, themarker 96 is displayed in a variable state to indicate that one or more parameters are being adjusted. - Once the parameters have been adjusted in
block 410, the picture is taken when the user gives the shutter release command. There are several ways of actually completing the process of taking the picture, in response to thedevice 100 detecting that a shutter release command has been invoked (block 412). For example, output from the image sensor may not be accepted until after having detected that the multi-finger gesture has been lifted off the touch sensitive screen. In other words, the camera takes the shot only after the user lifts her fingers off the touch screen. The picture is, of course, taken using the image capture parameters that have been adjusted to give priority to the selected area. - In another embodiment, the picture is taken only after expiration of a timer that was set upon the parameters having been adjusted. For example, a shutter button may be depressed half-way by the user, to signify that the image capture parameters be adjusted to give priority to the selected area, and then after the parameters have been adjusted, the
device 100 waits a predetermined time interval before accepting the user's command to take the shot (e.g., upon the user pressing the shutter button the rest of the way). - In another embodiment, the camera function of the
device 100 tracks movement of an object in the scene, that has been captured in the selected area of the image, as the object and thedevice 100 move relative to each other and while the multi-touch gesture is still present on the touch screen. The camera could, for example, maintain focus on the moving object only so long as the multi-touch gesture is on the screen, and then at the moment the multi-touch gesture has lifted off the screen, a still picture of the moving object is taken. Alternatively, focus would be maintained even after the multi-touch gesture has lifted off, and then the picture of the moving object is taken when a separate virtual or physical shutter button is actuated by the user. - The above-described process, for the use of a multi-touch pinch or spread gesture to define the hint or priority area for calculating exposure parameters, may be extended to complex scenes in which there may be two or more distinct priority areas that the user would like to define. For example, the user may wish to select for priority both a dark shadow portion and a medium tone portion, but not a bright portion, of the scene.
- Referring now to
FIGS. 5 and 6 , another embodiment of the invention is now described for the portable handheld electronic device having a built-in digital camera and touch sensitive screen. In this embodiment, the user can draw an arbitrarily sized andshaped contour 306 on thetouch screen 104, around the priority or hint area of the scene that is being displayed by the viewfinder. Referring now toFIG. 6 , thedevice 100 captures and displays live video of thescene 102 to which thecamera lens 103 is pointed, using a digital viewfinder function on the touch sensitive screen 104 (block 602). While showing this live video, thedevice 100 monitors thescreen 104 for a single finger touch gesture. The initial single finger touch gesture is then detected and screen coordinates of its location are stored, while monitoring the screen (block 604). Next, as the user's finger moves, while remaining in contact with the screen, to define a closed path, the closed path is detected on the touch sensitive screen and coordinates of the closed path are stored (block 606). Essentially simultaneously, thecontour 306 is being drawn on the preview area of the touch screen that underlies the user's finger touch. The detected closed path may include the location of the detected initial finger gesture. The stored coordinates of the closed path are translated to a portion of the image that is being displayed on the screen 104 (block 608). Thus, as seen inFIG. 5 , the user's finger is tracing a closed path, which is depicted by acontour 306, that is surrounding an image of a man walking in the scene. The stored screen coordinates of the closed path are translated to the selected area of the image of the scene, i.e. a graphical object representing the walking man (block 608). The rest of the process may be as described above, namely, the application of an automatic image capture parameter adjustment process that gives priority to the selected area defined within the contour or path 306 (block 610), and taking a picture of the scene using the parameters as they have been adjusted to give priority to the selected area of the scene, in response to detecting a shutter release command (block 612). - As above, the shutter release command may include simply the act of the user lifting off her finger, following the initial single finger touch gesture and the tracing of the
contour 306. As an alternative, the user may simply wish to accept the default image capture parameter values available in thedevice 100, and so may take the picture by pressing the generic shutter button (e.g.,icon 105 orphysical menu button 108 of the device 100), without first touching the preview area of thetouch screen 104. - In another embodiment of the invention, the user can zoom in and zoom out of the preview of the scene, using multi-touch pinch and spread gestures, respectively. In other words, with the electronic view finder function of the camera running, so that the
touch screen 104 has a preview portion that is displaying live video of the scene to which the camera lens is pointed, a multi-finger gesture is detected on the touch screen. The preview portion then displays either a zoomed-in or zoomed-out version of the scene, in response to the user's fingers undergoing a spreading movement or a pinching movement on the touch screen. Thereafter, the user can lift off the multi-touch gesture and reinitiate a second multi-touch gesture or a single touch gesture, that selects an area of the zoomed in (or zoomed out) preview for purposes of 3A adjustment. In other words, zooming in or zooming out is performed immediately prior to selecting the priority area of the previewed scene, and just prior to taking a picture of the scene (according to the zoom setting and priority area selected). Note that zooming into or out of the scene may be implemented using an optical zoom capability of thedevice 100, and/or a digital zoom capability. - In yet another embodiment of the invention, a volume control button of the
device 100 can also be used as a shutter release button. Referring back toFIG. 1 , the portable handheldelectronic device 100 has at least two modes of operation, namely a digital camera mode and a mobile telephone mode. In the camera mode, a user of the device is to point thelens 103 of the device to a scene and then take a digital picture of the scene using the built-in camera functionality. In the telephone mode, the user is to participate in a wireless telephone call and will hear the call through a built-in receiver 112 (ear speaker phone) of the device. Thedevice 100 also has abutton 110, in this case, located on a side of the external enclosure of thedevice 110 as shown (as opposed to its front face, its back face, and its top and bottom sides). Thebutton 110 is exposed to the user and in the telephone mode controls the loudness of thereceiver 112. In the camera mode, however, thebutton 110 acts as a generic shutter button. Thebutton 110 may be any one of a variety of different types, and generally is actuated by the user in different directions to increase and decrease, respectively, the loudness of thereceiver 112 in the telephone mode. Shutter release occurs in this case when the button is actuated by the user in either direction (while in the camera mode). Thus, in camera mode, there may be two generic shutter buttons available for the user where either one can be used to take a picture, namely the shutter button 110 (which also acts as the loudness or volume button in telephone mode), and the virtualshutter button icon 105 which is positioned immediately below the preview image area of thetouch screen 104. - Turning now to
FIG. 7 , a block diagram of an example portable, handheldelectronic device 100 is shown, in accordance with an embodiment of the invention. Thedevice 100 may be a personal computer, such as a laptop, tablet, or handheld computer. Alternatively, thedevice 100 may be a cellular phone handset, personal digital assistant (PDA), or a multi-function consumer electronic device, such as the IPHONE device. - The
device 100 has aprocessor 704 that executes instructions to carry out operations associated with thedevice 100. The instructions may be retrieved frommemory 720 and, when executed, control the reception and manipulation of input and output data between various components ofdevice 100. Although not shown, thememory 720 may store an operating system program that is executed by theprocessor 704, and one or more application programs are said to run on top of the operating system to perform different functions described below. The touchsensitive screen 104 displays a graphical user interface (GUI) to allow a user of thedevice 100 to interact with various application programs running in thedevice 100. The GUI displays icons or graphical images that represent application programs, files, and their associated commands on thescreen 104. These may include windows, fields, dialog boxes, menus, buttons, cursors, scrollbars, etc. During operation, the user can select and activate various graphical images to initiate functions associated therewith. - The
touch screen 104 also acts as an input device, to transfer data from the outside world into thedevice 100. This input is received via, for example, the user's finger touching the surface of thescreen 104. Thescreen 104 and its associated circuitry recognizes touches, as well as the position and perhaps the magnitude of touches and their duration on the surface of thescreen 104. These may be done by agesture detector program 722 that may be executed by theprocessor 704. Note that a dedicated processor may be provided to process touch inputs, in order to reduce demand for a main processor of the system. The touch sensing capability of thescreen 104 may be based on technology such as capacitive sensing, resistive sensing, or other suitable solid state technologies. The touch sensing may be based on single point sensing or multi-point or multi-touch sensing. Single point touch sensing is capable of only distinguishing a single touch, while multi-point sensing is capable of distinguishing multiple touches that occur at the same time. - The input device aspect of the
touch screen 104 may be integrated with its display device. The input device may be positioned “on top of” the display device, so that the user can manipulate the GUI directly by, for example, placing her finger on top of an object that is being displayed, in order to control that object. Note that this is different than how a touchpad works, because in a touchpad there is no one-to-one relationship such as this. With touchpads, the input device is not aligned with the display device, and the two are sometimes in different planes altogether. Additional details concerning the touchsensitive screen 104 and operation of thegesture detector 722 to detect user gestures (in this case, single and multi-touch finger gestures) are described in U.S. Patent Application Publication No. 2006/0026535, entitled “Mode-Based Graphical User Interfaces for Touch Sensitive Input Devices”. Thegesture detector 722 recognizes the occurrence of gestures and informs one or more software agents running in thedevice 100 of these gestures and/or what actions to take in response to such gestures. A gesture may be identified as a command for performing certain action in an application program, and in particular, a camera application as described below. - A wide range of different gestures may be defined and used. For example, a static gesture does not involve motion, while a dynamic gesture is one that includes motion, e.g. movement of a single or multi-touch point on the
screen 104. A continuous gesture is one that is performed in a single stroke in contact with thescreen 104, whereas a segmented gesture is one that is performed in a sequence of distinct steps or strokes, including at least one lift off from thetouch screen 104. In addition, thedevice 100 may recognize a gesture and take an associated action at essentially the same time as the gesture, that is, the gesture and the action simultaneously occur side-by-side rather than being a two-step process. For example, during a scrolling gesture, the graphical image of the screen moves in lock step with the finger motion. In another example, an object presented on the display device continuously follows the gesture that is occurring on the input device, that is, there is a one-to-one relationship between the gesture being performed and the object shown on the display portion. For example, during a zooming gesture, fingers may spread apart or close together (pinch) in order to cause the object shown on the display to zoom in during the spread and zoom out during the close or pinch. These are controlled by theprocessor 704 executing instructions that may be part of thegesture detector program 722, or another application program such as a priority (camera)application program 728. - Still referring to
FIG. 7 , camera functionality of thedevice 100 may be enabled by the following components. A solidstate image sensor 706 is built into thedevice 100 and may be located at a focal plane of an optical system that includes thelens 103. An optical image of a scene before the camera is formed on theimage sensor 706, and thesensor 706 responds by capturing the scene in the form of a digital image or picture consisting of pixels that will then be stored inmemory 720. Theimage sensor 706 may include a solid state image sensor chip with several options available for controlling how an image is captured. These options are set by image capture parameters that can be adjusted automatically, by the priority (camera)application 728. Thepriority application 728 can make automatic adjustments, that is without specific user input, to focus, exposure and color correction parameters (sometimes referred to as 3A adjustments) based on a hint or priority portion of the scene that is to be imaged. This selected or target area may be computed by thepriority application 728, by translating the stored coordinates of the detected gesture to certain pixel coordinates of a digital image of the scene that is being displayed at the moment of the touch gesture occurring. Thepriority application 728 may contract or expand this selected area in response to receiving an indication from thegesture detector 722 that the user's fingers are undergoing a pinch or spread movement, respectively. Once the selected area (hint) has been finalized, thepriority application 728 will apply an automatic image capture parameter adjustment process that adjusts one or more image capture parameters, to give priority to the selected area for taking a picture of the scene. - Still referring to
FIG. 7 , thedevice 100 may operate not just in a digital camera mode, but also in a mobile telephone mode. This is enabled by the following components of thedevice 100. Anintegrated antenna 708 that is driven and sensed byRF circuitry 710 is used to transmit and receive cellular network communication signals from a nearby base station (not shown). Amobile phone application 724 executed by theprocessor 704 presents mobile telephony options on the touchsensitive screen 104 for the user, such as a virtual telephone keypad with call and end buttons. Themobile phone application 724 also controls at a high level the two-way conversation in a typical mobile telephone call, by allowing the user to speak into the built-inmicrophone 714 while at the same time being able to hear the other side of the conversation through the receive orear speaker 112. Themobile phone application 724 also responds to the user's selection of the receiver volume, by detecting actuation of thephysical volume button 110. Although not shown, theprocessor 704 may include a cellular base band processor that is responsible for much of the digital audio signal processing functions associated with a cellular phone call, including encoding and decoding the voice signals of the participants to the conversation. - The
device 100 may be placed in either the digital camera mode or the mobile telephone mode, in response to, for example, the user actuating aphysical menu button 108 and then selecting an appropriate icon on the display device of the touchsensitive screen 104. In the telephone mode, themobile phone application 724 controls loudness of thereceiver 112, based on a detected actuation or position of thephysical volume button 110. In the camera mode, the priority (camera)application 728 responds to actuation of thevolume button 110 as if the latter were a physical shutter button (for taking pictures). This use of thevolume button 110 as a physical shutter button may be an alternative to a soft or virtual shutter button whose icon is simultaneously displayed on the display device of thescreen 104 during camera mode (see, e.g.FIG. 3 , whereicon 105 may be a generic virtual shutter button (default exposure parameters) and is displayed below the preview portion of the display device of the touch sensitive screen 104). - An embodiment of the invention may be a machine-readable medium having stored thereon instructions which program a processor to perform some of the operations described above. In other embodiments, some of these operations might be performed by specific hardware components that contain hardwired logic. Those operations might alternatively be performed by any combination of programmed computer components and custom hardware components.
- A machine-readable medium may include any mechanism for storing or transmitting information in a form readable by a machine (e.g., a computer), not limited to Compact Disc Read-Only Memory (CD-ROMs), Read-Only Memory (ROMs), Random Access Memory (RAM), Erasable Programmable Read-Only Memory (EPROM), and a transmission over the Internet.
- The invention is not limited to the specific embodiments described above. For example, the multi-finger touch down may be defined as a set of one or more predetermined patterns detected in the input device of the touch
sensitive screen 104. For example, a particular pattern may be defined for the joint tips of the index finger and thumb of the same hand, being pressed against the touch screen, for a certain interval of time. Alternatively, a pattern may be defined by the tips of the index finger and thumb being spaced apart from each other and held substantially in that position for a predetermined period of time. There are numerous other variations to different aspects of the invention described above, which in the interest of conciseness have not been provided in detail. Accordingly, other embodiments are within the scope of the claims.
Claims (21)
1. A method, comprising:
at a handheld electronic device having a built-in digital camera and a touch sensitive screen:
detecting a multi-finger gesture on the touch sensitive screen, wherein the touch sensitive screen is serving as part of an electronic viewfinder of the camera;
storing coordinates of a location corresponding to the detected multi-finger gesture;
translating the stored coordinates to a selected area of an image that is captured by the camera and that is being displayed on the touch sensitive screen;
contracting or expanding the selected area in response to the user's fingers undergoing a pinching movement or a spreading movement, respectively, while the detected multi-finger gesture remains in contact with the touch sensitive screen; and
applying an automatic image capture parameter adjustment process that gives priority to the selected area.
2. The method of claim 1 , wherein applying the automatic image capture parameter adjustment process includes making automatic adjustments to one or more parameters selected from the group consisting of focus, exposure, and color correction, and wherein the camera is configured to apply the automatic adjustments when taking a picture.
3. The method of claim 1 , further comprising:
displaying on the touch sensitive screen a contour around the location of the detected multi-finger gesture; and
associating the contour with the selected area of the image.
4. The method of claim 1 , further comprising:
tracking movement of an object captured in the selected area of the image, as the object and the device move relative to each other.
5. The method of claim 1 , further comprising:
taking a picture, using image capture parameters that have been adjusted to give priority to the selected area in response to detecting that the multi-finger gesture has lifted off the touch sensitive screen.
6. The method of claim 1 , further comprising:
taking a picture, using image capture parameters that have been adjusted to give priority to the selected area, in response to an expiration of a timer that was set upon the parameters having been adjusted.
7. The method of claim 1 , further comprising:
detecting a further multi-finger gesture on the touch sensitive screen;
storing coordinates of a location corresponding to the further multi-finger gesture; and
translating the stored coordinates to a further selected area of the image that is captured by the camera and that is being displayed on the touch sensitive screen,
wherein the automatic image capture parameter adjustment process is applied to give priority to both the selected area and the further selected area.
8. The method of claim 7 , wherein the selected area and the further selected area are two distinct user-defined priority areas.
9. The method of claim 8 , wherein the selected area corresponds to a dark shadow area of the image to be captured by the camera, and the further selected area corresponds to a medium tone area of the image to be captured by the camera.
10. The method of claim 1 , further comprising:
while the touch sensitive screen is displaying a scene at which the camera is pointed:
detecting a multi-finger gesture made by a user on the touch sensitive screen; and
zooming into or out of the scene, in response to detecting the user's fingers undergoing a spreading movement or a pinching movement on the touch sensitive screen.
11. The method of claim 10 , wherein the zooming into or out of the scene comprises performing a digital zoom.
12. The method of claim 10 , wherein the zooming into or out of the scene comprises performing an optical zoom.
13. The method of claim 1 , further comprising:
while the automatic image capture parameter adjustment process sets priority to the selected area, displaying the marker in a variable state to indicate that one or more parameters are being adjusted.
14. A handheld electronic device, comprising:
a touch sensitive screen;
a detector configured to detect a multi-finger gesture on the touch sensitive screen and store coordinates of a location of the detected gesture; and
a digital camera, including:
an image sensor,
a lens to form an optical image on the image sensor,
a viewfinder module configured to display on the touch sensitive screen a scene at which the lens is aimed, and
a priority module coupled to the detector, wherein the priority module is configured to:
translate the stored coordinates to a selected area of a digital image of the scene that is being displayed on the touch sensitive screen by the viewfinder module,
contract or expand the selected area in response to the user's fingers undergoing a pinching movement or a spreading movement, respectively, and
apply an automatic image capture parameter adjustment process that gives priority to the selected area for taking a picture of the scene.
15. The handheld electronic device of claim 14 , wherein the application of the automatic image capture parameter adjustment process includes making at least one automatic adjustment to one or more parameters selected from the group consisting of focus, exposure, and color correction, and wherein the camera is configured to apply the automatic adjustments when taking a picture.
16. The handheld electronic device of claim 14 , wherein the digital camera is configured to:
display a graphical object on the touch sensitive screen that is associated with a virtual shutter button of the digital camera; and
take the picture of the scene in accordance with image capture parameters, which are set to a default setting when the virtual shutter button is actuated, and adjusted to give priority to an area of the scene selected by the multi-finger touch gesture.
17. The handheld electronic device of claim 16 , wherein the graphical object representing the virtual shutter button is displayed below the preview.
18. A method, comprising:
at a handheld electronic device having a built-in digital camera and a touch sensitive screen:
detecting an initial finger gesture by a user on the touch sensitive screen, wherein the touch sensitive screen serves as part of an electronic viewfinder of the camera;
storing coordinates of the initial finger gesture;
detecting a closed path on the touch sensitive screen that includes the location of the detected initial finger gesture, wherein the user's finger moves while remaining in contact with the touch sensitive screen to define the closed path;
storing coordinates of the closed path;
translating the stored coordinates of the closed path to a selected portion of an image that is captured by the camera and that is being displayed on the touch sensitive screen; and
applying an automatic image capture parameter adjustment process that gives priority to the selected portion.
19. An apparatus, comprising:
a handheld electronic device configured to operate at least in a digital camera mode and a mobile telephone mode, wherein:
the digital camera mode is configured to permit a user of the apparatus to take a digital picture of a scene, and
the mobile telephone mode is configured to permit the user of the apparatus to participate in a wireless telephone call and hear the call through a built-in receiver of the apparatus,
wherein the apparatus has a button exposed to the user that alternatively:
controls loudness of the built-in receiver when the apparatus is operating in the mobile telephone mode, and
acts as shutter button when the apparatus is operating in the digital camera mode.
20. The apparatus of claim 19 , wherein:
the button is configured to be actuated by the user in two different directions to increase and decrease, respectively, loudness in the mobile telephone mode; and
shutter release occurs when the button is actuated by the user in either one of said directions in the digital camera mode.
21. The apparatus of claim 19 , further comprising:
a built-in touch sensitive screen that serves as part of an electronic viewfinder in the digital camera mode, and
wherein the digital camera mode is configured to display a shutter release button on the touch sensitive screen.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US12/508,534 US20100020221A1 (en) | 2008-07-24 | 2009-07-23 | Camera Interface in a Portable Handheld Electronic Device |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US8345508P | 2008-07-24 | 2008-07-24 | |
US12/508,534 US20100020221A1 (en) | 2008-07-24 | 2009-07-23 | Camera Interface in a Portable Handheld Electronic Device |
Publications (1)
Publication Number | Publication Date |
---|---|
US20100020221A1 true US20100020221A1 (en) | 2010-01-28 |
Family
ID=41568300
Family Applications (6)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/479,705 Active 2030-03-21 US8237807B2 (en) | 2008-07-24 | 2009-06-05 | Image capturing device with touch screen for adjusting camera settings |
US12/508,534 Abandoned US20100020221A1 (en) | 2008-07-24 | 2009-07-23 | Camera Interface in a Portable Handheld Electronic Device |
US13/551,360 Active US8670060B2 (en) | 2008-07-24 | 2012-07-17 | Image capturing device with touch screen for adjusting camera settings |
US14/177,200 Active US9544495B2 (en) | 2008-07-24 | 2014-02-10 | Image capturing device with touch screen for adjusting camera settings |
US15/363,381 Active US10057481B2 (en) | 2008-07-24 | 2016-11-29 | Image capturing device with touch screen for adjusting camera settings |
US16/030,609 Active US10341553B2 (en) | 2008-07-24 | 2018-07-09 | Image capturing device with touch screen for adjusting camera settings |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/479,705 Active 2030-03-21 US8237807B2 (en) | 2008-07-24 | 2009-06-05 | Image capturing device with touch screen for adjusting camera settings |
Family Applications After (4)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/551,360 Active US8670060B2 (en) | 2008-07-24 | 2012-07-17 | Image capturing device with touch screen for adjusting camera settings |
US14/177,200 Active US9544495B2 (en) | 2008-07-24 | 2014-02-10 | Image capturing device with touch screen for adjusting camera settings |
US15/363,381 Active US10057481B2 (en) | 2008-07-24 | 2016-11-29 | Image capturing device with touch screen for adjusting camera settings |
US16/030,609 Active US10341553B2 (en) | 2008-07-24 | 2018-07-09 | Image capturing device with touch screen for adjusting camera settings |
Country Status (1)
Country | Link |
---|---|
US (6) | US8237807B2 (en) |
Cited By (212)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20080168402A1 (en) * | 2007-01-07 | 2008-07-10 | Christopher Blumenberg | Application Programming Interfaces for Gesture Operations |
US20080168478A1 (en) * | 2007-01-07 | 2008-07-10 | Andrew Platzer | Application Programming Interfaces for Scrolling |
US20090228901A1 (en) * | 2008-03-04 | 2009-09-10 | Apple Inc. | Touch event model |
US20090225038A1 (en) * | 2008-03-04 | 2009-09-10 | Apple Inc. | Touch event processing for web pages |
US20090225039A1 (en) * | 2008-03-04 | 2009-09-10 | Apple Inc. | Touch event model programming interface |
US20090225037A1 (en) * | 2008-03-04 | 2009-09-10 | Apple Inc. | Touch event model for web pages |
US20100020222A1 (en) * | 2008-07-24 | 2010-01-28 | Jeremy Jones | Image Capturing Device with Touch Screen for Adjusting Camera Settings |
US20100156941A1 (en) * | 2008-12-19 | 2010-06-24 | Samsung Electronics Co., Ltd | Photographing method using multi-input scheme through touch and key manipulation and photographing apparatus using the same |
US20100171863A1 (en) * | 2009-01-08 | 2010-07-08 | Samsung Electronics Co., Ltd | Method to enlarge and change displayed image and photographing apparatus using the same |
US20100235118A1 (en) * | 2009-03-16 | 2010-09-16 | Bradford Allen Moore | Event Recognition |
US20100271318A1 (en) * | 2009-04-28 | 2010-10-28 | Hong Fu Jin Precision Industry (Shenzhen) Co., Ltd. | Displaying system and method thereof |
US20100325575A1 (en) * | 2007-01-07 | 2010-12-23 | Andrew Platzer | Application programming interfaces for scrolling operations |
US20110013049A1 (en) * | 2009-07-17 | 2011-01-20 | Sony Ericsson Mobile Communications Ab | Using a touch sensitive display to control magnification and capture of digital images by an electronic device |
US20110069180A1 (en) * | 2009-09-23 | 2011-03-24 | Microsoft Corporation | Camera-based scanning |
US20110141145A1 (en) * | 2009-12-15 | 2011-06-16 | Hong Fu Jin Precision Industry (Shenzhen) Co., Ltd | Electronic device and method capable of zooming images |
US20110164128A1 (en) * | 2010-01-06 | 2011-07-07 | Verto Medical Solutions, LLC | Image capture and earpiece sizing system and method |
US20110179387A1 (en) * | 2009-03-16 | 2011-07-21 | Shaffer Joshua L | Event Recognition |
US20110179380A1 (en) * | 2009-03-16 | 2011-07-21 | Shaffer Joshua L | Event Recognition |
US20110179386A1 (en) * | 2009-03-16 | 2011-07-21 | Shaffer Joshua L | Event Recognition |
US20110193778A1 (en) * | 2010-02-05 | 2011-08-11 | Samsung Electronics Co., Ltd. | Device and method for controlling mouse pointer |
US20110211073A1 (en) * | 2010-02-26 | 2011-09-01 | Research In Motion Limited | Object detection and selection using gesture recognition |
US20110221948A1 (en) * | 2010-03-15 | 2011-09-15 | Canon Kabushiki Kaisha | Image pickup apparatus and its control method |
US20110242396A1 (en) * | 2010-04-01 | 2011-10-06 | Yoshinori Matsuzawa | Imaging device, display device, control method, and method for controlling area change |
US20110273473A1 (en) * | 2010-05-06 | 2011-11-10 | Bumbae Kim | Mobile terminal capable of providing multiplayer game and operating method thereof |
US20110300910A1 (en) * | 2010-06-04 | 2011-12-08 | Kyungdong Choi | Mobile terminal capable of providing multiplayer game and method of controlling operation of the mobile terminal |
US20120120222A1 (en) * | 2010-11-15 | 2012-05-17 | Leica Microsystems (Schweiz) Ag | Operator control unit for a microscope |
US20120120223A1 (en) * | 2010-11-15 | 2012-05-17 | Leica Microsystems (Schweiz) Ag | Portable microscope |
JP2012108511A (en) * | 2010-11-15 | 2012-06-07 | Leica Microsystems (Schweiz) Ag | Microscope having touch screen |
US8213916B1 (en) * | 2011-03-17 | 2012-07-03 | Ebay Inc. | Video processing system for identifying items in video frames |
US20120212661A1 (en) * | 2011-02-22 | 2012-08-23 | Sony Corporation | Imaging apparatus, focus control method, and program |
JP2012186698A (en) * | 2011-03-07 | 2012-09-27 | Ricoh Co Ltd | Image photographing apparatus |
WO2012128835A1 (en) * | 2011-03-21 | 2012-09-27 | Apple Inc. | Gesture-based configuration of image processing techniques |
US20120257069A1 (en) * | 2011-04-08 | 2012-10-11 | Nokia Corporation | Imaging |
US20120262600A1 (en) * | 2011-04-18 | 2012-10-18 | Qualcomm Incorporated | White balance optimization with high dynamic range images |
WO2012146273A1 (en) * | 2011-04-26 | 2012-11-01 | Better4Drive Ug (Haftungsbeschränkt) | Method and system for video marker insertion |
US20130010170A1 (en) * | 2011-07-07 | 2013-01-10 | Yoshinori Matsuzawa | Imaging apparatus, imaging method, and computer-readable storage medium |
EP2555097A1 (en) * | 2011-08-05 | 2013-02-06 | prisma - Gesellschaft für Projektmanagement & Informationssysteme mbH | Method and device for determining a section of an image and triggering imaging using a single touch-based gesture |
US20130050565A1 (en) * | 2011-08-24 | 2013-02-28 | Sony Mobile Communications Ab | Image focusing |
US20130063645A1 (en) * | 2011-09-09 | 2013-03-14 | Canon Kabushiki Kaisha | Imaging apparatus, control method for the same, and recording medium |
US20130076959A1 (en) * | 2011-09-22 | 2013-03-28 | Panasonic Corporation | Imaging Device |
CN103024261A (en) * | 2011-09-20 | 2013-04-03 | 佳能株式会社 | Image capturing apparatus and control method thereof |
US20130088629A1 (en) * | 2011-10-06 | 2013-04-11 | Samsung Electronics Co., Ltd. | Mobile device and method of remotely controlling a controlled device |
US20130128091A1 (en) * | 2011-11-17 | 2013-05-23 | Samsung Electronics Co., Ltd | Method and apparatus for self camera shooting |
US20130155276A1 (en) * | 2011-10-12 | 2013-06-20 | Canon Kabushiki Kaisha | Image capturing apparatus, and control method and program therefor |
US20130250156A1 (en) * | 2012-03-22 | 2013-09-26 | Canon Kabushiki Kaisha | Image capturing apparatus and control method thereof |
US20130258160A1 (en) * | 2012-03-29 | 2013-10-03 | Sony Mobile Communications Inc. | Portable device, photographing method, and program |
US8552999B2 (en) | 2010-06-14 | 2013-10-08 | Apple Inc. | Control selection approximation |
US20130271637A1 (en) * | 2012-04-17 | 2013-10-17 | Samsung Electronics Co., Ltd. | Apparatus and method for controlling focus |
US20130293735A1 (en) * | 2011-11-04 | 2013-11-07 | Sony Corporation | Imaging control device, imaging apparatus, and control method for imaging control device |
WO2013169259A1 (en) | 2012-05-10 | 2013-11-14 | Intel Corporation | Gesture responsive image capture control and/or operation on image |
US8593555B1 (en) | 2013-02-28 | 2013-11-26 | Lg Electronics Inc. | Digital device and method for controlling the same |
US20130329114A1 (en) * | 2012-06-08 | 2013-12-12 | Samsung Electronics Co., Ltd. | Image magnifier for pin-point control |
US20140028893A1 (en) * | 2012-07-25 | 2014-01-30 | Samsung Electronics Co., Ltd. | Image capturing apparatus and method of controlling the same |
CN103562791A (en) * | 2011-04-18 | 2014-02-05 | 眼见360股份有限公司 | Apparatus and method for panoramic video imaging with mobile computing devices |
US20140036131A1 (en) * | 2012-08-06 | 2014-02-06 | Beijing Xiaomi Technology Co.,Ltd. | Method of capturing an image in a device and the device thereof |
US20140063313A1 (en) * | 2012-09-03 | 2014-03-06 | Lg Electronics Inc. | Mobile device and control method for the same |
WO2014033347A1 (en) * | 2012-08-27 | 2014-03-06 | Nokia Corporation | Method and apparatus for recording video sequences |
CN103795917A (en) * | 2012-10-30 | 2014-05-14 | 三星电子株式会社 | Imaging apparatus and control method |
US20140146076A1 (en) * | 2012-11-27 | 2014-05-29 | Samsung Electronics Co., Ltd. | Contour segmentation apparatus and method based on user interaction |
CN103841319A (en) * | 2012-11-23 | 2014-06-04 | 佳能株式会社 | Image pickup apparatus |
WO2014095782A1 (en) * | 2012-12-17 | 2014-06-26 | Connaught Electronics Ltd. | Method for white balance of an image presentation considering color values exclusively of a subset of pixels, camera system and motor vehicle with a camera system |
US20140181737A1 (en) * | 2012-12-20 | 2014-06-26 | Samsung Electronics Co., Ltd. | Method for processing contents and electronic device thereof |
US8773470B2 (en) | 2010-05-07 | 2014-07-08 | Apple Inc. | Systems and methods for displaying visual information on a device |
WO2014131168A1 (en) * | 2013-02-27 | 2014-09-04 | Motorola Mobility Llc | A viewfinder utility |
US20140300722A1 (en) * | 2011-10-19 | 2014-10-09 | The Regents Of The University Of California | Image-based measurement tools |
US20140362254A1 (en) * | 2008-10-01 | 2014-12-11 | Nintendo Co., Ltd. | Information processing device, information processing system, and launch program and storage medium storing the same |
CN104219518A (en) * | 2014-07-31 | 2014-12-17 | 小米科技有限责任公司 | Photometry method and device |
US8971623B2 (en) | 2012-03-06 | 2015-03-03 | Apple Inc. | Overlaid user interface tools for applying effects to image |
US20150062373A1 (en) * | 2013-08-28 | 2015-03-05 | Sony Corporation | Information processing device, imaging device, information processing method, and program |
US20150082236A1 (en) * | 2013-09-13 | 2015-03-19 | Sharp Kabushiki Kaisha | Information processing apparatus |
US20150103184A1 (en) * | 2013-10-15 | 2015-04-16 | Nvidia Corporation | Method and system for visual tracking of a subject for automatic metering using a mobile device |
US20150146925A1 (en) * | 2013-11-22 | 2015-05-28 | Samsung Electronics Co., Ltd. | Method for recognizing a specific object inside an image and electronic device thereof |
US20150163399A1 (en) * | 2013-12-05 | 2015-06-11 | Here Global B.V. | Method and apparatus for a shutter animation for image capture |
US9076224B1 (en) | 2012-08-08 | 2015-07-07 | Dolby Laboratories Licensing Corporation | Image processing for HDR images |
US20150205457A1 (en) * | 2014-01-22 | 2015-07-23 | Lg Electronics Inc. | Mobile terminal and controlling method thereof |
US9105121B2 (en) | 2012-03-06 | 2015-08-11 | Apple Inc. | Image editing with user interface controls overlaid on image |
US20150229849A1 (en) * | 2014-02-11 | 2015-08-13 | Samsung Electronics Co., Ltd. | Photographing method of an electronic device and the electronic device thereof |
US20150350587A1 (en) * | 2014-05-29 | 2015-12-03 | Samsung Electronics Co., Ltd. | Method of controlling display device and remote controller thereof |
US20150370443A1 (en) * | 2013-02-12 | 2015-12-24 | Inuitive Ltd. | System and method for combining touch and gesture in a three dimensional user interface |
US9250498B2 (en) | 2012-03-23 | 2016-02-02 | Samsung Electronics Co., Ltd. | Apparatus and method for controlling auto focus function in electronic device |
US20160057230A1 (en) * | 2014-08-19 | 2016-02-25 | Hand Held Products, Inc. | Mobile computing device with data cognition software |
CN105407306A (en) * | 2014-09-05 | 2016-03-16 | 三星电子株式会社 | Digital Image Processing Method And Apparatus |
US20160080639A1 (en) * | 2014-09-15 | 2016-03-17 | Lg Electronics Inc. | Mobile terminal and control method thereof |
US9298363B2 (en) | 2011-04-11 | 2016-03-29 | Apple Inc. | Region activation for touch sensitive surface |
US9330322B2 (en) * | 2009-06-16 | 2016-05-03 | Intel Corporation | Controlled access to functionality of a wireless device |
CN105607860A (en) * | 2016-03-07 | 2016-05-25 | 上海斐讯数据通信技术有限公司 | Drag-photographing method and system |
US20160191809A1 (en) * | 2014-12-24 | 2016-06-30 | Canon Kabushiki Kaisha | Zoom control device, imaging apparatus, control method of zoom control device, and recording medium |
US20160188200A1 (en) * | 2010-02-18 | 2016-06-30 | Rohm Co., Ltd. | Touch-panel input device |
US9405429B1 (en) * | 2012-12-10 | 2016-08-02 | Amazon Technologies, Inc. | Collecting items with multi-touch gestures |
US20160255268A1 (en) * | 2014-09-05 | 2016-09-01 | Lg Electronics Inc. | Mobile terminal and method of controlling the same |
EP3065393A1 (en) * | 2015-03-06 | 2016-09-07 | Sony Corporation | System, method, and apparatus for controlling camera operations |
WO2016142757A1 (en) * | 2015-03-06 | 2016-09-15 | Sony Corporation | Method for independently determining exposure and focus settings of a digital camera |
US9454301B2 (en) * | 2014-08-01 | 2016-09-27 | Lg Electronics Inc. | Mobile terminal controlled by at least one touch and method of controlling therefor |
WO2017018612A1 (en) * | 2015-07-27 | 2017-02-02 | Samsung Electronics Co., Ltd. | Method and electronic device for stabilizing video |
WO2017023620A1 (en) | 2015-07-31 | 2017-02-09 | Sony Corporation | Method and system to assist a user to capture an image or video |
CN106462772A (en) * | 2014-02-19 | 2017-02-22 | 河谷控股Ip有限责任公司 | Invariant-based dimensional reduction of object recognition features, systems and methods |
US20170064192A1 (en) * | 2015-09-02 | 2017-03-02 | Canon Kabushiki Kaisha | Video Processing Apparatus, Control Method, and Recording Medium |
US20170058619A1 (en) * | 2015-08-24 | 2017-03-02 | Texas International Oilfield Tools, LLC | Actuator, Elevator with Actuator, and Methods of Use |
US20170085784A1 (en) * | 2015-09-17 | 2017-03-23 | Fu Tai Hua Industry (Shenzhen) Co., Ltd. | Method for image capturing and an electronic device using the method |
US20170094162A1 (en) * | 2015-02-27 | 2017-03-30 | Google Inc. | Systems and methods for capturing images from a lock screen |
US20170123550A1 (en) * | 2015-10-29 | 2017-05-04 | Samsung Electronics Co., Ltd | Electronic device and method for providing user interaction based on force touch |
US20170142300A1 (en) * | 2010-08-03 | 2017-05-18 | Drake Rice | Camera for handheld device |
US9674454B2 (en) | 2014-07-31 | 2017-06-06 | Xiaomi Inc. | Light metering methods and devices |
US9684521B2 (en) | 2010-01-26 | 2017-06-20 | Apple Inc. | Systems having discrete and continuous gesture recognizers |
US20170195555A1 (en) * | 2014-05-13 | 2017-07-06 | Lg Electronics Inc. | Mobile terminal and method for controlling the same |
US9733716B2 (en) | 2013-06-09 | 2017-08-15 | Apple Inc. | Proxy gesture recognizer |
WO2017151222A1 (en) * | 2016-03-02 | 2017-09-08 | Qualcomm Incorporated | Irregular-region based automatic image correction |
CN107526460A (en) * | 2016-06-15 | 2017-12-29 | 卡西欧计算机株式会社 | Output-controlling device, output control method and storage medium |
WO2018004967A1 (en) * | 2016-06-12 | 2018-01-04 | Apple Inc. | Digital touch on live video |
US9871962B2 (en) | 2016-03-04 | 2018-01-16 | RollCall, LLC | Movable user interface shutter button for camera |
EP3270582A1 (en) * | 2016-07-11 | 2018-01-17 | Samsung Electronics Co., Ltd | Object or area based focus control in video |
US9886931B2 (en) | 2012-03-06 | 2018-02-06 | Apple Inc. | Multi operation slider |
CN107924113A (en) * | 2016-06-12 | 2018-04-17 | 苹果公司 | User interface for camera effect |
US20180181275A1 (en) * | 2015-09-22 | 2018-06-28 | Samsung Electronics Co., Ltd. | Electronic device and photographing method |
US20180288310A1 (en) * | 2015-10-19 | 2018-10-04 | Corephotonics Ltd. | Dual-aperture zoom digital camera user interface |
US10110983B2 (en) | 2009-08-17 | 2018-10-23 | Harman International Industries, Incorporated | Ear sizing system and method |
US10122931B2 (en) | 2015-04-23 | 2018-11-06 | Apple Inc. | Digital viewfinder user interface for multiple cameras |
US10200587B2 (en) | 2014-09-02 | 2019-02-05 | Apple Inc. | Remote camera user interface |
GB2565634A (en) * | 2017-06-29 | 2019-02-20 | Canon Kk | Imaging control apparatus and control method therefor |
US10225479B2 (en) | 2013-06-13 | 2019-03-05 | Corephotonics Ltd. | Dual aperture zoom digital camera |
US10250797B2 (en) | 2013-08-01 | 2019-04-02 | Corephotonics Ltd. | Thin multi-aperture imaging system with auto-focus and methods for using same |
EP3427472A4 (en) * | 2016-04-14 | 2019-04-03 | Samsung Electronics Co., Ltd. | Image capturing method and electronic device supporting the same |
US20190113997A1 (en) * | 2008-10-26 | 2019-04-18 | Microsoft Technology Licensing, Llc | Multi-touch manipulation of application objects |
US20190114022A1 (en) * | 2016-03-24 | 2019-04-18 | Hideep Inc. | Mobile terminal capable of easily capturing image and image capturing method |
US20190124257A1 (en) * | 2010-12-27 | 2019-04-25 | Samsung Electronics Co., Ltd. | Digital image photographing apparatus and method of controlling the same |
US10284780B2 (en) | 2015-09-06 | 2019-05-07 | Corephotonics Ltd. | Auto focus and optical image stabilization with roll compensation in a compact folded camera |
US10282055B2 (en) | 2012-03-06 | 2019-05-07 | Apple Inc. | Ordered processing of edits for a media editing application |
US10288896B2 (en) | 2013-07-04 | 2019-05-14 | Corephotonics Ltd. | Thin dual-aperture zoom digital camera |
US10288897B2 (en) | 2015-04-02 | 2019-05-14 | Corephotonics Ltd. | Dual voice coil motor structure in a dual-optical module camera |
US10288840B2 (en) | 2015-01-03 | 2019-05-14 | Corephotonics Ltd | Miniature telephoto lens module and a camera utilizing such a lens module |
US10325394B2 (en) | 2008-06-11 | 2019-06-18 | Apple Inc. | Mobile communication terminal and data input method |
CN109960406A (en) * | 2019-03-01 | 2019-07-02 | 清华大学 | Based on the intelligent electronic device gesture capture acted between both hands finger and identification technology |
US10356332B2 (en) | 2015-08-13 | 2019-07-16 | Corephotonics Ltd. | Dual aperture zoom camera with video support and switching / non-switching dynamic control |
US10372022B2 (en) | 2015-06-24 | 2019-08-06 | Corephotonics Ltd | Low profile tri-axis actuator for folded lens camera |
US10371928B2 (en) | 2015-04-16 | 2019-08-06 | Corephotonics Ltd | Auto focus and optical image stabilization in a compact folded camera |
US10379371B2 (en) | 2015-05-28 | 2019-08-13 | Corephotonics Ltd | Bi-directional stiffness for optical image stabilization in a dual-aperture digital camera |
US20190278996A1 (en) * | 2016-12-26 | 2019-09-12 | Ns Solutions Corporation | Information processing device, system, information processing method, and storage medium |
WO2019183900A1 (en) * | 2018-03-29 | 2019-10-03 | 深圳市大疆创新科技有限公司 | Photographing control method, photographing control apparatus, photographing system, and storage medium |
US10488631B2 (en) | 2016-05-30 | 2019-11-26 | Corephotonics Ltd. | Rotational ball-guided voice coil motor |
US10509209B2 (en) | 2014-08-10 | 2019-12-17 | Corephotonics Ltd. | Zoom dual-aperture camera with folded lens |
US10523879B2 (en) | 2018-05-07 | 2019-12-31 | Apple Inc. | Creative camera |
US10534153B2 (en) | 2017-02-23 | 2020-01-14 | Corephotonics Ltd. | Folded camera lens designs |
US10578948B2 (en) | 2015-12-29 | 2020-03-03 | Corephotonics Ltd. | Dual-aperture zoom digital camera with automatic adjustable tele field of view |
US20200081614A1 (en) * | 2012-05-09 | 2020-03-12 | Apple Inc. | Device and Method for Facilitating Setting Autofocus Reference Point in Camera Application User Interface |
EP3611914A4 (en) * | 2017-06-09 | 2020-03-25 | Huawei Technologies Co., Ltd. | Method and apparatus for photographing image |
US10616484B2 (en) | 2016-06-19 | 2020-04-07 | Corephotonics Ltd. | Frame syncrhonization in a dual-aperture camera system |
US10645294B1 (en) | 2019-05-06 | 2020-05-05 | Apple Inc. | User interfaces for capturing and managing visual media |
US10645286B2 (en) | 2017-03-15 | 2020-05-05 | Corephotonics Ltd. | Camera with panoramic scanning range |
US20200175298A1 (en) * | 2015-03-27 | 2020-06-04 | Nec Corporation | Mobile surveillance apparatus, program, and control method |
US10694168B2 (en) | 2018-04-22 | 2020-06-23 | Corephotonics Ltd. | System and method for mitigating or preventing eye damage from structured light IR/NIR projector systems |
EP3672227A1 (en) * | 2012-07-20 | 2020-06-24 | BlackBerry Limited | Dynamic region of interest adaptation and image capture device providing same |
US10706518B2 (en) | 2016-07-07 | 2020-07-07 | Corephotonics Ltd. | Dual camera system with improved video smooth transition by image blending |
US10788927B2 (en) | 2014-09-02 | 2020-09-29 | Apple Inc. | Electronic communication based on user input and determination of active execution of application for playback |
US10845565B2 (en) | 2016-07-07 | 2020-11-24 | Corephotonics Ltd. | Linear ball guided voice coil motor for folded optic |
US10884321B2 (en) | 2017-01-12 | 2021-01-05 | Corephotonics Ltd. | Compact folded camera |
US10904512B2 (en) | 2017-09-06 | 2021-01-26 | Corephotonics Ltd. | Combined stereoscopic and phase detection depth mapping in a dual aperture camera |
JP2021009223A (en) * | 2019-07-01 | 2021-01-28 | キヤノン株式会社 | Control unit, optical device, control method, and program |
USRE48444E1 (en) | 2012-11-28 | 2021-02-16 | Corephotonics Ltd. | High resolution thin multi-aperture imaging systems |
US10936173B2 (en) | 2012-03-06 | 2021-03-02 | Apple Inc. | Unified slider control for modifying multiple image properties |
US10951834B2 (en) | 2017-10-03 | 2021-03-16 | Corephotonics Ltd. | Synthetically enlarged camera aperture |
EP3796142A1 (en) * | 2019-05-06 | 2021-03-24 | Apple Inc. | User interfaces for capturing and managing visual media |
US10976567B2 (en) | 2018-02-05 | 2021-04-13 | Corephotonics Ltd. | Reduced height penalty for folded camera |
US11054973B1 (en) | 2020-06-01 | 2021-07-06 | Apple Inc. | User interfaces for managing media |
US11112964B2 (en) | 2018-02-09 | 2021-09-07 | Apple Inc. | Media capture lock affordance for graphical user interface |
US11128792B2 (en) | 2018-09-28 | 2021-09-21 | Apple Inc. | Capturing and displaying images with multiple focal planes |
US11157725B2 (en) | 2018-06-27 | 2021-10-26 | Facebook Technologies, Llc | Gesture-based casting and manipulation of virtual content in artificial-reality environments |
US11204692B2 (en) | 2017-06-04 | 2021-12-21 | Apple Inc. | User interface camera effects |
US11212449B1 (en) | 2020-09-25 | 2021-12-28 | Apple Inc. | User interfaces for media capture and management |
US11268830B2 (en) | 2018-04-23 | 2022-03-08 | Corephotonics Ltd | Optical-path folding-element with an extended two degree of freedom rotation range |
US11290636B2 (en) * | 2008-09-05 | 2022-03-29 | Telefonaktiebolaget Lm Ericsson (Publ) | Mobile terminal and method of performing multi-focusing and photographing image including plurality of objects using the same |
US11287081B2 (en) | 2019-01-07 | 2022-03-29 | Corephotonics Ltd. | Rotation mechanism with sliding joint |
EP3979060A1 (en) * | 2016-06-12 | 2022-04-06 | Apple Inc. | Digital touch on live video |
US11315276B2 (en) | 2019-03-09 | 2022-04-26 | Corephotonics Ltd. | System and method for dynamic stereoscopic calibration |
US11321857B2 (en) | 2018-09-28 | 2022-05-03 | Apple Inc. | Displaying and editing images with depth information |
US11333955B2 (en) | 2017-11-23 | 2022-05-17 | Corephotonics Ltd. | Compact folded camera structure |
US11350026B1 (en) | 2021-04-30 | 2022-05-31 | Apple Inc. | User interfaces for altering visual media |
US11363180B2 (en) | 2018-08-04 | 2022-06-14 | Corephotonics Ltd. | Switchable continuous display information system above camera |
US11368631B1 (en) | 2019-07-31 | 2022-06-21 | Corephotonics Ltd. | System and method for creating background blur in camera panning or motion |
WO2022126738A1 (en) * | 2020-12-14 | 2022-06-23 | 安徽鸿程光电有限公司 | Image acquisition method and apparatus, and device and medium |
US11468625B2 (en) | 2018-09-11 | 2022-10-11 | Apple Inc. | User interfaces for simulated depth effects |
US11531209B2 (en) | 2016-12-28 | 2022-12-20 | Corephotonics Ltd. | Folded camera structure with an extended light-folding-element scanning range |
US11550471B2 (en) | 2015-03-19 | 2023-01-10 | Apple Inc. | Touch input cursor manipulation |
US11635596B2 (en) | 2018-08-22 | 2023-04-25 | Corephotonics Ltd. | Two-state zoom folded camera |
US11637977B2 (en) | 2020-07-15 | 2023-04-25 | Corephotonics Ltd. | Image sensors and sensing methods to obtain time-of-flight and phase detection information |
US11640047B2 (en) | 2018-02-12 | 2023-05-02 | Corephotonics Ltd. | Folded camera with optical image stabilization |
US11659135B2 (en) | 2019-10-30 | 2023-05-23 | Corephotonics Ltd. | Slow or fast motion video using depth information |
WO2023101881A1 (en) * | 2021-12-03 | 2023-06-08 | Apple Inc. | Devices, methods, and graphical user interfaces for capturing and displaying media |
US11681429B2 (en) | 2015-06-07 | 2023-06-20 | Apple Inc. | Devices and methods for capturing and interacting with enhanced digital images |
US11693064B2 (en) | 2020-04-26 | 2023-07-04 | Corephotonics Ltd. | Temperature control for Hall bar sensor correction |
US11706521B2 (en) | 2019-05-06 | 2023-07-18 | Apple Inc. | User interfaces for capturing and managing visual media |
US11722764B2 (en) | 2018-05-07 | 2023-08-08 | Apple Inc. | Creative camera |
US11740785B2 (en) | 2015-08-10 | 2023-08-29 | Apple Inc. | Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback |
US11740727B1 (en) | 2011-08-05 | 2023-08-29 | P4Tents1 Llc | Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback |
US11770609B2 (en) | 2020-05-30 | 2023-09-26 | Corephotonics Ltd. | Systems and methods for obtaining a super macro image |
US11770618B2 (en) | 2019-12-09 | 2023-09-26 | Corephotonics Ltd. | Systems and methods for obtaining a smart panoramic image |
US11770601B2 (en) | 2019-05-06 | 2023-09-26 | Apple Inc. | User interfaces for capturing and managing visual media |
US11778339B2 (en) | 2021-04-30 | 2023-10-03 | Apple Inc. | User interfaces for altering visual media |
US11832018B2 (en) | 2020-05-17 | 2023-11-28 | Corephotonics Ltd. | Image stitching in the presence of a full field of view reference image |
US11910089B2 (en) | 2020-07-15 | 2024-02-20 | Corephotonics Lid. | Point of view aberrations correction in a scanning folded camera |
US11921975B2 (en) | 2015-03-08 | 2024-03-05 | Apple Inc. | Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback |
US11946775B2 (en) | 2020-07-31 | 2024-04-02 | Corephotonics Ltd. | Hall sensor—magnet geometry for large stroke linear position sensing |
US11949976B2 (en) | 2019-12-09 | 2024-04-02 | Corephotonics Ltd. | Systems and methods for obtaining a smart panoramic image |
US11947724B2 (en) | 2012-05-09 | 2024-04-02 | Apple Inc. | Device, method, and graphical user interface for providing tactile feedback for operations performed in a user interface |
US11968453B2 (en) | 2020-08-12 | 2024-04-23 | Corephotonics Ltd. | Optical image stabilization in a scanning folded camera |
US11977726B2 (en) | 2015-03-08 | 2024-05-07 | Apple Inc. | Devices, methods, and graphical user interfaces for interacting with a control object while dragging another object |
US12007671B2 (en) | 2021-06-08 | 2024-06-11 | Corephotonics Ltd. | Systems and cameras for tilting a focal plane of a super-macro image |
US12007668B2 (en) | 2020-02-22 | 2024-06-11 | Corephotonics Ltd. | Split screen feature for macro photography |
US12045451B2 (en) | 2012-05-09 | 2024-07-23 | Apple Inc. | Device, method, and graphical user interface for moving a user interface object based on an intensity of a press input |
US12050761B2 (en) | 2012-12-29 | 2024-07-30 | Apple Inc. | Device, method, and graphical user interface for transitioning from low power mode |
US12067229B2 (en) | 2012-05-09 | 2024-08-20 | Apple Inc. | Device, method, and graphical user interface for providing feedback for changing activation states of a user interface object |
US12081856B2 (en) | 2021-03-11 | 2024-09-03 | Corephotonics Lid. | Systems for pop-out camera |
US12101575B2 (en) | 2021-12-24 | 2024-09-24 | Corephotonics Ltd. | Video support in a multi-aperture mobile camera with a scanning zoom camera |
Families Citing this family (76)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
FR2925705A1 (en) * | 2007-12-20 | 2009-06-26 | Thomson Licensing Sas | IMAGE CAPTURE ASSISTING DEVICE |
KR101544475B1 (en) * | 2008-11-28 | 2015-08-13 | 엘지전자 주식회사 | Controlling of Input/Output through touch |
US9148618B2 (en) * | 2009-05-29 | 2015-09-29 | Apple Inc. | Systems and methods for previewing newly captured image content and reviewing previously stored image content |
JP2011050038A (en) * | 2009-07-27 | 2011-03-10 | Sanyo Electric Co Ltd | Image reproducing apparatus and image sensing apparatus |
KR101589501B1 (en) * | 2009-08-24 | 2016-01-28 | 삼성전자주식회사 | Method and apparatus for controlling zoom using touch screen |
KR101605771B1 (en) * | 2009-11-19 | 2016-03-23 | 삼성전자주식회사 | Digital photographing apparatus, method for controlling the same, and recording medium storing program to execute the method |
JP2011114662A (en) * | 2009-11-27 | 2011-06-09 | Sony Corp | Image processing apparatus and method, program, and recording medium |
US8274592B2 (en) * | 2009-12-22 | 2012-09-25 | Eastman Kodak Company | Variable rate browsing of an image collection |
EP2355526A3 (en) | 2010-01-14 | 2012-10-31 | Nintendo Co., Ltd. | Computer-readable storage medium having stored therein display control program, display control apparatus, display control system, and display control method |
JP5898842B2 (en) | 2010-01-14 | 2016-04-06 | 任天堂株式会社 | Portable information processing device, portable game device |
US20110196888A1 (en) * | 2010-02-10 | 2011-08-11 | Apple Inc. | Correlating Digital Media with Complementary Content |
JP5528856B2 (en) * | 2010-03-10 | 2014-06-25 | オリンパスイメージング株式会社 | Photography equipment |
JP5800501B2 (en) | 2010-03-12 | 2015-10-28 | 任天堂株式会社 | Display control program, display control apparatus, display control system, and display control method |
US8379098B2 (en) * | 2010-04-21 | 2013-02-19 | Apple Inc. | Real time video process control using gestures |
CN102238331B (en) * | 2010-04-21 | 2015-05-13 | 奥林巴斯映像株式会社 | Photographic device |
US8384770B2 (en) | 2010-06-02 | 2013-02-26 | Nintendo Co., Ltd. | Image display system, image display apparatus, and image display method |
EP2395768B1 (en) | 2010-06-11 | 2015-02-25 | Nintendo Co., Ltd. | Image display program, image display system, and image display method |
JP5647819B2 (en) | 2010-06-11 | 2015-01-07 | 任天堂株式会社 | Portable electronic devices |
JP5739674B2 (en) | 2010-09-27 | 2015-06-24 | 任天堂株式会社 | Information processing program, information processing apparatus, information processing system, and information processing method |
US8854356B2 (en) * | 2010-09-28 | 2014-10-07 | Nintendo Co., Ltd. | Storage medium having stored therein image processing program, image processing apparatus, image processing system, and image processing method |
US8760563B2 (en) | 2010-10-19 | 2014-06-24 | Hand Held Products, Inc. | Autofocusing optical imaging device |
WO2012078161A1 (en) * | 2010-12-09 | 2012-06-14 | Intel Corporation | Light metering in cameras for backlit scenes |
US8773568B2 (en) * | 2010-12-20 | 2014-07-08 | Samsung Electronics Co., Ltd | Imaging apparatus and method for improving manipulation of view finders |
US8692927B2 (en) * | 2011-01-19 | 2014-04-08 | Hand Held Products, Inc. | Imaging terminal having focus control |
JP5743679B2 (en) * | 2011-04-26 | 2015-07-01 | 京セラ株式会社 | Portable terminal, invalid area setting program, and invalid area setting method |
CN103563352A (en) | 2011-06-10 | 2014-02-05 | 国际商业机器公司 | Adapted digital device and adapter for digital device |
US9336240B2 (en) * | 2011-07-15 | 2016-05-10 | Apple Inc. | Geo-tagging digital images |
KR101784523B1 (en) * | 2011-07-28 | 2017-10-11 | 엘지이노텍 주식회사 | Touch-type portable terminal |
KR101832959B1 (en) * | 2011-08-10 | 2018-02-28 | 엘지전자 주식회사 | Mobile device and control method for the same |
US9026951B2 (en) | 2011-12-21 | 2015-05-05 | Apple Inc. | Device, method, and graphical user interface for selection of views in a three-dimensional map based on gesture inputs |
JP2013130761A (en) * | 2011-12-22 | 2013-07-04 | Sony Corp | Imaging device, method for controlling the same, and program |
JP2013130762A (en) * | 2011-12-22 | 2013-07-04 | Sony Corp | Imaging device, method for controlling the same, and program |
US9208698B2 (en) | 2011-12-27 | 2015-12-08 | Apple Inc. | Device, method, and graphical user interface for manipulating a three-dimensional map view based on a device orientation |
JP5613187B2 (en) * | 2012-01-27 | 2014-10-22 | オリンパスイメージング株式会社 | IMAGING DEVICE, IMAGING DEVICE CONTROL METHOD, AND SUSTAINABLE COMPUTER-READABLE MEDIUM CONTAINING CODE FOR CAUSING COMPUTER TO CONTROL IMAGING DEVICE |
JP5816571B2 (en) * | 2012-02-21 | 2015-11-18 | 京セラ株式会社 | Mobile terminal, shooting key control program, and shooting key control method |
US9385324B2 (en) * | 2012-05-07 | 2016-07-05 | Samsung Electronics Co., Ltd. | Electronic system with augmented reality mechanism and method of operation thereof |
KR101868352B1 (en) * | 2012-05-14 | 2018-06-19 | 엘지전자 주식회사 | Mobile terminal and control method thereof |
US10171727B1 (en) | 2012-05-29 | 2019-01-01 | Promanthan Brains Llc, Series Click Only | Resetting single-control apparatus |
US9313304B1 (en) * | 2012-05-29 | 2016-04-12 | Oliver Markus Haynold | Single-control image-taking apparatus |
US8875060B2 (en) * | 2012-06-04 | 2014-10-28 | Sap Ag | Contextual gestures manager |
US9131143B2 (en) | 2012-07-20 | 2015-09-08 | Blackberry Limited | Dynamic region of interest adaptation and image capture device providing same |
US9286509B1 (en) * | 2012-10-19 | 2016-03-15 | Google Inc. | Image optimization during facial recognition |
EP2933998A4 (en) * | 2012-12-28 | 2016-08-24 | Nubia Technology Co Ltd | Pick-up device and pick-up method |
CN103092514A (en) * | 2013-01-09 | 2013-05-08 | 中兴通讯股份有限公司 | Method for operating large-screen hand-held device by single hand and hand-held device |
JP6248412B2 (en) * | 2013-05-13 | 2017-12-20 | ソニー株式会社 | Imaging apparatus, imaging method, and program |
US9635246B2 (en) * | 2013-06-21 | 2017-04-25 | Qualcomm Incorporated | Systems and methods to super resolve a user-selected region of interest |
GB201312893D0 (en) * | 2013-07-18 | 2013-09-04 | Omg Plc | Outdoor exposure control of still image capture |
US20150312472A1 (en) * | 2013-10-03 | 2015-10-29 | Bae Systems Information And Electronic Systems Integration Inc. | User friendly interfaces and controls for targeting systems |
KR20150068740A (en) * | 2013-12-12 | 2015-06-22 | 삼성전자주식회사 | Image sensor for adjusting number of oversampling and image data processing system |
JP6146293B2 (en) * | 2013-12-25 | 2017-06-14 | ソニー株式会社 | Control device, control method, and control system |
US9313397B2 (en) | 2014-05-30 | 2016-04-12 | Apple Inc. | Realtime capture exposure adjust gestures |
KR102206386B1 (en) * | 2014-09-15 | 2021-01-22 | 엘지전자 주식회사 | Mobile terminal and method for controlling the same |
US20160087806A1 (en) * | 2014-09-19 | 2016-03-24 | STEVEN Thomas GENTER | Mobile device audio/video legal application software |
CN104243827A (en) * | 2014-09-23 | 2014-12-24 | 深圳市中兴移动通信有限公司 | Shooting method and device |
CN104469167B (en) * | 2014-12-26 | 2017-10-13 | 小米科技有限责任公司 | Atomatic focusing method and device |
KR102328098B1 (en) * | 2015-02-13 | 2021-11-17 | 삼성전자주식회사 | Apparatus and method for focusing of carmea device or an electronic device having a camera module |
US10108269B2 (en) * | 2015-03-06 | 2018-10-23 | Align Technology, Inc. | Intraoral scanner with touch sensitive input |
US11240421B2 (en) | 2015-04-10 | 2022-02-01 | Qualcomm Incorporated | Methods and apparatus for defocus reduction using laser autofocus |
US10002449B2 (en) * | 2015-04-16 | 2018-06-19 | Sap Se | Responsive and adaptive chart controls |
CN105635575B (en) * | 2015-12-29 | 2019-04-12 | 宇龙计算机通信科技(深圳)有限公司 | Imaging method, imaging device and terminal |
JP6711396B2 (en) * | 2016-03-29 | 2020-06-17 | ソニー株式会社 | Image processing device, imaging device, image processing method, and program |
CN107613208B (en) * | 2017-09-29 | 2020-11-06 | 努比亚技术有限公司 | Focusing area adjusting method, terminal and computer storage medium |
US10594925B2 (en) | 2017-12-04 | 2020-03-17 | Qualcomm Incorporated | Camera zoom level and image frame capture control |
US10645272B2 (en) | 2017-12-04 | 2020-05-05 | Qualcomm Incorporated | Camera zoom level and image frame capture control |
KR102645340B1 (en) * | 2018-02-23 | 2024-03-08 | 삼성전자주식회사 | Electronic device and method for recording thereof |
JP7154064B2 (en) * | 2018-08-13 | 2022-10-17 | キヤノン株式会社 | IMAGING CONTROL DEVICE, CONTROL METHOD THEREOF, PROGRAM AND RECORDING MEDIUM |
CA3114040A1 (en) * | 2018-09-26 | 2020-04-02 | Guardian Glass, LLC | Augmented reality system and method for substrates, coated articles, insulating glass units, and/or the like |
JP7210229B2 (en) * | 2018-11-07 | 2023-01-23 | キヤノン株式会社 | DISPLAY CONTROL DEVICE, CONTROL METHOD AND PROGRAM FOR DISPLAY CONTROL DEVICE |
CN109542230B (en) * | 2018-11-28 | 2022-09-27 | 北京旷视科技有限公司 | Image processing method, image processing device, electronic equipment and storage medium |
CN109831622B (en) | 2019-01-03 | 2021-06-22 | 华为技术有限公司 | Shooting method and electronic equipment |
US11178085B2 (en) * | 2019-02-27 | 2021-11-16 | A Social Company | Social media platform for sharing reactions to videos |
CN110740265B (en) * | 2019-10-31 | 2021-03-12 | 维沃移动通信有限公司 | Image processing method and terminal equipment |
EP3833000A1 (en) * | 2019-12-06 | 2021-06-09 | Microsoft Technology Licensing, LLC | Computing device, method and computer program for capture of an image |
JP7431609B2 (en) * | 2020-02-18 | 2024-02-15 | キヤノン株式会社 | Control device and its control method and program |
US12020435B1 (en) | 2020-08-12 | 2024-06-25 | Crop Search Llc | Digital image-based measurement and analysis tools and methods for agriculture |
CN112162684A (en) * | 2020-09-25 | 2021-01-01 | 维沃移动通信有限公司 | Parameter adjusting method and device and electronic equipment |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20060026521A1 (en) * | 2004-07-30 | 2006-02-02 | Apple Computer, Inc. | Gestures for touch sensitive input devices |
US20080146275A1 (en) * | 2005-02-23 | 2008-06-19 | Frank Tofflinger | Combination Device |
US7551899B1 (en) * | 2000-12-04 | 2009-06-23 | Palmsource, Inc. | Intelligent dialing scheme for telephony application |
US20110019655A1 (en) * | 2007-10-25 | 2011-01-27 | Nokia Corporation | Method for fast transmission type selection in wcdma umts |
Family Cites Families (38)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0795136B2 (en) | 1988-01-28 | 1995-10-11 | オリンパス光学工業株式会社 | Auto focus device |
JP3535693B2 (en) * | 1997-04-30 | 2004-06-07 | キヤノン株式会社 | Portable electronic device, image processing method, imaging device, and computer-readable recording medium |
JP4178484B2 (en) * | 1998-04-06 | 2008-11-12 | 富士フイルム株式会社 | Camera with monitor |
JPH11355617A (en) * | 1998-06-05 | 1999-12-24 | Fuji Photo Film Co Ltd | Camera with image display device |
JP3820497B2 (en) | 1999-01-25 | 2006-09-13 | 富士写真フイルム株式会社 | Imaging apparatus and correction processing method for automatic exposure control |
US6952229B1 (en) * | 1999-04-13 | 2005-10-04 | Seiko Epson Corporation | Digital camera having input devices and a display capable of displaying a plurality of set information items |
US7271838B2 (en) * | 2002-05-08 | 2007-09-18 | Olympus Corporation | Image pickup apparatus with brightness distribution chart display capability |
US7551223B2 (en) * | 2002-12-26 | 2009-06-23 | Sony Corporation | Apparatus, method, and computer program for imaging and automatic focusing |
JP2004282229A (en) * | 2003-03-13 | 2004-10-07 | Minolta Co Ltd | Imaging apparatus |
JP4374574B2 (en) * | 2004-03-30 | 2009-12-02 | 富士フイルム株式会社 | Manual focus adjustment device and focus assist program |
AU2005273948B2 (en) * | 2004-08-09 | 2010-02-04 | The Nielsen Company (Us), Llc | Methods and apparatus to monitor audio/visual content from various sources |
KR101058011B1 (en) * | 2004-10-01 | 2011-08-19 | 삼성전자주식회사 | How to Operate Digital Camera Using Touch Screen |
JP2006119471A (en) * | 2004-10-22 | 2006-05-11 | Fujinon Corp | Autofocus system |
JP3829144B2 (en) | 2004-11-25 | 2006-10-04 | シャープ株式会社 | Mobile device with focusing area adjustment camera |
JP4245185B2 (en) * | 2005-02-07 | 2009-03-25 | パナソニック株式会社 | Imaging device |
JP4441879B2 (en) * | 2005-06-28 | 2010-03-31 | ソニー株式会社 | Signal processing apparatus and method, program, and recording medium |
JP4929630B2 (en) * | 2005-07-06 | 2012-05-09 | ソニー株式会社 | Imaging apparatus, control method, and program |
JP4241709B2 (en) * | 2005-10-11 | 2009-03-18 | ソニー株式会社 | Image processing device |
US20070097088A1 (en) * | 2005-10-31 | 2007-05-03 | Battles Amy E | Imaging device scrolling touch pad with tap points |
US7728903B2 (en) * | 2005-11-30 | 2010-06-01 | Nikon Corporation | Focus adjustment device, focus adjustment method and camera |
JP2007178576A (en) * | 2005-12-27 | 2007-07-12 | Casio Comput Co Ltd | Imaging apparatus and program therefor |
JP2007256907A (en) * | 2006-02-10 | 2007-10-04 | Fujifilm Corp | Digital camera |
KR100846498B1 (en) * | 2006-10-18 | 2008-07-17 | 삼성전자주식회사 | Image analysis method and apparatus, motion segmentation system |
KR100805293B1 (en) * | 2006-11-16 | 2008-02-20 | 삼성전자주식회사 | Mobile device and photographing method thereof |
JP2008167363A (en) * | 2007-01-05 | 2008-07-17 | Sony Corp | Information processor and information processing method, and program |
JP5188071B2 (en) * | 2007-02-08 | 2013-04-24 | キヤノン株式会社 | Focus adjustment device, imaging device, and focus adjustment method |
JP5004816B2 (en) * | 2007-02-08 | 2012-08-22 | キヤノン株式会社 | Imaging apparatus and control method |
KR101354669B1 (en) * | 2007-03-27 | 2014-01-27 | 삼성전자주식회사 | Method and apparatus for detecting dead pixel in image sensor, method and apparatus for capturing image from image sensor |
US20080284900A1 (en) * | 2007-04-04 | 2008-11-20 | Nikon Corporation | Digital camera |
JP4883413B2 (en) * | 2007-06-28 | 2012-02-22 | ソニー株式会社 | Imaging apparatus, image display control method, and program |
US8497928B2 (en) * | 2007-07-31 | 2013-07-30 | Palm, Inc. | Techniques to automatically focus a digital camera |
US20090174674A1 (en) * | 2008-01-09 | 2009-07-09 | Qualcomm Incorporated | Apparatus and methods for a touch user interface using an image sensor |
US8577216B2 (en) * | 2008-02-13 | 2013-11-05 | Qualcomm Incorporated | Auto-focus calibration for image capture device |
JP4530067B2 (en) * | 2008-03-27 | 2010-08-25 | ソニー株式会社 | Imaging apparatus, imaging method, and program |
JP5268433B2 (en) * | 2008-06-02 | 2013-08-21 | キヤノン株式会社 | IMAGING DEVICE AND IMAGING DEVICE CONTROL METHOD |
US8237807B2 (en) | 2008-07-24 | 2012-08-07 | Apple Inc. | Image capturing device with touch screen for adjusting camera settings |
US20100166404A1 (en) * | 2008-12-31 | 2010-07-01 | Lombardi Michael J | Device and Method Using a Touch-Detecting Surface |
US8830339B2 (en) * | 2009-04-15 | 2014-09-09 | Qualcomm Incorporated | Auto-triggered fast frame rate digital video recording |
-
2009
- 2009-06-05 US US12/479,705 patent/US8237807B2/en active Active
- 2009-07-23 US US12/508,534 patent/US20100020221A1/en not_active Abandoned
-
2012
- 2012-07-17 US US13/551,360 patent/US8670060B2/en active Active
-
2014
- 2014-02-10 US US14/177,200 patent/US9544495B2/en active Active
-
2016
- 2016-11-29 US US15/363,381 patent/US10057481B2/en active Active
-
2018
- 2018-07-09 US US16/030,609 patent/US10341553B2/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7551899B1 (en) * | 2000-12-04 | 2009-06-23 | Palmsource, Inc. | Intelligent dialing scheme for telephony application |
US20060026521A1 (en) * | 2004-07-30 | 2006-02-02 | Apple Computer, Inc. | Gestures for touch sensitive input devices |
US20080146275A1 (en) * | 2005-02-23 | 2008-06-19 | Frank Tofflinger | Combination Device |
US20110019655A1 (en) * | 2007-10-25 | 2011-01-27 | Nokia Corporation | Method for fast transmission type selection in wcdma umts |
Non-Patent Citations (3)
Title |
---|
Apple, "iPhone User's Guide", July 2, 2007, pp. 10, 19, 24-41, 73-79 * |
AT&T, "Pantech C3b User Guide", February 10, 2007, pp .3, 5, 24-31, 40-43 * |
Jeremy Peters, "Long-Awaited iPhone Goes on Sale", nytimes.com, June 29, 2007 * |
Cited By (504)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9037995B2 (en) | 2007-01-07 | 2015-05-19 | Apple Inc. | Application programming interfaces for scrolling operations |
US20080168478A1 (en) * | 2007-01-07 | 2008-07-10 | Andrew Platzer | Application Programming Interfaces for Scrolling |
US8661363B2 (en) | 2007-01-07 | 2014-02-25 | Apple Inc. | Application programming interfaces for scrolling operations |
US11449217B2 (en) | 2007-01-07 | 2022-09-20 | Apple Inc. | Application programming interfaces for gesture operations |
US10175876B2 (en) | 2007-01-07 | 2019-01-08 | Apple Inc. | Application programming interfaces for gesture operations |
US9760272B2 (en) | 2007-01-07 | 2017-09-12 | Apple Inc. | Application programming interfaces for scrolling operations |
US20080168402A1 (en) * | 2007-01-07 | 2008-07-10 | Christopher Blumenberg | Application Programming Interfaces for Gesture Operations |
US8429557B2 (en) | 2007-01-07 | 2013-04-23 | Apple Inc. | Application programming interfaces for scrolling operations |
US10963142B2 (en) | 2007-01-07 | 2021-03-30 | Apple Inc. | Application programming interfaces for scrolling |
US10481785B2 (en) | 2007-01-07 | 2019-11-19 | Apple Inc. | Application programming interfaces for scrolling operations |
US9665265B2 (en) | 2007-01-07 | 2017-05-30 | Apple Inc. | Application programming interfaces for gesture operations |
US20100325575A1 (en) * | 2007-01-07 | 2010-12-23 | Andrew Platzer | Application programming interfaces for scrolling operations |
US11954322B2 (en) | 2007-01-07 | 2024-04-09 | Apple Inc. | Application programming interface for gesture operations |
US10817162B2 (en) | 2007-01-07 | 2020-10-27 | Apple Inc. | Application programming interfaces for scrolling operations |
US10613741B2 (en) | 2007-01-07 | 2020-04-07 | Apple Inc. | Application programming interface for gesture operations |
US9639260B2 (en) | 2007-01-07 | 2017-05-02 | Apple Inc. | Application programming interfaces for gesture operations |
US9575648B2 (en) | 2007-01-07 | 2017-02-21 | Apple Inc. | Application programming interfaces for gesture operations |
US9529519B2 (en) | 2007-01-07 | 2016-12-27 | Apple Inc. | Application programming interfaces for gesture operations |
US9448712B2 (en) | 2007-01-07 | 2016-09-20 | Apple Inc. | Application programming interfaces for scrolling operations |
US8836652B2 (en) | 2008-03-04 | 2014-09-16 | Apple Inc. | Touch event model programming interface |
US20090228901A1 (en) * | 2008-03-04 | 2009-09-10 | Apple Inc. | Touch event model |
US9798459B2 (en) | 2008-03-04 | 2017-10-24 | Apple Inc. | Touch event model for web pages |
US9323335B2 (en) | 2008-03-04 | 2016-04-26 | Apple Inc. | Touch event model programming interface |
US11740725B2 (en) | 2008-03-04 | 2023-08-29 | Apple Inc. | Devices, methods, and user interfaces for processing touch events |
US8723822B2 (en) | 2008-03-04 | 2014-05-13 | Apple Inc. | Touch event model programming interface |
US9389712B2 (en) | 2008-03-04 | 2016-07-12 | Apple Inc. | Touch event model |
US8717305B2 (en) | 2008-03-04 | 2014-05-06 | Apple Inc. | Touch event model for web pages |
US8174502B2 (en) | 2008-03-04 | 2012-05-08 | Apple Inc. | Touch event processing for web pages |
US8416196B2 (en) | 2008-03-04 | 2013-04-09 | Apple Inc. | Touch event model programming interface |
US20090225039A1 (en) * | 2008-03-04 | 2009-09-10 | Apple Inc. | Touch event model programming interface |
US20090225037A1 (en) * | 2008-03-04 | 2009-09-10 | Apple Inc. | Touch event model for web pages |
US9720594B2 (en) | 2008-03-04 | 2017-08-01 | Apple Inc. | Touch event model |
US8411061B2 (en) | 2008-03-04 | 2013-04-02 | Apple Inc. | Touch event processing for documents |
US10936190B2 (en) | 2008-03-04 | 2021-03-02 | Apple Inc. | Devices, methods, and user interfaces for processing touch events |
US9971502B2 (en) | 2008-03-04 | 2018-05-15 | Apple Inc. | Touch event model |
US10521109B2 (en) | 2008-03-04 | 2019-12-31 | Apple Inc. | Touch event model |
US8560975B2 (en) | 2008-03-04 | 2013-10-15 | Apple Inc. | Touch event model |
US8645827B2 (en) * | 2008-03-04 | 2014-02-04 | Apple Inc. | Touch event model |
US9690481B2 (en) | 2008-03-04 | 2017-06-27 | Apple Inc. | Touch event model |
US20090225038A1 (en) * | 2008-03-04 | 2009-09-10 | Apple Inc. | Touch event processing for web pages |
US10325394B2 (en) | 2008-06-11 | 2019-06-18 | Apple Inc. | Mobile communication terminal and data input method |
US8670060B2 (en) | 2008-07-24 | 2014-03-11 | Apple Inc. | Image capturing device with touch screen for adjusting camera settings |
US10057481B2 (en) | 2008-07-24 | 2018-08-21 | Apple Inc. | Image capturing device with touch screen for adjusting camera settings |
US20100020222A1 (en) * | 2008-07-24 | 2010-01-28 | Jeremy Jones | Image Capturing Device with Touch Screen for Adjusting Camera Settings |
US9544495B2 (en) | 2008-07-24 | 2017-01-10 | Apple Inc. | Image capturing device with touch screen for adjusting camera settings |
US8237807B2 (en) | 2008-07-24 | 2012-08-07 | Apple Inc. | Image capturing device with touch screen for adjusting camera settings |
US10341553B2 (en) | 2008-07-24 | 2019-07-02 | Apple Inc. | Image capturing device with touch screen for adjusting camera settings |
US11290636B2 (en) * | 2008-09-05 | 2022-03-29 | Telefonaktiebolaget Lm Ericsson (Publ) | Mobile terminal and method of performing multi-focusing and photographing image including plurality of objects using the same |
US11601585B2 (en) | 2008-09-05 | 2023-03-07 | Telefonaktiebolaget Lm Ericsson (Publ) | Mobile terminal and method of performing multi-focusing and photographing image including plurality of objects using the same |
US20140362254A1 (en) * | 2008-10-01 | 2014-12-11 | Nintendo Co., Ltd. | Information processing device, information processing system, and launch program and storage medium storing the same |
US9630099B2 (en) * | 2008-10-01 | 2017-04-25 | Nintendo Co., Ltd. | Information processing device, information processing system, and launch program and storage medium storing the same providing photographing functionality |
US20190113997A1 (en) * | 2008-10-26 | 2019-04-18 | Microsoft Technology Licensing, Llc | Multi-touch manipulation of application objects |
US20100156941A1 (en) * | 2008-12-19 | 2010-06-24 | Samsung Electronics Co., Ltd | Photographing method using multi-input scheme through touch and key manipulation and photographing apparatus using the same |
US20100171863A1 (en) * | 2009-01-08 | 2010-07-08 | Samsung Electronics Co., Ltd | Method to enlarge and change displayed image and photographing apparatus using the same |
US8566045B2 (en) | 2009-03-16 | 2013-10-22 | Apple Inc. | Event recognition |
US20100235118A1 (en) * | 2009-03-16 | 2010-09-16 | Bradford Allen Moore | Event Recognition |
US10719225B2 (en) | 2009-03-16 | 2020-07-21 | Apple Inc. | Event recognition |
US8566044B2 (en) | 2009-03-16 | 2013-10-22 | Apple Inc. | Event recognition |
US11163440B2 (en) | 2009-03-16 | 2021-11-02 | Apple Inc. | Event recognition |
US20110179387A1 (en) * | 2009-03-16 | 2011-07-21 | Shaffer Joshua L | Event Recognition |
US9311112B2 (en) | 2009-03-16 | 2016-04-12 | Apple Inc. | Event recognition |
US8285499B2 (en) | 2009-03-16 | 2012-10-09 | Apple Inc. | Event recognition |
US9965177B2 (en) | 2009-03-16 | 2018-05-08 | Apple Inc. | Event recognition |
US9483121B2 (en) | 2009-03-16 | 2016-11-01 | Apple Inc. | Event recognition |
US20110179380A1 (en) * | 2009-03-16 | 2011-07-21 | Shaffer Joshua L | Event Recognition |
US8428893B2 (en) | 2009-03-16 | 2013-04-23 | Apple Inc. | Event recognition |
US9285908B2 (en) | 2009-03-16 | 2016-03-15 | Apple Inc. | Event recognition |
US8682602B2 (en) | 2009-03-16 | 2014-03-25 | Apple Inc. | Event recognition |
US20110179386A1 (en) * | 2009-03-16 | 2011-07-21 | Shaffer Joshua L | Event Recognition |
US11755196B2 (en) | 2009-03-16 | 2023-09-12 | Apple Inc. | Event recognition |
US8294682B2 (en) * | 2009-04-28 | 2012-10-23 | Hong Fu Jin Precision Industry (Shenzhen) Co., Ltd. | Displaying system and method thereof |
US20100271318A1 (en) * | 2009-04-28 | 2010-10-28 | Hong Fu Jin Precision Industry (Shenzhen) Co., Ltd. | Displaying system and method thereof |
US9330322B2 (en) * | 2009-06-16 | 2016-05-03 | Intel Corporation | Controlled access to functionality of a wireless device |
US8723988B2 (en) * | 2009-07-17 | 2014-05-13 | Sony Corporation | Using a touch sensitive display to control magnification and capture of digital images by an electronic device |
US20110013049A1 (en) * | 2009-07-17 | 2011-01-20 | Sony Ericsson Mobile Communications Ab | Using a touch sensitive display to control magnification and capture of digital images by an electronic device |
US10110983B2 (en) | 2009-08-17 | 2018-10-23 | Harman International Industries, Incorporated | Ear sizing system and method |
US20110069180A1 (en) * | 2009-09-23 | 2011-03-24 | Microsoft Corporation | Camera-based scanning |
US8704896B2 (en) | 2009-09-23 | 2014-04-22 | Microsoft Corporation | Camera-based scanning |
US8345106B2 (en) * | 2009-09-23 | 2013-01-01 | Microsoft Corporation | Camera-based scanning |
US20110141145A1 (en) * | 2009-12-15 | 2011-06-16 | Hong Fu Jin Precision Industry (Shenzhen) Co., Ltd | Electronic device and method capable of zooming images |
US20110164128A1 (en) * | 2010-01-06 | 2011-07-07 | Verto Medical Solutions, LLC | Image capture and earpiece sizing system and method |
US9843855B2 (en) * | 2010-01-06 | 2017-12-12 | Harman International Industries, Incorporated | Image capture and earpiece sizing system and method |
US10123109B2 (en) | 2010-01-06 | 2018-11-06 | Harman International Industries, Incorporated | Image capture and earpiece sizing system and method |
US20150271588A1 (en) * | 2010-01-06 | 2015-09-24 | Harman International Industries, Inc. | Image capture and earpiece sizing system and method |
US9050029B2 (en) * | 2010-01-06 | 2015-06-09 | Harman International Industries, Inc. | Image capture and earpiece sizing system and method |
US10732997B2 (en) | 2010-01-26 | 2020-08-04 | Apple Inc. | Gesture recognizers with delegates for controlling and modifying gesture recognition |
US12061915B2 (en) | 2010-01-26 | 2024-08-13 | Apple Inc. | Gesture recognizers with delegates for controlling and modifying gesture recognition |
US9684521B2 (en) | 2010-01-26 | 2017-06-20 | Apple Inc. | Systems having discrete and continuous gesture recognizers |
US20110193778A1 (en) * | 2010-02-05 | 2011-08-11 | Samsung Electronics Co., Ltd. | Device and method for controlling mouse pointer |
US8957857B2 (en) * | 2010-02-05 | 2015-02-17 | Samsung Electronics Co., Ltd | Device and method for controlling mouse pointer |
US9760280B2 (en) * | 2010-02-18 | 2017-09-12 | Rohm Co., Ltd. | Touch-panel input device |
US20160188200A1 (en) * | 2010-02-18 | 2016-06-30 | Rohm Co., Ltd. | Touch-panel input device |
US20110211073A1 (en) * | 2010-02-26 | 2011-09-01 | Research In Motion Limited | Object detection and selection using gesture recognition |
US8379134B2 (en) * | 2010-02-26 | 2013-02-19 | Research In Motion Limited | Object detection and selection using gesture recognition |
US8817158B2 (en) | 2010-03-15 | 2014-08-26 | Canon Kabushiki Kaisha | Image pickup apparatus and control method for image pickup apparatus with touch operation member control |
CN102196178A (en) * | 2010-03-15 | 2011-09-21 | 佳能株式会社 | Image pickup apparatus and its control method |
CN104539849A (en) * | 2010-03-15 | 2015-04-22 | 佳能株式会社 | Image pickup apparatus and its control method |
EP2367345A1 (en) * | 2010-03-15 | 2011-09-21 | Canon Kabushiki Kaisha | Image pickup apparatus and its control method |
US20110221948A1 (en) * | 2010-03-15 | 2011-09-15 | Canon Kabushiki Kaisha | Image pickup apparatus and its control method |
US8643749B2 (en) * | 2010-04-01 | 2014-02-04 | Olympus Imaging Corp. | Imaging device, display device, control method, and method for controlling area change |
US20110242396A1 (en) * | 2010-04-01 | 2011-10-06 | Yoshinori Matsuzawa | Imaging device, display device, control method, and method for controlling area change |
US9277116B2 (en) * | 2010-04-01 | 2016-03-01 | Olympus Corporation | Imaging device, display device, control method, and method for controlling area change |
US20140111679A1 (en) * | 2010-04-01 | 2014-04-24 | Olympus Imaging Corp. | Imaging device, display device, control method, and method for controlling area change |
US20110273473A1 (en) * | 2010-05-06 | 2011-11-10 | Bumbae Kim | Mobile terminal capable of providing multiplayer game and operating method thereof |
US8648877B2 (en) * | 2010-05-06 | 2014-02-11 | Lg Electronics Inc. | Mobile terminal and operation method thereof |
US8773470B2 (en) | 2010-05-07 | 2014-07-08 | Apple Inc. | Systems and methods for displaying visual information on a device |
US8849355B2 (en) * | 2010-06-04 | 2014-09-30 | Lg Electronics Inc. | Mobile terminal capable of providing multiplayer game and method of controlling operation of the mobile terminal |
US20110300910A1 (en) * | 2010-06-04 | 2011-12-08 | Kyungdong Choi | Mobile terminal capable of providing multiplayer game and method of controlling operation of the mobile terminal |
US8552999B2 (en) | 2010-06-14 | 2013-10-08 | Apple Inc. | Control selection approximation |
US10216408B2 (en) | 2010-06-14 | 2019-02-26 | Apple Inc. | Devices and methods for identifying user interface objects based on view hierarchy |
US20170142300A1 (en) * | 2010-08-03 | 2017-05-18 | Drake Rice | Camera for handheld device |
US20200045209A1 (en) * | 2010-08-03 | 2020-02-06 | Drake Rice | Camera for handheld device |
JP2012108511A (en) * | 2010-11-15 | 2012-06-07 | Leica Microsystems (Schweiz) Ag | Microscope having touch screen |
US20120120223A1 (en) * | 2010-11-15 | 2012-05-17 | Leica Microsystems (Schweiz) Ag | Portable microscope |
US20120120222A1 (en) * | 2010-11-15 | 2012-05-17 | Leica Microsystems (Schweiz) Ag | Operator control unit for a microscope |
US20190124257A1 (en) * | 2010-12-27 | 2019-04-25 | Samsung Electronics Co., Ltd. | Digital image photographing apparatus and method of controlling the same |
US10681262B2 (en) * | 2010-12-27 | 2020-06-09 | Samsung Electronics Co., Ltd. | Digital image photographing apparatus and method of controlling the same |
US20120212661A1 (en) * | 2011-02-22 | 2012-08-23 | Sony Corporation | Imaging apparatus, focus control method, and program |
JP2012186698A (en) * | 2011-03-07 | 2012-09-27 | Ricoh Co Ltd | Image photographing apparatus |
US8213916B1 (en) * | 2011-03-17 | 2012-07-03 | Ebay Inc. | Video processing system for identifying items in video frames |
WO2012128835A1 (en) * | 2011-03-21 | 2012-09-27 | Apple Inc. | Gesture-based configuration of image processing techniques |
US20120257069A1 (en) * | 2011-04-08 | 2012-10-11 | Nokia Corporation | Imaging |
US9204047B2 (en) * | 2011-04-08 | 2015-12-01 | Nokia Technologies Oy | Imaging |
US9298363B2 (en) | 2011-04-11 | 2016-03-29 | Apple Inc. | Region activation for touch sensitive surface |
CN103562791A (en) * | 2011-04-18 | 2014-02-05 | 眼见360股份有限公司 | Apparatus and method for panoramic video imaging with mobile computing devices |
US20120262600A1 (en) * | 2011-04-18 | 2012-10-18 | Qualcomm Incorporated | White balance optimization with high dynamic range images |
US8947555B2 (en) * | 2011-04-18 | 2015-02-03 | Qualcomm Incorporated | White balance optimization with high dynamic range images |
WO2012146273A1 (en) * | 2011-04-26 | 2012-11-01 | Better4Drive Ug (Haftungsbeschränkt) | Method and system for video marker insertion |
US9678657B2 (en) * | 2011-07-07 | 2017-06-13 | Olympus Corporation | Imaging apparatus, imaging method, and computer-readable storage medium providing a touch panel display user interface |
US9075459B2 (en) * | 2011-07-07 | 2015-07-07 | Olympus Corporation | Imaging apparatus, imaging method, and computer-readable storage medium providing a touch panel display user interface |
US20150248208A1 (en) * | 2011-07-07 | 2015-09-03 | Olympus Corporation | Imaging apparatus, imaging method, and computer-readable storage medium providing a touch panel display user interface |
US20130010170A1 (en) * | 2011-07-07 | 2013-01-10 | Yoshinori Matsuzawa | Imaging apparatus, imaging method, and computer-readable storage medium |
US11740727B1 (en) | 2011-08-05 | 2023-08-29 | P4Tents1 Llc | Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback |
EP2555097A1 (en) * | 2011-08-05 | 2013-02-06 | prisma - Gesellschaft für Projektmanagement & Informationssysteme mbH | Method and device for determining a section of an image and triggering imaging using a single touch-based gesture |
WO2013020943A1 (en) | 2011-08-05 | 2013-02-14 | Prisma - Gesellschaft Für Projektmanagement & Informationssysteme Mbh | Method and apparatus for determining an image detail and initiating image captures by means of a single touch-based gesture |
US20130050565A1 (en) * | 2011-08-24 | 2013-02-28 | Sony Mobile Communications Ab | Image focusing |
US9172860B2 (en) * | 2011-08-24 | 2015-10-27 | Sony Corporation | Computational camera and method for setting multiple focus planes in a captured image |
US9106836B2 (en) * | 2011-09-09 | 2015-08-11 | Canon Kabushiki Kaisha | Imaging apparatus, control method for the same, and recording medium, where continuous shooting or single shooting is performed based on touch |
US20130063645A1 (en) * | 2011-09-09 | 2013-03-14 | Canon Kabushiki Kaisha | Imaging apparatus, control method for the same, and recording medium |
EP2574041A3 (en) * | 2011-09-20 | 2013-10-23 | Canon Kabushiki Kaisha | Image capturing apparatus and control method thereof |
US8810709B2 (en) | 2011-09-20 | 2014-08-19 | Canon Kabushiki Kaisha | Image capturing apparatus and control method thereof |
CN103024261A (en) * | 2011-09-20 | 2013-04-03 | 佳能株式会社 | Image capturing apparatus and control method thereof |
JP2013068671A (en) * | 2011-09-20 | 2013-04-18 | Canon Inc | Imaging apparatus, control method thereof, program, and storage medium |
US20130076959A1 (en) * | 2011-09-22 | 2013-03-28 | Panasonic Corporation | Imaging Device |
US9118907B2 (en) * | 2011-09-22 | 2015-08-25 | Panasonic Intellectual Property Management Co., Ltd. | Imaging device enabling automatic taking of photo when pre-registered object moves into photographer's intended shooting distance |
US20130088629A1 (en) * | 2011-10-06 | 2013-04-11 | Samsung Electronics Co., Ltd. | Mobile device and method of remotely controlling a controlled device |
US20130155276A1 (en) * | 2011-10-12 | 2013-06-20 | Canon Kabushiki Kaisha | Image capturing apparatus, and control method and program therefor |
US9696897B2 (en) * | 2011-10-19 | 2017-07-04 | The Regents Of The University Of California | Image-based measurement tools |
US20140300722A1 (en) * | 2011-10-19 | 2014-10-09 | The Regents Of The University Of California | Image-based measurement tools |
US20130293735A1 (en) * | 2011-11-04 | 2013-11-07 | Sony Corporation | Imaging control device, imaging apparatus, and control method for imaging control device |
US10154199B2 (en) | 2011-11-17 | 2018-12-11 | Samsung Electronics Co., Ltd. | Method and apparatus for self camera shooting |
US11368625B2 (en) | 2011-11-17 | 2022-06-21 | Samsung Electronics Co., Ltd. | Method and apparatus for self camera shooting |
US10652469B2 (en) | 2011-11-17 | 2020-05-12 | Samsung Electronics Co., Ltd. | Method and apparatus for self camera shooting |
US20130128091A1 (en) * | 2011-11-17 | 2013-05-23 | Samsung Electronics Co., Ltd | Method and apparatus for self camera shooting |
US9041847B2 (en) * | 2011-11-17 | 2015-05-26 | Samsung Electronics Co., Ltd. | Method and apparatus for self camera shooting |
US10936173B2 (en) | 2012-03-06 | 2021-03-02 | Apple Inc. | Unified slider control for modifying multiple image properties |
US10282055B2 (en) | 2012-03-06 | 2019-05-07 | Apple Inc. | Ordered processing of edits for a media editing application |
US9105121B2 (en) | 2012-03-06 | 2015-08-11 | Apple Inc. | Image editing with user interface controls overlaid on image |
US8971623B2 (en) | 2012-03-06 | 2015-03-03 | Apple Inc. | Overlaid user interface tools for applying effects to image |
US10545631B2 (en) | 2012-03-06 | 2020-01-28 | Apple Inc. | Fanning user interface controls for a media editing application |
US10552016B2 (en) | 2012-03-06 | 2020-02-04 | Apple Inc. | User interface tools for cropping and straightening image |
US9299168B2 (en) | 2012-03-06 | 2016-03-29 | Apple Inc. | Context aware user interface for image editing |
US9886931B2 (en) | 2012-03-06 | 2018-02-06 | Apple Inc. | Multi operation slider |
US9569078B2 (en) | 2012-03-06 | 2017-02-14 | Apple Inc. | User interface tools for cropping and straightening image |
US10942634B2 (en) | 2012-03-06 | 2021-03-09 | Apple Inc. | User interface tools for cropping and straightening image |
US11119635B2 (en) | 2012-03-06 | 2021-09-14 | Apple Inc. | Fanning user interface controls for a media editing application |
US11481097B2 (en) | 2012-03-06 | 2022-10-25 | Apple Inc. | User interface tools for cropping and straightening image |
US9159144B2 (en) | 2012-03-06 | 2015-10-13 | Apple Inc. | Color adjustors for color segments |
US9041727B2 (en) | 2012-03-06 | 2015-05-26 | Apple Inc. | User interface tools for selectively applying effects to image |
US20130250156A1 (en) * | 2012-03-22 | 2013-09-26 | Canon Kabushiki Kaisha | Image capturing apparatus and control method thereof |
US10187562B2 (en) | 2012-03-23 | 2019-01-22 | Samsung Electronics Co., Ltd. | Apparatus and method for controlling auto focus function in electronic device |
US20170289442A1 (en) * | 2012-03-23 | 2017-10-05 | Samsung Electronics Co., Ltd. | Apparatus and method for controlling auto focus function in electronic device |
US11582377B2 (en) | 2012-03-23 | 2023-02-14 | Samsung Electronics Co., Ltd. | Apparatus and method for controlling auto focus function in electronic device |
US9250498B2 (en) | 2012-03-23 | 2016-02-02 | Samsung Electronics Co., Ltd. | Apparatus and method for controlling auto focus function in electronic device |
US10863076B2 (en) * | 2012-03-23 | 2020-12-08 | Samsung Electronics Co., Ltd. | Apparatus and method for controlling auto focus function in electronic device |
US9007508B2 (en) * | 2012-03-29 | 2015-04-14 | Sony Corporation | Portable device, photographing method, and program for setting a target region and performing an image capturing operation when a target is detected in the target region |
US20130258160A1 (en) * | 2012-03-29 | 2013-10-03 | Sony Mobile Communications Inc. | Portable device, photographing method, and program |
US20130271637A1 (en) * | 2012-04-17 | 2013-10-17 | Samsung Electronics Co., Ltd. | Apparatus and method for controlling focus |
US9131144B2 (en) * | 2012-04-17 | 2015-09-08 | Samsung Electronics Co., Ltd. | Apparatus and method for controlling focus |
US12045451B2 (en) | 2012-05-09 | 2024-07-23 | Apple Inc. | Device, method, and graphical user interface for moving a user interface object based on an intensity of a press input |
US20200081614A1 (en) * | 2012-05-09 | 2020-03-12 | Apple Inc. | Device and Method for Facilitating Setting Autofocus Reference Point in Camera Application User Interface |
US11947724B2 (en) | 2012-05-09 | 2024-04-02 | Apple Inc. | Device, method, and graphical user interface for providing tactile feedback for operations performed in a user interface |
US12067229B2 (en) | 2012-05-09 | 2024-08-20 | Apple Inc. | Device, method, and graphical user interface for providing feedback for changing activation states of a user interface object |
WO2013169259A1 (en) | 2012-05-10 | 2013-11-14 | Intel Corporation | Gesture responsive image capture control and/or operation on image |
CN104220975A (en) * | 2012-05-10 | 2014-12-17 | 英特尔公司 | Gesture responsive image capture control and/or operation on image |
EP2847649A4 (en) * | 2012-05-10 | 2015-12-16 | Intel Corp | Gesture responsive image capture control and/or operation on image |
US20130329114A1 (en) * | 2012-06-08 | 2013-12-12 | Samsung Electronics Co., Ltd. | Image magnifier for pin-point control |
EP3672227A1 (en) * | 2012-07-20 | 2020-06-24 | BlackBerry Limited | Dynamic region of interest adaptation and image capture device providing same |
US20140028893A1 (en) * | 2012-07-25 | 2014-01-30 | Samsung Electronics Co., Ltd. | Image capturing apparatus and method of controlling the same |
US9338359B2 (en) * | 2012-08-06 | 2016-05-10 | Xiaomi Inc. | Method of capturing an image in a device and the device thereof |
US20140036131A1 (en) * | 2012-08-06 | 2014-02-06 | Beijing Xiaomi Technology Co.,Ltd. | Method of capturing an image in a device and the device thereof |
US9374589B2 (en) | 2012-08-08 | 2016-06-21 | Dolby Laboratories Licensing Corporation | HDR images with multiple color gamuts |
US9076224B1 (en) | 2012-08-08 | 2015-07-07 | Dolby Laboratories Licensing Corporation | Image processing for HDR images |
US9467704B2 (en) | 2012-08-08 | 2016-10-11 | Dolby Laboratories Licensing Corporation | Adaptive ratio images in HDR image representation |
US9426408B2 (en) | 2012-08-27 | 2016-08-23 | Nokia Technologies Oy | Method and apparatus for recording video sequences |
WO2014033347A1 (en) * | 2012-08-27 | 2014-03-06 | Nokia Corporation | Method and apparatus for recording video sequences |
US9137437B2 (en) * | 2012-09-03 | 2015-09-15 | Lg Electronics Inc. | Method for changing displayed characteristics of a preview image |
US20140063313A1 (en) * | 2012-09-03 | 2014-03-06 | Lg Electronics Inc. | Mobile device and control method for the same |
CN103795917A (en) * | 2012-10-30 | 2014-05-14 | 三星电子株式会社 | Imaging apparatus and control method |
CN103841319A (en) * | 2012-11-23 | 2014-06-04 | 佳能株式会社 | Image pickup apparatus |
US10186062B2 (en) * | 2012-11-27 | 2019-01-22 | Samsung Electronics Co., Ltd. | Contour segmentation apparatus and method based on user interaction |
US20140146076A1 (en) * | 2012-11-27 | 2014-05-29 | Samsung Electronics Co., Ltd. | Contour segmentation apparatus and method based on user interaction |
USRE48444E1 (en) | 2012-11-28 | 2021-02-16 | Corephotonics Ltd. | High resolution thin multi-aperture imaging systems |
USRE49256E1 (en) | 2012-11-28 | 2022-10-18 | Corephotonics Ltd. | High resolution thin multi-aperture imaging systems |
USRE48945E1 (en) | 2012-11-28 | 2022-02-22 | Corephotonics Ltd. | High resolution thin multi-aperture imaging systems |
USRE48477E1 (en) | 2012-11-28 | 2021-03-16 | Corephotonics Ltd | High resolution thin multi-aperture imaging systems |
USRE48697E1 (en) | 2012-11-28 | 2021-08-17 | Corephotonics Ltd. | High resolution thin multi-aperture imaging systems |
US9405429B1 (en) * | 2012-12-10 | 2016-08-02 | Amazon Technologies, Inc. | Collecting items with multi-touch gestures |
WO2014095782A1 (en) * | 2012-12-17 | 2014-06-26 | Connaught Electronics Ltd. | Method for white balance of an image presentation considering color values exclusively of a subset of pixels, camera system and motor vehicle with a camera system |
US20140181737A1 (en) * | 2012-12-20 | 2014-06-26 | Samsung Electronics Co., Ltd. | Method for processing contents and electronic device thereof |
US12050761B2 (en) | 2012-12-29 | 2024-07-30 | Apple Inc. | Device, method, and graphical user interface for transitioning from low power mode |
US20150370443A1 (en) * | 2013-02-12 | 2015-12-24 | Inuitive Ltd. | System and method for combining touch and gesture in a three dimensional user interface |
US9571722B2 (en) | 2013-02-27 | 2017-02-14 | Google Technology Holdings LLC | Viewfinder utility |
WO2014131168A1 (en) * | 2013-02-27 | 2014-09-04 | Motorola Mobility Llc | A viewfinder utility |
US8804022B1 (en) | 2013-02-28 | 2014-08-12 | Lg Electronics Inc. | Digital device and method for controlling the same |
US8964091B2 (en) | 2013-02-28 | 2015-02-24 | Lg Electronics Inc. | Digital device and method for controlling the same |
US8593555B1 (en) | 2013-02-28 | 2013-11-26 | Lg Electronics Inc. | Digital device and method for controlling the same |
US9733716B2 (en) | 2013-06-09 | 2017-08-15 | Apple Inc. | Proxy gesture recognizer |
US11429190B2 (en) | 2013-06-09 | 2022-08-30 | Apple Inc. | Proxy gesture recognizer |
US12069371B2 (en) | 2013-06-13 | 2024-08-20 | Corephotonics Lid. | Dual aperture zoom digital camera |
US11470257B2 (en) | 2013-06-13 | 2022-10-11 | Corephotonics Ltd. | Dual aperture zoom digital camera |
US10904444B2 (en) | 2013-06-13 | 2021-01-26 | Corephotonics Ltd. | Dual aperture zoom digital camera |
US10326942B2 (en) | 2013-06-13 | 2019-06-18 | Corephotonics Ltd. | Dual aperture zoom digital camera |
US10841500B2 (en) | 2013-06-13 | 2020-11-17 | Corephotonics Ltd. | Dual aperture zoom digital camera |
US10225479B2 (en) | 2013-06-13 | 2019-03-05 | Corephotonics Ltd. | Dual aperture zoom digital camera |
US11838635B2 (en) | 2013-06-13 | 2023-12-05 | Corephotonics Ltd. | Dual aperture zoom digital camera |
US10288896B2 (en) | 2013-07-04 | 2019-05-14 | Corephotonics Ltd. | Thin dual-aperture zoom digital camera |
US11852845B2 (en) | 2013-07-04 | 2023-12-26 | Corephotonics Ltd. | Thin dual-aperture zoom digital camera |
US10620450B2 (en) | 2013-07-04 | 2020-04-14 | Corephotonics Ltd | Thin dual-aperture zoom digital camera |
US11614635B2 (en) | 2013-07-04 | 2023-03-28 | Corephotonics Ltd. | Thin dual-aperture zoom digital camera |
US11287668B2 (en) | 2013-07-04 | 2022-03-29 | Corephotonics Ltd. | Thin dual-aperture zoom digital camera |
US10250797B2 (en) | 2013-08-01 | 2019-04-02 | Corephotonics Ltd. | Thin multi-aperture imaging system with auto-focus and methods for using same |
US11991444B2 (en) | 2013-08-01 | 2024-05-21 | Corephotonics Ltd. | Thin multi-aperture imaging system with auto-focus and methods for using same |
US10469735B2 (en) | 2013-08-01 | 2019-11-05 | Corephotonics Ltd. | Thin multi-aperture imaging system with auto-focus and methods for using same |
US11470235B2 (en) | 2013-08-01 | 2022-10-11 | Corephotonics Ltd. | Thin multi-aperture imaging system with autofocus and methods for using same |
US11716535B2 (en) | 2013-08-01 | 2023-08-01 | Corephotonics Ltd. | Thin multi-aperture imaging system with auto-focus and methods for using same |
US11856291B2 (en) | 2013-08-01 | 2023-12-26 | Corephotonics Ltd. | Thin multi-aperture imaging system with auto-focus and methods for using same |
US10694094B2 (en) | 2013-08-01 | 2020-06-23 | Corephotonics Ltd. | Thin multi-aperture imaging system with auto-focus and methods for using same |
US9578247B2 (en) * | 2013-08-28 | 2017-02-21 | Sony Corporation | Information processing device, imaging device, information processing method, and program |
US20150062373A1 (en) * | 2013-08-28 | 2015-03-05 | Sony Corporation | Information processing device, imaging device, information processing method, and program |
CN104427322A (en) * | 2013-08-28 | 2015-03-18 | 索尼公司 | Information processing device, imaging device, information processing method, and program |
JP2015046742A (en) * | 2013-08-28 | 2015-03-12 | ソニー株式会社 | Information processing apparatus, imaging apparatus, information processing method, and program |
US20150082236A1 (en) * | 2013-09-13 | 2015-03-19 | Sharp Kabushiki Kaisha | Information processing apparatus |
US20150103184A1 (en) * | 2013-10-15 | 2015-04-16 | Nvidia Corporation | Method and system for visual tracking of a subject for automatic metering using a mobile device |
US9767359B2 (en) * | 2013-11-22 | 2017-09-19 | Samsung Electronics Co., Ltd | Method for recognizing a specific object inside an image and electronic device thereof |
US10115015B2 (en) * | 2013-11-22 | 2018-10-30 | Samsung Electronics Co., Ltd | Method for recognizing a specific object inside an image and electronic device thereof |
US20150146925A1 (en) * | 2013-11-22 | 2015-05-28 | Samsung Electronics Co., Ltd. | Method for recognizing a specific object inside an image and electronic device thereof |
US11113523B2 (en) | 2013-11-22 | 2021-09-07 | Samsung Electronics Co., Ltd | Method for recognizing a specific object inside an image and electronic device thereof |
US20170351917A1 (en) * | 2013-11-22 | 2017-12-07 | Samsung Electronics Co., Ltd | Method for recognizing a specific object inside an image and electronic device thereof |
US9210321B2 (en) * | 2013-12-05 | 2015-12-08 | Here Global B.V. | Method and apparatus for a shutter animation for image capture |
US20150163399A1 (en) * | 2013-12-05 | 2015-06-11 | Here Global B.V. | Method and apparatus for a shutter animation for image capture |
US20150205457A1 (en) * | 2014-01-22 | 2015-07-23 | Lg Electronics Inc. | Mobile terminal and controlling method thereof |
US9874998B2 (en) * | 2014-01-22 | 2018-01-23 | Lg Electronics Inc. | Mobile terminal and controlling method thereof |
KR20150094289A (en) * | 2014-02-11 | 2015-08-19 | 삼성전자주식회사 | Photogrpaphing method of electronic apparatus and electronic apparatus thereof |
US9894275B2 (en) * | 2014-02-11 | 2018-02-13 | Samsung Electronics Co., Ltd. | Photographing method of an electronic device and the electronic device thereof |
KR102154802B1 (en) | 2014-02-11 | 2020-09-11 | 삼성전자주식회사 | Photogrpaphing method of electronic apparatus and electronic apparatus thereof |
US20150229849A1 (en) * | 2014-02-11 | 2015-08-13 | Samsung Electronics Co., Ltd. | Photographing method of an electronic device and the electronic device thereof |
CN106462772A (en) * | 2014-02-19 | 2017-02-22 | 河谷控股Ip有限责任公司 | Invariant-based dimensional reduction of object recognition features, systems and methods |
US10419660B2 (en) | 2014-05-13 | 2019-09-17 | Lg Electronics Inc. | Mobile terminal and method for controlling the same |
US10659678B2 (en) | 2014-05-13 | 2020-05-19 | Lg Electronics Inc. | Mobile terminal and method for controlling the same |
US9942469B2 (en) * | 2014-05-13 | 2018-04-10 | Lg Electronics Inc. | Mobile terminal and method for controlling the same |
US10863080B2 (en) | 2014-05-13 | 2020-12-08 | Lg Electronics Inc. | Mobile terminal and method for controlling the same |
US20170195555A1 (en) * | 2014-05-13 | 2017-07-06 | Lg Electronics Inc. | Mobile terminal and method for controlling the same |
US20150350587A1 (en) * | 2014-05-29 | 2015-12-03 | Samsung Electronics Co., Ltd. | Method of controlling display device and remote controller thereof |
CN104219518A (en) * | 2014-07-31 | 2014-12-17 | 小米科技有限责任公司 | Photometry method and device |
US9674454B2 (en) | 2014-07-31 | 2017-06-06 | Xiaomi Inc. | Light metering methods and devices |
US9454301B2 (en) * | 2014-08-01 | 2016-09-27 | Lg Electronics Inc. | Mobile terminal controlled by at least one touch and method of controlling therefor |
EP3175333A4 (en) * | 2014-08-01 | 2018-03-14 | LG Electronics Inc. | Mobile terminal controlled by at least one touch and method of controlling therefor |
US11042011B2 (en) | 2014-08-10 | 2021-06-22 | Corephotonics Ltd. | Zoom dual-aperture camera with folded lens |
US10976527B2 (en) | 2014-08-10 | 2021-04-13 | Corephotonics Ltd. | Zoom dual-aperture camera with folded lens |
US11002947B2 (en) | 2014-08-10 | 2021-05-11 | Corephotonics Ltd. | Zoom dual-aperture camera with folded lens |
US10571665B2 (en) | 2014-08-10 | 2020-02-25 | Corephotonics Ltd. | Zoom dual-aperture camera with folded lens |
US11543633B2 (en) | 2014-08-10 | 2023-01-03 | Corephotonics Ltd. | Zoom dual-aperture camera with folded lens |
US10509209B2 (en) | 2014-08-10 | 2019-12-17 | Corephotonics Ltd. | Zoom dual-aperture camera with folded lens |
US12007537B2 (en) | 2014-08-10 | 2024-06-11 | Corephotonics Lid. | Zoom dual-aperture camera with folded lens |
US11703668B2 (en) | 2014-08-10 | 2023-07-18 | Corephotonics Ltd. | Zoom dual-aperture camera with folded lens |
US11982796B2 (en) | 2014-08-10 | 2024-05-14 | Corephotonics Ltd. | Zoom dual-aperture camera with folded lens |
US11262559B2 (en) | 2014-08-10 | 2022-03-01 | Corephotonics Ltd | Zoom dual-aperture camera with folded lens |
US20160057230A1 (en) * | 2014-08-19 | 2016-02-25 | Hand Held Products, Inc. | Mobile computing device with data cognition software |
US12003584B2 (en) | 2014-08-19 | 2024-06-04 | Hand Held Products, Inc. | Mobile computing device with data cognition software |
US11546428B2 (en) * | 2014-08-19 | 2023-01-03 | Hand Held Products, Inc. | Mobile computing device with data cognition software |
US11579721B2 (en) | 2014-09-02 | 2023-02-14 | Apple Inc. | Displaying a representation of a user touch input detected by an external device |
US10200587B2 (en) | 2014-09-02 | 2019-02-05 | Apple Inc. | Remote camera user interface |
US10788927B2 (en) | 2014-09-02 | 2020-09-29 | Apple Inc. | Electronic communication based on user input and determination of active execution of application for playback |
US9706106B2 (en) * | 2014-09-05 | 2017-07-11 | Lg Electronics Inc. | Mobile terminal and method of controlling the same |
CN105407306A (en) * | 2014-09-05 | 2016-03-16 | 三星电子株式会社 | Digital Image Processing Method And Apparatus |
CN112672041A (en) * | 2014-09-05 | 2021-04-16 | 三星电子株式会社 | Image processing method and image processing apparatus |
US9984435B2 (en) * | 2014-09-05 | 2018-05-29 | Samsung Electronics Co., Ltd. | Digital image processing method, non-transitory computer-readable recording medium having recorded thereon a program for executing the digital image processing method, and digital image processing apparatus |
US20160255268A1 (en) * | 2014-09-05 | 2016-09-01 | Lg Electronics Inc. | Mobile terminal and method of controlling the same |
US20160080639A1 (en) * | 2014-09-15 | 2016-03-17 | Lg Electronics Inc. | Mobile terminal and control method thereof |
CN106210184A (en) * | 2014-09-15 | 2016-12-07 | Lg电子株式会社 | Mobile terminal and control method thereof |
US9955064B2 (en) * | 2014-09-15 | 2018-04-24 | Lg Electronics Inc. | Mobile terminal and control method for changing camera angles with respect to the fixed body |
US10419683B2 (en) | 2014-12-24 | 2019-09-17 | Canon Kabushiki Kaisha | Zoom control device, imaging apparatus, control method of zoom control device, and recording medium |
US20160191809A1 (en) * | 2014-12-24 | 2016-06-30 | Canon Kabushiki Kaisha | Zoom control device, imaging apparatus, control method of zoom control device, and recording medium |
US10015406B2 (en) * | 2014-12-24 | 2018-07-03 | Canon Kabushiki Kaisha | Zoom control device, imaging apparatus, control method of zoom control device, and recording medium |
US10288840B2 (en) | 2015-01-03 | 2019-05-14 | Corephotonics Ltd | Miniature telephoto lens module and a camera utilizing such a lens module |
US11994654B2 (en) | 2015-01-03 | 2024-05-28 | Corephotonics Ltd. | Miniature telephoto lens module and a camera utilizing such a lens module |
US11125975B2 (en) | 2015-01-03 | 2021-09-21 | Corephotonics Ltd. | Miniature telephoto lens module and a camera utilizing such a lens module |
US9888172B2 (en) * | 2015-02-27 | 2018-02-06 | Google Llc | Systems and methods for capturing images from a lock screen |
US20170094162A1 (en) * | 2015-02-27 | 2017-03-30 | Google Inc. | Systems and methods for capturing images from a lock screen |
US10122917B2 (en) | 2015-02-27 | 2018-11-06 | Google Llc | Systems and methods for capturing images from a lock screen |
US9641743B2 (en) | 2015-03-06 | 2017-05-02 | Sony Corporation | System, method, and apparatus for controlling timer operations of a camera |
WO2016142757A1 (en) * | 2015-03-06 | 2016-09-15 | Sony Corporation | Method for independently determining exposure and focus settings of a digital camera |
EP3065393A1 (en) * | 2015-03-06 | 2016-09-07 | Sony Corporation | System, method, and apparatus for controlling camera operations |
US11977726B2 (en) | 2015-03-08 | 2024-05-07 | Apple Inc. | Devices, methods, and graphical user interfaces for interacting with a control object while dragging another object |
US11921975B2 (en) | 2015-03-08 | 2024-03-05 | Apple Inc. | Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback |
US11550471B2 (en) | 2015-03-19 | 2023-01-10 | Apple Inc. | Touch input cursor manipulation |
US20200175298A1 (en) * | 2015-03-27 | 2020-06-04 | Nec Corporation | Mobile surveillance apparatus, program, and control method |
US11644968B2 (en) | 2015-03-27 | 2023-05-09 | Nec Corporation | Mobile surveillance apparatus, program, and control method |
US10288897B2 (en) | 2015-04-02 | 2019-05-14 | Corephotonics Ltd. | Dual voice coil motor structure in a dual-optical module camera |
US10558058B2 (en) | 2015-04-02 | 2020-02-11 | Corephontonics Ltd. | Dual voice coil motor structure in a dual-optical module camera |
US10571666B2 (en) | 2015-04-16 | 2020-02-25 | Corephotonics Ltd. | Auto focus and optical image stabilization in a compact folded camera |
US10371928B2 (en) | 2015-04-16 | 2019-08-06 | Corephotonics Ltd | Auto focus and optical image stabilization in a compact folded camera |
US10459205B2 (en) | 2015-04-16 | 2019-10-29 | Corephotonics Ltd | Auto focus and optical image stabilization in a compact folded camera |
US10613303B2 (en) | 2015-04-16 | 2020-04-07 | Corephotonics Ltd. | Auto focus and optical image stabilization in a compact folded camera |
US10962746B2 (en) | 2015-04-16 | 2021-03-30 | Corephotonics Ltd. | Auto focus and optical image stabilization in a compact folded camera |
US10656396B1 (en) | 2015-04-16 | 2020-05-19 | Corephotonics Ltd. | Auto focus and optical image stabilization in a compact folded camera |
US11808925B2 (en) | 2015-04-16 | 2023-11-07 | Corephotonics Ltd. | Auto focus and optical image stabilization in a compact folded camera |
US11711614B2 (en) | 2015-04-23 | 2023-07-25 | Apple Inc. | Digital viewfinder user interface for multiple cameras |
US10122931B2 (en) | 2015-04-23 | 2018-11-06 | Apple Inc. | Digital viewfinder user interface for multiple cameras |
US11490017B2 (en) | 2015-04-23 | 2022-11-01 | Apple Inc. | Digital viewfinder user interface for multiple cameras |
US11102414B2 (en) | 2015-04-23 | 2021-08-24 | Apple Inc. | Digital viewfinder user interface for multiple cameras |
US10616490B2 (en) | 2015-04-23 | 2020-04-07 | Apple Inc. | Digital viewfinder user interface for multiple cameras |
US10379371B2 (en) | 2015-05-28 | 2019-08-13 | Corephotonics Ltd | Bi-directional stiffness for optical image stabilization in a dual-aperture digital camera |
US10670879B2 (en) | 2015-05-28 | 2020-06-02 | Corephotonics Ltd. | Bi-directional stiffness for optical image stabilization in a dual-aperture digital camera |
US11835985B2 (en) | 2015-06-07 | 2023-12-05 | Apple Inc. | Devices and methods for capturing and interacting with enhanced digital images |
US11681429B2 (en) | 2015-06-07 | 2023-06-20 | Apple Inc. | Devices and methods for capturing and interacting with enhanced digital images |
US10372022B2 (en) | 2015-06-24 | 2019-08-06 | Corephotonics Ltd | Low profile tri-axis actuator for folded lens camera |
US9860448B2 (en) | 2015-07-27 | 2018-01-02 | Samsung Electronics Co., Ltd. | Method and electronic device for stabilizing video |
WO2017018612A1 (en) * | 2015-07-27 | 2017-02-02 | Samsung Electronics Co., Ltd. | Method and electronic device for stabilizing video |
JP2018530177A (en) * | 2015-07-31 | 2018-10-11 | ソニー株式会社 | Method and system for assisting a user in capturing an image or video |
WO2017023620A1 (en) | 2015-07-31 | 2017-02-09 | Sony Corporation | Method and system to assist a user to capture an image or video |
CN107710736A (en) * | 2015-07-31 | 2018-02-16 | 索尼公司 | Aid in the method and system of user's capture images or video |
EP3304884A4 (en) * | 2015-07-31 | 2018-12-26 | Sony Corporation | Method and system to assist a user to capture an image or video |
US11740785B2 (en) | 2015-08-10 | 2023-08-29 | Apple Inc. | Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback |
US11770616B2 (en) | 2015-08-13 | 2023-09-26 | Corephotonics Ltd. | Dual aperture zoom camera with video support and switching / non-switching dynamic control |
US11350038B2 (en) | 2015-08-13 | 2022-05-31 | Corephotonics Ltd. | Dual aperture zoom camera with video support and switching / non-switching dynamic control |
US11546518B2 (en) | 2015-08-13 | 2023-01-03 | Corephotonics Ltd. | Dual aperture zoom camera with video support and switching / non-switching dynamic control |
US10567666B2 (en) | 2015-08-13 | 2020-02-18 | Corephotonics Ltd. | Dual aperture zoom camera with video support and switching / non-switching dynamic control |
US10356332B2 (en) | 2015-08-13 | 2019-07-16 | Corephotonics Ltd. | Dual aperture zoom camera with video support and switching / non-switching dynamic control |
US12022196B2 (en) | 2015-08-13 | 2024-06-25 | Corephotonics Ltd. | Dual aperture zoom camera with video support and switching / non-switching dynamic control |
US10917576B2 (en) | 2015-08-13 | 2021-02-09 | Corephotonics Ltd. | Dual aperture zoom camera with video support and switching / non-switching dynamic control |
US20170058619A1 (en) * | 2015-08-24 | 2017-03-02 | Texas International Oilfield Tools, LLC | Actuator, Elevator with Actuator, and Methods of Use |
US20170064192A1 (en) * | 2015-09-02 | 2017-03-02 | Canon Kabushiki Kaisha | Video Processing Apparatus, Control Method, and Recording Medium |
US10205869B2 (en) * | 2015-09-02 | 2019-02-12 | Canon Kabushiki Kaisha | Video processing apparatus, control method, and recording medium |
US10498961B2 (en) | 2015-09-06 | 2019-12-03 | Corephotonics Ltd. | Auto focus and optical image stabilization with roll compensation in a compact folded camera |
US10284780B2 (en) | 2015-09-06 | 2019-05-07 | Corephotonics Ltd. | Auto focus and optical image stabilization with roll compensation in a compact folded camera |
US20170085784A1 (en) * | 2015-09-17 | 2017-03-23 | Fu Tai Hua Industry (Shenzhen) Co., Ltd. | Method for image capturing and an electronic device using the method |
US20180181275A1 (en) * | 2015-09-22 | 2018-06-28 | Samsung Electronics Co., Ltd. | Electronic device and photographing method |
EP3352449A4 (en) * | 2015-09-22 | 2018-12-05 | Samsung Electronics Co., Ltd. | Electronic device and photographing method |
US10503390B2 (en) * | 2015-09-22 | 2019-12-10 | Samsung Electronics Co., Ltd. | Electronic device and photographing method |
US20180288310A1 (en) * | 2015-10-19 | 2018-10-04 | Corephotonics Ltd. | Dual-aperture zoom digital camera user interface |
US20170123550A1 (en) * | 2015-10-29 | 2017-05-04 | Samsung Electronics Co., Ltd | Electronic device and method for providing user interaction based on force touch |
CN107066195A (en) * | 2015-10-29 | 2017-08-18 | 三星电子株式会社 | Electronic equipment and method for providing the user mutual based on pressure sensitivity touch-control |
US11599007B2 (en) | 2015-12-29 | 2023-03-07 | Corephotonics Ltd. | Dual-aperture zoom digital camera with automatic adjustable tele field of view |
US10578948B2 (en) | 2015-12-29 | 2020-03-03 | Corephotonics Ltd. | Dual-aperture zoom digital camera with automatic adjustable tele field of view |
US10935870B2 (en) | 2015-12-29 | 2021-03-02 | Corephotonics Ltd. | Dual-aperture zoom digital camera with automatic adjustable tele field of view |
US11314146B2 (en) | 2015-12-29 | 2022-04-26 | Corephotonics Ltd. | Dual-aperture zoom digital camera with automatic adjustable tele field of view |
US11726388B2 (en) | 2015-12-29 | 2023-08-15 | Corephotonics Ltd. | Dual-aperture zoom digital camera with automatic adjustable tele field of view |
US11392009B2 (en) | 2015-12-29 | 2022-07-19 | Corephotonics Ltd. | Dual-aperture zoom digital camera with automatic adjustable tele field of view |
WO2017151222A1 (en) * | 2016-03-02 | 2017-09-08 | Qualcomm Incorporated | Irregular-region based automatic image correction |
US9838594B2 (en) | 2016-03-02 | 2017-12-05 | Qualcomm Incorporated | Irregular-region based automatic image correction |
US9871962B2 (en) | 2016-03-04 | 2018-01-16 | RollCall, LLC | Movable user interface shutter button for camera |
CN109155821A (en) * | 2016-03-04 | 2019-01-04 | 罗尔卡尔有限责任公司 | The mobile user interface shutter release button of camera |
WO2017149517A3 (en) * | 2016-03-04 | 2018-08-23 | RollCall, LLC | Movable user interface shutter button for camera |
CN105607860A (en) * | 2016-03-07 | 2016-05-25 | 上海斐讯数据通信技术有限公司 | Drag-photographing method and system |
US20190114022A1 (en) * | 2016-03-24 | 2019-04-18 | Hideep Inc. | Mobile terminal capable of easily capturing image and image capturing method |
EP3427472A4 (en) * | 2016-04-14 | 2019-04-03 | Samsung Electronics Co., Ltd. | Image capturing method and electronic device supporting the same |
US11977210B2 (en) | 2016-05-30 | 2024-05-07 | Corephotonics Ltd. | Rotational ball-guided voice coil motor |
US11650400B2 (en) | 2016-05-30 | 2023-05-16 | Corephotonics Ltd. | Rotational ball-guided voice coil motor |
US10488631B2 (en) | 2016-05-30 | 2019-11-26 | Corephotonics Ltd. | Rotational ball-guided voice coil motor |
US11150447B2 (en) | 2016-05-30 | 2021-10-19 | Corephotonics Ltd. | Rotational ball-guided voice coil motor |
US11165949B2 (en) | 2016-06-12 | 2021-11-02 | Apple Inc. | User interface for capturing photos with different camera magnifications |
US11641517B2 (en) | 2016-06-12 | 2023-05-02 | Apple Inc. | User interface for camera effects |
EP3979060A1 (en) * | 2016-06-12 | 2022-04-06 | Apple Inc. | Digital touch on live video |
WO2018004967A1 (en) * | 2016-06-12 | 2018-01-04 | Apple Inc. | Digital touch on live video |
US11245837B2 (en) | 2016-06-12 | 2022-02-08 | Apple Inc. | User interface for camera effects |
US11962889B2 (en) | 2016-06-12 | 2024-04-16 | Apple Inc. | User interface for camera effects |
US10602053B2 (en) | 2016-06-12 | 2020-03-24 | Apple Inc. | User interface for camera effects |
US10136048B2 (en) | 2016-06-12 | 2018-11-20 | Apple Inc. | User interface for camera effects |
CN107924113A (en) * | 2016-06-12 | 2018-04-17 | 苹果公司 | User interface for camera effect |
US10656826B2 (en) | 2016-06-15 | 2020-05-19 | Casio Computer Co., Ltd. | Output control apparatus for controlling output of contents, output control method, and storage medium |
CN107526460A (en) * | 2016-06-15 | 2017-12-29 | 卡西欧计算机株式会社 | Output-controlling device, output control method and storage medium |
US10616484B2 (en) | 2016-06-19 | 2020-04-07 | Corephotonics Ltd. | Frame syncrhonization in a dual-aperture camera system |
US11172127B2 (en) | 2016-06-19 | 2021-11-09 | Corephotonics Ltd. | Frame synchronization in a dual-aperture camera system |
US11689803B2 (en) | 2016-06-19 | 2023-06-27 | Corephotonics Ltd. | Frame synchronization in a dual-aperture camera system |
US10706518B2 (en) | 2016-07-07 | 2020-07-07 | Corephotonics Ltd. | Dual camera system with improved video smooth transition by image blending |
US11977270B2 (en) | 2016-07-07 | 2024-05-07 | Corephotonics Lid. | Linear ball guided voice coil motor for folded optic |
US11048060B2 (en) | 2016-07-07 | 2021-06-29 | Corephotonics Ltd. | Linear ball guided voice coil motor for folded optic |
US11550119B2 (en) | 2016-07-07 | 2023-01-10 | Corephotonics Ltd. | Linear ball guided voice coil motor for folded optic |
US10845565B2 (en) | 2016-07-07 | 2020-11-24 | Corephotonics Ltd. | Linear ball guided voice coil motor for folded optic |
EP3270582A1 (en) * | 2016-07-11 | 2018-01-17 | Samsung Electronics Co., Ltd | Object or area based focus control in video |
US10477096B2 (en) | 2016-07-11 | 2019-11-12 | Samsung Electronics Co., Ltd. | Object or area based focus control in video |
US20190278996A1 (en) * | 2016-12-26 | 2019-09-12 | Ns Solutions Corporation | Information processing device, system, information processing method, and storage medium |
US10755100B2 (en) * | 2016-12-26 | 2020-08-25 | Ns Solutions Corporation | Information processing device, system, information processing method, and storage medium |
US11531209B2 (en) | 2016-12-28 | 2022-12-20 | Corephotonics Ltd. | Folded camera structure with an extended light-folding-element scanning range |
US12092841B2 (en) | 2016-12-28 | 2024-09-17 | Corephotonics Ltd. | Folded camera structure with an extended light-folding-element scanning range |
US12038671B2 (en) | 2017-01-12 | 2024-07-16 | Corephotonics Ltd. | Compact folded camera |
US10884321B2 (en) | 2017-01-12 | 2021-01-05 | Corephotonics Ltd. | Compact folded camera |
US11693297B2 (en) | 2017-01-12 | 2023-07-04 | Corephotonics Ltd. | Compact folded camera |
US11809065B2 (en) | 2017-01-12 | 2023-11-07 | Corephotonics Ltd. | Compact folded camera |
US11815790B2 (en) | 2017-01-12 | 2023-11-14 | Corephotonics Ltd. | Compact folded camera |
US10670827B2 (en) | 2017-02-23 | 2020-06-02 | Corephotonics Ltd. | Folded camera lens designs |
US10534153B2 (en) | 2017-02-23 | 2020-01-14 | Corephotonics Ltd. | Folded camera lens designs |
US10571644B2 (en) | 2017-02-23 | 2020-02-25 | Corephotonics Ltd. | Folded camera lens designs |
US11671711B2 (en) | 2017-03-15 | 2023-06-06 | Corephotonics Ltd. | Imaging system with panoramic scanning range |
US10645286B2 (en) | 2017-03-15 | 2020-05-05 | Corephotonics Ltd. | Camera with panoramic scanning range |
US11687224B2 (en) | 2017-06-04 | 2023-06-27 | Apple Inc. | User interface camera effects |
US11204692B2 (en) | 2017-06-04 | 2021-12-21 | Apple Inc. | User interface camera effects |
EP3611914A4 (en) * | 2017-06-09 | 2020-03-25 | Huawei Technologies Co., Ltd. | Method and apparatus for photographing image |
US11425309B2 (en) | 2017-06-09 | 2022-08-23 | Huawei Technologies Co., Ltd. | Image capture method and apparatus |
US10645278B2 (en) | 2017-06-29 | 2020-05-05 | Canon Kabushiki Kaisha | Imaging control apparatus and control method therefor |
GB2565634A (en) * | 2017-06-29 | 2019-02-20 | Canon Kk | Imaging control apparatus and control method therefor |
GB2565634B (en) * | 2017-06-29 | 2020-12-02 | Canon Kk | Imaging control apparatus and control method therefor |
US10904512B2 (en) | 2017-09-06 | 2021-01-26 | Corephotonics Ltd. | Combined stereoscopic and phase detection depth mapping in a dual aperture camera |
US10951834B2 (en) | 2017-10-03 | 2021-03-16 | Corephotonics Ltd. | Synthetically enlarged camera aperture |
US11695896B2 (en) | 2017-10-03 | 2023-07-04 | Corephotonics Ltd. | Synthetically enlarged camera aperture |
US11619864B2 (en) | 2017-11-23 | 2023-04-04 | Corephotonics Ltd. | Compact folded camera structure |
US12007672B2 (en) | 2017-11-23 | 2024-06-11 | Corephotonics Ltd. | Compact folded camera structure |
US11333955B2 (en) | 2017-11-23 | 2022-05-17 | Corephotonics Ltd. | Compact folded camera structure |
US11809066B2 (en) | 2017-11-23 | 2023-11-07 | Corephotonics Ltd. | Compact folded camera structure |
US11686952B2 (en) | 2018-02-05 | 2023-06-27 | Corephotonics Ltd. | Reduced height penalty for folded camera |
US10976567B2 (en) | 2018-02-05 | 2021-04-13 | Corephotonics Ltd. | Reduced height penalty for folded camera |
US12007582B2 (en) | 2018-02-05 | 2024-06-11 | Corephotonics Ltd. | Reduced height penalty for folded camera |
US11977731B2 (en) | 2018-02-09 | 2024-05-07 | Apple Inc. | Media capture lock affordance for graphical user interface |
US11112964B2 (en) | 2018-02-09 | 2021-09-07 | Apple Inc. | Media capture lock affordance for graphical user interface |
US11640047B2 (en) | 2018-02-12 | 2023-05-02 | Corephotonics Ltd. | Folded camera with optical image stabilization |
WO2019183900A1 (en) * | 2018-03-29 | 2019-10-03 | 深圳市大疆创新科技有限公司 | Photographing control method, photographing control apparatus, photographing system, and storage medium |
US10911740B2 (en) | 2018-04-22 | 2021-02-02 | Corephotonics Ltd. | System and method for mitigating or preventing eye damage from structured light IR/NIR projector systems |
US10694168B2 (en) | 2018-04-22 | 2020-06-23 | Corephotonics Ltd. | System and method for mitigating or preventing eye damage from structured light IR/NIR projector systems |
US11268829B2 (en) | 2018-04-23 | 2022-03-08 | Corephotonics Ltd | Optical-path folding-element with an extended two degree of freedom rotation range |
US11359937B2 (en) | 2018-04-23 | 2022-06-14 | Corephotonics Ltd. | Optical-path folding-element with an extended two degree of freedom rotation range |
US11976949B2 (en) | 2018-04-23 | 2024-05-07 | Corephotonics Lid. | Optical-path folding-element with an extended two degree of freedom rotation range |
US11867535B2 (en) | 2018-04-23 | 2024-01-09 | Corephotonics Ltd. | Optical-path folding-element with an extended two degree of freedom rotation range |
US12085421B2 (en) | 2018-04-23 | 2024-09-10 | Corephotonics Ltd. | Optical-path folding-element with an extended two degree of freedom rotation range |
US11733064B1 (en) | 2018-04-23 | 2023-08-22 | Corephotonics Ltd. | Optical-path folding-element with an extended two degree of freedom rotation range |
US11268830B2 (en) | 2018-04-23 | 2022-03-08 | Corephotonics Ltd | Optical-path folding-element with an extended two degree of freedom rotation range |
US10523879B2 (en) | 2018-05-07 | 2019-12-31 | Apple Inc. | Creative camera |
US11722764B2 (en) | 2018-05-07 | 2023-08-08 | Apple Inc. | Creative camera |
US11178335B2 (en) | 2018-05-07 | 2021-11-16 | Apple Inc. | Creative camera |
US11157725B2 (en) | 2018-06-27 | 2021-10-26 | Facebook Technologies, Llc | Gesture-based casting and manipulation of virtual content in artificial-reality environments |
US11363180B2 (en) | 2018-08-04 | 2022-06-14 | Corephotonics Ltd. | Switchable continuous display information system above camera |
US11852790B2 (en) | 2018-08-22 | 2023-12-26 | Corephotonics Ltd. | Two-state zoom folded camera |
US11635596B2 (en) | 2018-08-22 | 2023-04-25 | Corephotonics Ltd. | Two-state zoom folded camera |
US11468625B2 (en) | 2018-09-11 | 2022-10-11 | Apple Inc. | User interfaces for simulated depth effects |
US11128792B2 (en) | 2018-09-28 | 2021-09-21 | Apple Inc. | Capturing and displaying images with multiple focal planes |
US11895391B2 (en) | 2018-09-28 | 2024-02-06 | Apple Inc. | Capturing and displaying images with multiple focal planes |
US11321857B2 (en) | 2018-09-28 | 2022-05-03 | Apple Inc. | Displaying and editing images with depth information |
US11669985B2 (en) | 2018-09-28 | 2023-06-06 | Apple Inc. | Displaying and editing images with depth information |
US11287081B2 (en) | 2019-01-07 | 2022-03-29 | Corephotonics Ltd. | Rotation mechanism with sliding joint |
US12025260B2 (en) | 2019-01-07 | 2024-07-02 | Corephotonics Ltd. | Rotation mechanism with sliding joint |
CN109960406A (en) * | 2019-03-01 | 2019-07-02 | 清华大学 | Based on the intelligent electronic device gesture capture acted between both hands finger and identification technology |
US11315276B2 (en) | 2019-03-09 | 2022-04-26 | Corephotonics Ltd. | System and method for dynamic stereoscopic calibration |
US11527006B2 (en) | 2019-03-09 | 2022-12-13 | Corephotonics Ltd. | System and method for dynamic stereoscopic calibration |
EP3839715A1 (en) * | 2019-05-06 | 2021-06-23 | Apple Inc. | User interfaces for capturing and managing visual media |
US10791273B1 (en) | 2019-05-06 | 2020-09-29 | Apple Inc. | User interfaces for capturing and managing visual media |
US10652470B1 (en) | 2019-05-06 | 2020-05-12 | Apple Inc. | User interfaces for capturing and managing visual media |
US11223771B2 (en) | 2019-05-06 | 2022-01-11 | Apple Inc. | User interfaces for capturing and managing visual media |
US11770601B2 (en) | 2019-05-06 | 2023-09-26 | Apple Inc. | User interfaces for capturing and managing visual media |
EP3796142A1 (en) * | 2019-05-06 | 2021-03-24 | Apple Inc. | User interfaces for capturing and managing visual media |
US10645294B1 (en) | 2019-05-06 | 2020-05-05 | Apple Inc. | User interfaces for capturing and managing visual media |
US10681282B1 (en) | 2019-05-06 | 2020-06-09 | Apple Inc. | User interfaces for capturing and managing visual media |
US11706521B2 (en) | 2019-05-06 | 2023-07-18 | Apple Inc. | User interfaces for capturing and managing visual media |
US10674072B1 (en) | 2019-05-06 | 2020-06-02 | Apple Inc. | User interfaces for capturing and managing visual media |
US10735643B1 (en) | 2019-05-06 | 2020-08-04 | Apple Inc. | User interfaces for capturing and managing visual media |
US10735642B1 (en) | 2019-05-06 | 2020-08-04 | Apple Inc. | User interfaces for capturing and managing visual media |
JP7301633B2 (en) | 2019-07-01 | 2023-07-03 | キヤノン株式会社 | OPTICAL DEVICE, CONTROL METHOD OF OPTICAL DEVICE, AND PROGRAM |
JP2021009223A (en) * | 2019-07-01 | 2021-01-28 | キヤノン株式会社 | Control unit, optical device, control method, and program |
US11368631B1 (en) | 2019-07-31 | 2022-06-21 | Corephotonics Ltd. | System and method for creating background blur in camera panning or motion |
US11659135B2 (en) | 2019-10-30 | 2023-05-23 | Corephotonics Ltd. | Slow or fast motion video using depth information |
US11949976B2 (en) | 2019-12-09 | 2024-04-02 | Corephotonics Ltd. | Systems and methods for obtaining a smart panoramic image |
US12075151B2 (en) | 2019-12-09 | 2024-08-27 | Corephotonics Ltd. | Systems and methods for obtaining a smart panoramic image |
US11770618B2 (en) | 2019-12-09 | 2023-09-26 | Corephotonics Ltd. | Systems and methods for obtaining a smart panoramic image |
US12007668B2 (en) | 2020-02-22 | 2024-06-11 | Corephotonics Ltd. | Split screen feature for macro photography |
US11693064B2 (en) | 2020-04-26 | 2023-07-04 | Corephotonics Ltd. | Temperature control for Hall bar sensor correction |
US11832018B2 (en) | 2020-05-17 | 2023-11-28 | Corephotonics Ltd. | Image stitching in the presence of a full field of view reference image |
US12096150B2 (en) | 2020-05-17 | 2024-09-17 | Corephotonics Ltd. | Image stitching in the presence of a full field of view reference image |
US11962901B2 (en) | 2020-05-30 | 2024-04-16 | Corephotonics Ltd. | Systems and methods for obtaining a super macro image |
US11770609B2 (en) | 2020-05-30 | 2023-09-26 | Corephotonics Ltd. | Systems and methods for obtaining a super macro image |
US11617022B2 (en) | 2020-06-01 | 2023-03-28 | Apple Inc. | User interfaces for managing media |
US11330184B2 (en) | 2020-06-01 | 2022-05-10 | Apple Inc. | User interfaces for managing media |
US12081862B2 (en) | 2020-06-01 | 2024-09-03 | Apple Inc. | User interfaces for managing media |
US11054973B1 (en) | 2020-06-01 | 2021-07-06 | Apple Inc. | User interfaces for managing media |
US11832008B2 (en) | 2020-07-15 | 2023-11-28 | Corephotonics Ltd. | Image sensors and sensing methods to obtain time-of-flight and phase detection information |
US11910089B2 (en) | 2020-07-15 | 2024-02-20 | Corephotonics Lid. | Point of view aberrations correction in a scanning folded camera |
US12003874B2 (en) | 2020-07-15 | 2024-06-04 | Corephotonics Ltd. | Image sensors and sensing methods to obtain Time-of-Flight and phase detection information |
US11637977B2 (en) | 2020-07-15 | 2023-04-25 | Corephotonics Ltd. | Image sensors and sensing methods to obtain time-of-flight and phase detection information |
US11946775B2 (en) | 2020-07-31 | 2024-04-02 | Corephotonics Ltd. | Hall sensor—magnet geometry for large stroke linear position sensing |
US11968453B2 (en) | 2020-08-12 | 2024-04-23 | Corephotonics Ltd. | Optical image stabilization in a scanning folded camera |
US11212449B1 (en) | 2020-09-25 | 2021-12-28 | Apple Inc. | User interfaces for media capture and management |
WO2022126738A1 (en) * | 2020-12-14 | 2022-06-23 | 安徽鸿程光电有限公司 | Image acquisition method and apparatus, and device and medium |
US12081856B2 (en) | 2021-03-11 | 2024-09-03 | Corephotonics Lid. | Systems for pop-out camera |
US11350026B1 (en) | 2021-04-30 | 2022-05-31 | Apple Inc. | User interfaces for altering visual media |
US11416134B1 (en) | 2021-04-30 | 2022-08-16 | Apple Inc. | User interfaces for altering visual media |
US11418699B1 (en) | 2021-04-30 | 2022-08-16 | Apple Inc. | User interfaces for altering visual media |
US11778339B2 (en) | 2021-04-30 | 2023-10-03 | Apple Inc. | User interfaces for altering visual media |
US11539876B2 (en) | 2021-04-30 | 2022-12-27 | Apple Inc. | User interfaces for altering visual media |
US12007671B2 (en) | 2021-06-08 | 2024-06-11 | Corephotonics Ltd. | Systems and cameras for tilting a focal plane of a super-macro image |
WO2023101881A1 (en) * | 2021-12-03 | 2023-06-08 | Apple Inc. | Devices, methods, and graphical user interfaces for capturing and displaying media |
US12101575B2 (en) | 2021-12-24 | 2024-09-24 | Corephotonics Ltd. | Video support in a multi-aperture mobile camera with a scanning zoom camera |
US12108151B2 (en) | 2022-12-11 | 2024-10-01 | Corephotonics Ltd. | Point of view aberrations correction in a scanning folded camera |
US12101567B2 (en) | 2023-07-31 | 2024-09-24 | Apple Inc. | User interfaces for altering visual media |
US12105267B2 (en) | 2023-08-09 | 2024-10-01 | Corephotonics Ltd. | Auto focus and optical image stabilization in a compact folded camera |
US12105268B2 (en) | 2024-04-28 | 2024-10-01 | Corephotonics Ltd. | Zoom dual-aperture camera with folded lens |
Also Published As
Publication number | Publication date |
---|---|
US20130063644A1 (en) | 2013-03-14 |
US8670060B2 (en) | 2014-03-11 |
US8237807B2 (en) | 2012-08-07 |
US20180316849A1 (en) | 2018-11-01 |
US20140152883A1 (en) | 2014-06-05 |
US20170078564A1 (en) | 2017-03-16 |
US10341553B2 (en) | 2019-07-02 |
US10057481B2 (en) | 2018-08-21 |
US20100020222A1 (en) | 2010-01-28 |
US9544495B2 (en) | 2017-01-10 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20100020221A1 (en) | Camera Interface in a Portable Handheld Electronic Device | |
JP7169383B2 (en) | Capture and user interface using night mode processing | |
KR101373333B1 (en) | Portable terminal having touch sensing based image photographing function and image photographing method therefor | |
US10148886B2 (en) | Method for photographing control and electronic device thereof | |
KR101799223B1 (en) | Realtime capture exposure adjust gestures | |
US20120120277A1 (en) | Multi-point Touch Focus | |
AU2014221568B2 (en) | Apparatus and method for positioning image area using image sensor location | |
CN103945113B (en) | The method and apparatus for shooting of portable terminal | |
KR101690786B1 (en) | Device and method for performing multi-tasking | |
JP5743847B2 (en) | Mobile terminal and low sensitivity area setting program | |
WO2012128835A1 (en) | Gesture-based configuration of image processing techniques | |
US20140022433A1 (en) | Dynamic region of interest adaptation and image capture device providing same | |
CA2820575A1 (en) | Dynamic region of interest adaptation and image capture device providing same | |
EP2887648B1 (en) | Method of performing previewing and electronic device for implementing the same | |
KR20110006243A (en) | Apparatus and method for manual focusing in portable terminal | |
US20120306786A1 (en) | Display apparatus and method | |
KR20130092196A (en) | Apparatus and method for dispalying shutter key of a camera | |
US9582179B2 (en) | Apparatus and method for editing image in portable terminal | |
KR20140019215A (en) | Camera cursor system | |
KR20130116013A (en) | Camera apparatus and method for controlling thereof | |
WO2022156774A1 (en) | Focusing method and apparatus, electronic device, and medium | |
KR101324809B1 (en) | Mobile terminal and controlling method thereof | |
KR20140088669A (en) | Apparatus and method for shooting a image in device hamving a camera | |
KR101465653B1 (en) | Mobile terminal apparatus and control method using semitransparency toolbar | |
KR20120040031A (en) | Photographing potable device having optical pointing device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |