WO2012110809A1 - Display apparatus and method therefor - Google Patents
Display apparatus and method therefor Download PDFInfo
- Publication number
- WO2012110809A1 WO2012110809A1 PCT/GB2012/050337 GB2012050337W WO2012110809A1 WO 2012110809 A1 WO2012110809 A1 WO 2012110809A1 GB 2012050337 W GB2012050337 W GB 2012050337W WO 2012110809 A1 WO2012110809 A1 WO 2012110809A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- hand
- control apparatus
- display
- operative
- held control
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0346—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
Definitions
- the present invention relates to display apparatus, which comprises hand-held control apparatus and a display arrangement comprising a display surface, and a method of controlling a display arrangement comprising a display surface by way of hand-held control apparatus.
- An electronic whiteboard typically comprises projector, a computer, such as a Personal Computer (PC), a display surface and a marker pen like device.
- the computer is operative to provide image data to the projector, which is operative to project an image based on the image data onto the display surface.
- the marker pen like device is held by a user and moved over the display surface to perform functions, such as control of what is displayed on the display surface by selecting an appropriate icon displayed as part of the image or annotation of the displayed image.
- a recent electronic whiteboard apparatus is described in WO 2010/129102.
- a pen like device emits acoustic signals which are detected by an ultrasonic detector located beside the display surface with time of flight calculations being used to determine the location of the pen like device on the display surface.
- the Wiimote Whiteboard is another example, in which a Wiimote located beside the display surface is used to determine the location of an infra-red pen on the display surface.
- the present inventor has become appreciative of shortcomings in known electronic whiteboard apparatus, such as the apparatus of WO 2010/129102 and the Wiimote Whiteboard.
- the present invention has been devised in the light of the appreciation of such shortcomings. It is therefore an object for the present invention to provide a display apparatus comprising hand-held control apparatus and a display arrangement comprising a display surface, in which the hand-held control apparatus is operative to determine a position of a control point on the display surface.
- display apparatus comprising hand-held control apparatus and a display arrangement comprising a display surface
- the hand-held control apparatus comprising: a chassis; and position determining apparatus mounted on the chassis, the position determining apparatus being configured to determine of itself a change in position of the hand-held control apparatus in at least two dimensions,
- the display apparatus being configured to determine: in dependence on operation of the hand-held control apparatus, a control apparatus reference point at a location spaced apart from the display surface;
- a position of a control point on the display surface in relation to a display surface reference point on the display surface in dependence on the determined relative position of the operative location, a position of a control point on the display surface in relation to a display surface reference point on the display surface.
- the position determining apparatus determines of itself a change in position of the hand-held control apparatus in at least two dimensions.
- the position determining apparatus determines the change in position of itself.
- the position determining apparatus may determine the change in position without the assistance of further apparatus which is external to the hand-held apparatus and which is operative to provide an external reference.
- the position determining apparatus determines the change in position by itself.
- the apparatus of WO 2010/129102 and the Wiimote Whiteboard has hand-held control apparatus that relies on external apparatus which is operative to provide an external reference. More specifically, the apparatus of WO 2010/129102 relies on external apparatus in the form of an ultrasonic detector located beside the display surface and the Wiimote Whiteboard relies on external apparatus in the form of a Wiimote located beside the display surface.
- hand held apparatus comprising a Global Positioning System (GPS) receiver is operative to determine its position with the assistance of signals transmitted from plural satellites whereas the position determining apparatus of the present invention is operative by itself.
- GPS Global Positioning System
- an optical mouse for a personal computer is operative to determine a change in position of the optical mouse by way of light reflected from a surface, such as a mouse mat.
- the surface constitutes external apparatus which is operative to provide a reference for deternnining a change in position of the mouse.
- the present invention requires no such reference providing external apparatus.
- the determination of the position of the control point on the display surface in dependence on the display surface and control apparatus reference points and position determining apparatus which is operative to determine of itself a change in position, enables the hand-held control apparatus to be used spaced apart from the display surface.
- the hand-held control apparatus may be employed by a user facing away from the display surface.
- the apparatus of WO 2010/129102 and the Wiimote Whiteboard require the hand-held control apparatus to be used on or proximate the display surface and such that the hand-held control apparatus is within range of the external apparatus.
- configuration of the display apparatus according to the present invention to determine the changed position of the operative location on the hand- held control apparatus in first and second orthogonal directions relative to the control apparatus reference point and despite movement of the operative location in a third direction which is orthogonal to each of the first and second directions enables the hand-held control apparatus to be used in free space, i.e. without movement of the hand-held control apparatus being constrained, e.g. by supporting the hand-held control apparatus or the hand or arm holding the hand-held control apparatus on a surface.
- the hand-held control apparatus may be used afar from the display surface in a virtual plane generally parallel to the display surface or even in a virtual plane diverging from the plane of the display surface, such as in a classroom where the display surface is at the front of the classroom and the teacher operates the hand-held control apparatus from the rear of the classroom.
- the display apparatus is operative to determine a control apparatus reference point at a location spaced apart from the display surface.
- the control apparatus reference point is determined in dependence on operation of the hand-held control apparatus, e.g. as a consequence of actuation of a user operable input, such as a push button, on the hand-held control apparatus.
- the control apparatus reference point may be operative as an internally generated reference for determination of at least one of an extent and a direction of movement of the operative location.
- the internal reference may thus take the place of an external reference provided by the prior art examples given above.
- a display surface reference point on the display surface is used by the display apparatus to determine the position of the control point on the display surface.
- the display surface reference point may be stored by the display apparatus.
- the control apparatus reference point is at a location spaced apart from the display surface and the display surface reference point is on the display surface. Therefore the control apparatus reference point and display surface reference point are at different locations. A changed position of the operative location on the handheld control apparatus relative to the control apparatus reference point is
- the changed position may comprise a displacement in each of first and second orthogonal directions. Then the position of the control point relative to the display surface reference point is determined in dependence on the changed position of the operative location relative to the control apparatus reference point.
- the display apparatus may therefore be operative to determine a change in position of the control point in a first two coordinate space and in dependence thereon to determine a position of the control point in a second two coordinate space.
- the display arrangement may be operative to display an image of no more than two dimensions on the display surface.
- the display surface itself may define a non-planar surface, e.g. the display surface may be slightly curved. Therefore the display apparatus may be operative to determine the position of the control point in a two dimensional display plane.
- the display apparatus may be configured such that, in use, the display plane and a plane defined by the first and second directions, e.g. a virtual plane, diverge from each other.
- the control apparatus reference point may lie on the virtual plane.
- the display arrangement may comprise at least one of: a passive display surface, such as a wall or screen on which images are projected by projection apparatus; and an active display surface, such as forms part of a monitor.
- the display apparatus may be operative to form a data structure representing the determined relative position of the operative location on the hand-held control apparatus in relation to the display surface reference point.
- the data structure may comprise further data, such data relating to plural display surface reference points and successive control point locations.
- the hand-held control apparatus may be used to perform one or more functions.
- the hand-held control apparatus may be used to draw or write with the thus formed drawing or writing being displayed on the display surface.
- the hand-held control apparatus may be hand-held pen apparatus.
- the hand-held control apparatus may be used to perform control functions, such as scrolling of an image shown on the display surface by selecting and moving a cursor displayed on the display surface.
- the position determining apparatus may be configured to determine of itself a change in position of the hand-held control apparatus in three dimensions.
- the display apparatus may be operative to transform movement of the hand-held control apparatus in three dimensions into movement of the control point on the display surface in two dimensions. Determination of a change in position of the hand-held control apparatus in three dimensions allows for movement of the hand-held control apparatus out of a plane, e.g. virtual plane, in which a control action is being effected.
- the hand-held control apparatus may not need to be maintained in the virtual plane in which the control action is being effected such that it is possible, for example, for the hand-held control apparatus to be lifted from and subsequently returned to the virtual plane in which control is being effected.
- the position determining apparatus may comprise inertial sensing apparatus.
- the inertial sensing apparatus may be operative to sense motion in dependence on at least one reference internal to the hand-held apparatus. More specifically, the inertial sensing apparatus may comprise an accelerometer. For example, where the position determining apparatus is operative in two
- the inertial sensing apparatus may comprise a two axis accelerometer. Where the position determining apparatus is operative in three dimensions the inertial sensing apparatus may comprise a three axis accelerometer. An output from the accelerometer may be integrated twice with respect to time to thereby determine a relative location, e.g. in terms of displacement in each of two orthogonal directions, of the hand-held control apparatus.
- the hand-held control apparatus may comprise orientation determining apparatus, which is configured to determine of itself a change in orientation of the hand-held control apparatus about each of at least two mutually orthogonal axes. More specifically, the orientation determining apparatus may be configured to determine of itself a change in orientation of the hand-held control apparatus about each of three mutually orthogonal axes.
- the orientation determining apparatus may comprise a gyroscope.
- the hand-held control apparatus comprises a gyroscope
- an output from the gyroscope may be integrated with respect to time to determine an orientation of the hand-held control apparatus.
- the hand-held control apparatus may comprise heading determining apparatus, such as a magnetometer.
- an output from the accelerometer may be changed in dependence on the determined orientation before the changed accelerometer output is integrated.
- the orientation may be multiplied by an output from the accelerometer.
- the changed accelerometer output may be aligned properly with a fundamental coordinate frame, i.e. the coordinate frame when the hand-held control apparatus is initialised.
- the orientation of the hand-held control apparatus may be changed but without such a change in orientation being intended to effect a change of location of the control point.
- the hand-held control apparatus might be rotated by the hand of the user about the operative location of the hand-held control apparatus such that the centre of the apparatus where the position determining apparatus is located moves up or down.
- the display apparatus may be operative to determine the location of the control point in dependence on operation of the position determining apparatus and the orientation determining apparatus.
- the display apparatus may take account of a change of orientation of the hand-held control apparatus in determining a location of the control point.
- the operative location on the hand-held control apparatus may be spaced apart from the position determining apparatus.
- the orientation of the hand-held control apparatus may be changed without there being any translation of the operative location of the hand-held control apparatus.
- the hand-held control apparatus may be rotated about the operative location such that the position determining apparatus describes an arc.
- Such a movement may be detected by the position determining apparatus. Therefore, the position of a control point may be determined in dependence on operation of the position determining apparatus and the orientation determining apparatus and a distance between the position determining apparatus and the operative location on the hand-held control apparatus.
- Determination of movement of the position determining apparatus along three axes and of orientation of the hand-held control apparatus about three axes may enable a location of the operative location on the hand-held control apparatus to be determined along three axes where the location of the operative location in relation to the position determining apparatus is known.
- the hand-held control apparatus may comprise first and second position determining apparatus, each of the first and second position determining apparatus being operative to determine of itself a change in position of the hand-held control apparatus in at least two dimensions. More specifically, the display apparatus may be operative to determine the changed position of the operative location on the hand-held control apparatus in dependence on at least one of: a difference between outputs from the first and second position determining apparatus; and an average of outputs from the first and second position determining apparatus. Hence, common mode errors may be reduced where a difference is used and greater accuracy may be achieved where an average is used. The display apparatus may be operative to determine a difference between outputs from the first and second position determining apparatus after an integration of each output from the first and second position determining apparatus.
- the display apparatus may be operative to detect at least one predetermined movement, for example an unintentional movement, such as may arise from dropping the hand-held control apparatus. More specifically, the at least one predetermined movement may be determined in dependence on a comparison of a characteristic of movement with a threshold value, e.g. when the velocity of movement or the acceleration is greater than a value that is representative of a shock event such as is liable to arise when the hand held control apparatus is dropped.
- the hand-held control apparatus may be used to determine a control apparatus reference point by positioning the operative location on the hand-held control apparatus at a desired location. Therefore, the hand-held control apparatus may further comprise an input device mounted on the chassis, the input device being operative to initiate determination of the control apparatus reference point.
- the input device may be operative to determine the display surface reference point. More specifically, the hand-held control device may be held such that the operative location on the hand-held control apparatus is at or proximate a desired location on the display surface with the input device being operative thereupon to determine the display surface reference point.
- the input device may be user operable.
- the input device may comprise a user operable actuator which is operated to initiate a determination.
- the actuator may comprise a switch, such as a push-button switch, or a selector, such as a wheel.
- the user may move the hand-held control apparatus such that its operative location is on or proximate the display surface and operate the actuator to thereby initiate determination of the display surface reference point.
- the display apparatus may be operative to display an object at a predetermined location on the display surface and the hand-held control apparatus may be held at or proximate the object before the input device is operative.
- the user may locate the hand-held control apparatus at a location spaced apart from the display surface and oriented such that the operative location is directed at the desired location on the display surface or otherwise held at a location in free space in which control movements are to be made and then operate the actuator to thereby initiate determination of the control apparatus reference point.
- the input device may comprise a sensor, e.g. image sensor, which is operative upon reception of a stimulus to initiate at least one of: determination of the display surface reference point; and determination of the control apparatus reference point.
- the input device may operate in dependence on reception of the stimulus either with or without manual operation.
- the hand-held control apparatus may comprise an image sensor mounted on the chassis.
- the image sensor may be mounted on the chassis such that it is at or proximate the operative location on the hand-held control apparatus.
- the image sensor may be configured to acquire colour images.
- the display apparatus may be configured to adjust at least one characteristic of an image acquired by the image sensor.
- the at least one characteristic may comprise at least one of: luminance; shade; hue; tint; Red, Green and Blue (RGB) values; Hue, Saturation and Lightness (HSL) values; and Hue, Saturation and Value (HSV) values.
- a characteristic of a grey scale image may be adjusted if a grey scale image sensor is employed or a characteristic of a colour image may be adjusted if a colour image sensor is employed.
- the display apparatus may be configured to adjust at least one characteristic of the image by controlling the image sensor.
- the display apparatus may be operative to analyse an image acquired by the image sensor with the at least one characteristic of a further image being adjusted in dependence thereon.
- the analysis may comprise comparing at least one characteristic of the acquired image with a predetermined characteristic.
- the analysis may comprise comparing the acquired image with a stored image having predetermined characteristics.
- the acquired image may be a representation of a predetermined object which is, for example, displayed on the display surface.
- the image sensor and associated processing apparatus may be calibrated for proper representation of acquired colour or grey scale images.
- the display apparatus may adjust for a change in ambient light level or for the effect of a change in ambient light colour in addition to compensating for improper image acquisition.
- the image sensor may be comprised in the input device.
- the display surface reference point may be determined in dependence on an image sensed by the image sensor.
- the display apparatus may be operative to display an object at a predeternnined location on the display surface and the hand-held apparatus may be held at or proximate the object such that the object may be sensed by the image sensor.
- the displayed object may be of predetermined form.
- the display apparatus may be operative to recognise the form of the sensed object and to initiate determination of the display surface reference point in dependence thereon and without manual operation.
- the display surface reference point may be determined in dependence on both manual operation and sensing of the predetermined object. Hence, greater precision and reliability of determination of the display surface reference point may be achieved.
- the image sensor may be operative to acquire at least one image and the display apparatus may be operative to change a location of the control point in dependence on the acquired at least one image.
- the image sensor may be operative to acquire first and second images of an object, e.g. a cursor, on the display surface at time spaced intervals and the display apparatus may be operative to: determine the relative locations of the acquired first and second images within an image frame of the image sensor; and move the control point in dependence on the determined relative locations. If the control point is unable to follow movement of the hand-held control apparatus on the basis of operation of the position determining apparatus, e.g. where the position determining apparatus is subject to error, the first and second images may be spaced apart from each other within the image frame.
- the display apparatus may be further operative to move the control point by an amount corresponding to an extent to which the first and second images are spaced apart from each other in the image frame.
- the display apparatus may be configured such that when the hand-held control apparatus is held so as to direct the image sensor towards an object, e.g. a cursor, at a predetermined location on the display surface, the image sensor may be operative thereupon to acquire an image of the object and the display apparatus may be operative to move the control point in dependence on the acquired image and the predetermined location.
- the control point may be moved to the location of the object.
- a fresh control apparatus reference point may be determined upon acquisition of the image of the object.
- the hand-held control apparatus may be redirected so as to allow for acquisition of the object by the image sensor and the display apparatus may be operative to correct the position on the display surface of the control point.
- the hand-held control apparatus may further comprise a lens arrangement disposed in relation to the image sensor so as to alter a field of view of the image sensor. More specifically, the lens arrangement may be
- the lens arrangement may comprise a fish-eye lens.
- the fish eye lens may provide a wide angle of view of the display surface and thereby reduce the effect of shadow and variation in light intensity arising from an object, such as the user's hand, obscuring the display surface.
- the lens arrangement may distort an image of an object acquired with the image sensor. Therefore, the display apparatus may be configured to transform an image acquired by the image sensor so as to reduce if not substantially remove distortion caused by the lens arrangement.
- a fish eye lens is liable to introduce barrel distortion and the display apparatus may be configured to apply a transformation that reduces such barrel distortion. More specifically, the display apparatus may store a predetermined lens transformation with the predetermined lens transformation being applied to acquired image data.
- the display apparatus may be configured to transform an image acquired by the image sensor in dependence upon a distance between the hand held control apparatus and the display screen.
- the image may be resized in dependence on the distance with such resizing being useful for comparison of the image with another image such as during a position correction procedure.
- the display apparatus may be configured to transform an image acquired by the image sensor in dependence on an orientation of the hand-held control apparatus in relation to the display surface.
- An orientation of the hand-held control apparatus in relation to the display surface may be determined in dependence on operation of at least one of the position determining apparatus and the orientation determining apparatus.
- the display apparatus may be configured to determine the display surface reference point when the hand-held control apparatus is held at a location on or proximate the display surface.
- the display surface reference point may be determined in dependence on operation of the input device, e.g. operation of a switch by the user.
- This form of the invention may be appropriate where the handheld apparatus is used in the same room as the display surface or in a room nearby.
- the display surface reference point may be determined by means other than the hand-held control apparatus. More specifically, the display surface reference point may be determined by control apparatus forming part of the display apparatus, such as a conventional mouse or the like forming part of or connected to a PC comprised in the display apparatus. Hence, the display surface reference point may be determined without the hand-held control apparatus being moved to the display surface.
- the display apparatus may be configured when the handheld control apparatus is held at plural spaced apart locations on or proximate the display surface to determine corresponding display surface reference points.
- the user may determine the dimensions of the display surface and thereby fit or perhaps even scale movement of the hand-held control apparatus when the handheld control apparatus is moved with respect to the control apparatus reference point.
- the display apparatus may be configured when the hand-held control apparatus is held at three different locations on or proximate the display surface, e.g. at top left, at top right and at bottom centre of the display surface, to determine three spaced apart display surface reference points. This form is appropriate for a display surface of rectangular form.
- the display surface may be distorted, e.g.
- the display apparatus may be configured when the hand-held control apparatus is held at more than three spaced apart locations on or proximate the display surface, e.g. at each of all four corners of the display surface, to determine a corresponding number of display surface reference points.
- the number of display surface reference points determined may depend on the form of distortion with the user determining the presence of distortion by eye and selecting the number of display surface reference points to be
- the hand-held control apparatus may comprise
- the communications apparatus mounted on the chassis.
- the communications apparatus may be configured to provide for communication with the display arrangement.
- Communication with the display arrangement may be wireless, e.g. in accordance with Wi-Fi or in the 2.4 GHz radio frequency band.
- the display arrangement may comprise a Personal Computer (PC).
- the display arrangement may comprise a projector operative to project images onto the display surface, e.g. onto a wall.
- the display arrangement may comprise a display device, e.g. a monitor, which defines the display surface.
- the display arrangement comprises a PC
- the PC and projector or display device are in data communication.
- the PC and projector or display device may be of a form and may cooperate to display images on the display surface in a fashion that is well known to the reader skilled in the art.
- the PC may run a known operating system to provide this capability.
- the display apparatus may further comprise a communications device which is configured to electrically connect to the display arrangement and to provide for communication with the hand-held control apparatus.
- the communications device may be configured to be received in a port of the PC, such as a USB port.
- the PC may employ a USB driver, which runs on an operating system, such as Windows, to effect
- the communications device may comprise wireless communications apparatus, which is operative to provide for wireless communication with the hand-held control apparatus, e.g. by way of the wireless communications apparatus of the hand-held control apparatus.
- the communications device may comprise data storage, which is operative to store software instructions for configuring the display arrangement for operation with the hand-held control apparatus.
- the communications device and the display arrangement may be operative to install the software instructions stored in the communications device in the display arrangement. Installation may be initiated without manual intervention other than that required to effect electrical connection between the communications device and the display arrangement.
- the display apparatus may be operative to identify at least one predetermined movement of the hand-held control apparatus, e.g. in
- Such at least one predetermined movement may be of an unintended nature, e.g. the movement may arise from dropping of the hand-held control apparatus or the striking of the hand-held control apparatus against a surface.
- the display apparatus may initiate a reset procedure in
- the display apparatus may be operative to prompt for at least one of a fresh control apparatus reference point and a display surface reference point to be determined.
- at least one predetermined movement may be of an intended nature, e.g. the movement may be of the form of a double tap of the hand-held control apparatus in a virtual plane in which the handheld control apparatus is being used.
- the display apparatus may initiate the performance of a control operation in dependence on identification of a
- the display apparatus may initiate a control operation in respect of the control location on the display surface.
- the predetermined movement may initiate the opening of the document that the icon represents.
- the hand-held control apparatus may comprise proximity detector apparatus, the proximity detector apparatus being configured to determine when the hand-held control apparatus is proximate the display surface.
- the proximity detector apparatus may be configured to determine an extent of separation between a location on the hand-held control apparatus and a location on the display surface.
- the proximity detector apparatus may comprise a light emitter and a light detector disposed in relation to each other and operative such that light emitted by the light emitter is reflected by the display surface and detected by the light detector.
- the light emitter may be an infra-red Light Emitting Device (LED) and the light detector may be a photo-diode.
- the display apparatus may be operative in dependence on operation of the proximity detector apparatus to control operation of the hand-held control apparatus. More specifically, the display apparatus may be operative to provide for determination of the position of the control point on the display surface.
- the display apparatus may be operative to supply electrical power or reduce supply of electrical power to parts of the hand-held control apparatus. Therefore, the proximity detector apparatus may be operative to vary the supply of electrical power to the hand -held control apparatus in dependence on an extent of proximity of the hand-held control apparatus to the display surface to thereby conserve electrical power. Conservation of electrical power may be advantageous when the electrical power is provided to the hand-held control apparatus from an electrical battery.
- the display apparatus may be operative to define a control region on the display surface within which the hand-held control apparatus is operative to determine a location of the control point.
- the display apparatus may be further operative to define an activation region on the display surface within which the hand-held control apparatus is activated, e.g. by the bringing into operation of predetermined parts of the hand-held control apparatus, such as the image sensor.
- Activation may be in dependence on operation of at least one of: the proximity detector apparatus; and the position determining apparatus. More specifically, activation region may be a space within which the proximity detector is operable.
- the control region may lie within the activation region. More specifically, the activation region may extend around the control region.
- movement of the hand-held control apparatus into the activation region activates at least a part of the hand-held control apparatus, e.g. the image sensor, before the hand-held control apparatus is moved into the control region for movement of the control point within the control area.
- the control area may constitute the part of the display surface within which the control point may be moved.
- the hand-held control apparatus may be configured for user operation, e.g. by way of a switch, to provide for activation of the hand-held apparatus when used at a location spaced apart from the display surface, e.g. at a distance beyond the reach of the proximity detector apparatus.
- the hand-held control apparatus may comprise pointer apparatus, the pointer apparatus being configured to dispose a user discernible mark on the display surface when the hand-held control apparatus is directed towards the display surface.
- the pointer apparatus may be used to point at locations other than on the display surface.
- the pointer apparatus may comprise a laser device.
- the pointer apparatus may be operative to illuminate a part of the display surface.
- the display apparatus may be configured to change a focus of the imaging sensor. The focus may be changed by at least one of: processing of an image acquired by the imaging sensor; and changing a configuration of imaging apparatus. Processing of the image acquired by the imaging sensor may comprise applying a
- changing a configuration of the imaging apparatus may comprise changing a separation between the imaging sensor and the focusing lens.
- the display apparatus may be operative to change the focus in dependence on at least one of: a distance between the hand-held control apparatus and the display surface, e.g. as determined by the position determining apparatus; and a quality of an image acquired by the image sensor, e.g. by comparing an acquired image of an object with a stored image of the same object.
- the display apparatus may be configured to change the focus of the imaging sensor on an automatic basis, i.e. without manual intervention.
- the hand-held control apparatus may be elongate in form. More specifically, the hand-held control apparatus may have the form of a marker pen.
- the operative location on the hand-held control apparatus may be located at or proximate an end of the hand-held control apparatus.
- At least one of an image sensor, proximity detector apparatus and pointer apparatus may be disposed at or proximate an end of the hand-held control apparatus.
- a method of controlling a display arrangement comprising a display surface by way of hand-held control apparatus comprising a chassis and position determining apparatus mounted on the chassis, the position determining apparatus being configured to determine of itself a change in position of the hand-held control apparatus in at least two dimensions, the method comprising:
- control apparatus determining, in dependence on operation of the hand-held control apparatus, a control apparatus reference point at a location spaced apart from the display surface
- Embodiments of the second aspect of the present invention may comprise one or more features of the first aspect of the present invention.
- a computer program comprising program instructions for causing a computer, which is comprised in a display arrangement comprising a display surface, and a hand-held control apparatus according to the first aspect of the present invention to perform the method according to the second aspect of the present invention.
- the computer program may be one of: embodied on a record medium; embodied in a read only memory; stored in a computer memory; and carried on an electrical carried signal.
- the computer program may be embodied in a communications device which is configured to electrically connect with the computer, e.g. by way of a connector on the computer, such as a USB port.
- the hand-held control apparatus may be a mobile device, such as a smartphone or tablet device.
- the smartphone may comprise inertial navigation apparatus.
- the smartphone may further comprise at least one of an orientation determining apparatus and an image sensor.
- a computer system comprising: a display arrangement comprising a display surface; and handheld control apparatus comprising a chassis and position determining apparatus mounted on the chassis, the position determining apparatus being configured to determine of itself a change in position of the hand-held control apparatus in at least two dimensions; and program instructions for causing the computer system to perform the method according to the second aspect of the present invention.
- a computer in the form of a general purpose computer, such as a Personal Computer (PC), an embedded microcontroller or a microprocessor may form part of the display arrangement.
- the program instructions may be at least one of: embodied on a record medium; embodied in a read only memory; stored in a computer memory; and carried on an electrical carried signal. Further embodiments of the fourth aspect of the present invention may comprise one or more features of the first aspect of the present invention.
- display apparatus comprising hand-held control apparatus and a display arrangement comprising a display surface
- the hand-held control apparatus comprising: a chassis; an image sensor mounted on the chassis; and position determining apparatus mounted on the chassis, the position determining apparatus being configured to determine of itself a change in position of the hand-held control apparatus in at least two dimensions, the display apparatus being configured to:
- the display apparatus relies principally on the position determining apparatus to determine the position of the control point with the at least one acquired image being used to adjust the position of the control point.
- Position determining apparatus that is operative of itself, e.g. inertial navigation apparatus, may be liable to error over time. For example, position determinations may be liable to drift over time and thus register a progressively increasing level of error. Adjusting the control point by means of the at least one acquired image addresses such an error. Hence, the display apparatus may be operative to adjust the control point in dependence on the at least one acquired image periodically.
- the image sensor may be operative to acquire first and second images of an object, e.g. a cursor, on the display surface at time spaced intervals and the display apparatus may be operative to: determine the relative locations of the acquired first and second images within an image frame of the image sensor; and move the control point in dependence on the determined relative locations.
- the hand-held control apparatus may be held with respect to the display surface to allow for acquisition of images from the display surface by the image sensor. If the control point is unable to follow movement of the hand-held control apparatus on the basis of operation of the position determining apparatus, e.g. where the position
- the first and second images may be spaced apart from each other within the image frame.
- the display apparatus may be further operative to move the control point by an amount corresponding to an extent to which the first and second images are spaced apart from each other in the image frame.
- the display apparatus may be configured such that when the hand-held control apparatus is held so as to direct the image sensor towards an object, e.g. a cursor, at a predetermined location on the display surface, the image sensor may be operative thereupon to acquire an image of the object and the display apparatus may be operative to move the control point in dependence on the acquired image and the predetermined location. The control point may be moved to the location of the object.
- an error in operation of the position determining apparatus may be addressed, e.g. on a periodic basis and as the need arises. If, for example, the position determining apparatus is subject to an error which gives rise to a lack of coincidence between the actual location of the control point on the display surface and the desired location, the hand-held control apparatus may be moved so as to allow for acquisition of an image of the object by the image sensor and the display apparatus may be operative to correct the position of the control point on the display surface. Further embodiments of the fifth aspect of the present invention may comprise one or more features of any previous aspect of the present invention. According to a sixth aspect of the present invention there is provided a method of controlling a display arrangement comprising a display surface by way of hand-held control apparatus comprising a chassis, an image sensor mounted on the chassis and position determining apparatus mounted on the chassis, the position
- determining apparatus being configured to determine of itself a change in position of the hand-held control apparatus in at least two dimensions, the method comprising: determining, in dependence on operation of the position determining apparatus, a changed position of an operative location on the hand-held control apparatus after movement of the hand-held control apparatus by hand;
- Embodiments of the sixth aspect of the present invention may comprise one or more features of any previous aspect of the present invention.
- a seventh aspect of the present invention there is provided a computer program comprising program instructions for causing a computer, which is comprised in a display arrangement comprising a display surface, and a hand-held control apparatus according to the fifth aspect of the present invention to perform the method according to the sixth aspect of the present invention.
- Embodiments of the seventh aspect of the present invention may comprise one or more features of any previous aspect of the present invention.
- a computer system comprising: a display arrangement comprising a display surface; and handheld control apparatus comprising a chassis and position determining apparatus and an image sensor mounted on the chassis, the position determining apparatus being configured to determine of itself a change in position of the hand-held control apparatus in at least two dimensions and the image sensor being operative to acquire at least one image of an object on the display surface; and program instructions for causing the computer system to perform the method according to the sixth aspect of the present invention.
- Embodiments of the eighth aspect of the present invention may comprise one or more features of any previous aspect of the present invention.
- a display apparatus comprising hand-held control apparatus and a display arrangement comprising a display surface
- the hand-held control apparatus comprising: a chassis; and position determining apparatus mounted on the chassis, the position
- determining apparatus being configured to determine a change in position of the hand-held control apparatus in at least two dimensions, the display apparatus being configured: to determine a changed position of the hand-held control apparatus in dependence on operation of the position determining apparatus after movement by hand of the hand-held control apparatus; and to determine a position of a control point on the display surface in dependence on the determined changed position.
- Embodiments of the further aspect of the present invention may comprise one or more features of any previous aspect of the present invention.
- a method of controlling a display arrangement comprising a display surface by way of hand- held control apparatus comprising a chassis and position determining apparatus mounted on the chassis, the position determining apparatus being configured to determine a change in position of the hand-held control apparatus in at least two dimensions, the method comprising: determining a changed position of the hand- held control apparatus in dependence on operation of the position deternnining apparatus and after movement by hand of the hand-held control apparatus; and determining a position of a control point on the display surface in dependence on the determined changed position.
- Embodiments of the yet further aspect of the present invention may comprise one or more features of any previous aspect of the present invention.
- Figure 1 shows display apparatus according to the present invention
- Figure 2 is a drawing in longitudinal section of the hand-held control apparatus of Figure 1 ;
- FIG. 3 is a block diagram representation of the main components of the hand-held control apparatus of Figure 2;
- FIG. 4 is a block diagram representation of the software components of the present invention.
- Figure 5 is a block diagram representation of operation of the present invention with regards to the inertial navigation apparatus
- Figure 6 is a block diagram representation of operation of the present invention with regards to the proximity detector and image sensor;
- Figure 7 is a block diagram representation of processing of acquired image data
- FIG. 8 illustrates the initialisation process for the present invention
- Figure 9 represents certain control functions provided by the present invention.
- Figure 10 represents certain error handling procedures
- Figure 1 1 shows the activation and control regions of a display surface.
- the display apparatus 10 comprises hand-held control apparatus 12 and a display arrangement.
- the display arrangement comprises a Personal Computer (PC) 14, a projector 16 and a display surface 18 formed on a wall.
- the PC 14, projector 16 and display surface 18 are of a form and cooperate to display images on the display surface in a fashion that is well known to the reader skilled in the art.
- the display apparatus 10 further comprises a communications device 20, which is received in and electrically connects with a USB port on the PC 14.
- the communications device 20 and the hand-held control apparatus 12 are configured to provide for wireless communication between the communications device and the hand-held control apparatus in the 2.4 GHz radio frequency band. The provision of a 2.4 GHz radio frequency channel between the hand-held control apparatus 12 and the
- communications device 20 will be readily within the grasp of the reader skilled in the art without resorting to any more than ordinary design skills.
- the hand-held control apparatus 30 is in the form of a cylinder which tapers towards one end such that the hand-held control apparatus is similar in form and size to a conventional marker pen.
- the hand-held control apparatus 30 has an outer shell 32 formed of a hard plastics material or similar such material, which contains or supports the components of the hand-held control apparatus.
- the outer shell comprises a section 34 of transparent plastics material.
- the section 34 of transparent plastics material has a tapered profile and defines an aperture 36 at its distal end. The distal end of the tapered section constitutes the operative location of the hand-held apparatus.
- the hand-held control apparatus 30 of Figure 2 comprises a chassis 38 with a first three-axis accelerometer 40, a second three-axis
- the accelerometers 40 and the gyroscope 42 are MEMs devices of a type that will be readily available to the reader skilled in the art.
- the magnetometer is a three axis, digital magnetometer namely part number MAG31 10 from Freescale Semiconductor Inc., 6501 William Cannon Drive West, Austin, Texas 78735, USA.
- the colour camera 44 is of a type that will be readily available to the reader skilled in the art. The colour camera 44 is located towards the tapered end of the hand-held apparatus.
- the hand-held control apparatus 30 further comprises a fish-eye lens 48 located between the colour camera 44 and the aperture 36, and a laser pointer 50 and proximity detector 52, which are mounted side-by-side between the fish-eye lens 48 and the aperture 36.
- the laser pointer and the proximity detector obscure the field of view of the camera; attending to the effect of this obscuration is described below with reference to Figure 7.
- the laser pointer is of a type which is safe in use and readily available to the reader skilled in the art.
- the proximity detector 52 comprises an infra-red Light Emitting Device (LED) and a photo-diode light detector and is operative when light emitted by the LED is reflected by a surface and detected by the photo-diode.
- LED Light Emitting Device
- the hand-held control apparatus 30 comprises a USB port 54, which provides for data communications with the electronic circuitry of the handheld control apparatus 30 to thereby provide for software updates, calibration and the like, a microprocessor 56, a wireless transceiver 58, a battery 60, which provides electrical power for the hand-held control apparatus 30, and control switches and buttons.
- the wireless transceiver 58 is configured to provide for wireless communication in the 2.4 GHz radio frequency band.
- the control switches and buttons comprise an on/off switch 62, a left mouse button 64, a right mouse button 66 and an auxiliary button 68.
- FIG. 3 A block diagram representation 80 of the main components of the hand-held control apparatus and certain components of the display arrangement of Figures 1 and 2 is shown in Figure 3.
- the hand-held control apparatus comprises an electronics core 82, which comprises the microprocessor 56 and associated static and flash memory.
- the hand-held control apparatus further comprises the gyroscope 42, 84, the accelerometers 40, 86, the magnetometer, a temperature sensor 88, the colour camera 44, 90 and proximity detector 52, 92 with each component being in data communication with the electronics core 82.
- the temperature sensor 88 measures the temperature of the hand-held control apparatus, with the measurements being used to provide for temperature compensation.
- the hand-held control apparatus comprises the control switches and buttons 94, the USB port 54, 96 and the 2.4 GHz radio frequency transceiver 58, 98, which are each in communication with the electronics core 82.
- the block representing the control switches and buttons 94 in Figure 3 specifies four buttons instead of the three buttons shown in Figure 2 and a scroll wheel that is absent from Figure 2. This is because control mechanisms other than the control mechanism shown in Figure 2 are envisaged depending on the nature of the application intended for the hand-held control apparatus. For example, a scroll wheel is advantageous where it is intended to effect control operations which involve scrolling a control point up and down menu items shown on the display screen.
- a fourth button may be required to, for example, switch on and off the laser pointer 50, 100, which is represented in Figure 3.
- the hand-held control apparatus also comprises a battery 60, 102 and power management circuitry 104, which is operative to provide for routine regulation and control functions.
- Figure 3 also shows the PC 14, 106 and its 2.4 GHz radio frequency transceiver 20, 108.
- the hand-held control apparatus is constituted by a smartphone comprising accelerometers, a gyroscope, a magnetometer, a camera and a WiFi communications transceiver.
- a smartphone comprising accelerometers, a gyroscope, a magnetometer, a camera and a WiFi communications transceiver.
- WiFi wireless communication
- smartphones are known and widely available.
- a laser pointer is attached by way of a releasable clip to the outside of the smartphone case.
- the smartphone is configured by software that is operative as described herein to function as described above and below with regards to the hand-held control apparatus of Figure 2.
- the accelerometer 40 providing a signal, A(t), indicating the perceived acceleration of the hand-held control apparatus 12, 30 along three axes which are aligned with the body of the hand-held control apparatus.
- the gyroscope 42 provides a signal G(t) which reflects the rate of rotation of the hand-held control apparatus 12, 30.
- the acceleration value A(t) is aligned according to O(t). Hence, the orientation and acceleration are multiplied together to obtain a new acceleration vector, Atrue(t), which is aligned properly with the fundamental coordinate frame, i.e. the coordinate frame when the hand-held control apparatus is initialised.
- the operative location of the hand-held control apparatus is then determined by adding an additional offset, L, which relates the position of the accelerometer 40 to the position of the operative location:
- a coordinate reference frame, [C] is determined. The determination of the calibration points during the initialisation phase is described below.
- the coordinate reference frame is then used to convert the calculated position of the operative location of the hand-held control apparatus, T(t), into coordinates that specify the position of the control point on the display surface 18, S(t).
- the display apparatus is configured to translate movement of the hand held control apparatus in two orthogonal directions in relation to a control apparatus reference point into a change in position of the control point in two orthogonal directions and despite movement of the hand held control apparatus in a third mutually orthogonal direction.
- unintended movement of the hand held control apparatus or indeed deliberate movement of the hand held control apparatus e.g. when the user momentarily lifts the hand held control apparatus out of a virtual plane in which he is making some annotation, is not translated into a change in position of the control point on the display surface.
- the software structure of the present invention is represented in block diagram form in Figure 4. Components in common with Figure 3 are designated by like reference numerals and therefore the reader's attention is directed to the description provided above with reference to Figure 3 for identification and description of such common components.
- the primary purpose of Figure 4 is to show the basic software structure employed according to an embodiment of the present invention.
- application software and firmware 1 10 resident in the hand-held control apparatus provides for higher level functionality.
- the application software and firmware 1 10 runs on an Operating System (OS) framework 1 12 hosted by the electronics core 82 of the hand-held control apparatus.
- OS Operating System
- data is communicated during normal use between the hand-held control apparatus and the PC 106 by way of a wireless channel operative in the 2.4 GHz radio frequency band.
- a wired communications channel can be established between the hand-held control apparatus and the PC 106 by an appropriate cable connecting the USB ports of the hand-held control apparatus and the PC 106.
- Such a wired communications channel is used to provide firmware updates for the hand-held control apparatus, change calibration coefficients for operative components of the hand-held control apparatus and like maintenance and control operations.
- application software 1 14 provides for higher level functionality and runs via an Application Programming Interface (API) 1 16 and employs a USB driver 1 18 to effect communication with the hand-held control apparatus.
- API Application Programming Interface
- the further software is automatically installed in the PC when the communications device 20 is inserted in a USB port of the PC.
- Operation of the inertial navigation apparatus of the hand-held control apparatus is represented in block diagram form 140 in Figure 5.
- the output from each of the first and second accelerometers 142, 144 is filtered by a filter 146, 148 to remove undesired signals.
- accelerometers is time stamped by way of a real-time clock 150.
- the orientation of the hand-held control apparatus (described further below) is multiplied by the output from each of the first and second accelerometers 142, 144 before being integrated by a respective first integrator 152, 154 and then being integrated by a respective second integrator 156, 158 to thereby provide a location in three dimensional space relative to a starting point having regards to either common mode errors by determining a difference between data from the first and second accelerometers or increased accuracy of measurement by determining an average of data from the first and second accelerometers.
- the output from the gyroscope 160 is likewise filtered by a filter 162 to remove undesired signals.
- each data packet from the gyroscope is time stamped by way of the real-time clock 150. Then the rate of rotation measure provided by the gyroscope 160 is integrated by a gyroscope integrator 163 to provide the orientation of the hand-held control apparatus at any one time.
- a gyroscope integrator 163 to provide the orientation of the hand-held control apparatus at any one time.
- an output from each of the filters 146, 148 connected to the first and second accelerometers 142, 144 is received directly by the application software 172.
- an output from the filter 162 connected to the gyroscope 160 is received directly by the application software 172.
- a filter comprises a single first order low pass stage the filter is operative to perform an integrating function.
- an output from the filter can be sufficient to provide data which is usable by the application software 172 without relying on an integrated signal. Furthermore the outputs from the filters yield further data of use to the application software 172.
- the application software can determine in dependence on the filter output data at least one of: errors, such as gravitational error; and how quickly a sensed property, such as orientation, has changed.
- An output from the magnetometer 165 is received by the application software 172.
- the magnetometer 165 provides data on the heading of the hand-held control apparatus.
- the application software 172 is operative to use the heading data provided by the magnetometer 165 in addition to the orientation data provided by the gyroscope 160 to determine the disposition of the hand-held control apparatus during use with the magnetometer 165 providing data of a supplementary nature to data provided by the gyroscope 160.
- the output from the temperature sensor 164 is used to compensate for the effects of temperature variation on the first and second accelerometers 142, 144 and the gyroscope 160. Differences are determined between: the filtered but un-integrated measurements from the first and second accelerometers 142, 144, 166; the one time integrated measurements from the first and second
- unintended movements of a predetermined nature include movement caused by shock events, such dropping the hand-held control apparatus, knocking the hand-held control apparatus against a hard surface, etc.
- Intended movements of a predetermined nature include the like of moving the hand-held control apparatus so as to perform a double tap motion analogous to the double click action performed with a conventional mouse when effecting certain control operations. Operation with regards to the colour camera and proximity detector of the hand-held control apparatus is represented in greater detail in block diagram form in Figure 6.
- the output from the camera 180 is filtered 182 and image data packets are time stamped 184. Thereafter the image is assembled 186 in dependence on the received image data. Lighting conditions, e.g. in respect of the brightness and colour of the ambient light, may change. Hence an image of a predetermined object displayed on the display screen is acquired by the camera 180, processed and then compared with a predetermined image of the same object stored in memory.
- the characteristics of the filter 182 may be adjusted to compensate for a change in lighting conditions.
- the fish eye lens 48 causes barrel distortion of the image acquired by the camera 180.
- the assembled image is transformed 188 to remove the distorting effect of the fish eye lens.
- the application uses the angle of orientation of the hand-held control apparatus to transform the acquired image and thereby reduce the distortion.
- Such and further image processing is described below with reference to Figure 7.
- the display apparatus is configured to automatically change a focus of the camera.
- the focus is changed by applying a transformation to an image acquired by the camera to thereby change an apparent focal length of the camera.
- the formation of a transformation and the application of the transformation to an acquired image so as to change an apparent focal length will be well known to the reader of ordinary skill.
- the focus is changed by changing a separation between a focusing lens (not shown in Figure 2) and the camera.
- the changing of the focus is in dependence on either: a distance between the hand-held control apparatus and the display surface as determined by the accelerometers and gyroscope; and a quality of an acquired image as determined by comparing an acquired image of an object with a stored image of the same object.
- Image data is employed according to one of two approaches to adjust the location of the control point as determined by the inertial navigation apparatus to address errors in the inertial navigation apparatus 190.
- the camera acquires first and second images of an object, e.g. a cursor, on the display surface at time spaced intervals and the application determines the relative locations of the acquired first and second images within an image frame of the camera. Then the application moves the control point by an amount corresponding to an extent to which the first and second images are spaced apart from each other in the image frame.
- the hand-held control apparatus is held so as to direct the camera towards an object, e.g.
- the camera acquires an image of the object. Then the application moves the control point to the location of the object and establishes a fresh starting point for the inertial navigation apparatus.
- This form of approach is described further below with reference to Figure 7.
- the output from the proximity detector 192 is used by the application to activate the camera of the hand-held control apparatus when it is proximate the display surface. This feature is described further below with reference to Figure 1 1 .
- an assembled image is transformed to address barrel distortion caused by the fish eye lens and by the orientation of the hand held control apparatus at an angle to the display screen 194.
- the distance between the hand held control apparatus and the display screen at the time of acquisition of the image by the camera is used to resize the image 194.
- the content of a colour table 196 is employed to correct for improper colour balance and the like.
- the transformed acquired image is compared with what is displayed on the computer screen 198 (which is used to drive the projector) by way of a fast pattern matching process 200 to thereby identify the object
- the acquired image may be of a cursor and the fast pattern matching process finds a matching cursor on the computer screen whereupon the coordinates of the cursor on the computer screen are determined 202. Thereafter the location of the control point is moved to the determined location to thereby address whatever accumulated error there might be in the inertial navigation apparatus.
- the laser pointer and the proximity detector obscure the field of view of the camera. Also, the field of view may be otherwise obscured in part, e.g. by a person's hand present between the projector and the display screen. Hence and where there is obscuration, the success of the pattern matching and the location determination is dependent on there being sufficient useful information remaining in the acquired image. Where there is insufficient useful information remaining in the acquired image, a fresh image is acquired and the process repeated.
- the initialisation process is represented in block diagram form in Figure 8.
- a first step all position and colour compensation data is cleared 210 before the inertial navigation apparatus and camera are operated 220.
- the operational location on the hand-held control apparatus is held at a first target on the display surface 230, e.g. at the top left of the display surface.
- Operation by the user of one of the control buttons on the hand-held control apparatus 240 initiates acquisition of a first image of the target by the camera and the determination of a first position reading by the inertial navigation apparatus 250.
- the hand-held control apparatus is then moved such that the operational location on the hand-held control apparatus is held at a second target on the display surface 260, e.g. at the top right of the display surface.
- Operation by the user of one of the control buttons on the hand-held control apparatus 270 initiates acquisition of a second image by the camera and the determination of a second position reading by the inertial navigation apparatus 280.
- the hand-held control apparatus is then moved such that the operational location on the hand-held control apparatus is held at a third target on the display surface 290, e.g. at the centre of the bottom of the display surface.
- Operation by the user of one of the control buttons on the hand-held control apparatus 300 initiates acquisition of a third image by the camera and the determination of a third position reading by the inertial navigation apparatus 310.
- the first to third images of the target are
- the first to third position readings are stored by the application as three calibration points (CO, C1 , C2) for use in control point position determination during use of the hand-held control apparatus. Thereafter, the hand-held control apparatus is brought into use 330.
- the display surface defines a non-rectangular footprint, e.g.
- a calibration point at each of the four corners of the display surface may be required.
- the number of calibration points, i.e. three or four, is selected by the user after the user has determined the extent and nature of the distortion by eye.
- Certain control functions are represented in block diagram form in Figure 9.
- a part of the application 410 regulates the effect of operation of the control buttons and scroll wheel.
- the first (left) button, second (right) button and scroll wheel 412 are configured to initiate the correction of the control point on the display in dependence on operation of the camera 420 and to perform control operations analogous to those performed by operation of the corresponding parts of a conventional mouse 430.
- the third button 432 is used to toggle the overlay mode, which is described further below, on and off 440.
- the fourth button 434 is used to enter and leave a remote mode 450.
- the remote mode the hand-held control apparatus is usable at a location spaced apart from the display surface.
- the remote mode is turned off, the hand-held control apparatus is usable near the display surface with the camera being turned on in dependence on operation of the inertial navigation system or the proximity detector.
- a first form of error arises when the hand-held control apparatus is subject to a shock, such as is sustained when the hand-held control apparatus is dropped.
- a second form of error arises when the position management application 500 records an unacceptably large position error. Irrespective of the form of error, the
- initialisation process 510 described above with reference to Figure 8 is executed thereupon to thereby restore the camera 520 and the inertial navigation apparatus 530 to normal operation.
- Figure 1 1 shows a side view and a facing view of the display surface 610.
- the proximity detector described above is operative to detect the presence of a surface, such as the display surface, which is capable of reflecting signals from the proximity detector, when the hand-held control apparatus is within a predetermined distance of the display surface.
- the proximity detector is operative to define with data from the inertial navigation apparatus an activation region 620 that extends over and above the display surface 610.
- the application is operative to define a control region 630 lying within the activation region within which the hand-held control apparatus is fully operative.
- the control region consists of the control area, i.e. the part of the display surface within which the control point is to be moved by the hand-held control apparatus, and space above the control area within which the camera is capable of operating.
- the hand-held control apparatus can be used to annotate an existing image shown on the display surface or to draw or write on an effectively blank image by linking the control point to a conventional drawing application at the application level in the PC.
- the operative location of the hand held control apparatus is held at the origin of a virtual plane in which the hand held control apparatus will be used and a switch is actuated by the user to thereby record the location of a control apparatus reference point. Changes in position of the operative location are determined in relation to the control apparatus reference point with the control apparatus reference point being related to the calibration points on the display surface.
- Annotations made or drawings executed with the hand-held control apparatus can be configured at the application level to modify the image data itself.
- annotations or drawings can have the form of an overlay over an existing image, which can be shown or hidden in dependence on operation of the control buttons of the hand-held control apparatus in a predetermined fashion.
- the hand-held control apparatus can also be used for control operations. For example, movement of the control point by way of the hand-held control apparatus to an icon or structure shown on the display surface allows for operation of the control buttons to perform control operations, such as the opening of a document or the scrolling of an already open document.
- Such applications of the present invention make use of functions and application software of a conventional nature that are either already resident in the PC or can be obtained readily elsewhere. The implementation of such applications is readily within the grasp of the reader of ordinary skill without resorting to any more than ordinary programming skill.
- the hand-held control apparatus may be used in proximity to the display surface such that the camera is capable of acquiring image data from the display screen for use in conjunction with position deternninations made by the inertial navigation apparatus.
- the hand-held control apparatus may be used at a location spaced apart from the display surface such that position determinations are made by the inertial navigation apparatus alone.
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Controls And Circuits For Display Device (AREA)
Abstract
The present invention relates to display apparatus 10 which comprises hand-held control apparatus (12) and a display arrangement comprising a display surface (18). The hand-held control apparatus (12) comprises a chassis and position determining apparatus mounted on the chassis. The position determining apparatus is configured to determine of itself a change in position of the hand-held control apparatus (12) in at least two dimensions. The display apparatus is configured to determine, in dependence on operation of the hand-held control apparatus, a control apparatus reference point at a location spaced apart from the display surface 18. The display apparatus (10) is also configured to determine, in dependence on operation of the position determining apparatus (12), a changed position of an operative location on the hand-held control apparatus (12) in first and second orthogonal directions relative to the control apparatus reference point after movement of the hand-held control apparatus (12) by hand and despite movement of the operative location in a third direction orthogonal to each of the first and second directions. Furthermore the display apparatus is configured to determine, in dependence on the determined relative position, a position of a control point on the display surface (18) in relation to a display surface reference point on the display surface (18).
Description
Title of Invention: Display apparatus and method therefor Field of the Invention
The present invention relates to display apparatus, which comprises hand-held control apparatus and a display arrangement comprising a display surface, and a method of controlling a display arrangement comprising a display surface by way of hand-held control apparatus.
Background Art
Electronic whiteboards are known. An electronic whiteboard typically comprises projector, a computer, such as a Personal Computer (PC), a display surface and a marker pen like device. The computer is operative to provide image data to the projector, which is operative to project an image based on the image data onto the display surface. The marker pen like device is held by a user and moved over the display surface to perform functions, such as control of what is displayed on the display surface by selecting an appropriate icon displayed as part of the image or annotation of the displayed image.
A recent electronic whiteboard apparatus is described in WO 2010/129102.
According to the electronic whiteboard apparatus of WO 2010/129102 a pen like device emits acoustic signals which are detected by an ultrasonic detector located beside the display surface with time of flight calculations being used to determine the location of the pen like device on the display surface. The Wiimote Whiteboard is another example, in which a Wiimote located beside the display surface is used to determine the location of an infra-red pen on the display surface.
The present inventor has become appreciative of shortcomings in known electronic whiteboard apparatus, such as the apparatus of WO 2010/129102 and the Wiimote Whiteboard.
The present invention has been devised in the light of the appreciation of such shortcomings. It is therefore an object for the present invention to provide a display apparatus comprising hand-held control apparatus and a display arrangement comprising a display surface, in which the hand-held control apparatus is operative to determine a position of a control point on the display surface.
It is a further object for the present invention to provide a method of controlling a display arrangement comprising a display surface by way of hand-held control apparatus, which is operative to determine a position of a control point on the display surface.
Statement of Invention
According to a first aspect of the present invention there is provided display apparatus comprising hand-held control apparatus and a display arrangement comprising a display surface,
the hand-held control apparatus comprising: a chassis; and position determining apparatus mounted on the chassis, the position determining apparatus being configured to determine of itself a change in position of the hand-held control apparatus in at least two dimensions,
the display apparatus being configured to determine:
in dependence on operation of the hand-held control apparatus, a control apparatus reference point at a location spaced apart from the display surface;
in dependence on operation of the position determining apparatus, a changed position of an operative location on the hand-held control apparatus in first and second orthogonal directions relative to the control apparatus reference point after movement of the hand-held control apparatus by hand and despite movement of the operative location in a third direction orthogonal to each of the first and second directions; and,
in dependence on the determined relative position of the operative location, a position of a control point on the display surface in relation to a display surface reference point on the display surface.
In use, the position determining apparatus determines of itself a change in position of the hand-held control apparatus in at least two dimensions. The position
determining apparatus determines the change in position of itself. Thus the position determining apparatus may determine the change in position without the assistance of further apparatus which is external to the hand-held apparatus and which is operative to provide an external reference. In other words the position determining apparatus determines the change in position by itself. This is in contrast to the apparatus of WO 2010/129102 and the Wiimote Whiteboard. Each of the apparatus of WO 2010/129102 and the Wiimote Whiteboard has hand-held control apparatus that relies on external apparatus which is operative to provide an external reference. More specifically, the apparatus of WO 2010/129102 relies on external apparatus in the form of an ultrasonic detector located beside the display surface and the Wiimote Whiteboard relies on external apparatus in the form of a Wiimote located beside the display surface. By way of another example hand held apparatus comprising a Global Positioning System (GPS) receiver is operative to determine its position with the assistance of signals transmitted from plural satellites whereas the position determining apparatus of the present invention is operative by itself. By way of further example an optical mouse for a personal computer is operative to determine a change in position of the optical mouse by way of light reflected from a surface, such as a mouse mat. Thus the surface constitutes external apparatus which is
operative to provide a reference for deternnining a change in position of the mouse. The present invention requires no such reference providing external apparatus.
Considering the present invention further, the determination of the position of the control point on the display surface in dependence on the display surface and control apparatus reference points and position determining apparatus, which is operative to determine of itself a change in position, enables the hand-held control apparatus to be used spaced apart from the display surface. Indeed, the hand-held control apparatus may be employed by a user facing away from the display surface. In contrast, the apparatus of WO 2010/129102 and the Wiimote Whiteboard require the hand-held control apparatus to be used on or proximate the display surface and such that the hand-held control apparatus is within range of the external apparatus.
Furthermore, configuration of the display apparatus according to the present invention to determine the changed position of the operative location on the hand- held control apparatus in first and second orthogonal directions relative to the control apparatus reference point and despite movement of the operative location in a third direction which is orthogonal to each of the first and second directions enables the hand-held control apparatus to be used in free space, i.e. without movement of the hand-held control apparatus being constrained, e.g. by supporting the hand-held control apparatus or the hand or arm holding the hand-held control apparatus on a surface. For example the hand-held control apparatus may be used afar from the display surface in a virtual plane generally parallel to the display surface or even in a virtual plane diverging from the plane of the display surface, such as in a classroom where the display surface is at the front of the classroom and the teacher operates the hand-held control apparatus from the rear of the classroom.
The display apparatus is operative to determine a control apparatus reference point at a location spaced apart from the display surface. The control apparatus reference point is determined in dependence on operation of the hand-held control apparatus, e.g. as a consequence of actuation of a user operable input, such as a push button, on the hand-held control apparatus. The control apparatus reference point may be operative as an internally generated reference for determination of at least one of an extent and a direction of movement of the operative location. The internal reference
may thus take the place of an external reference provided by the prior art examples given above. A display surface reference point on the display surface is used by the display apparatus to determine the position of the control point on the display surface. The display surface reference point may be stored by the display apparatus. The control apparatus reference point is at a location spaced apart from the display surface and the display surface reference point is on the display surface. Therefore the control apparatus reference point and display surface reference point are at different locations. A changed position of the operative location on the handheld control apparatus relative to the control apparatus reference point is
determined. The changed position may comprise a displacement in each of first and second orthogonal directions. Then the position of the control point relative to the display surface reference point is determined in dependence on the changed position of the operative location relative to the control apparatus reference point. The display apparatus may therefore be operative to determine a change in position of the control point in a first two coordinate space and in dependence thereon to determine a position of the control point in a second two coordinate space.
The display arrangement may be operative to display an image of no more than two dimensions on the display surface. Despite the two dimensional form of the image to be displayed, the display surface itself may define a non-planar surface, e.g. the display surface may be slightly curved. Therefore the display apparatus may be operative to determine the position of the control point in a two dimensional display plane. The display apparatus may be configured such that, in use, the display plane and a plane defined by the first and second directions, e.g. a virtual plane, diverge from each other. The control apparatus reference point may lie on the virtual plane. The display arrangement may comprise at least one of: a passive display surface, such as a wall or screen on which images are projected by projection apparatus; and an active display surface, such as forms part of a monitor. The display apparatus may be operative to form a data structure representing the determined relative position of the operative location on the hand-held control apparatus in relation to the display surface reference point. The data structure may comprise further data, such data relating to plural display surface reference points and successive control point locations.
The hand-held control apparatus may be used to perform one or more functions. For example the hand-held control apparatus may be used to draw or write with the thus formed drawing or writing being displayed on the display surface. Accordingly, the hand-held control apparatus may be hand-held pen apparatus. Alternatively, the hand-held control apparatus may be used to perform control functions, such as scrolling of an image shown on the display surface by selecting and moving a cursor displayed on the display surface.
Alternatively or in addition, the position determining apparatus may be configured to determine of itself a change in position of the hand-held control apparatus in three dimensions. Hence, the display apparatus may be operative to transform movement of the hand-held control apparatus in three dimensions into movement of the control point on the display surface in two dimensions. Determination of a change in position of the hand-held control apparatus in three dimensions allows for movement of the hand-held control apparatus out of a plane, e.g. virtual plane, in which a control action is being effected. Thus the hand-held control apparatus may not need to be maintained in the virtual plane in which the control action is being effected such that it is possible, for example, for the hand-held control apparatus to be lifted from and subsequently returned to the virtual plane in which control is being effected.
Alternatively or in addition, the position determining apparatus may comprise inertial sensing apparatus. The inertial sensing apparatus may be operative to sense motion in dependence on at least one reference internal to the hand-held apparatus. More specifically, the inertial sensing apparatus may comprise an accelerometer. For example, where the position determining apparatus is operative in two
dimensions the inertial sensing apparatus may comprise a two axis accelerometer. Where the position determining apparatus is operative in three dimensions the inertial sensing apparatus may comprise a three axis accelerometer. An output from the accelerometer may be integrated twice with respect to time to thereby determine a relative location, e.g. in terms of displacement in each of two orthogonal directions, of the hand-held control apparatus.
Alternatively or in addition, the hand-held control apparatus may comprise orientation determining apparatus, which is configured to determine of itself a change in orientation of the hand-held control apparatus about each of at least two mutually orthogonal axes. More specifically, the orientation determining apparatus may be configured to determine of itself a change in orientation of the hand-held control apparatus about each of three mutually orthogonal axes. For example, the orientation determining apparatus may comprise a gyroscope. Where the hand-held control apparatus comprises a gyroscope, an output from the gyroscope may be integrated with respect to time to determine an orientation of the hand-held control apparatus. Alternatively or in addition the hand-held control apparatus may comprise heading determining apparatus, such as a magnetometer. In addition and where the hand-held control apparatus comprises an accelerometer, an output from the accelerometer may be changed in dependence on the determined orientation before the changed accelerometer output is integrated. For example, the orientation may be multiplied by an output from the accelerometer. Hence, the changed accelerometer output may be aligned properly with a fundamental coordinate frame, i.e. the coordinate frame when the hand-held control apparatus is initialised. During movement of the hand-held control apparatus to change a location of the control point on the display surface, the orientation of the hand-held control apparatus may be changed but without such a change in orientation being intended to effect a change of location of the control point. For example, the hand-held control apparatus might be rotated by the hand of the user about the operative location of the hand-held control apparatus such that the centre of the apparatus where the position determining apparatus is located moves up or down. However, such a change in orientation may be detected by the position determining apparatus and a change in location of the control point determined mistakenly. Therefore, the display apparatus may be operative to determine the location of the control point in dependence on operation of the position determining apparatus and the orientation determining apparatus. Hence, the display apparatus may take account of a change of orientation of the hand-held control apparatus in determining a location of the control point.
The operative location on the hand-held control apparatus may be spaced apart from the position determining apparatus. The orientation of the hand-held control apparatus may be changed without there being any translation of the operative location of the hand-held control apparatus. For example, the hand-held control apparatus may be rotated about the operative location such that the position determining apparatus describes an arc. Such a movement may be detected by the position determining apparatus. Therefore, the position of a control point may be determined in dependence on operation of the position determining apparatus and the orientation determining apparatus and a distance between the position determining apparatus and the operative location on the hand-held control apparatus. Determination of movement of the position determining apparatus along three axes and of orientation of the hand-held control apparatus about three axes may enable a location of the operative location on the hand-held control apparatus to be determined along three axes where the location of the operative location in relation to the position determining apparatus is known.
Alternatively or in addition, the hand-held control apparatus may comprise first and second position determining apparatus, each of the first and second position determining apparatus being operative to determine of itself a change in position of the hand-held control apparatus in at least two dimensions. More specifically, the display apparatus may be operative to determine the changed position of the operative location on the hand-held control apparatus in dependence on at least one of: a difference between outputs from the first and second position determining apparatus; and an average of outputs from the first and second position determining apparatus. Hence, common mode errors may be reduced where a difference is used and greater accuracy may be achieved where an average is used. The display apparatus may be operative to determine a difference between outputs from the first and second position determining apparatus after an integration of each output from the first and second position determining apparatus. Therefore, the display apparatus may be operative to detect at least one predetermined movement, for example an unintentional movement, such as may arise from dropping the hand-held control apparatus. More specifically, the at least one predetermined movement may be determined in dependence on a comparison of a characteristic of movement with
a threshold value, e.g. when the velocity of movement or the acceleration is greater than a value that is representative of a shock event such as is liable to arise when the hand held control apparatus is dropped. The hand-held control apparatus may be used to determine a control apparatus reference point by positioning the operative location on the hand-held control apparatus at a desired location. Therefore, the hand-held control apparatus may further comprise an input device mounted on the chassis, the input device being operative to initiate determination of the control apparatus reference point.
Alternatively or in addition, the input device may be operative to determine the display surface reference point. More specifically, the hand-held control device may be held such that the operative location on the hand-held control apparatus is at or proximate a desired location on the display surface with the input device being operative thereupon to determine the display surface reference point.
The input device may be user operable. The input device may comprise a user operable actuator which is operated to initiate a determination. For example, the actuator may comprise a switch, such as a push-button switch, or a selector, such as a wheel. Hence, the user may move the hand-held control apparatus such that its operative location is on or proximate the display surface and operate the actuator to thereby initiate determination of the display surface reference point. The display apparatus may be operative to display an object at a predetermined location on the display surface and the hand-held control apparatus may be held at or proximate the object before the input device is operative. Also, the user may locate the hand-held control apparatus at a location spaced apart from the display surface and oriented such that the operative location is directed at the desired location on the display surface or otherwise held at a location in free space in which control movements are to be made and then operate the actuator to thereby initiate determination of the control apparatus reference point.
Alternatively or in addition, the input device may comprise a sensor, e.g. image sensor, which is operative upon reception of a stimulus to initiate at least one of: determination of the display surface reference point; and determination of the control
apparatus reference point. Hence, the input device may operate in dependence on reception of the stimulus either with or without manual operation.
Alternatively or in addition, the hand-held control apparatus may comprise an image sensor mounted on the chassis. The image sensor may be mounted on the chassis such that it is at or proximate the operative location on the hand-held control apparatus. The image sensor may be configured to acquire colour images.
More specifically, the display apparatus may be configured to adjust at least one characteristic of an image acquired by the image sensor. The at least one characteristic may comprise at least one of: luminance; shade; hue; tint; Red, Green and Blue (RGB) values; Hue, Saturation and Lightness (HSL) values; and Hue, Saturation and Value (HSV) values. Hence, a characteristic of a grey scale image may be adjusted if a grey scale image sensor is employed or a characteristic of a colour image may be adjusted if a colour image sensor is employed. Furthermore, the display apparatus may be configured to adjust at least one characteristic of the image by controlling the image sensor. The display apparatus may be operative to analyse an image acquired by the image sensor with the at least one characteristic of a further image being adjusted in dependence thereon. The analysis may comprise comparing at least one characteristic of the acquired image with a predetermined characteristic. For example, the analysis may comprise comparing the acquired image with a stored image having predetermined characteristics. The acquired image may be a representation of a predetermined object which is, for example, displayed on the display surface. Hence, the image sensor and associated processing apparatus may be calibrated for proper representation of acquired colour or grey scale images. Thus the display apparatus may adjust for a change in ambient light level or for the effect of a change in ambient light colour in addition to compensating for improper image acquisition. More specifically, the image sensor may be comprised in the input device. The display surface reference point may be determined in dependence on an image sensed by the image sensor. Where the display surface reference point is being determined, the display apparatus may be operative to display an object at a
predeternnined location on the display surface and the hand-held apparatus may be held at or proximate the object such that the object may be sensed by the image sensor. The displayed object may be of predetermined form. Hence, the display apparatus may be operative to recognise the form of the sensed object and to initiate determination of the display surface reference point in dependence thereon and without manual operation. Alternatively and where the input device is user operable, the display surface reference point may be determined in dependence on both manual operation and sensing of the predetermined object. Hence, greater precision and reliability of determination of the display surface reference point may be achieved.
Alternatively or in addition, the image sensor may be operative to acquire at least one image and the display apparatus may be operative to change a location of the control point in dependence on the acquired at least one image.
More specifically, the image sensor may be operative to acquire first and second images of an object, e.g. a cursor, on the display surface at time spaced intervals and the display apparatus may be operative to: determine the relative locations of the acquired first and second images within an image frame of the image sensor; and move the control point in dependence on the determined relative locations. If the control point is unable to follow movement of the hand-held control apparatus on the basis of operation of the position determining apparatus, e.g. where the position determining apparatus is subject to error, the first and second images may be spaced apart from each other within the image frame. The display apparatus may be further operative to move the control point by an amount corresponding to an extent to which the first and second images are spaced apart from each other in the image frame.
Alternatively or in addition, the display apparatus may be configured such that when the hand-held control apparatus is held so as to direct the image sensor towards an object, e.g. a cursor, at a predetermined location on the display surface, the image sensor may be operative thereupon to acquire an image of the object and the display apparatus may be operative to move the control point in dependence on the
acquired image and the predetermined location. The control point may be moved to the location of the object. Thus, a fresh control apparatus reference point may be determined upon acquisition of the image of the object. If, for example, the position determining apparatus is subject to an error which gives rise to a lack of coincidence between the actual location of the control point on the display surface and the desired location, the hand-held control apparatus may be redirected so as to allow for acquisition of the object by the image sensor and the display apparatus may be operative to correct the position on the display surface of the control point. Alternatively or in addition, the hand-held control apparatus may further comprise a lens arrangement disposed in relation to the image sensor so as to alter a field of view of the image sensor. More specifically, the lens arrangement may be
configured to increase a field of view of the image sensor. The lens arrangement may comprise a fish-eye lens. The fish eye lens may provide a wide angle of view of the display surface and thereby reduce the effect of shadow and variation in light intensity arising from an object, such as the user's hand, obscuring the display surface. The lens arrangement may distort an image of an object acquired with the image sensor. Therefore, the display apparatus may be configured to transform an image acquired by the image sensor so as to reduce if not substantially remove distortion caused by the lens arrangement. For example, a fish eye lens is liable to introduce barrel distortion and the display apparatus may be configured to apply a transformation that reduces such barrel distortion. More specifically, the display apparatus may store a predetermined lens transformation with the predetermined lens transformation being applied to acquired image data. Alternatively or in addition, the display apparatus may be configured to transform an image acquired by the image sensor in dependence upon a distance between the hand held control apparatus and the display screen. For example, the image may be resized in dependence on the distance with such resizing being useful for comparison of the image with another image such as during a position correction procedure.
An image acquired by the image sensor when the hand-held control apparatus is held at an angle to the display surface may be distorted. Hence, alternatively or in addition, the display apparatus may be configured to transform an image acquired by
the image sensor in dependence on an orientation of the hand-held control apparatus in relation to the display surface. An orientation of the hand-held control apparatus in relation to the display surface may be determined in dependence on operation of at least one of the position determining apparatus and the orientation determining apparatus.
In a form, the display apparatus may be configured to determine the display surface reference point when the hand-held control apparatus is held at a location on or proximate the display surface. For example, the display surface reference point may be determined in dependence on operation of the input device, e.g. operation of a switch by the user. This form of the invention may be appropriate where the handheld apparatus is used in the same room as the display surface or in a room nearby.
In another form, the display surface reference point may be determined by means other than the hand-held control apparatus. More specifically, the display surface reference point may be determined by control apparatus forming part of the display apparatus, such as a conventional mouse or the like forming part of or connected to a PC comprised in the display apparatus. Hence, the display surface reference point may be determined without the hand-held control apparatus being moved to the display surface.
Alternatively or in addition, the display apparatus may be configured when the handheld control apparatus is held at plural spaced apart locations on or proximate the display surface to determine corresponding display surface reference points. Hence, the user may determine the dimensions of the display surface and thereby fit or perhaps even scale movement of the hand-held control apparatus when the handheld control apparatus is moved with respect to the control apparatus reference point. In a form, the display apparatus may be configured when the hand-held control apparatus is held at three different locations on or proximate the display surface, e.g. at top left, at top right and at bottom centre of the display surface, to determine three spaced apart display surface reference points. This form is appropriate for a display surface of rectangular form. The display surface may be distorted, e.g. on account of a projection device being more than or less than ninety
degrees to the display surface or on account of distortion caused by optics in the projection device. Hence and in another form, the display apparatus may be configured when the hand-held control apparatus is held at more than three spaced apart locations on or proximate the display surface, e.g. at each of all four corners of the display surface, to determine a corresponding number of display surface reference points. The number of display surface reference points determined may depend on the form of distortion with the user determining the presence of distortion by eye and selecting the number of display surface reference points to be
determined, e.g. three or four, accordingly.
Alternatively or in addition, the hand-held control apparatus may comprise
communications apparatus mounted on the chassis. The communications apparatus may be configured to provide for communication with the display arrangement.
Communication with the display arrangement may be wireless, e.g. in accordance with Wi-Fi or in the 2.4 GHz radio frequency band.
The display arrangement may comprise a Personal Computer (PC). The display arrangement may comprise a projector operative to project images onto the display surface, e.g. onto a wall. The display arrangement may comprise a display device, e.g. a monitor, which defines the display surface. Where the display arrangement comprises a PC, the PC and projector or display device are in data communication. The PC and projector or display device may be of a form and may cooperate to display images on the display surface in a fashion that is well known to the reader skilled in the art. Thus, the PC may run a known operating system to provide this capability.
The display apparatus may further comprise a communications device which is configured to electrically connect to the display arrangement and to provide for communication with the hand-held control apparatus. For example and where the display arrangement comprises a PC, the communications device may be configured to be received in a port of the PC, such as a USB port. The PC may employ a USB driver, which runs on an operating system, such as Windows, to effect
communication with the hand-held control apparatus. The communications device
may comprise wireless communications apparatus, which is operative to provide for wireless communication with the hand-held control apparatus, e.g. by way of the wireless communications apparatus of the hand-held control apparatus. The communications device may comprise data storage, which is operative to store software instructions for configuring the display arrangement for operation with the hand-held control apparatus. Upon electrical connection of the communications device and the display arrangement, the communications device and the display arrangement may be operative to install the software instructions stored in the communications device in the display arrangement. Installation may be initiated without manual intervention other than that required to effect electrical connection between the communications device and the display arrangement.
Alternatively or in addition, the display apparatus may be operative to identify at least one predetermined movement of the hand-held control apparatus, e.g. in
dependence on operation of at least one of the position determining apparatus and the orientation determining apparatus. Such at least one predetermined movement may be of an unintended nature, e.g. the movement may arise from dropping of the hand-held control apparatus or the striking of the hand-held control apparatus against a surface. The display apparatus may initiate a reset procedure in
dependence on identification of a predetermined movement of an unintended nature. More specifically, the display apparatus may be operative to prompt for at least one of a fresh control apparatus reference point and a display surface reference point to be determined. Alternatively or in addition, such at least one predetermined movement may be of an intended nature, e.g. the movement may be of the form of a double tap of the hand-held control apparatus in a virtual plane in which the handheld control apparatus is being used. The display apparatus may initiate the performance of a control operation in dependence on identification of a
predetermined movement of an intended nature. More specifically, the display apparatus may initiate a control operation in respect of the control location on the display surface. For example, when the hand-held control apparatus has been used to point at an object on the display surface, such as an icon representing a link to a
document, the predetermined movement may initiate the opening of the document that the icon represents.
Alternatively or in addition, the hand-held control apparatus may comprise proximity detector apparatus, the proximity detector apparatus being configured to determine when the hand-held control apparatus is proximate the display surface.
Furthermore, the proximity detector apparatus may be configured to determine an extent of separation between a location on the hand-held control apparatus and a location on the display surface. The proximity detector apparatus may comprise a light emitter and a light detector disposed in relation to each other and operative such that light emitted by the light emitter is reflected by the display surface and detected by the light detector. For example, the light emitter may be an infra-red Light Emitting Device (LED) and the light detector may be a photo-diode. The display apparatus may be operative in dependence on operation of the proximity detector apparatus to control operation of the hand-held control apparatus. More specifically, the display apparatus may be operative to provide for determination of the position of the control point on the display surface. Alternatively or in addition, the display apparatus may be operative to supply electrical power or reduce supply of electrical power to parts of the hand-held control apparatus. Therefore, the proximity detector apparatus may be operative to vary the supply of electrical power to the hand -held control apparatus in dependence on an extent of proximity of the hand-held control apparatus to the display surface to thereby conserve electrical power. Conservation of electrical power may be advantageous when the electrical power is provided to the hand-held control apparatus from an electrical battery.
Alternatively or in addition, the display apparatus may be operative to define a control region on the display surface within which the hand-held control apparatus is operative to determine a location of the control point. The display apparatus may be further operative to define an activation region on the display surface within which the hand-held control apparatus is activated, e.g. by the bringing into operation of predetermined parts of the hand-held control apparatus, such as the image sensor. Activation may be in dependence on operation of at least one of: the proximity detector apparatus; and the position determining apparatus. More specifically,
activation region may be a space within which the proximity detector is operable. The control region may lie within the activation region. More specifically, the activation region may extend around the control region. Hence, movement of the hand-held control apparatus into the activation region activates at least a part of the hand-held control apparatus, e.g. the image sensor, before the hand-held control apparatus is moved into the control region for movement of the control point within the control area. Thus the control area may constitute the part of the display surface within which the control point may be moved. The hand-held control apparatus may be configured for user operation, e.g. by way of a switch, to provide for activation of the hand-held apparatus when used at a location spaced apart from the display surface, e.g. at a distance beyond the reach of the proximity detector apparatus.
Alternatively or in addition, the hand-held control apparatus may comprise pointer apparatus, the pointer apparatus being configured to dispose a user discernible mark on the display surface when the hand-held control apparatus is directed towards the display surface. The pointer apparatus may be used to point at locations other than on the display surface. The pointer apparatus may comprise a laser device. Hence, the pointer apparatus may be operative to illuminate a part of the display surface. The display apparatus may be configured to change a focus of the imaging sensor. The focus may be changed by at least one of: processing of an image acquired by the imaging sensor; and changing a configuration of imaging apparatus. Processing of the image acquired by the imaging sensor may comprise applying a
transformation to the acquired image to thereby change an apparent focal length of the imaging sensor. Where the imaging apparatus comprises a focusing lens, changing a configuration of the imaging apparatus may comprise changing a separation between the imaging sensor and the focusing lens. The display apparatus may be operative to change the focus in dependence on at least one of: a distance between the hand-held control apparatus and the display surface, e.g. as determined by the position determining apparatus; and a quality of an image acquired by the image sensor, e.g. by comparing an acquired image of an object with a stored image of the same object. The display apparatus may be configured to
change the focus of the imaging sensor on an automatic basis, i.e. without manual intervention.
The hand-held control apparatus may be elongate in form. More specifically, the hand-held control apparatus may have the form of a marker pen. The operative location on the hand-held control apparatus may be located at or proximate an end of the hand-held control apparatus. At least one of an image sensor, proximity detector apparatus and pointer apparatus may be disposed at or proximate an end of the hand-held control apparatus.
According to a second aspect of the present invention there is provided a method of controlling a display arrangement comprising a display surface by way of hand-held control apparatus comprising a chassis and position determining apparatus mounted on the chassis, the position determining apparatus being configured to determine of itself a change in position of the hand-held control apparatus in at least two dimensions, the method comprising:
determining, in dependence on operation of the hand-held control apparatus, a control apparatus reference point at a location spaced apart from the display surface;
determining, in dependence on operation of the position determining apparatus, a changed position of an operative location on the hand-held control apparatus in first and second orthogonal directions relative to the control apparatus reference point after movement of the hand-held control apparatus by hand and despite movement of the operative location in a third direction which is orthogonal to each of the first and second directions; and
determining, in dependence on the determined relative position of the operative location, a position of a control point on the display surface in relation to a display surface reference point on the display surface. Embodiments of the second aspect of the present invention may comprise one or more features of the first aspect of the present invention.
According to a third aspect of the present invention there is provided a computer program comprising program instructions for causing a computer, which is comprised in a display arrangement comprising a display surface, and a hand-held control apparatus according to the first aspect of the present invention to perform the method according to the second aspect of the present invention.
More specifically, the computer program may be one of: embodied on a record medium; embodied in a read only memory; stored in a computer memory; and carried on an electrical carried signal. For example, the computer program may be embodied in a communications device which is configured to electrically connect with the computer, e.g. by way of a connector on the computer, such as a USB port.
Alternatively or in addition, the hand-held control apparatus may be a mobile device, such as a smartphone or tablet device. The smartphone may comprise inertial navigation apparatus. The smartphone may further comprise at least one of an orientation determining apparatus and an image sensor.
Further embodiments of the third aspect of the present invention may comprise one or more features of the first aspect of the present invention.
According to a fourth aspect of the present invention there is provided a computer system comprising: a display arrangement comprising a display surface; and handheld control apparatus comprising a chassis and position determining apparatus mounted on the chassis, the position determining apparatus being configured to determine of itself a change in position of the hand-held control apparatus in at least two dimensions; and program instructions for causing the computer system to perform the method according to the second aspect of the present invention. A computer in the form of a general purpose computer, such as a Personal Computer (PC), an embedded microcontroller or a microprocessor may form part of the display arrangement.
More specifically, the program instructions may be at least one of: embodied on a record medium; embodied in a read only memory; stored in a computer memory; and carried on an electrical carried signal. Further embodiments of the fourth aspect of the present invention may comprise one or more features of the first aspect of the present invention.
The present inventors have appreciated that the feature of the image sensor is of wider application than hitherto described. Therefore and according to a fifth aspect of the present invention, there is provided display apparatus comprising hand-held control apparatus and a display arrangement comprising a display surface,
the hand-held control apparatus comprising: a chassis; an image sensor mounted on the chassis; and position determining apparatus mounted on the chassis, the position determining apparatus being configured to determine of itself a change in position of the hand-held control apparatus in at least two dimensions, the display apparatus being configured to:
determine, in dependence on operation of the position determining apparatus, a changed position of an operative location on the hand-held control apparatus after movement of the hand-held control apparatus by hand;
acquire with the image sensor at least one image of an object on the display surface; and
determine a position of a control point on the display surface in dependence on the changed position of the operative location on the hand-held control apparatus and adjust the position of the control point in dependence on the at least one acquired image.
In use, the display apparatus relies principally on the position determining apparatus to determine the position of the control point with the at least one acquired image being used to adjust the position of the control point. Position determining apparatus that is operative of itself, e.g. inertial navigation apparatus, may be liable to error over time. For example, position determinations may be liable to drift over time and thus register a progressively increasing level of error. Adjusting the control point by means of the at least one acquired image addresses such an error. Hence, the
display apparatus may be operative to adjust the control point in dependence on the at least one acquired image periodically.
More specifically, the image sensor may be operative to acquire first and second images of an object, e.g. a cursor, on the display surface at time spaced intervals and the display apparatus may be operative to: determine the relative locations of the acquired first and second images within an image frame of the image sensor; and move the control point in dependence on the determined relative locations. During such movement of the hand-held control apparatus by hand the hand-held control apparatus may be held with respect to the display surface to allow for acquisition of images from the display surface by the image sensor. If the control point is unable to follow movement of the hand-held control apparatus on the basis of operation of the position determining apparatus, e.g. where the position
determining apparatus is subject to error, the first and second images may be spaced apart from each other within the image frame. The display apparatus may be further operative to move the control point by an amount corresponding to an extent to which the first and second images are spaced apart from each other in the image frame. Alternatively or in addition, the display apparatus may be configured such that when the hand-held control apparatus is held so as to direct the image sensor towards an object, e.g. a cursor, at a predetermined location on the display surface, the image sensor may be operative thereupon to acquire an image of the object and the display apparatus may be operative to move the control point in dependence on the acquired image and the predetermined location. The control point may be moved to the location of the object. Thus, an error in operation of the position determining apparatus may be addressed, e.g. on a periodic basis and as the need arises. If, for example, the position determining apparatus is subject to an error which gives rise to a lack of coincidence between the actual location of the control point on the display surface and the desired location, the hand-held control apparatus may be moved so as to allow for acquisition of an image of the object by the image sensor and the display apparatus may be operative to correct the position of the control point on the display surface.
Further embodiments of the fifth aspect of the present invention may comprise one or more features of any previous aspect of the present invention. According to a sixth aspect of the present invention there is provided a method of controlling a display arrangement comprising a display surface by way of hand-held control apparatus comprising a chassis, an image sensor mounted on the chassis and position determining apparatus mounted on the chassis, the position
determining apparatus being configured to determine of itself a change in position of the hand-held control apparatus in at least two dimensions, the method comprising: determining, in dependence on operation of the position determining apparatus, a changed position of an operative location on the hand-held control apparatus after movement of the hand-held control apparatus by hand;
acquiring with the image sensor at least one image of an object on the display surface; and
determining a position of a control point on the display surface in dependence on the changed position of the operative location on the hand-held control apparatus and adjusting the position of the control point in dependence on the at least one acquired image.
Embodiments of the sixth aspect of the present invention may comprise one or more features of any previous aspect of the present invention.
According to a seventh aspect of the present invention there is provided a computer program comprising program instructions for causing a computer, which is comprised in a display arrangement comprising a display surface, and a hand-held control apparatus according to the fifth aspect of the present invention to perform the method according to the sixth aspect of the present invention. Embodiments of the seventh aspect of the present invention may comprise one or more features of any previous aspect of the present invention.
According to an eighth aspect of the present invention there is provided a computer system comprising: a display arrangement comprising a display surface; and handheld control apparatus comprising a chassis and position determining apparatus and an image sensor mounted on the chassis, the position determining apparatus being configured to determine of itself a change in position of the hand-held control apparatus in at least two dimensions and the image sensor being operative to acquire at least one image of an object on the display surface; and program instructions for causing the computer system to perform the method according to the sixth aspect of the present invention.
Embodiments of the eighth aspect of the present invention may comprise one or more features of any previous aspect of the present invention.
According to a further aspect of the present invention there is provided a display apparatus comprising hand-held control apparatus and a display arrangement comprising a display surface, the hand-held control apparatus comprising: a chassis; and position determining apparatus mounted on the chassis, the position
determining apparatus being configured to determine a change in position of the hand-held control apparatus in at least two dimensions, the display apparatus being configured: to determine a changed position of the hand-held control apparatus in dependence on operation of the position determining apparatus after movement by hand of the hand-held control apparatus; and to determine a position of a control point on the display surface in dependence on the determined changed position. Embodiments of the further aspect of the present invention may comprise one or more features of any previous aspect of the present invention.
According to a yet further aspect of the present invention there is provided a method of controlling a display arrangement comprising a display surface by way of hand- held control apparatus comprising a chassis and position determining apparatus mounted on the chassis, the position determining apparatus being configured to determine a change in position of the hand-held control apparatus in at least two dimensions, the method comprising: determining a changed position of the hand-
held control apparatus in dependence on operation of the position deternnining apparatus and after movement by hand of the hand-held control apparatus; and determining a position of a control point on the display surface in dependence on the determined changed position.
Embodiments of the yet further aspect of the present invention may comprise one or more features of any previous aspect of the present invention.
Brief Description of Drawings
Further features and advantages of the present invention will become apparent from the following specific description, which is given by way of example only and with reference to the accompanying drawings, in which:
Figure 1 shows display apparatus according to the present invention;
Figure 2 is a drawing in longitudinal section of the hand-held control apparatus of Figure 1 ;
Figure 3 is a block diagram representation of the main components of the hand-held control apparatus of Figure 2;
Figure 4 is a block diagram representation of the software components of the present invention;
Figure 5 is a block diagram representation of operation of the present invention with regards to the inertial navigation apparatus;
Figure 6 is a block diagram representation of operation of the present invention with regards to the proximity detector and image sensor;
Figure 7 is a block diagram representation of processing of acquired image data;
Figure 8 illustrates the initialisation process for the present invention;
Figure 9 represents certain control functions provided by the present invention;
Figure 10 represents certain error handling procedures; and
Figure 1 1 shows the activation and control regions of a display surface.
Description of Embodiments
Display apparatus 10 according to the present invention is shown in Figure 1 . The display apparatus 10 comprises hand-held control apparatus 12 and a display arrangement. The display arrangement comprises a Personal Computer (PC) 14, a projector 16 and a display surface 18 formed on a wall. The PC 14, projector 16 and display surface 18 are of a form and cooperate to display images on the display surface in a fashion that is well known to the reader skilled in the art. The display apparatus 10 further comprises a communications device 20, which is received in and electrically connects with a USB port on the PC 14. The communications device 20 and the hand-held control apparatus 12 are configured to provide for wireless communication between the communications device and the hand-held control apparatus in the 2.4 GHz radio frequency band. The provision of a 2.4 GHz radio frequency channel between the hand-held control apparatus 12 and the
communications device 20 will be readily within the grasp of the reader skilled in the art without resorting to any more than ordinary design skills.
A drawing in longitudinal section of the hand-held control apparatus of Figure 1 is shown in Figure 2. The hand-held control apparatus 30 is in the form of a cylinder which tapers towards one end such that the hand-held control apparatus is similar in form and size to a conventional marker pen. The hand-held control apparatus 30 has an outer shell 32 formed of a hard plastics material or similar such material, which contains or supports the components of the hand-held control apparatus. The outer shell comprises a section 34 of transparent plastics material. The section 34 of transparent plastics material has a tapered profile and defines an aperture 36 at its distal end. The distal end of the tapered section constitutes the operative location of the hand-held apparatus. The hand-held control apparatus 30 of Figure 2 comprises a chassis 38 with a first three-axis accelerometer 40, a second three-axis
accelerometer (not shown), a gyroscope 42, a magnetometer (not shown) and a colour camera 44 mounted on the chassis. The accelerometers 40 and the gyroscope 42 are MEMs devices of a type that will be readily available to the reader skilled in the art. The magnetometer is a three axis, digital magnetometer namely part number MAG31 10 from Freescale Semiconductor Inc., 6501 William Cannon Drive West, Austin, Texas 78735, USA. Similarly, the colour camera 44 is of a type
that will be readily available to the reader skilled in the art. The colour camera 44 is located towards the tapered end of the hand-held apparatus. The hand-held control apparatus 30 further comprises a fish-eye lens 48 located between the colour camera 44 and the aperture 36, and a laser pointer 50 and proximity detector 52, which are mounted side-by-side between the fish-eye lens 48 and the aperture 36. The laser pointer and the proximity detector obscure the field of view of the camera; attending to the effect of this obscuration is described below with reference to Figure 7. The laser pointer is of a type which is safe in use and readily available to the reader skilled in the art. The proximity detector 52 comprises an infra-red Light Emitting Device (LED) and a photo-diode light detector and is operative when light emitted by the LED is reflected by a surface and detected by the photo-diode. The design and construction of an appropriate proximity detector 52 will be readily within the grasp of the reader skilled in the art without resorting to any more than ordinary design skill. In addition, the hand-held control apparatus 30 comprises a USB port 54, which provides for data communications with the electronic circuitry of the handheld control apparatus 30 to thereby provide for software updates, calibration and the like, a microprocessor 56, a wireless transceiver 58, a battery 60, which provides electrical power for the hand-held control apparatus 30, and control switches and buttons. As described above, the wireless transceiver 58 is configured to provide for wireless communication in the 2.4 GHz radio frequency band. The control switches and buttons comprise an on/off switch 62, a left mouse button 64, a right mouse button 66 and an auxiliary button 68.
A block diagram representation 80 of the main components of the hand-held control apparatus and certain components of the display arrangement of Figures 1 and 2 is shown in Figure 3. In Figure 3 the hand-held control apparatus comprises an electronics core 82, which comprises the microprocessor 56 and associated static and flash memory. The hand-held control apparatus further comprises the gyroscope 42, 84, the accelerometers 40, 86, the magnetometer, a temperature sensor 88, the colour camera 44, 90 and proximity detector 52, 92 with each component being in data communication with the electronics core 82. The temperature sensor 88 measures the temperature of the hand-held control apparatus, with the measurements being used to provide for temperature
compensation. In addition, the hand-held control apparatus comprises the control switches and buttons 94, the USB port 54, 96 and the 2.4 GHz radio frequency transceiver 58, 98, which are each in communication with the electronics core 82. The block representing the control switches and buttons 94 in Figure 3 specifies four buttons instead of the three buttons shown in Figure 2 and a scroll wheel that is absent from Figure 2. This is because control mechanisms other than the control mechanism shown in Figure 2 are envisaged depending on the nature of the application intended for the hand-held control apparatus. For example, a scroll wheel is advantageous where it is intended to effect control operations which involve scrolling a control point up and down menu items shown on the display screen. Also, a fourth button may be required to, for example, switch on and off the laser pointer 50, 100, which is represented in Figure 3. The hand-held control apparatus also comprises a battery 60, 102 and power management circuitry 104, which is operative to provide for routine regulation and control functions. Figure 3 also shows the PC 14, 106 and its 2.4 GHz radio frequency transceiver 20, 108.
According to another embodiment of the present invention, the hand-held control apparatus is constituted by a smartphone comprising accelerometers, a gyroscope, a magnetometer, a camera and a WiFi communications transceiver. Hence, wireless communication is by way of WiFi instead of by way of the 2.4 GHz radio frequency band as described above. Such smartphones are known and widely available. A laser pointer is attached by way of a releasable clip to the outside of the smartphone case. Otherwise but with the exception of the proximity detector, the smartphone is configured by software that is operative as described herein to function as described above and below with regards to the hand-held control apparatus of Figure 2.
The determination of a position of a control point on the display surface 18 in dependence on operation of the accelerometers 40 and the gyroscope 42 will now be described before further features of the display apparatus are described. During use, the hand-held control apparatus 12, 30 is moved by hand with each
accelerometer 40 providing a signal, A(t), indicating the perceived acceleration of the hand-held control apparatus 12, 30 along three axes which are aligned with the body
of the hand-held control apparatus. Simultaneously, the gyroscope 42 provides a signal G(t) which reflects the rate of rotation of the hand-held control apparatus 12, 30. These two measurements together with image data from the camera 44 are transmitted to an application running on the computer 14. There, the rate of rotation, G(t), is integrated over time to provide a matrix, [O(t)], which represents at time t the instantaneous matrix O which defines the coordinate frame that describes the orientation of the hand-held control apparatus.
[0(f)] = \G(t).dt
The acceleration value A(t) is aligned according to O(t). Hence, the orientation and acceleration are multiplied together to obtain a new acceleration vector, Atrue(t), which is aligned properly with the fundamental coordinate frame, i.e. the coordinate frame when the hand-held control apparatus is initialised.
Atrue(t) = [O(t)].A(t)
By integrating the true acceleration vector, Atrue(t), twice over time, the position of the hand-held control apparatus, P(t), is determined relative to its starting point.
P{t)= l truem.cft2
The operative location of the hand-held control apparatus is then determined by adding an additional offset, L, which relates the position of the accelerometer 40 to the position of the operative location:
T(t)= P(t) + [0(t)].L
Finally, based on calibration points (CO, C1 , C2) established during the initialisation phase, a coordinate reference frame, [C], is determined. The determination of the calibration points during the initialisation phase is described below. The coordinate reference frame is then used to convert the calculated position of the operative location of the hand-held control apparatus, T(t), into coordinates that specify the position of the control point on the display surface 18, S(t). Thus:
S (t) = [C]- (T(t) - Co)
Taking account of the effect of gravity on the accelerometers will be readily within the the grasp of the ordinary design skills of the person skilled in the art. Furthermore, the display apparatus is configured to translate movement of the hand held control apparatus in two orthogonal directions in relation to a control apparatus reference point into a change in position of the control point in two orthogonal directions and despite movement of the hand held control apparatus in a third mutually orthogonal direction. Hence, unintended movement of the hand held control apparatus or indeed deliberate movement of the hand held control apparatus, e.g. when the user momentarily lifts the hand held control apparatus out of a virtual plane in which he is making some annotation, is not translated into a change in position of the control point on the display surface.
The software structure of the present invention is represented in block diagram form in Figure 4. Components in common with Figure 3 are designated by like reference numerals and therefore the reader's attention is directed to the description provided above with reference to Figure 3 for identification and description of such common components. The primary purpose of Figure 4 is to show the basic software structure employed according to an embodiment of the present invention. As can be seen from Figure 4, application software and firmware 1 10 resident in the hand-held control apparatus provides for higher level functionality. The application software and firmware 1 10 runs on an Operating System (OS) framework 1 12 hosted by the electronics core 82 of the hand-held control apparatus. As described above, data is communicated during normal use between the hand-held control apparatus and the PC 106 by way of a wireless channel operative in the 2.4 GHz radio frequency band. A wired communications channel can be established between the hand-held control apparatus and the PC 106 by an appropriate cable connecting the USB ports of the hand-held control apparatus and the PC 106. Such a wired communications channel is used to provide firmware updates for the hand-held control apparatus, change calibration coefficients for operative components of the hand-held control apparatus and like maintenance and control operations. Within the PC, application software 1 14 provides for higher level functionality and runs via an Application Programming Interface (API) 1 16 and employs a USB driver 1 18 to effect communication with the hand-held control apparatus. The implementation of software and firmware for the
hand-held control apparatus and the PC will be readily within the grasp of the reader skilled in the art without resorting to any more than ordinary programming skill when he has read the description provided above and below of the operation of and various functions performed by embodiments of the present invention. Generic drivers for USB and wireless functionality may already be resident in the PC. Should proprietary or generic drivers be required, these can be downloaded from a vendor website in accordance with well known practice in this regard. Further software to configure the PC for operation with the hand-held control apparatus is stored in the communications device 20. Such further software includes a driver for low level processing of data from the hand-held control apparatus and application software, which is operative to carry out higher level processing tasks. The further software is automatically installed in the PC when the communications device 20 is inserted in a USB port of the PC. Operation of the inertial navigation apparatus of the hand-held control apparatus is represented in block diagram form 140 in Figure 5. As can be seen, the output from each of the first and second accelerometers 142, 144 is filtered by a filter 146, 148 to remove undesired signals. In addition, each data packet from the
accelerometers is time stamped by way of a real-time clock 150. The orientation of the hand-held control apparatus (described further below) is multiplied by the output from each of the first and second accelerometers 142, 144 before being integrated by a respective first integrator 152, 154 and then being integrated by a respective second integrator 156, 158 to thereby provide a location in three dimensional space relative to a starting point having regards to either common mode errors by determining a difference between data from the first and second accelerometers or increased accuracy of measurement by determining an average of data from the first and second accelerometers. The output from the gyroscope 160 is likewise filtered by a filter 162 to remove undesired signals. Thereafter each data packet from the gyroscope is time stamped by way of the real-time clock 150. Then the rate of rotation measure provided by the gyroscope 160 is integrated by a gyroscope integrator 163 to provide the orientation of the hand-held control apparatus at any one time. As can be seen from Figure 5 an output from each of the filters 146, 148 connected to the first and second accelerometers 142, 144 is received directly by the
application software 172. In addition an output from the filter 162 connected to the gyroscope 160 is received directly by the application software 172. Where a filter comprises a single first order low pass stage the filter is operative to perform an integrating function. Therefore an output from the filter can be sufficient to provide data which is usable by the application software 172 without relying on an integrated signal. Furthermore the outputs from the filters yield further data of use to the application software 172. For example the application software can determine in dependence on the filter output data at least one of: errors, such as gravitational error; and how quickly a sensed property, such as orientation, has changed. An output from the magnetometer 165 is received by the application software 172.
According to this form of the invention the magnetometer 165 provides data on the heading of the hand-held control apparatus. The application software 172 is operative to use the heading data provided by the magnetometer 165 in addition to the orientation data provided by the gyroscope 160 to determine the disposition of the hand-held control apparatus during use with the magnetometer 165 providing data of a supplementary nature to data provided by the gyroscope 160. As described above, the output from the temperature sensor 164 is used to compensate for the effects of temperature variation on the first and second accelerometers 142, 144 and the gyroscope 160. Differences are determined between: the filtered but un-integrated measurements from the first and second accelerometers 142, 144, 166; the one time integrated measurements from the first and second
accelerometers 168; and the twice integrated measurements from the first and second accelerometers 170. These differences are used by the application software 172 to reduce the effects of common mode errors, which are liable to accumulate on account of the integration process, and in the identification of unintended and intended movements of a predetermined nature. Unintended movements of a predetermined nature include movement caused by shock events, such dropping the hand-held control apparatus, knocking the hand-held control apparatus against a hard surface, etc. Intended movements of a predetermined nature include the like of moving the hand-held control apparatus so as to perform a double tap motion analogous to the double click action performed with a conventional mouse when effecting certain control operations.
Operation with regards to the colour camera and proximity detector of the hand-held control apparatus is represented in greater detail in block diagram form in Figure 6. The output from the camera 180 is filtered 182 and image data packets are time stamped 184. Thereafter the image is assembled 186 in dependence on the received image data. Lighting conditions, e.g. in respect of the brightness and colour of the ambient light, may change. Hence an image of a predetermined object displayed on the display screen is acquired by the camera 180, processed and then compared with a predetermined image of the same object stored in memory.
Depending on the outcome of the comparison, the characteristics of the filter 182 may be adjusted to compensate for a change in lighting conditions. The fish eye lens 48 causes barrel distortion of the image acquired by the camera 180. Hence the assembled image is transformed 188 to remove the distorting effect of the fish eye lens. When the hand-held control apparatus is held at an angle to the display screen an image of an object acquired by the camera is liable to be distorted.
Hence, the application uses the angle of orientation of the hand-held control apparatus to transform the acquired image and thereby reduce the distortion. Such and further image processing is described below with reference to Figure 7. The display apparatus is configured to automatically change a focus of the camera.
According to one approach, the focus is changed by applying a transformation to an image acquired by the camera to thereby change an apparent focal length of the camera. The formation of a transformation and the application of the transformation to an acquired image so as to change an apparent focal length will be well known to the reader of ordinary skill. According to another approach, the focus is changed by changing a separation between a focusing lens (not shown in Figure 2) and the camera. The changing of the focus is in dependence on either: a distance between the hand-held control apparatus and the display surface as determined by the accelerometers and gyroscope; and a quality of an acquired image as determined by comparing an acquired image of an object with a stored image of the same object. Image data is employed according to one of two approaches to adjust the location of the control point as determined by the inertial navigation apparatus to address errors in the inertial navigation apparatus 190. According to the first approach, the camera acquires first and second images of an object, e.g. a cursor, on the display surface at time spaced intervals and the application determines the relative locations of the
acquired first and second images within an image frame of the camera. Then the application moves the control point by an amount corresponding to an extent to which the first and second images are spaced apart from each other in the image frame. According to the second approach, when the hand-held control apparatus is held so as to direct the camera towards an object, e.g. a cursor, at a predetermined and thus known or determinable location on the display surface, the camera acquires an image of the object. Then the application moves the control point to the location of the object and establishes a fresh starting point for the inertial navigation apparatus. This form of approach is described further below with reference to Figure 7. The output from the proximity detector 192 is used by the application to activate the camera of the hand-held control apparatus when it is proximate the display surface. This feature is described further below with reference to Figure 1 1 .
The processing of acquired images will now be described further with reference to Figure 7. As described above with reference to Figure 6, an assembled image is transformed to address barrel distortion caused by the fish eye lens and by the orientation of the hand held control apparatus at an angle to the display screen 194. In addition, the distance between the hand held control apparatus and the display screen at the time of acquisition of the image by the camera is used to resize the image 194. Furthermore, the content of a colour table 196, the formation of which is described below with reference to Figure 8, is employed to correct for improper colour balance and the like. The transformed acquired image is compared with what is displayed on the computer screen 198 (which is used to drive the projector) by way of a fast pattern matching process 200 to thereby identify the object
corresponding to the acquired image on the computer screen. For example, the acquired image may be of a cursor and the fast pattern matching process finds a matching cursor on the computer screen whereupon the coordinates of the cursor on the computer screen are determined 202. Thereafter the location of the control point is moved to the determined location to thereby address whatever accumulated error there might be in the inertial navigation apparatus. As mentioned above with reference to Figure 2, the laser pointer and the proximity detector obscure the field of view of the camera. Also, the field of view may be otherwise obscured in part, e.g. by a person's hand present between the projector and the display screen. Hence
and where there is obscuration, the success of the pattern matching and the location determination is dependent on there being sufficient useful information remaining in the acquired image. Where there is insufficient useful information remaining in the acquired image, a fresh image is acquired and the process repeated.
The initialisation process is represented in block diagram form in Figure 8. As a first step, all position and colour compensation data is cleared 210 before the inertial navigation apparatus and camera are operated 220. Then the operational location on the hand-held control apparatus is held at a first target on the display surface 230, e.g. at the top left of the display surface. Operation by the user of one of the control buttons on the hand-held control apparatus 240 initiates acquisition of a first image of the target by the camera and the determination of a first position reading by the inertial navigation apparatus 250. The hand-held control apparatus is then moved such that the operational location on the hand-held control apparatus is held at a second target on the display surface 260, e.g. at the top right of the display surface. Operation by the user of one of the control buttons on the hand-held control apparatus 270 initiates acquisition of a second image by the camera and the determination of a second position reading by the inertial navigation apparatus 280. The hand-held control apparatus is then moved such that the operational location on the hand-held control apparatus is held at a third target on the display surface 290, e.g. at the centre of the bottom of the display surface. Operation by the user of one of the control buttons on the hand-held control apparatus 300 initiates acquisition of a third image by the camera and the determination of a third position reading by the inertial navigation apparatus 310. The first to third images of the target are
compared to a predetermined image of the target to thereby determine a colour table which regulates compensation for colour balance and other such properties of acquired images 320. This approach compensates for changes in ambient lighting, a discrepancy in lighting across the display surface or errors arising from operation of the display arrangement, such as the projector. In addition, the first to third position readings are stored by the application as three calibration points (CO, C1 , C2) for use in control point position determination during use of the hand-held control apparatus. Thereafter, the hand-held control apparatus is brought into use 330. Where the display surface defines a non-rectangular footprint, e.g. where a projector is non-orthogonal to the display surface or where optics in the projector cause
distortion, a calibration point at each of the four corners of the display surface may be required. The number of calibration points, i.e. three or four, is selected by the user after the user has determined the extent and nature of the distortion by eye. Certain control functions are represented in block diagram form in Figure 9. A part of the application 410 regulates the effect of operation of the control buttons and scroll wheel. The first (left) button, second (right) button and scroll wheel 412 are configured to initiate the correction of the control point on the display in dependence on operation of the camera 420 and to perform control operations analogous to those performed by operation of the corresponding parts of a conventional mouse 430. The third button 432 is used to toggle the overlay mode, which is described further below, on and off 440. The fourth button 434 is used to enter and leave a remote mode 450. In the remote mode, the hand-held control apparatus is usable at a location spaced apart from the display surface. When the remote mode is turned off, the hand-held control apparatus is usable near the display surface with the camera being turned on in dependence on operation of the inertial navigation system or the proximity detector.
Certain error handling procedures are represented in block diagram form in Figure 10. A first form of error arises when the hand-held control apparatus is subject to a shock, such as is sustained when the hand-held control apparatus is dropped. A second form of error arises when the position management application 500 records an unacceptably large position error. Irrespective of the form of error, the
initialisation process 510 described above with reference to Figure 8 is executed thereupon to thereby restore the camera 520 and the inertial navigation apparatus 530 to normal operation.
Figure 1 1 shows a side view and a facing view of the display surface 610. The proximity detector described above is operative to detect the presence of a surface, such as the display surface, which is capable of reflecting signals from the proximity detector, when the hand-held control apparatus is within a predetermined distance of the display surface. Hence, the proximity detector is operative to define with data from the inertial navigation apparatus an activation region 620 that extends over and
above the display surface 610. When the operative location on the hand-held apparatus enters the activation region 620 operation of the camera is initiated. The application is operative to define a control region 630 lying within the activation region within which the hand-held control apparatus is fully operative. The control region consists of the control area, i.e. the part of the display surface within which the control point is to be moved by the hand-held control apparatus, and space above the control area within which the camera is capable of operating.
In use, the hand-held control apparatus can be used to annotate an existing image shown on the display surface or to draw or write on an effectively blank image by linking the control point to a conventional drawing application at the application level in the PC. Before such use, the operative location of the hand held control apparatus is held at the origin of a virtual plane in which the hand held control apparatus will be used and a switch is actuated by the user to thereby record the location of a control apparatus reference point. Changes in position of the operative location are determined in relation to the control apparatus reference point with the control apparatus reference point being related to the calibration points on the display surface. Annotations made or drawings executed with the hand-held control apparatus can be configured at the application level to modify the image data itself. Alternatively, the annotations or drawings can have the form of an overlay over an existing image, which can be shown or hidden in dependence on operation of the control buttons of the hand-held control apparatus in a predetermined fashion. The hand-held control apparatus can also be used for control operations. For example, movement of the control point by way of the hand-held control apparatus to an icon or structure shown on the display surface allows for operation of the control buttons to perform control operations, such as the opening of a document or the scrolling of an already open document. Such applications of the present invention make use of functions and application software of a conventional nature that are either already resident in the PC or can be obtained readily elsewhere. The implementation of such applications is readily within the grasp of the reader of ordinary skill without resorting to any more than ordinary programming skill. As described above, the hand-held control apparatus may be used in proximity to the display surface such that the camera is capable of acquiring image data from the display screen for use in
conjunction with position deternninations made by the inertial navigation apparatus. Alternatively, the hand-held control apparatus may be used at a location spaced apart from the display surface such that position determinations are made by the inertial navigation apparatus alone.
Claims
1 . Display apparatus comprising hand-held control apparatus and a display arrangement comprising a display surface,
the hand-held control apparatus comprising: a chassis; and position determining apparatus mounted on the chassis, the position determining apparatus being configured to determine of itself (e.g. without the assistance of further apparatus which is external to the hand-held apparatus and which is operative to provide an external reference) a change in position of the hand-held control apparatus in at least two dimensions,
the display apparatus being configured to determine:
in dependence on operation of the hand-held control apparatus, a control apparatus reference point at a location spaced apart from the display surface;
in dependence on operation of the position determining apparatus, a changed position of an operative location on the hand-held control apparatus in first and second orthogonal directions relative to the control apparatus reference point after movement of the hand-held control apparatus by hand and despite movement of the operative location in a third direction orthogonal to each of the first and second directions; and,
in dependence on the determined relative position of the operative location, a position of a control point on the display surface in relation to a display surface reference point on the display surface.
2. Display apparatus according to claim 1 , in which the display arrangement is operative to display an image of no more than two dimensions on the display surface.
3. Display apparatus according to claim 1 or 2, in which the display apparatus is operative to determine a displacement of the control point in a first two coordinate space and in dependence thereon to determine a position of the control point in a second two coordinate space.
4. Display apparatus according to claim 2 or 3 operative to determine the position of the control point in a two dimensional display plane, the display apparatus being configured such that, in use, the display plane and a virtual plane diverge from each other, the virtual plane being defined by the first and second orthogonal directions and with the control apparatus reference point lying on the virtual plane.
5. Display apparatus according to any preceding claim, in which the position determining apparatus is configured to determine of itself a change in position of the hand-held control apparatus in three dimensions, the display apparatus being operative to transform movement of the hand-held control apparatus in three dimensions into movement of the control point on the display surface in two dimensions.
6. Display apparatus according to any preceding claim, in which the position determining apparatus comprises inertial sensing apparatus.
7. Display apparatus according to claim 6, in which the inertial sensing apparatus comprises an accelerometer and an output from the accelerometer is integrated twice with respect to time to thereby determine a relative location of the hand-held control apparatus.
8. Display apparatus according to any preceding claim, in which the hand-held control apparatus comprises orientation determining apparatus, which is configured to determine of itself a change in orientation of the hand-held control apparatus about each of at least two mutually orthogonal axes.
9. Display apparatus according to claim 8 and where the hand-held control apparatus comprises an accelerometer, in which an output from the accelerometer is changed in dependence on the determined orientation before the changed accelerometer output is integrated.
10. Display apparatus according to any preceding claim and where the display apparatus comprises orientation determining apparatus, in which the operative location on the hand-held control apparatus is spaced apart from the position determining apparatus and the position of the control point is determined in dependence on operation of the position determining apparatus and the orientation determining apparatus and on a distance between the position determining apparatus and the operative location on the hand-held control apparatus.
1 1 . Display apparatus according to any preceding claim, in which the hand-held control apparatus comprises first and second position determining apparatus, each of the first and second position determining apparatus being operative to determine of itself a change in position of the hand-held control apparatus in at least two dimensions, the display apparatus being operative to determine the changed position of the operative location on the hand-held control apparatus in dependence on at least one of: a difference between outputs from the first and second position determining apparatus; and an average of outputs from the first and second position determining apparatus.
12. Display apparatus according to any preceding claim, in which the hand-held control apparatus comprises an input device and the hand-held control apparatus is operative to determine the control apparatus reference point in dependence on a user positioning the operative location on the hand-held control apparatus at a desired location and on the input device being operative thereupon to initiate determination of the control apparatus reference point.
13. Display apparatus according to any preceding claim, in which the hand-held control apparatus comprises an input device and the hand-held control apparatus is operative to determine the display surface reference point in dependence on the hand-held control device being held such that the operative location on the handheld control apparatus is at or proximate a desired location on the display surface with the input device being operative thereupon to determine the display surface reference point.
14. Display apparatus according to claim 12 or 13, in which the input device comprises a sensor which is operative upon reception of a stimulus to initiate at least one of: deternnination of the display surface reference point; and deternnination of the control apparatus reference point.
15. Display apparatus according to any preceding claim, in which the hand-held control apparatus comprises an image sensor mounted on the chassis, the display apparatus being operative to analyse an image acquired by the image sensor with the at least one characteristic of a further image being adjusted in dependence thereon, the analysis comprising comparing at least one characteristic of the acquired image with a predetermined characteristic.
16. Display apparatus according to claim 15, in which the analysis comprises comparing the acquired image with a stored image having predetermined
characteristics and the acquired image is a representation of a predetermined object which is displayed on the display surface.
17. Display apparatus according to claim 15 or 16, in which the image sensor is comprised in an input device, the input device being comprised in the hand-held control apparatus, the display surface reference point being determined in
dependence on an image sensed by the image sensor, the display apparatus being operative to display an object of predetermined form at a predetermined location on the display surface and the hand-held apparatus is held at or proximate the object, whereupon the object is sensed by the image sensor, the display apparatus being operative to recognise the form of the sensed object and to initiate determination of the display surface reference point in dependence thereon and without manual operation.
18. Display apparatus according to any one of claims 15 to 17, in which the image sensor is operative to acquire at least one image and the display apparatus is operative to change a location of the control point in dependence on the acquired at least one image.
19. Display apparatus according to claim 18, in which the image sensor is operative to acquire first and second images of an object on the display surface at time spaced intervals and the display apparatus is operative to: determine the relative locations of the acquired first and second images within an image frame of the image sensor; and move the control point in dependence on the determined relative locations.
20. Display apparatus according to claim 18 or 19, in which the display apparatus is configured such that when the hand-held control apparatus is held so as to direct the image sensor towards an object at a predetermined location on the display surface, the image sensor is operative thereupon to acquire an image of the object and the display apparatus is operative to move the control point in dependence on the acquired image and the predetermined location.
21 . Display apparatus according to any one of claims 15 to 20, in which the handheld control apparatus further comprises a lens arrangement disposed in relation to the image sensor so as to increase a field of view of the image sensor and the display apparatus is configured to transform an image acquired by the image sensor so as to reduce distortion caused by the lens arrangement.
22. Display apparatus according to any one of claims 15 to 21 , in which the display apparatus is configured to transform an image acquired by the image sensor in dependence on an orientation of the hand-held control apparatus in relation to the display surface.
23. A method of controlling a display arrangement comprising a display surface by way of hand-held control apparatus comprising a chassis and position
determining apparatus mounted on the chassis, the position determining apparatus being configured to determine of itself a change in position of the hand-held control apparatus in at least two dimensions, the method comprising:
determining, in dependence on operation of the hand-held control apparatus, a control apparatus reference point at a location spaced apart from the display surface;
determining, in dependence on operation of the position determining apparatus, a changed position of an operative location on the hand-held control apparatus in first and second orthogonal directions relative to the control apparatus reference point after movement of the hand-held control apparatus by hand and despite movement of the operative location in a third direction which is orthogonal to each of the first and second directions; and
determining, in dependence on the determined relative position of the operative location, a position of a control point on the display surface in relation to a display surface reference point on the display surface.
24. A computer program comprising program instructions for causing a computer, which is comprised in a display arrangement comprising a display surface, and a hand-held control apparatus according to any one of claims 1 to 22 to perform the method according to claim 23.
25. The computer program according to claim 24, the computer program being one of: embodied on a record medium; embodied in a read only memory; stored in a computer memory; and carried on an electrical carried signal.
26. A computer system comprising: a display arrangement comprising a display surface; and hand-held control apparatus comprising a chassis and position determining apparatus mounted on the chassis, the position determining apparatus being configured to determine of itself a change in position of the hand-held control apparatus in at least two dimensions; and program instructions for causing the computer system to perform the method according to claim 23.
27. The computer system according to claim 26, in which the program instructions are at least one of: embodied on a record medium; embodied in a read only memory; stored in a computer memory; and carried on an electrical carried signal.
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201161443299P | 2011-02-16 | 2011-02-16 | |
US61/443,299 | 2011-02-16 | ||
GB201102680A GB201102680D0 (en) | 2011-02-16 | 2011-02-16 | Display apparatus and method thereof |
GB1102680.4 | 2011-02-16 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2012110809A1 true WO2012110809A1 (en) | 2012-08-23 |
Family
ID=43859484
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/GB2012/050337 WO2012110809A1 (en) | 2011-02-16 | 2012-02-15 | Display apparatus and method therefor |
Country Status (2)
Country | Link |
---|---|
GB (1) | GB201102680D0 (en) |
WO (1) | WO2012110809A1 (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
GB2521107A (en) * | 2013-09-12 | 2015-06-17 | Cosneta Ltd | Display apparatus |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20050242947A1 (en) * | 2004-04-29 | 2005-11-03 | Tracetech Incorporated | Tracking system and methods thereof |
US20050253806A1 (en) * | 2004-04-30 | 2005-11-17 | Hillcrest Communications, Inc. | Free space pointing devices and methods |
US20060033711A1 (en) * | 2004-08-10 | 2006-02-16 | Microsoft Corporation | Direct navigation of two-dimensional control using a three-dimensional pointing device |
US20100123660A1 (en) * | 2008-11-14 | 2010-05-20 | Kyu-Cheol Park | Method and device for inputting a user's instructions based on movement sensing |
WO2010129102A2 (en) | 2009-04-28 | 2010-11-11 | Luidia Inc. | Digital transcription system utilizing small aperture acoustical sensors |
-
2011
- 2011-02-16 GB GB201102680A patent/GB201102680D0/en not_active Ceased
-
2012
- 2012-02-15 WO PCT/GB2012/050337 patent/WO2012110809A1/en active Application Filing
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20050242947A1 (en) * | 2004-04-29 | 2005-11-03 | Tracetech Incorporated | Tracking system and methods thereof |
US20050253806A1 (en) * | 2004-04-30 | 2005-11-17 | Hillcrest Communications, Inc. | Free space pointing devices and methods |
US20060033711A1 (en) * | 2004-08-10 | 2006-02-16 | Microsoft Corporation | Direct navigation of two-dimensional control using a three-dimensional pointing device |
US20100123660A1 (en) * | 2008-11-14 | 2010-05-20 | Kyu-Cheol Park | Method and device for inputting a user's instructions based on movement sensing |
WO2010129102A2 (en) | 2009-04-28 | 2010-11-11 | Luidia Inc. | Digital transcription system utilizing small aperture acoustical sensors |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
GB2521107A (en) * | 2013-09-12 | 2015-06-17 | Cosneta Ltd | Display apparatus |
Also Published As
Publication number | Publication date |
---|---|
GB201102680D0 (en) | 2011-03-30 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10514723B2 (en) | Accessory and information processing system | |
CN112527102B (en) | Head-mounted all-in-one machine system and 6DoF tracking method and device thereof | |
US10785472B2 (en) | Display apparatus and method for controlling display apparatus | |
US10635182B2 (en) | Head mounted display device and control method for head mounted display device | |
JP5412227B2 (en) | Video display device and display control method thereof | |
US20140247279A1 (en) | Registration between actual mobile device position and environmental model | |
JP6387644B2 (en) | Position detection device, position detection system, and position detection method | |
WO2006110141A3 (en) | Automatic projection calibration | |
US20140062863A1 (en) | Method and apparatus for setting electronic blackboard system | |
US20100188587A1 (en) | Projection method | |
US20050280628A1 (en) | Projector pen image stabilization system | |
CN103019638A (en) | Display device, projector, and display method | |
US9910507B2 (en) | Image display apparatus and pointing method for same | |
US8937593B2 (en) | Interactive projection system and method for calibrating position of light point thereof | |
US20180164983A1 (en) | Display system, display apparatus, control method for display apparatus | |
CN107754310B (en) | Handheld device and positioning method thereof | |
WO2023104115A1 (en) | Panoramic video acquiring method, apparatus and system, device, and storage medium | |
KR102655532B1 (en) | Electronic device and method for acquiring biometric information using light of display | |
KR102620877B1 (en) | Electronic device for recommending photographing spot for an image and method for operating thefeof | |
WO2012110809A1 (en) | Display apparatus and method therefor | |
US20120019442A1 (en) | Pointing device for use with a computer and methods of operation and calibration thereof | |
US20180039344A1 (en) | Coordinate detection apparatus, electronic blackboard, image display system, and coordinate detection method | |
JP6569259B2 (en) | POSITION DETECTION DEVICE, DISPLAY DEVICE, POSITION DETECTION METHOD, AND DISPLAY METHOD | |
TWI522848B (en) | Pointer device and pointer positioning method thereof | |
US11933988B2 (en) | Information processing apparatus, information processing method, head mounted display housing, and head mounted display system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 12708576 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 12708576 Country of ref document: EP Kind code of ref document: A1 |