WO2012063561A1 - 情報報知システム、情報報知方法、情報処理装置及びその制御方法と制御プログラム - Google Patents
情報報知システム、情報報知方法、情報処理装置及びその制御方法と制御プログラム Download PDFInfo
- Publication number
- WO2012063561A1 WO2012063561A1 PCT/JP2011/071802 JP2011071802W WO2012063561A1 WO 2012063561 A1 WO2012063561 A1 WO 2012063561A1 JP 2011071802 W JP2011071802 W JP 2011071802W WO 2012063561 A1 WO2012063561 A1 WO 2012063561A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- person
- hand
- information
- information processing
- image
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/107—Static hand or arm
- G06V40/113—Recognition of static hand signs
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V2201/00—Indexing scheme relating to image or video recognition or understanding
- G06V2201/10—Recognition assisted with metadata
Definitions
- the present invention relates to a technique for informing advertisement information to an unspecified person.
- Patent Document 1 discloses a technique for determining the degree of attention to a display screen based on the attention time obtained from a captured image by a camera and the distance from the screen, and notifying information corresponding to the person who is paying attention. It is disclosed.
- the technique described in the above-mentioned patent document is a technique for further increasing the degree of attention on the premise that attention has already been paid to the notification information, and it has not been possible to create a focus for attention to the notification information.
- An object of the present invention is to provide a technique for solving the above-described problems.
- an apparatus for informing information to an unspecified person, First display control means for displaying a screen including an attraction image for attracting a hand movement; Recognizing means for recognizing hand movement by a photographed person, A specifying means for specifying a person whose hand movement recognized by the recognition means among the photographed person corresponds to the hand movement to be attracted by the attraction image; It is characterized by providing.
- the method according to the present invention comprises: A method of controlling an information processing apparatus for informing information to an unspecified person, A first display control step for displaying a screen including an attraction image for attracting a hand movement; A recognition step for recognizing hand movements by a photographed person; In the photographed person, a specifying step for specifying a person whose hand motion recognized by the recognition step corresponds to a hand motion to be attracted by the attraction image; It is characterized by providing.
- a storage medium for storing a control program of an information processing device for informing information to an unspecified person, A first display control step for displaying a screen including an attraction image for attracting a hand movement; A recognition step for recognizing hand movements by a photographed person; In the photographed person, a specifying step for specifying a person whose hand motion recognized by the recognition step corresponds to a hand motion to be attracted by the attraction image; A control program for causing a computer to execute is stored.
- a system provides: An information notification system for reporting information to an unspecified person, Display means for displaying a screen including advertising information; First display control means for causing the display means to display a screen including an attraction image for attracting a hand movement; Recognizing means for recognizing hand movement by a photographed person, A specifying means for specifying a person whose hand movement recognized by the recognition means among the photographed person corresponds to the hand movement to be attracted by the attraction image; Second display control means for causing the display means to display a screen containing advertisement information directed to the person specified by the specifying means; It is characterized by providing.
- the method according to the present invention comprises: An information notification method for reporting information to an unspecified person, A first display control step for displaying a screen including an attraction image for attracting a hand movement on a display means for displaying a screen including notification information; A recognition step for recognizing hand movements by a photographed person; In the photographed person, a specifying step for specifying a person whose hand motion recognized in the recognition step corresponds to the hand motion to be attracted by the attraction image; A second display control step for causing the display means to display a screen including advertisement information directed to the person specified by the specifying step; It is characterized by including.
- the information processing apparatus 100 is an apparatus for notifying an unspecified person 104 of information.
- the information processing apparatus 100 includes a first display control unit 101, a recognition unit 102, and a specifying unit 103.
- the first display control unit 101 displays a screen including an attraction image for attracting a hand motion.
- the recognizing unit 102 recognizes the movement of the hand by the photographed person 104.
- a person 105 with a hand is identified.
- a hand gesture an information processing apparatus capable of focusing on a person who responds to an attraction image for attracting a camera or a video camera (hereinafter also referred to as a target person) and capable of interacting with the target person by hand gesture.
- FIG. 2 is a block diagram illustrating a configuration of an information notification system 200 including the information processing apparatus 210 according to the second embodiment.
- an independent information processing apparatus 210 is illustrated, but the present invention can be extended to a system in which a plurality of information processing apparatuses 210 are connected via a network.
- the database is abbreviated as DB.
- the 2 includes an information processing device 210, a stereo camera 230, a display device 240, and a speaker 250.
- the stereo camera 230 can capture an unspecified person 104 and send the captured image to the information processing apparatus 210 and can be controlled by the information processing apparatus 210 to focus on the subject.
- the display device 240 notifies advertisement information such as advertisements or advertisement messages from the information processing device 210 according to the notification program.
- a screen including an attraction image for inviting a response by a hand gesture to an unspecified person 104 is displayed in the advertisement or the advertisement message or prior to the advertisement or the advertisement message.
- a screen is displayed on which the user 105 can interact with the responding person 105 by hand gesture.
- the speaker 250 outputs an auxiliary sound for promoting a dialogue by a hand gesture with the screen of the display device 240 or the responding person 105.
- the information processing apparatus 210 includes the following components. Note that the information processing apparatus 210 does not have to be a single apparatus, as long as the functions of FIG. Hereinafter, each functional component will be described in accordance with the operation procedure in the present embodiment.
- the input / output interface 211 serves as an interface between the stereo camera 230, the display device 240, the speaker 250, and the information processing device 210.
- a predetermined notification program or initial program is controlled by the notification program control unit 217, and image data and audio data are transmitted from the output control unit 221 to the display device 240 and the speaker 250 via the input / output interface 211.
- the output control unit 221 functions as a display control unit that displays an attraction image or advertisement information for attracting a hand gesture on the display device 240, and further, the speaker 250 outputs sound corresponding to the attraction image or the advertisement image. It functions as an audio output control unit for output.
- the notification program includes contents for inviting a hand gesture to an unspecified person 104 (for example, an attraction image showing an action of waving a hand, an action to call for participation in janken, a sign language, or the like).
- the notification program control unit 217 selects the notification program DB 216.
- the attracting hand gesture acquisition unit 218 that has obtained the initial program or the content of the notification program from the notification program control unit 217 acquires an attraction hand gesture to be requested of the unspecified person 104 from the program.
- the attraction hand gesture acquisition unit 218 includes a table 222 that stores a correspondence relationship between an attraction image included in the notification program and a hand gesture to be attracted by the attraction image.
- the image of the unspecified person 104 taken by the stereo camera 230 is sent to the image recording unit 212 via the input / output interface 211, and the image history of the time when the hand gesture can be determined is recorded.
- the hand detection unit 213 detects a hand image from the image of the unspecified person 104 taken by the stereo camera 230. Such a hand image is detected from, for example, the color, shape, and position. It should be noted that, in cooperation with the determination of the next hand gesture determination unit 214, for example, when including a case of wearing a glove, it is possible to make a change such as not selecting by color.
- the hand gesture determination unit 214 determines the hand gesture of each hand with reference to the hand gesture DB 215 from the characteristics of the hand image in the image of the unspecified person 104 detected by the hand detection unit 213 (see FIG. 4). .
- hand gesture DB 2115 hand positions, finger positions, or time-series changes are stored in association with hand gestures (see FIG. 5).
- the attraction hand gesture obtained by the attraction hand gesture acquisition unit 218 and the hand gesture of each hand determined by the hand gesture determination unit 214 are compared by the hand gesture comparison unit 219. As a result of the comparison, if they match within a predetermined range, the hand gesture comparison unit 219 outputs a signal indicating that.
- the comparison of the hand gesture comparison unit 219 differs depending on the hand gesture. For example, the position of a finger is not considered when waving, and in the case of janken, it is assumed that any one of goo, choki, and par matches. Further, in the case of sign language, it is assumed that they match if the call / reply is within a certain range.
- the signal output from the hand gesture comparison unit 219 is input to the hand gesture determination unit 214 and the camera control unit 220.
- the hand gesture determination unit 214 receives a signal corresponding to the attracting hand gesture and the hand gesture of the photographed hand, the hand gesture determination unit 214 transmits information about the person (target person) who has the currently determined hand to the notification program control unit 217.
- the notification program control unit 217 displays a screen including notification information indicating that the hand gesture has been received by the target person. Specifically, it responds by displaying a character or an image indicating that a hand gesture has been received from the subject or displaying an image of the subject on the screen.
- the camera control unit 220 operates the stereo camera 230 so as to focus the target person, and specializes in the hand gesture determination of the target person, and in subsequent notification programs, the interactive communication with the target person is made smooth. .
- the processing and operation in the hand gesture comparison unit 219 in FIG. 2 and the connection with other functional components are examples, and if the attracting hand gesture and the hand gesture of the photographed hand correspond to each other, an operation that responds to it can be realized.
- the configuration is not limited to that shown in FIG.
- FIG. 3 is a block diagram illustrating a hardware configuration of the information processing apparatus 210 according to the second embodiment.
- data and programs used in the third embodiment are indicated by broken lines.
- a CPU 310 is a processor for calculation control, and implements each functional component of FIG. 2B by executing a program.
- the ROM 320 stores fixed data and programs such as initial data and programs.
- the communication control unit 330 communicates with an external device via a network.
- the communication control unit 330 may download a notification program from various servers or the like, or a signal with the stereo camera 230 or the display device 240 may be transmitted / received via a network. Communication may be wireless or wired.
- the input / output interface 211 interfaces with the stereo camera 230, the display device 240, and the like.
- the RAM 340 is a random access memory used by the CPU 310 as a temporary storage work area.
- the RAM 340 has an area for storing data necessary for realizing the present embodiment and an area for storing a notification program.
- 341 in FIG. 3 is display screen data displayed on the display device 240.
- Reference numeral 342 denotes image data captured by the stereo camera 230.
- Reference numeral 343 denotes hand data detected from image data captured by the stereo camera 230.
- Reference numeral 344 denotes a hand gesture determined from data of each photographed hand.
- Reference numeral 345 denotes an attraction hand gesture that is attracted by an attraction image included in the screen displayed on the display device 240.
- Reference numeral 346 denotes a target person discrimination table for discriminating a person holding the hand as a target person because the photographed hand gesture and the attracting hand gesture correspond to each other (see FIG. 6).
- Reference numeral 347 denotes camera control data for performing camera control such as focusing the stereo camera 230 on the determined subject.
- 348 is a notification program selection table used in the third embodiment for selecting a notification program based on the attributes of the subject determined from the image.
- a notification program 349 is currently executed by the information processing apparatus 210. Note that the programs stored in the other storage 350 are also loaded into the RAM 340 and executed by the CPU 310 to realize the functions of the functional components shown in FIG.
- the storage 350 stores the following data or programs necessary for realizing the present embodiment.
- the following database is stored.
- 352 stores the feature of the subject's image and the attribute (for example, gender and age) in association with each other.
- This is a person recognition DB (see FIG. 9).
- 216 is a notification program DB that stores a plurality of notification programs that are selected according to the attributes of the target person or the environment such as the day of the week or the time zone, particularly used in the third embodiment (see FIG. 10).
- 354 is a main information processing program executed by the information processing apparatus 210 (see FIGS. 7 and 12).
- Reference numeral 355 denotes a target person determination module included in the information processing program 354 for performing target person determination.
- Reference numeral 356 denotes a notification program execution module that controls execution of the notification program included in the information processing program 354 (see FIG. 8).
- Reference numeral 357 denotes a notification program selection module that is included in the information processing program 354 and is selected in accordance with the attribute of the subject, which is executed in the third embodiment.
- FIG. 3 shows only data and programs essential for the present embodiment, and general-purpose data and programs such as OS are not shown.
- FIG. 4 is a diagram illustrating a configuration of the captured hand data 343 according to the second embodiment.
- FIG. 4 shows an example of hand data 343 necessary for determining “waving hand” or “junken” as a hand gesture. Note that “sign language” or the like can also be determined by extracting hand data necessary for the determination.
- Reference numeral 411 denotes a hand ID for identifying each hand attached to the hand of an unspecified person who has been photographed.
- Reference numeral 412 indicates the position of the hand, here the height.
- Reference numeral 413 denotes a movement history. In FIG. 4, "one direction”, “reciprocating motion”, “stationary (intermittent)", and the like are extracted.
- Reference numeral 414 denotes a moving distance
- reference numeral 415 denotes a moving speed. For example, the movement distance and the movement speed are used to determine whether the gesture is “waving hand” or “calling a person”.
- Reference numeral 416 denotes a face direction, which is used to determine whether or not attention is paid.
- Reference numeral 417 denotes a person ID for identifying a person having this hand, and reference numeral 418 extracts a person position where the person with this person ID is located.
- the focus position of the stereo camera 230 is determined based on the person position. Alternatively, in the case of three-dimensional display, the direction of the display screen to this person position may be determined. Further, the content and directivity of the sound from the speaker 250 may be adjusted. Note that the finger position data is not described in the data for determining the “waving hand” hand gesture, but the finger position may be added.
- Reference numeral 421 denotes a hand ID for identifying each hand attached to the hand of an unspecified person who has been photographed.
- Reference numeral 422 extracts the position of the hand, here the height.
- Reference numeral 423 indicates a three-dimensional position of the thumb.
- Reference numeral 424 denotes a three-dimensional position of the index finger.
- Reference numeral 425 indicates a three-dimensional position of the middle finger.
- Reference numeral 426 denotes a three-dimensional position of the little finger.
- Reference numeral 427 denotes a person ID for identifying the person having the hand.
- the person position indicating where the person with the person ID is located is extracted.
- the position of the ring finger was excluded in the example of FIG. 4, this may be included.
- accurate determination can be made by using not only the finger but also palm and back data and more specifically the joint position of the finger for the determination.
- FIG. 5 is a diagram illustrating a configuration of the hand gesture DB 215 according to the second embodiment.
- FIG. 5 also shows the contents of the DB for determining “waving hands” in the upper stage 510 and the contents of the DB for determining “Junken” in the lower stage 520 corresponding to FIG. “Sign language” is also provided separately.
- 511 of the upper stage 510 stores a range of “hand height” determined as each gesture.
- a movement history is stored.
- the range of the movement distance is stored.
- a range of moving speed is stored.
- the face direction is stored.
- 516 stores “hand gesture” for data (see FIG. 4) that satisfies the conditions of 511 to 515. For example, if the condition of the first row is satisfied, it is determined as a “waving hand” gesture. If the condition of the second row is satisfied, it is determined as a “call a person” gesture. If the condition of the third row is satisfied, it is determined as a “run” gesture.
- FIG. 5 shows an example of determining a gesture that is confusing with a “waving hand” gesture.
- the type of hand data to be extracted and the structure of the hand gesture DB 215 are added or changed depending on what kind of data is valid in order to determine the “waving hand” gesture as accurately as possible. .
- 521 of the lower 520 stores a range of “hand height” determined as each gesture. Since the lower stage 520 is “Janken” discrimination, the range of “Height” is the same, and if it deviates from this height, it is not regarded as “Janken”.
- 522 stores the thumb position
- 523 stores the index finger position
- 524 stores the middle finger position
- 525 stores the little finger position. Note that the finger positions 522 to 525 are not the absolute position of the finger but the relative position of the finger, and the comparison with the finger position data in FIG. Although specific numerical values are not shown in FIG.
- the positional relationship between the fingers in the first row is “Goo”
- the positional relationship between the fingers in the second row is “Cho”
- the positional relationship between the fingers in the third row is “par”. Is determined.
- “Sign language” includes a time-series history similar to the determination of “Janken”.
- the hand having data that matches the data in the hand gesture DB 215 or is within a certain range is determined as the corresponding hand gesture.
- the hand gesture comparison unit 219 determines whether the determination result “hand gesture” from the photographed hand corresponds to the “attractive hand gesture” from the display screen of the display device 240.
- FIG. 6 is a diagram showing a configuration of the subject discrimination table 346 according to the second embodiment.
- the hand ID (0002) / person ID (0010) in the second row 410 in the upper section 410 of FIG. 4 is determined as a “waving hand” gesture, and the hand in the third row in the lower section 420 of FIG. It is assumed that the ID (0003) / person ID (0005) is determined to be a “cheerful” gesture.
- Reference numeral 601 in FIG. 6 is a person ID.
- Reference numeral 602 denotes a gesture determined from the hand of the photographed image.
- Reference numeral 603 denotes an “attracting gesture” from the screen displayed on the display device 240.
- Reference numeral 604 denotes a determination result indicating that the person having the hand corresponds to the target person when the “attracting gesture” corresponds to the “photographing gesture”, and “non-target person” when the person does not correspond.
- the person ID (0010) is performing the “waving hand” gesture. Determined. However, if the “attracting gesture” is “sign language”, it is determined as a “non-target person”. Further, when the “attracting gesture” is “Janken”, the person ID (0005) is performing the “cheerful” gesture, so that it is determined as the “subject” as a response to the display screen. However, if the “attracting gesture” is “sign language”, it is determined as a “non-target person”. Even if the “attracting gesture” and the “photographing gesture” do not correspond to each other, if the “waving hand” gesture is performed toward the screen, it is possible to preferentially select “target person”.
- FIG. 7 is a flowchart illustrating an operation procedure of the information processing apparatus according to the second embodiment. Such a flowchart is executed by the CPU 310 in FIG. 3 using the RAM 340 to realize the functions of the respective functional components in FIG.
- step S701 an attraction image for attracting an unspecified person's hand gesture is displayed on the display device 240.
- step S703 the stereo camera 230 captures an image.
- step S705 a hand is detected from the acquired image, and a “hand gesture” of the hand is detected.
- step S707 it is determined whether the “detection gesture” corresponds to the “attraction gesture”. If not, the process proceeds to step S709 to determine whether or not all hand “hand gestures” in the acquired image have been detected and determined. If the detection of “hand gestures” for all the hands has not been completed, the process returns to step S705 to determine the next hand. If the detection of “hand gesture” for all hands has been completed, the process returns to step S703, a new image is acquired from the stereo camera 230, and the detection of “hand gesture” is repeated.
- step S711 the person who has the “detection gesture” hand is determined as the “subject”. That is, the person who first detected the hand motion corresponding to the hand gesture to be attracted by the attraction image is the target person.
- the camera control unit 220 is used to focus the stereo camera 230 on the “subject”.
- Step S715 is an optional process. In order to make the contact with the “subject” more close, the fact that the “subject” has responded to the attraction by the screen is detected for the “subject”. To inform. For example, the notification may be made by text display or voice, or by displaying an image of the “subject” focused and photographed by the stereo camera 230 on a part of the screen.
- step S717 execution of the notification program is started. Such processing is shown in more detail in FIG.
- step S719 determination of whether the notification process is ended. If not completed, the process returns to step S701 to repeat the process.
- FIG. 8 is a flowchart showing an operation procedure of the notification program execution process S717. Note that the notification program requires the “subject” to change the display screen or select an option on the display screen rather than attracting a hand gesture from the display screen.
- step S801 Even during the execution of the notification program, an image is acquired from the stereo camera 230 in step S801. In this case, since the “target person” is focused in step S713 of FIG. 7, an enlarged image of the surrounding area including the “target person” is acquired.
- step S803 the hand of the “subject” is extracted and “hand gesture” is detected.
- step S805 the instruction of the “subject” by the detected “hand gesture” is determined, and the notification program is advanced in response to the instruction.
- step S807 the process returns to step S801 and repeats until the notification program is terminated.
- the attribute (for example, gender and age) of the person determined as the “target person” by the hand gesture is determined based on the image from the stereo camera 230, and the attribute A notification program corresponding to the information is selected and executed.
- the notification program may be selected accordingly. According to the present embodiment, the “subject” can continue to be attracted to the notification program.
- a person recognition DB 352 a notification program DB 216, and a notification program selection table 348, which are indicated by broken lines in FIG. 3, are added as data.
- a notification program selection module 357 is added to a part of the information processing program 354 as a program.
- FIG. 9 is a diagram illustrating a configuration of the person recognition DB 352 according to the third embodiment.
- FIG. 10 is a diagram showing a configuration of the notification program DB 216 according to the third embodiment.
- FIG. 10 stores a notification program ID 1001 that identifies a notification program and serves as a read key.
- Each notification program A (1010) and notification program B (1020) can be read out from each notification program ID, “001” and “002” in FIG.
- the notification program A is a “cosmetic advertisement” program
- the notification program B is a “condominium advertisement” program.
- a notification program corresponding to the attribute of the “subject” recognized using the person recognition DB 352 is selected from the notification program DB 216 and executed.
- FIG. 11 is a diagram showing the configuration of the notification program selection table 348 according to the third embodiment.
- FIG. 11 is a person ID that has become “target person” by the hand gesture.
- Reference numeral 1102 denotes the “sex” of the “subject” recognized by the person recognition DB 352.
- 1103 is the “age” of the “subject”.
- the notification program ID 1104 is determined in association with the attribute of these “subjects”.
- the cosmetic advertisement notification program A in FIG. Selected and executed.
- the condominium advertisement notification program B in FIG. 10 is selected and executed. Is done.
- the selection of the notification program is an example and is not limited to this.
- FIG. 12 is a flowchart illustrating an operation procedure of the information processing apparatus according to the third embodiment.
- the flowchart of FIG. 12 is obtained by adding steps S1201 and S1203 to the flowchart of FIG. 7.
- the subsequent steps are the same, and therefore, these two steps will be described here.
- step S1201 the attribute of “target person” is recognized with reference to the person recognition DB 352 as shown in FIG.
- step S1203 a notification program is selected from the notification program DB 216 according to the notification program selection table 348 shown in FIG.
- the said 2nd and 3rd embodiment it demonstrated as a process by one information processing apparatus.
- a configuration will be described in which a plurality of information processing apparatuses are connected to a notification information server via a network and execute a notification program downloaded from the notification information server.
- the information processing apparatus of this embodiment may have a function equivalent to the information processing apparatus of 2nd and 3rd embodiment, and may transfer a part of the function to a alerting
- the processing in the fourth embodiment is basically the same as that in the second and third embodiments even if there is a function distribution. Therefore, the configuration of the information notification system will be described, and detailed function description will be omitted.
- FIG. 13 is a block diagram showing a configuration of an information notification system 1300 according to the fourth embodiment.
- the same reference numerals as those in FIG. 2 denote components that perform the same function. The differences will be described below.
- FIG. 13 shows three information processing apparatuses 1310. There is no limit to the number. These information processing apparatuses 1310 are connected to the notification information server 1320 via the network 1330.
- the notification information server 1320 stores a notification program 1321 for download, receives information on each point photographed by the stereo camera 230, and selects a notification program to be downloaded. For example, it is possible to perform integrated control such as displaying a hand gesture invitation image associated with a plurality of display devices 240.
- the information processing apparatus 1310 is illustrated as having a characteristic component such as a hand gesture determination unit 214, a hand gesture DB 215, a notification program DB 216, a notification program control unit 217, and a camera control unit 220.
- a characteristic component such as a hand gesture determination unit 214, a hand gesture DB 215, a notification program DB 216, a notification program control unit 217, and a camera control unit 220.
- the present invention is not limited to this, and some functions of the information processing device 1310 may be distributed to the notification information server 1320 or other devices.
- the present invention may be applied to a system composed of a plurality of devices, or may be applied to a single device. Furthermore, the present invention can also be applied to a case where a control program that realizes the functions of the embodiments is supplied directly or remotely to a system or apparatus. Therefore, in order to realize the functions of the present invention on a computer, a control program installed in the computer, a medium storing the control program, and a WWW (World Wide Web) server that downloads the control program are also included in the scope of the present invention. include.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Description
不特定の人物に情報を報知するための情報処理装置であって、
手の動作を誘引するための誘引画像を含む画面を表示させる第1表示制御手段と、
撮影された人物による手の動作を認識する認識手段と、
前記撮影された人物の中で、前記認識手段が認識した手の動作が、前記誘引画像により誘引すべき手の動作に対応する人物を特定する特定手段と、
を備えることを特徴とする。
不特定の人物に情報を報知するための情報処理装置の制御方法であって、
手の動作を誘引するための誘引画像を含む画面を表示させる第1表示制御ステップと、
撮影された人物による手の動作を認識する認識ステップと、
前記撮影された人物の中で、前記認識ステップにより認識した手の動作が、前記誘引画像により誘引すべき手の動作に対応する人物を特定する特定ステップと、
を備えることを特徴とする。
不特定の人物に情報を報知するための情報処理装置の制御プログラムを記憶する記憶媒体であって、
手の動作を誘引するための誘引画像を含む画面を表示させる第1表示制御ステップと、
撮影された人物による手の動作を認識する認識ステップと、
前記撮影された人物の中で、前記認識ステップにより認識した手の動作が、前記誘引画像により誘引すべき手の動作に対応する人物を特定する特定ステップと、
をコンピュータに実行させる制御プログラムを格納したことを特徴とする。
不特定の人物に情報を報知するための情報報知システムであって、
広告情報を含む画面を表示する表示手段と、
手の動作を誘引するための誘引画像を含む画面を前記表示手段に表示させる第1表示制御手段と、
撮影された人物による手の動作を認識する認識手段と、
前記撮影された人物の中で、前記認識手段が認識した手の動作が、前記誘引画像により誘引すべき手の動作に対応する人物を特定する特定手段と、
前記特定手段が特定した前記人物に向けた広告情報を含む画面を前記表示手段に表示させる第2表示制御手段と、
を備えることを特徴とする。
不特定の人物に情報を報知するための情報報知方法であって、
手の動作を誘引するための誘引画像を含む画面を、報知情報を含む画面を表示する表示手段に表示させる第1表示制御ステップと、
撮影された人物による手の動作を認識する認識ステップと、
前記撮影された人物の中で、前記認識ステップで認識した手の動作が、前記誘引画像により誘引すべき手の動作に対応する人物を特定する特定ステップと、
前記特定ステップにより特定された前記人物に向けた広告情報を含む画面を前記表示手段に表示させる第2表示制御ステップと、
を含むことを特徴とする。
本発明の第1実施形態としての情報処理装置100について、図1を用いて説明する。情報処理装置100は、不特定の人物104に情報を報知するための装置である。
第2実施形態においては、ステレオカメラにより奥行きの判別も可能にして手の動作(以下、ハンドジェスチャーと称する)を容易により正確に検出できる情報処理装置を説明する。また、カメラやビデオカメラを誘引するための誘引画像に反応した人物(以下、対象者とも呼ぶ)へフォーカスして、対象者とハンドジェスチャーによる対話が可能となる情報処理装置を説明する。
図2は、第2実施形態に係る情報処理装置210を含む情報報知システム200の構成を示すブロック図である。なお、図2では、独立した情報処理装置210を図示しているが、複数の情報処理装置210がネットワークを介して接続されたシステムにも拡張可能である。以下、データベースについては、DBと略して記載する。
情報処理装置210は、以下の構成要素を含む。なお、情報処理装置210は1つの装置である必要はなく、複数の装置に機能が分散してトータルとして図2の機能を実現できればよい。以下、本実施形態における動作手順に従って、各機能構成部を説明する。
図3は、第2実施形態に係る情報処理装置210のハードウエア構成を示すブロック図である。なお、図3には、第3実施形態で使用されるデータやプログラムは、破線によって図示されている。
以下に、第2実施形態の情報処理装置210で使用される特徴的なデータの構成を示す。
図4は、第2実施形態に係る撮影した手のデータ343の構成を示す図である。
図5は、第2実施形態に係るハンドジェスチャーDB215の構成を示す図である。図5も、図4に対応させて、上段510が「手を振る」を判定するDBの内容、下段520が「ジャンケン」を判定するDBの内容である。「手話」についても別途設けられる。
図6は、第2実施形態に係る対象者判別テーブル346の構成を示す図である。
図7は、第2実施形態に係る情報処理装置の動作手順を示すフローチャートである。かかるフローチャートは、図3のCPU310がRAM340を使用しながら実行して図2の各機能構成部の機能を実現する。
図8は、報知プログラム実行処理S717の動作手順を示すフローチャートである。なお、報知プログラムにおいては、表示画面からハンドジェスチャーを誘引するよりも、表示画面の変更や表示画面上の選択肢の選択を「対象者」に求めることになる。
第3実施形態では、第2実施形態に加えて、ハンドジェスチャーにより「対象者」と判定された人物の属性(例えば、性別や年齢)を、ステレオカメラ230からの画像に基づいて判断し、属性に応じた報知プログラムを選択して実行する。なお、「対象者」の属性のみでなく、服装や行動傾向、あるいはグループなのかなどを判断して、それに応じて報知プログラムを選択してもよい。本実施形態によれば、「対象者」が引き続き報知プログラムに引きつけることが可能となる。
第3実施形態の情報処理装置210では、図3に破線で示した、人物認識DB352、報知プログラムDB216、報知プログラム選択テーブル348がデータとして追加される。また、プログラムとして、報知プログラム選択モジュール357が情報処理プログラム354の一部に追加される。
図9は、第3実施形態に係る人物認識DB352の構成を示す図である。
図10は、第3実施形態に係る報知プログラムDB216の構成を示す図である。
図11は、第3実施形態に係る報知プログラム選択テーブル348の構成を示す図である。
図12は、第3実施形態に係る情報処理装置の動作手順を示すフローチャートである。図12のフローチャートは、図7のフローチャートにステップS1201とS1203を追加したものであり、後のステップは同様であるので、ここではこの2つのステップについて説明する。
上記第2及び第3実施形態では、1つの情報処理装置による処理として説明をした。第4実施形態においては、複数の情報処理装置がネットワークを介して報知情報サーバに接続し、報知情報サーバからダウンロードされた報知プログラムを実行する構成を説明する。本実施形態によれば、互いの情報交換が可能になると共に、報知情報サーバに情報を集中して一元的に広告・宣伝を管理することが可能になる。なお、本実施形態の情報処理装置は、第2及び第3実施形態の情報処理装置と同等の機能を有しても良いし、その機能の一部を報知情報サーバに移行してもよい。また、報知プログラムばかりでなく、状況に応じて情報処理装置の動作プログラムを報知情報サーバからダウンロードすることで、配置場所に適切なハンドジェスチャーによる制御方法が実現される。
図13は、第4実施形態に係る情報報知システム1300の構成を示すブロック図である。図13において、図2と同じ参照番号は同様な機能を果たす構成要素を示している。以下、相違点を説明する。
以上、実施形態を参照して本願発明を説明したが、本願発明は上記実施形態に限定されものではない。本願発明の構成や詳細には、本願発明のスコープ内で当業者が理解し得る様々な変更をすることができる。また、本発明の実施形態について詳述したが、それぞれの実施形態に含まれる別々の特徴を如何様に組み合わせたシステム又は装置も、本発明の範疇に含まれる。
Claims (16)
- 不特定の人物に情報を報知するための情報処理装置であって、
手の動作を誘引するための誘引画像を含む画面を表示させる第1表示制御手段と、
撮影された人物による手の動作を認識する認識手段と、
前記撮影された人物の中で、前記認識手段が認識した手の動作が、前記誘引画像により誘引すべき手の動作に対応する人物を特定する特定手段と、
を備えることを特徴とする情報処理装置。 - 前記特定手段は、前記誘引画像と、該誘引画像により誘引すべき手の動作とを対応付けて記憶する記憶手段を含み、
前記特定手段は、前記記憶手段を参照して、前記誘引画像と認識した前記手の動作との対応関係を判定することを特徴とする請求項1に記載の情報処理装置。 - 前記特定手段が特定した人物に向けた広告情報を含む画面を表示させる第2表示制御手段を更に有することを特徴とする請求項1または2に記載の情報処理装置。
- 前記認識手段は、前記第2表示制御手段が前記人物に向けた広告情報を含む画面の表示を開始した後は、前記人物を対象者として手の動作を認識し、
前記第2表示制御手段は、前記対象者による手の動作に応答して前記広告情報を含む画面を表示させることを特徴とする請求項3に記載の情報処理装置。 - 前記第2表示制御手段は、前記人物に向けた広告情報を含む画面を表示させる前に、前記第1表示制御手段が表示させた画像により誘引した手の動作に対応する手の動作を返したことを前記人物に通知することを特徴とする請求項3または4に記載の情報処理装置。
- 前記第2表示制御手段は、前記特定手段が特定した人物を撮影した画像と、前記人物の手の画像との少なくともいずれかの画像を含む画面を表示させることを特徴とする請求項3乃至5のいずれか1項に記載の情報処理装置。
- 前記手の動作は、指の動作を含むことを特徴とする請求項1乃至6のいずれか1項に記載の情報処理装置。
- 前記認識手段は、ステレオカメラが撮影した2つの画像に基づいて、前記撮影された人物による手の動作を認識することを特徴とする請求項1乃至7のいずれか1項に記載の情報処理装置。
- 前記誘引画像は、手を振る動作または手話を表わす画像を含むことを特徴とする請求項1乃至8のいずれか1項に記載の情報処理装置。
- 前記誘引画像は、ジャンケンをする手の動作を表わす画像を含むことを特徴とする請求項1乃至8のいずれか1項に記載の情報処理装置。
- 前記特定手段は、前記認識手段が複数の人物による手の動作を認識した場合、前記誘引画像により誘引すべき手の動作に対応する手の動作を最初に検出した人物を特定することを特徴とする請求項1乃至10のいずれか1項に記載の情報処理装置。
- 前記第1表示制御手段によって表示させた前記画像に対応する音声を出力させる音声出力制御手段をさらに備えることを特徴とする請求項1乃至11のいずれか1項に記載の情報処理装置。
- 不特定の人物に情報を報知するための情報処理装置の制御方法であって、
手の動作を誘引するための誘引画像を含む画面を表示させる第1表示制御ステップと、
撮影された人物による手の動作を認識する認識ステップと、
前記撮影された人物の中で、前記認識ステップにより認識した手の動作が、前記誘引画像により誘引すべき手の動作に対応する人物を特定する特定ステップと、
を備えることを特徴とする情報処理装置の制御方法。 - 不特定の人物に情報を報知するための情報処理装置の制御プログラムを記憶する記憶媒体であって、
手の動作を誘引するための誘引画像を含む画面を表示させる第1表示制御ステップと、
撮影された人物による手の動作を認識する認識ステップと、
前記撮影された人物の中で、前記認識ステップにより認識した手の動作が、前記誘引画像により誘引すべき手の動作に対応する人物を特定する特定ステップと、
をコンピュータに実行させる制御プログラムを格納したことを特徴とする記憶媒体。 - 不特定の人物に情報を報知するための情報報知システムであって、
広告情報を含む画面を表示する表示手段と、
手の動作を誘引するための誘引画像を含む画面を前記表示手段に表示させる第1表示制御手段と、
撮影された人物による手の動作を認識する認識手段と、
前記撮影された人物の中で、前記認識手段が認識した手の動作が、前記誘引画像により誘引すべき手の動作に対応する人物を特定する特定手段と、
前記特定手段が特定した前記人物に向けた広告情報を含む画面を前記表示手段に表示させる第2表示制御手段と、
を備えることを特徴とする情報報知システム。 - 不特定の人物に情報を報知するための情報報知方法であって、
手の動作を誘引するための誘引画像を含む画面を、報知情報を含む画面を表示する表示手段に表示させる第1表示制御ステップと、
撮影された人物による手の動作を認識する認識ステップと、
前記撮影された人物の中で、前記認識ステップで認識した手の動作が、前記誘引画像により誘引すべき手の動作に対応する人物を特定する特定ステップと、
前記特定ステップにより特定された前記人物に向けた広告情報を含む画面を前記表示手段に表示させる第2表示制御ステップと、
を含むことを特徴とする情報報知方法。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13/823,517 US20130229342A1 (en) | 2010-11-10 | 2011-09-26 | Information providing system, information providing method, information processing apparatus, method of controlling the same, and control program |
CN2011800543464A CN103221968A (zh) | 2010-11-10 | 2011-09-26 | 信息通知系统、信息通知方法、信息处理设备及其控制方法和控制程序 |
JP2012542845A JP5605725B2 (ja) | 2010-11-10 | 2011-09-26 | 情報報知システム、情報報知方法、情報処理装置及びその制御方法と制御プログラム |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2010251678 | 2010-11-10 | ||
JP2010-251678 | 2010-11-10 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2012063561A1 true WO2012063561A1 (ja) | 2012-05-18 |
Family
ID=46050716
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2011/071802 WO2012063561A1 (ja) | 2010-11-10 | 2011-09-26 | 情報報知システム、情報報知方法、情報処理装置及びその制御方法と制御プログラム |
Country Status (4)
Country | Link |
---|---|
US (1) | US20130229342A1 (ja) |
JP (1) | JP5605725B2 (ja) |
CN (1) | CN103221968A (ja) |
WO (1) | WO2012063561A1 (ja) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2014203224A (ja) * | 2013-04-04 | 2014-10-27 | カシオ計算機株式会社 | 表示装置及び表示システム |
WO2017187614A1 (ja) * | 2016-04-28 | 2017-11-02 | 富士通株式会社 | 通信制御装置、方法、及びシステム |
Families Citing this family (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103455800A (zh) * | 2013-09-09 | 2013-12-18 | 苏州大学 | 基于智能识别的广告系统及相应广告推送方法 |
CN103605701A (zh) * | 2013-11-07 | 2014-02-26 | 北京智谷睿拓技术服务有限公司 | 通信对象的确定方法及确定装置 |
WO2016048688A1 (en) * | 2014-09-26 | 2016-03-31 | Thomson Licensing | Method and apparatus for providing interactive content |
KR101766902B1 (ko) * | 2015-11-26 | 2017-08-10 | 주식회사 제이엠랩 | 디지털 사이니지 및 그 제어 방법 |
CN107092430B (zh) * | 2016-02-18 | 2020-03-24 | 纬创资通(中山)有限公司 | 空间绘画计分方法、用于进行空间绘画计分的装置及系统 |
CN109214278B (zh) * | 2018-07-27 | 2023-04-18 | 平安科技(深圳)有限公司 | 用户指令匹配方法、装置、计算机设备及存储介质 |
CN113032605B (zh) * | 2019-12-25 | 2023-08-18 | 中移(成都)信息通信科技有限公司 | 一种信息展示方法、装置、设备及计算机存储介质 |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2010113313A (ja) * | 2008-11-10 | 2010-05-20 | Nec Corp | 電子広告装置、電子広告方法及びプログラム |
JP2010231355A (ja) * | 2009-03-26 | 2010-10-14 | Sanyo Electric Co Ltd | 情報表示装置 |
Family Cites Families (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20080059994A1 (en) * | 2006-06-02 | 2008-03-06 | Thornton Jay E | Method for Measuring and Selecting Advertisements Based Preferences |
JP4267648B2 (ja) * | 2006-08-25 | 2009-05-27 | 株式会社東芝 | インターフェース装置及びその方法 |
EP2597868B1 (en) * | 2007-09-24 | 2017-09-13 | Qualcomm Incorporated | Enhanced interface for voice and video communications |
CN101925916B (zh) * | 2007-11-21 | 2013-06-19 | 高通股份有限公司 | 基于媒体偏好控制电子设备的方法和系统 |
CN101470883A (zh) * | 2007-12-26 | 2009-07-01 | 联想(北京)有限公司 | 广告播放方法和广告播放设备 |
US8464160B2 (en) * | 2008-09-29 | 2013-06-11 | Panasonic Corporation | User interface device, user interface method, and recording medium |
CN201383313Y (zh) * | 2009-02-19 | 2010-01-13 | 李赛 | 互动广告牌以及网络式互动广告系统 |
JP5343773B2 (ja) * | 2009-09-04 | 2013-11-13 | ソニー株式会社 | 情報処理装置、表示制御方法及び表示制御プログラム |
-
2011
- 2011-09-26 US US13/823,517 patent/US20130229342A1/en not_active Abandoned
- 2011-09-26 JP JP2012542845A patent/JP5605725B2/ja active Active
- 2011-09-26 WO PCT/JP2011/071802 patent/WO2012063561A1/ja active Application Filing
- 2011-09-26 CN CN2011800543464A patent/CN103221968A/zh active Pending
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2010113313A (ja) * | 2008-11-10 | 2010-05-20 | Nec Corp | 電子広告装置、電子広告方法及びプログラム |
JP2010231355A (ja) * | 2009-03-26 | 2010-10-14 | Sanyo Electric Co Ltd | 情報表示装置 |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2014203224A (ja) * | 2013-04-04 | 2014-10-27 | カシオ計算機株式会社 | 表示装置及び表示システム |
WO2017187614A1 (ja) * | 2016-04-28 | 2017-11-02 | 富士通株式会社 | 通信制御装置、方法、及びシステム |
JPWO2017187614A1 (ja) * | 2016-04-28 | 2018-11-15 | 富士通株式会社 | 通信制御装置、方法、及びシステム |
US10885600B2 (en) | 2016-04-28 | 2021-01-05 | Fujitsu Limited | Communication control apparatus, communication control apparatus method, and system |
Also Published As
Publication number | Publication date |
---|---|
JPWO2012063561A1 (ja) | 2014-05-12 |
US20130229342A1 (en) | 2013-09-05 |
CN103221968A (zh) | 2013-07-24 |
JP5605725B2 (ja) | 2014-10-15 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5605725B2 (ja) | 情報報知システム、情報報知方法、情報処理装置及びその制御方法と制御プログラム | |
JP5527423B2 (ja) | 画像処理システム、画像処理方法、及び画像処理プログラムを記憶した記憶媒体 | |
JP6684883B2 (ja) | カメラエフェクトを提供する方法およびシステム | |
KR101796008B1 (ko) | 센서-기반 모바일 검색, 관련 방법들 및 시스템들 | |
US20200412975A1 (en) | Content capture with audio input feedback | |
US9888105B2 (en) | Intuitive computing methods and systems | |
US12063321B2 (en) | Modular camera interface with context-based display elements utilizing first and second lens | |
CN109242765B (zh) | 一种人脸图像处理方法、装置和存储介质 | |
KR102092931B1 (ko) | 시선 추적 방법 및 이를 수행하기 위한 사용자 단말 | |
KR101978299B1 (ko) | 콘텐츠 서비스 시스템에서의 콘텐츠 서비스를 위한 장치 | |
KR20130027081A (ko) | 직관적 컴퓨팅 방법들 및 시스템들 | |
CN108304762B (zh) | 一种人体姿态匹配方法及其设备、存储介质、终端 | |
US11625754B2 (en) | Method for providing text-reading based reward-type advertisement service and user terminal for executing same | |
US20210004082A1 (en) | Method for eye-tracking and terminal for executing the same | |
KR20160074315A (ko) | 사용자 단말 및 이의 햅틱 서비스 제공 방법 | |
KR20190035363A (ko) | 전자 장치 및 그 제어 방법 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 11839039 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2012542845 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 13823517 Country of ref document: US |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 11839039 Country of ref document: EP Kind code of ref document: A1 |