Nothing Special   »   [go: up one dir, main page]

WO2018008434A1 - Musical performance presentation device - Google Patents

Musical performance presentation device Download PDF

Info

Publication number
WO2018008434A1
WO2018008434A1 PCT/JP2017/023262 JP2017023262W WO2018008434A1 WO 2018008434 A1 WO2018008434 A1 WO 2018008434A1 JP 2017023262 W JP2017023262 W JP 2017023262W WO 2018008434 A1 WO2018008434 A1 WO 2018008434A1
Authority
WO
WIPO (PCT)
Prior art keywords
performance
virtual reality
content data
data
viewer
Prior art date
Application number
PCT/JP2017/023262
Other languages
French (fr)
Japanese (ja)
Inventor
昌利 ▲高▼木
Original Assignee
株式会社エム・ティー・ケー
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社エム・ティー・ケー filed Critical 株式会社エム・ティー・ケー
Publication of WO2018008434A1 publication Critical patent/WO2018008434A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
    • G10H1/00Details of electrophonic musical instruments
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/14Systems for two-way working
    • H04N7/15Conference systems

Definitions

  • the present invention relates to a performance performance device that realizes performance in a virtual reality space.
  • the performance device of Patent Document 1 performs automatic performance by reproducing viewpoint data detection means for directly or indirectly detecting a user's viewpoint, display means mounted in the vicinity of the user's eyes, and automatic performance data.
  • the viewpoint detection means detects the automatic performance means to be performed, the performance image generation means for generating a performance image in synchronization with the automatic performance by the automatic performance means, and the drawing angle of the performance image generated by the performance image generation means.
  • a control unit that controls the display unit to display the performance image with the drawing angle changed according to the user's viewpoint.
  • the performance image is a virtual player performing a real or virtual musical instrument.
  • the present invention has been made in view of such circumstances, and in the virtual reality space, as one of the performers and viewers, along with other performers and viewers, the performers can play musical instruments and the like in the real space. It is an object of the present invention to provide a performance performance device that can be recognized as if the user is performing using the keyboard, or can be recognized as if the viewer is viewing the performance in a real space.
  • VR performance means for visually and auditorily projecting virtual reality data for forming a virtual reality space composed of video and sound in front of the performer or viewer.
  • virtual reality content data which is individual content data related to video and sound, constituting the virtual reality space to be projected by the VR projection means, and performance response information consisting of the performance information of the performer or the response information of the viewer are input.
  • the control means uses the input means as if the player is in real space. It can be recognized as if it is playing or the viewer is as if It generates virtual reality data that can be recognized as to view played in real space, characterized in that to the projection in VR projection means.
  • the performance performance apparatus is characterized in that the virtual reality content data includes content data relating to video and sound of other performers or other viewers input in advance.
  • the performance performance device is characterized in that the virtual reality content data includes content data related to video and sound of other performers or other viewers generated in real time.
  • the performance performance device includes content data for generating virtual reality space having an atmosphere matched with existing music information previously generated as sound or live music information performed by the performer in the virtual reality content data. It is characterized by that.
  • the present invention in the virtual reality space, as a performer or viewer, it is recognized as if the performer is playing with an instrument or the like in a real space together with other performers and viewers. Or it can be perceived as if the viewer is watching the performance in real space.
  • FIG. 1 is a block diagram showing an example of the configuration of a performance performance device according to the present invention.
  • FIG. 2 is a block diagram showing an example of a player device of the performance device.
  • FIG. 3 is a block diagram showing an example of VR content of the performance device.
  • FIG. 4 is a block diagram showing an example of a viewer device of the performance device.
  • the performance performance device 1 is for realizing a performance in a virtual reality space.
  • a performance can be performed while feeling that the performer is performing. You can enjoy the music as if you were in a concert venue.
  • the performance performance device 1 performs, for example, in the performance of a band, even if one of the performers is in a different location from the other performers,
  • the purpose is to make you feel as if you are performing (ensemble) in the same place (virtual reality space) with other band members (other performers).
  • the purpose is to allow the viewer to feel as if he / she is watching and listening to the performance as a concert participant at the concert venue, even if he / she is not at the concert venue.
  • the performance device 1 is composed of a player device 10 and a viewer device 100, and if necessary (details will be described later) by connecting to the information communication network 5 or the like. It is connected to other player devices 10 and viewer devices 100.
  • the player device 10 (viewer device 100) includes a VR performance processing terminal 12 and VR projection means 30 worn by a player P who is a player (for example, a viewer A who is a concert audience). It is comprised from the input means 40 which inputs information.
  • the performance presentation processing terminal 12 includes a control unit 14 that controls the entire player device 10 (viewer device 100), and a storage unit 20 that stores data such as virtual reality content data (VR content data) 22. It is configured.
  • the performance presentation processing terminal 12 can perform various controls such as an electronic computer, a portable information terminal (for example, a smartphone or a tablet terminal), a game machine, and the like, can output video and audio, and can receive information from the input means 40. It can be obtained.
  • the VR projection means 30 is for visually and auditorily projecting virtual reality data for forming a virtual reality space composed of video and sound in front of the player P and the viewer A who are performers.
  • an HMD Head Mounted Display
  • the VR projection unit 30 will be described using the HMD 32.
  • the VR projection unit 30 is not limited to the HMD 32, and any smartphone having a function as the VR projection unit 30 may be used. It is not limited by the means.
  • the HMD 32 is described as including an acoustic device such as headphones, but an acoustic device different from the HMD may be used. It is not limited by.
  • the virtual reality content data 22 stored in the storage means 20 is individual content data related to video and sound that constitutes a virtual reality space to be projected by the VR projection means 30.
  • the input means 40 inputs performance response information including performance information of the player P who is a player and reaction information of the viewer A to the control means 14 (details will be described later). Further, the control unit 14 has a function of generating virtual reality data to be projected on the VR projection unit 30 from the virtual reality content data 22 and the performance response information.
  • the virtual reality content data 22 is roughly divided into, for example, video data 24 and audio data 26 when roughly classified. Then, by combining the video data 24 and the audio data 26, the virtual reality space can be experienced by the VR projection means 30, and becomes basic information for generating virtual reality data to be projected.
  • the video data 24 of the virtual reality content data 22 includes, for example, background / stage apparatus data 24a, audience data 24b, band member data 24c, player data 24d, and the like.
  • the background / stage apparatus data 24a is, for example, a fictitious virtual concert venue, a real concert venue, a baseball stadium, a stadium, etc., as information on a background (place, etc.) image that can form a virtual reality space in three dimensions.
  • it is information on fictional and real scenes such as the coast and the plain.
  • the background / stage apparatus data 24a includes, for example, information on a stage apparatus that can form a virtual reality space in three dimensions and a video of the surrounding atmosphere, such as spotlights, large screens, smoke, flames, and illuminations Information about fictitious and real devices.
  • the audience data 24b includes, for example, appearance information of a person other than the player P and the band member who can appear three-dimensionally in the virtual reality space, information on things worn by the person such as penlights and towels, Information on the person's reaction, such as applause and cheers.
  • the audience data 24b includes information on each gender, and may be information on a real person or information on a fictitious person, or may include real animals and fictional animals.
  • the band member data 24c and the player data 24d include the appearance information of the player P himself / herself and other performers who can appear three-dimensionally in the virtual reality space, the musical instrument to be played, the manner and expression of performance, vocals, If it is a chorus, it is information of a singing state, and if it is a dancer, it is information of the dance (information of movement).
  • the band member data 24c and the player data 24d may be information on a real person or information on a fictitious person, or a photograph of a face of a real person inserted in a video of a fictitious body. The combination is arbitrary. Further, the band member data 24c and the player data 24d can be used as the audience data 24b, and the usage method itself is arbitrary. When the band member data 24c and the player data 24d are regarded as performers, they are, for example, vocals, guitars, basses, drums, keyboards, choruses, dancers, DJs, orchestras, and the like.
  • the sound data 26 of the virtual reality content data 22 includes, for example, music data 26a and environmental sound data 26b.
  • the music data 26a is, for example, performance information of existing music, and is information on performance sounds and singing voices of musical instruments.
  • the music data 26a is, for example, information such as vocals, guitars, basses, drums, and keyboards as musical instruments, which are sounds formed in band performance, and each of them is individually turned off (player track off, etc.). It can also be turned on or turned on.
  • the music data 26a can also include performance motion data, which is information on the operations of the player P and other performers in the performance for each music.
  • performance motion data can be provided in advance.
  • the player P performs live and inputs or in the case of a music piece without performance motion data
  • the tempo of the music piece is provided.
  • Etc. can be used so that general-purpose performance motion data can be used.
  • Such general-purpose performance motion data can also be stored as music data 26a.
  • the environmental sound data 26b is, for example, in the case of a concert, information on ambient sounds at the concert venue, and information on sound due to audience response without the video data 24 is also environmental sound data 26b.
  • the virtual reality content data 22 may include content data related to video and sound of other performers or other viewers input in advance, and other performers generated in real time or It is also possible to include content data related to the video and sound of other viewers, and further generate a virtual reality space with an atmosphere that matches existing music information previously generated as sound or live music information performed by the performer It is also possible to include content data for this purpose.
  • the above-described example of the virtual reality content data 22 is merely an example, and is not limited as long as it contributes to the achievement of the object of the present application, that is, video and audio information constituting the virtual reality space.
  • the virtual reality content data 22 is stored in the storage means 20 in advance.
  • the virtual reality content data 22 is not necessarily stored in a fixed manner, and the external storage means is connected to the performance performance processing terminal. 12, the virtual reality content data 22 can be added to the storage means 20 or replaced.
  • the information communication network 5 is, for example, an electric communication line such as the Internet network.
  • the information communication network 5 may simply be connected with a cable that directly connects the player device 10 and the viewer device 100 to each other. If the viewer apparatus 100 can be connected to the outside, it is not limited by the form.
  • control means 14 can recognize that the player P who is a performer is performing using the input means 40 in the real space, or the viewer P is viewing the performance in the real space.
  • Virtual reality data that can be recognized by the VR projection unit 30 is generated.
  • the player P in the virtual reality space, as one of the performers and viewers, the player P (performer) is as if the player P (performer) is real. It can be recognized as if the user is performing using the musical instrument 44 or the like in the space, or the viewer A can be recognized as if the user is viewing the performance in the actual space.
  • the player P uses the performance presentation processing terminal 12 to determine what kind of music, what kind of band member (other performers, etc.) and what kind of place (concert venue, etc.) Which part (which may be a vocal) is assigned is set by the input device 42 (keyboard, mouse, controller, etc.).
  • the player P connects and sets the musical instrument 44 or the microphone 46 used by the player P to the performance presentation processing terminal (control means 14).
  • the interface between the musical instrument 44 and the performance presentation processing terminal 12 may be in any form as long as it is electronically connectable, and may be wired or wirelessly connected.
  • the signal system may be a digital system or an analog system.
  • the musical instrument 44 and the performance presentation processing terminal can be connected by USB or by the Midi method, and an analog audio signal from the musical instrument 44 is sent to the control means 14 of the performance presentation processing terminal 12. You may make it send.
  • performance presentation processing terminal 12 is a game machine
  • performance response information as performance information is input to the performance presentation processing terminal 12 (control means 14) via a controller which is an input device 42. You may do it.
  • the player P wears the HMD 32, which is the VR projection means 30, so that the virtual reality space can be experienced through video and sound.
  • the virtual reality space set by the player P is projected onto the HMD 32, and the player P appears to be at a concert venue or the like set by the player P. It becomes a state like this.
  • other performers and spectators appear in front of the eyes via the HMD 32.
  • the player P starts to play or sing the music in accordance with other imaginary players.
  • the player P sends performance response information, which is performance information, to the control means 14 of the performance presentation processing terminal 12 via these input means 40 by playing the instrument 44 or singing via the microphone 46.
  • the control means 14 generates virtual reality data that can be recognized as if the player P is playing using the input means 40 (for example, the musical instrument 44) in a real space, and projects it onto the HMD 32 that is the VR projection means 30.
  • the performance performance information is sent to the control means 14 using the guitar played by the player P as the instrument 44.
  • the virtual reality space projected on the HMD 32 can be recognized as if the player P is playing with his / her part.
  • the other players are projected as if they are playing with their instruments and parts, and the player P feels as if they are having a session with other players in the real space, in the virtual reality space. Can play.
  • the performance is performed with other performers at the set concert venue, and performing while feeling the audience's reaction in front of the set audience. It is possible to perform in a virtual reality space with the same feeling.
  • the performance of other performers and the state of the audience's reaction can be determined based on the prepared virtual reality content data 22.
  • the control means 14 determines from the performance of the player P.
  • the performance operation of the player P can be sent to the control unit 14 and reflected in the virtual reality space.
  • the motion capture means for example, sensor information of the HMD 32 such as an acceleration sensor, a gyro sensor, a proximity sensor, and a line-of-sight detection sensor provided in the HMD 32 can be used. Then, the player P himself / herself can see the performance of his / her performance in the virtual reality space with the eyes of another player. Of course, it is possible to allow the player P to see a state other than himself / herself in the virtual reality space with the gaze of another player, the gaze of the audience, or another gaze.
  • the other performers and spectators are fictitious ones prepared in advance in the virtual reality content data 22, but other performances connected by the information communication network 5 as shown in FIG. It is also possible to connect to the player device 10 of the person in real time and to make a session as if playing in the same place while being in a remote place in real time. In this case, not all performers actually perform, and some performers may be fictitious prepared in advance in the virtual reality content data 22 as described above.
  • a spectator who is a viewer A which will be described later, is connected in real time to the viewer apparatus 100 connected by the information communication network 5, and seems to be listening to the performance at the same place even though it is a remote place in real time. It is also possible to make it.
  • virtual reality data forming a virtual reality space generated by the performance of the player P can be stored as it is or distributed through the information communication network 5, and the virtual reality data can be reproduced on other media. It is possible to store and distribute in two or three dimensions as much as possible. Further, for example, it is possible to score by comparing the performance of the player P with the information of the existing performance.
  • the viewer A uses the performance presentation processing terminal 12 to set what kind of band performance he / she wants to view at the concert venue, using the input device 42 (keyboard, mouse, controller, etc.). Further, the viewer A attaches the microphone 46, the penlight 48, etc., which are input means 40 for inputting performance response information as his / her reaction information, as necessary, to the performance stage processing terminal 12 (control means 14). Connect to and set.
  • the viewer A wears the HMD 32 that is the VR projection means 30 to make the virtual reality space feelable by video and sound.
  • the virtual reality space set by the viewer A is projected onto the HMD 32, and the band performance set by the viewer A is the concert venue set by himself / herself. You can watch it as if you were in a situation.
  • the viewer A raises a voice in accordance with the band performance and inputs the voice to the microphone 46, or shakes another input device 42 (keyboard, mouse, controller, etc.) or a penlight 48 to respond to his / her reaction.
  • performance response information which is reaction information, is sent to the control means 14 of the performance performance processing terminal 12 via these input means 40.
  • control means 14 can recognize that the viewer A is reacting in the real space using the input means 40 (for example, the microphone 46), and the virtual reality data is obtained in accordance with the reaction of the viewer A. It can also be generated and projected on the HMD 32 which is the VR projection means 30.
  • a motion capture means that can capture the motion of the viewer A as the input device 42, it is possible to send the motion of the viewer A's reaction to the control means 14 and reflect it in the virtual reality space.
  • the motion capture means for example, sensor information of the HMD 32 such as an acceleration sensor, a gyro sensor, a proximity sensor, and a line-of-sight detection sensor provided in the HMD 32 can be used. Then, it is possible for the viewer A himself / herself to see the state of his / her viewer as the viewer in the virtual reality space. Of course, it is possible to allow the viewer A to see a state other than himself / herself in the virtual reality space with the gaze of the performer, the gaze of another spectator (viewer), or another gaze.
  • the performer and other spectators are fictitious prepared in advance by the virtual reality content data 22, but are connected by the information communication network 5 as shown in FIG. It is possible to connect the player device 10 and the viewer device 100 of the performer in real time, and to feel as if they are performing in front of their eyes while being in a remote place in real time. In this case, not only all performers and spectators (viewers) are actually performing or watching, but some performers and spectators (viewers) have virtual reality content data as described above. An imaginary thing prepared in advance in 22 may be used.
  • the virtual reality data forming the virtual reality space generated while the viewer A is observing can be stored as it is or distributed through the information communication network 5.
  • This virtual reality data Can be stored and distributed two-dimensionally or three-dimensionally so as to be reproducible on other media.
  • the performer performs as if using a musical instrument or the like in an actual space together with other performers or viewers. It is possible to provide a performance device that can be recognized as if the viewer is viewing the performance in a real space.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Computer Hardware Design (AREA)
  • Acoustics & Sound (AREA)
  • Computer Graphics (AREA)
  • Signal Processing (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Electrophonic Musical Instruments (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)

Abstract

The musical performance presentation device is characterized by comprising a control means (14) for generating virtual reality data to be projected by a VR projection means (30) from virtual reality content data (22) and musical performance reaction information, wherein the control means (14) generates the virtual reality data and causes the VR projection means (30) to project the data such that a player perceives that he/she is playing music using an input means (40) in a real space, or an audience can perceive that they are listening to the musical performance in a real space.

Description

演奏上演装置Performance performance device
 本発明は、バーチャルリアリティ空間での演奏を実現させる演奏上演装置に関する。 The present invention relates to a performance performance device that realizes performance in a virtual reality space.
 従来より、演奏装置として、バーチャルリアリティ空間において、演奏者が演奏していることを実感しながら演奏が行えるものが提案されている。例えば、特許文献1に示される演奏装置がある。 Conventionally, as a performance device, a device capable of performing while realizing that a performer is performing in a virtual reality space has been proposed. For example, there is a performance device disclosed in Patent Document 1.
 特許文献1の演奏装置は、ユーザの視点を直接的または間接的に検出する視点検出手段と、ユーザの目の近傍に装着される表示手段と、自動演奏データを再生することにより、自動演奏を行う自動演奏手段と、自動演奏手段による自動演奏と同期して、演奏画像を生成する演奏画像生成手段と、演奏画像生成手段によって生成された演奏画像の描画角度を、視点検出手段によって検出されたユーザの視点に応じて変更し、描画角度の変更された演奏画像を表示手段に表示させるように制御する制御手段とを有するものである。そして、演奏画像は、リアルな又はバーチャルな楽器をバーチャルな演奏者が演奏しているものである。 The performance device of Patent Document 1 performs automatic performance by reproducing viewpoint data detection means for directly or indirectly detecting a user's viewpoint, display means mounted in the vicinity of the user's eyes, and automatic performance data. The viewpoint detection means detects the automatic performance means to be performed, the performance image generation means for generating a performance image in synchronization with the automatic performance by the automatic performance means, and the drawing angle of the performance image generated by the performance image generation means. And a control unit that controls the display unit to display the performance image with the drawing angle changed according to the user's viewpoint. The performance image is a virtual player performing a real or virtual musical instrument.
特開2007-271698号公報JP 2007-271698 A
 しかしながら、従来の演奏装置では、例えばバンド演奏で、特定のパートだけを自分が演奏し、演奏者があたかも現実の空間でバンド演奏しているように認識しつつバーチャルリアリティ空間で演奏をすることができなかった。 However, in a conventional performance apparatus, for example, in a band performance, only a specific part is played, and the performer can perform in a virtual reality space while recognizing that the band is performing in a real space. could not.
 本発明は、このような事情に鑑みてなされたもので、バーチャルリアリティ空間において、演奏者や視聴者の1人として、他の演奏者や視聴者と共に、演奏者があたかも現実の空間で楽器等を用いて演奏しているように認識できる又は視聴者があたかも現実の空間で演奏を視聴しているように認識できる演奏上演装置を提供することにある。 The present invention has been made in view of such circumstances, and in the virtual reality space, as one of the performers and viewers, along with other performers and viewers, the performers can play musical instruments and the like in the real space. It is an object of the present invention to provide a performance performance device that can be recognized as if the user is performing using the keyboard, or can be recognized as if the viewer is viewing the performance in a real space.
 請求項1記載の演奏上演装置は、演奏者又は視聴者の眼前に、映像及び音響によって構成されるバーチャルリアリティ空間を形成させるためのバーチャルリアリティデータを、視覚的及び聴覚的に投影させるVR投影手段と、VR投影手段に投影させるバーチャルリアリティ空間を構成する映像及び音響に係る個々のコンテンツデータであるバーチャルリアリティコンテンツデータと、演奏者の演奏情報又は視聴者の反応情報からなる演奏反応情報を入力する入力手段と、バーチャルリアリティコンテンツデータと演奏反応情報とから、VR投影手段に投影させるバーチャルリアリティデータを生成する制御手段とを備え、制御手段が、演奏者があたかも現実の空間で入力手段を用いて演奏しているように認識できる又は視聴者があたかも現実の空間で演奏を視聴しているように認識できるバーチャルリアリティデータを生成し、VR投影手段で投影させることを特徴とする。 The performance performance apparatus according to claim 1, wherein VR performance means for visually and auditorily projecting virtual reality data for forming a virtual reality space composed of video and sound in front of the performer or viewer. And virtual reality content data, which is individual content data related to video and sound, constituting the virtual reality space to be projected by the VR projection means, and performance response information consisting of the performance information of the performer or the response information of the viewer are input. And a control means for generating virtual reality data to be projected onto the VR projection means from the virtual reality content data and the performance response information. The control means uses the input means as if the player is in real space. It can be recognized as if it is playing or the viewer is as if It generates virtual reality data that can be recognized as to view played in real space, characterized in that to the projection in VR projection means.
 請求項2記載の演奏上演装置は、バーチャルリアリティコンテンツデータに、予め入力された他の演奏者又は他の視聴者の映像及び音響に係るコンテンツデータを含むことを特徴とする。 The performance performance apparatus according to claim 2 is characterized in that the virtual reality content data includes content data relating to video and sound of other performers or other viewers input in advance.
 請求項3記載の演奏上演装置は、バーチャルリアリティコンテンツデータに、リアルタイムで生成される他の演奏者又は他の視聴者の映像及び音響に係るコンテンツデータを含むことを特徴とする。 The performance performance device according to claim 3 is characterized in that the virtual reality content data includes content data related to video and sound of other performers or other viewers generated in real time.
 請求項4記載の演奏上演装置は、バーチャルリアリティコンテンツデータに、音響として予め生成された既存楽曲情報又は演者が演奏する生楽曲情報にマッチした雰囲気のバーチャルリアリティ空間を生成するためのコンテンツデータを含むことを特徴とする。 The performance performance device according to claim 4 includes content data for generating virtual reality space having an atmosphere matched with existing music information previously generated as sound or live music information performed by the performer in the virtual reality content data. It is characterized by that.
 本願発明によれば、バーチャルリアリティ空間において、演奏者や視聴者の1人として、他の演奏者や視聴者と共に、演奏者があたかも現実の空間で楽器等を用いて演奏しているように認識できる又は視聴者があたかも現実の空間で演奏を視聴しているように認識できるようにすることができる。 According to the present invention, in the virtual reality space, as a performer or viewer, it is recognized as if the performer is playing with an instrument or the like in a real space together with other performers and viewers. Or it can be perceived as if the viewer is watching the performance in real space.
本発明に係る演奏上演装置の構成の一例を示す構成図である。It is a block diagram which shows an example of a structure of the performance performance apparatus which concerns on this invention. 同演奏上演装置のプレーヤ装置の一例を示す構成図である。It is a block diagram which shows an example of the player apparatus of the performance performance apparatus. 同演奏上演装置のVRコンテンツの一例を示す構成図である。It is a block diagram which shows an example of the VR content of the performance apparatus. 同演奏上演装置の視聴者装置の一例を示す構成図である。It is a block diagram which shows an example of the viewer apparatus of the performance performance apparatus.
 以下、本発明の形態について図面を参照しながら具体的に説明する。図1は、本発明に係る演奏上演装置の構成の一例を示す構成図である。図2は、同演奏上演装置のプレーヤ装置の一例を示す構成図である。図3は、同演奏上演装置のVRコンテンツの一例を示す構成図である。図4は、同演奏上演装置の視聴者装置の一例を示す構成図である。 Hereinafter, embodiments of the present invention will be specifically described with reference to the drawings. FIG. 1 is a block diagram showing an example of the configuration of a performance performance device according to the present invention. FIG. 2 is a block diagram showing an example of a player device of the performance device. FIG. 3 is a block diagram showing an example of VR content of the performance device. FIG. 4 is a block diagram showing an example of a viewer device of the performance device.
 本発明に係る演奏上演装置1は、バーチャルリアリティ空間での演奏を実現させるためのもので、バーチャルリアリティ空間において、演奏者が演奏していることを実感しながら演奏が行え、また、視聴者があたかもコンサート会場にいるような感覚で楽曲を楽しむことができるものである。さらに、演奏上演装置1は、例えば、バンド演奏において、演奏者の1人が、他の演奏者とは別の場所にいる状態であっても、自分の楽器もしくはパートの演奏をして、他のバンドメンバ(他の演奏者)とあたかも同じ場所(バーチャルリアリティ空間)で競演(合奏)しているように体感できることを目的にしている。また、視聴者の場合には、例えば、コンサート会場にいなくても、あたかもコンサート会場で、自らもコンサート参加者として演奏を見聞きしているように体感できることを目的にしている。 The performance performance device 1 according to the present invention is for realizing a performance in a virtual reality space. In the virtual reality space, a performance can be performed while feeling that the performer is performing. You can enjoy the music as if you were in a concert venue. Furthermore, the performance performance device 1 performs, for example, in the performance of a band, even if one of the performers is in a different location from the other performers, The purpose is to make you feel as if you are performing (ensemble) in the same place (virtual reality space) with other band members (other performers). In the case of a viewer, for example, the purpose is to allow the viewer to feel as if he / she is watching and listening to the performance as a concert participant at the concert venue, even if he / she is not at the concert venue.
 演奏上演装置1は、図1に基本的な構成を示すように、プレーヤ装置10や視聴者装置100により構成され、必要に応じて(詳細は後述する)、情報通信ネットワーク5等の接続により、他のプレーヤ装置10や視聴者装置100に接続される。 As shown in FIG. 1, the performance device 1 is composed of a player device 10 and a viewer device 100, and if necessary (details will be described later) by connecting to the information communication network 5 or the like. It is connected to other player devices 10 and viewer devices 100.
 図1に示すように、プレーヤ装置10(視聴者装置100)は、演奏上演処理端末12と、演奏者であるプレーヤP(例えばコンサートの観客である視聴者A)が装着するVR投影手段30や情報を入力する入力手段40とから構成されている。そして、演奏上演処理端末12は、プレーヤ装置10(視聴者装置100)の全体の制御をつかさどる制御手段14と、バーチャルリアリティコンテンツデータ(VRコンテンツデータ)22等のデータを記憶する記憶手段20とから構成されている。演奏上演処理端末12は、例えば、電子計算機や携帯情報端末(例えばスマートフォンやタブレット端末)やゲーム機等の各種制御が可能で、映像や音声の出力が可能で、且つ、入力手段40から情報を入手できるものである。 As shown in FIG. 1, the player device 10 (viewer device 100) includes a VR performance processing terminal 12 and VR projection means 30 worn by a player P who is a player (for example, a viewer A who is a concert audience). It is comprised from the input means 40 which inputs information. The performance presentation processing terminal 12 includes a control unit 14 that controls the entire player device 10 (viewer device 100), and a storage unit 20 that stores data such as virtual reality content data (VR content data) 22. It is configured. The performance presentation processing terminal 12 can perform various controls such as an electronic computer, a portable information terminal (for example, a smartphone or a tablet terminal), a game machine, and the like, can output video and audio, and can receive information from the input means 40. It can be obtained.
 VR投影手段30は、演奏者であるプレーヤPや視聴者Aの眼前に、映像及び音響によって構成されるバーチャルリアリティ空間を形成させるためのバーチャルリアリティデータを、視覚的及び聴覚的に投影させるためのもので、図2や図4にあるように、例えば、頭部装着ディスプレイであるHMD(Head Mounted Display)32である。本願の説明では、VR投影手段30をHMD32で説明するが、HMD32に限定されるものではなく、VR投影手段30としての機能を有するものであれば、例えば、スマートフォンをVR投影手段30として用いる等、手段により制限されるものではない。また、図2や図4の例では、HMD32に音響装置である例えばヘッドホンが含まれる形で記載しているが、HMDとは別の音響装置を用いるようにしてもよいし、音響装置の種類によって制限されるものでもない。 The VR projection means 30 is for visually and auditorily projecting virtual reality data for forming a virtual reality space composed of video and sound in front of the player P and the viewer A who are performers. As shown in FIGS. 2 and 4, for example, an HMD (Head Mounted Display) 32 that is a head-mounted display is used. In the description of the present application, the VR projection unit 30 will be described using the HMD 32. However, the VR projection unit 30 is not limited to the HMD 32, and any smartphone having a function as the VR projection unit 30 may be used. It is not limited by the means. In the examples of FIGS. 2 and 4, the HMD 32 is described as including an acoustic device such as headphones, but an acoustic device different from the HMD may be used. It is not limited by.
 記憶手段20に記憶されるバーチャルリアリティコンテンツデータ22は、VR投影手段30に投影させるバーチャルリアリティ空間を構成する映像及び音響に係る個々のコンテンツデータである。そして、入力手段40は、演奏者であるプレーヤPの演奏情報や視聴者Aの反応情報からなる演奏反応情報を制御手段14に対して入力するものである(詳細は、後述する)。さらに、制御手段14は、バーチャルリアリティコンテンツデータ22と演奏反応情報とから、VR投影手段30に投影させるバーチャルリアリティデータを生成する機能を有している。 The virtual reality content data 22 stored in the storage means 20 is individual content data related to video and sound that constitutes a virtual reality space to be projected by the VR projection means 30. The input means 40 inputs performance response information including performance information of the player P who is a player and reaction information of the viewer A to the control means 14 (details will be described later). Further, the control unit 14 has a function of generating virtual reality data to be projected on the VR projection unit 30 from the virtual reality content data 22 and the performance response information.
 バーチャルリアリティコンテンツデータ22は、具体的には、例えば、大別すると、映像用データ24と音響用データ26とに分けられる。そして、映像用データ24や音響用データ26が、組み合わせることでバーチャルリアリティ空間がVR投影手段30により体感可能になり、投影させるバーチャルリアリティデータを生成する基礎的な情報になるわけである。 Specifically, the virtual reality content data 22 is roughly divided into, for example, video data 24 and audio data 26 when roughly classified. Then, by combining the video data 24 and the audio data 26, the virtual reality space can be experienced by the VR projection means 30, and becomes basic information for generating virtual reality data to be projected.
 バーチャルリアリティコンテンツデータ22の映像用データ24は、より具体的には、例えば、背景・舞台装置データ24aや観客データ24bやバンドメンバデータ24cやプレーヤデータ24d等がある。背景・舞台装置データ24aは、例えば、バーチャルリアリティ空間を立体的に形成可能な背景(場所等)の映像の情報として、架空のバーチャルのコンサート会場や、実在のコンサート会場や野球場や競技場等の他、海岸や平原等の架空や実在の光景の情報である。また、背景・舞台装置データ24aは、例えば、バーチャルリアリティ空間を立体的に形成可能な舞台装置や変化する周囲の雰囲気の映像の情報として、スポットライトや大型スクリーンやスモークや炎や電飾等の架空や実在の装置等の情報である。 More specifically, the video data 24 of the virtual reality content data 22 includes, for example, background / stage apparatus data 24a, audience data 24b, band member data 24c, player data 24d, and the like. The background / stage apparatus data 24a is, for example, a fictitious virtual concert venue, a real concert venue, a baseball stadium, a stadium, etc., as information on a background (place, etc.) image that can form a virtual reality space in three dimensions. In addition, it is information on fictional and real scenes such as the coast and the plain. Further, the background / stage apparatus data 24a includes, for example, information on a stage apparatus that can form a virtual reality space in three dimensions and a video of the surrounding atmosphere, such as spotlights, large screens, smoke, flames, and illuminations Information about fictitious and real devices.
 観客データ24bは、例えば、バーチャルリアリティ空間に立体的に登場可能なプレーヤPやバンドメンバ以外の人そのものの外見的な情報や、ペンライトやタオル等のその人が身につけている物の情報や、拍手や歓声等のその人の反応の情報である。観客データ24bは、男女おのおのの情報があり、また、実在する人物の情報であっても架空の人物の情報であってもよいし、実在の動物や架空の動物も含むようにしてもよい。 The audience data 24b includes, for example, appearance information of a person other than the player P and the band member who can appear three-dimensionally in the virtual reality space, information on things worn by the person such as penlights and towels, Information on the person's reaction, such as applause and cheers. The audience data 24b includes information on each gender, and may be information on a real person or information on a fictitious person, or may include real animals and fictional animals.
 バンドメンバデータ24cやプレーヤデータ24dは、バーチャルリアリティ空間に立体的に登場可能なプレーヤP本人やそれ以外の演奏者そのものの外見的な情報や、演奏する楽器や演奏の仕方や表情や、ボーカルやコーラスであれば歌っている様子の情報であり、ダンサーであればそのダンスの情報(動きの情報)である。バンドメンバデータ24cやプレーヤデータ24dは、実在する人物の情報であっても架空の人物の情報であってもよいし、架空の身体の映像に実在する人の顔の写真等を嵌め込んだもの等、その組み合わせは任意である。また、バンドメンバデータ24cやプレーヤデータ24dを観客データ24bとして用いることも可能で、その使用方法自体が任意である。バンドメンバデータ24cやプレーヤデータ24dを、出演者として捉えると、例えば、ボーカル、ギター、ベース、ドラム、キーボード、コーラス、ダンサー、DJ、オーケスト等のそれぞれである。 The band member data 24c and the player data 24d include the appearance information of the player P himself / herself and other performers who can appear three-dimensionally in the virtual reality space, the musical instrument to be played, the manner and expression of performance, vocals, If it is a chorus, it is information of a singing state, and if it is a dancer, it is information of the dance (information of movement). The band member data 24c and the player data 24d may be information on a real person or information on a fictitious person, or a photograph of a face of a real person inserted in a video of a fictitious body. The combination is arbitrary. Further, the band member data 24c and the player data 24d can be used as the audience data 24b, and the usage method itself is arbitrary. When the band member data 24c and the player data 24d are regarded as performers, they are, for example, vocals, guitars, basses, drums, keyboards, choruses, dancers, DJs, orchestras, and the like.
 バーチャルリアリティコンテンツデータ22の音響用データ26は、より具体的には、例えば、楽曲データ26aや環境音データ26b等がある。楽曲データ26aは、例えば、既存の楽曲の演奏情報で、楽器の演奏音や歌声の情報である。また、楽曲データ26aは、例えば、バンド演奏において構成される音響であるボーカル、ギター、ベース、ドラム、楽器としてのキーボード等の情報であり、それらのそれぞれを個別にオフ(プレーヤトラックOFF等)したりオンすることができたりすることも可能な情報である。 More specifically, the sound data 26 of the virtual reality content data 22 includes, for example, music data 26a and environmental sound data 26b. The music data 26a is, for example, performance information of existing music, and is information on performance sounds and singing voices of musical instruments. The music data 26a is, for example, information such as vocals, guitars, basses, drums, and keyboards as musical instruments, which are sounds formed in band performance, and each of them is individually turned off (player track off, etc.). It can also be turned on or turned on.
 また、楽曲データ26aは、各楽曲毎に演奏におけるプレーヤPや他の演奏者毎の動作の情報である演奏モーションデータを合わせ持たせることも可能である。尚、既存の楽曲であれば、演奏モーションデータを予め持たせることが可能であるが、プレーヤPが生で演奏して入力する場合や演奏モーションデータを伴っていない楽曲の場合、その楽曲のテンポ等を解析して汎用的な演奏モーションデータを使用できるようにすることも可能で、このような汎用的な演奏モーションデータも楽曲データ26aとして持つことも可能である。 Further, the music data 26a can also include performance motion data, which is information on the operations of the player P and other performers in the performance for each music. In the case of an existing music piece, performance motion data can be provided in advance. However, when the player P performs live and inputs or in the case of a music piece without performance motion data, the tempo of the music piece is provided. Etc. can be used so that general-purpose performance motion data can be used. Such general-purpose performance motion data can also be stored as music data 26a.
 環境音データ26bは、例えば、コンサートの場合、コンサート会場での周囲の音の情報で、映像用データ24を伴わない観客の反響による音の情報も環境音データ26bである。 The environmental sound data 26b is, for example, in the case of a concert, information on ambient sounds at the concert venue, and information on sound due to audience response without the video data 24 is also environmental sound data 26b.
 尚、バーチャルリアリティコンテンツデータ22に、予め入力された他の演奏者又は他の視聴者の映像及び音響に係るコンテンツデータを含めることも可能であり、また、リアルタイムで生成される他の演奏者又は他の視聴者の映像及び音響に係るコンテンツデータを含めることも可能であり、さらに、音響として予め生成された既存楽曲情報又は演者が演奏する生楽曲情報にマッチした雰囲気のバーチャルリアリティ空間を生成するためのコンテンツデータを含めることも可能である。 The virtual reality content data 22 may include content data related to video and sound of other performers or other viewers input in advance, and other performers generated in real time or It is also possible to include content data related to the video and sound of other viewers, and further generate a virtual reality space with an atmosphere that matches existing music information previously generated as sound or live music information performed by the performer It is also possible to include content data for this purpose.
 尚、上述のバーチャルリアリティコンテンツデータ22の例は、あくまでも一例にすぎず、本願の目的を達成に寄与するすなわちバーチャルリアリティ空間を構成する映像や音響の情報であれば、限定されるものではない。また、各図の記載では、予めバーチャルリアリティコンテンツデータ22が、記憶手段20に納められているようになっているが、必ずしも固定的に納められているのでなく、外部記憶手段を演奏上演処理端末12に接続し、バーチャルリアリティコンテンツデータ22を記憶手段20に追加したり、入れ替えたりすることも可能である。また、図1に示すような情報通信ネットワーク5を介して、外部から記憶手段20に追加したり、入れ替えたりすることも可能である。尚、情報通信ネットワーク5は、例えば、インターネット網等の電気通信回線であるが、単に、プレーヤ装置10や視聴者装置100を相互に直接接続するケーブルでの接続であってもよく、プレーヤ装置10や視聴者装置100が、外部と接続可能であれば、その形態により制限されるものではない。 Note that the above-described example of the virtual reality content data 22 is merely an example, and is not limited as long as it contributes to the achievement of the object of the present application, that is, video and audio information constituting the virtual reality space. In the description of each figure, the virtual reality content data 22 is stored in the storage means 20 in advance. However, the virtual reality content data 22 is not necessarily stored in a fixed manner, and the external storage means is connected to the performance performance processing terminal. 12, the virtual reality content data 22 can be added to the storage means 20 or replaced. Moreover, it is also possible to add to the storage means 20 from the outside via the information communication network 5 as shown in FIG. The information communication network 5 is, for example, an electric communication line such as the Internet network. However, the information communication network 5 may simply be connected with a cable that directly connects the player device 10 and the viewer device 100 to each other. If the viewer apparatus 100 can be connected to the outside, it is not limited by the form.
 そして、制御手段14は、演奏者であるプレーヤPがあたかも現実の空間で入力手段40を用いて演奏しているように認識できる又は視聴者Pがあたかも現実の空間で演奏を視聴しているように認識できるバーチャルリアリティデータを生成し、VR投影手段30で投影させる。 Then, the control means 14 can recognize that the player P who is a performer is performing using the input means 40 in the real space, or the viewer P is viewing the performance in the real space. Virtual reality data that can be recognized by the VR projection unit 30 is generated.
 以上のような構成の本願の演奏上演装置1によれば、バーチャルリアリティ空間において、演奏者や視聴者の1人として、他の演奏者や視聴者と共に、プレーヤP(演奏者)があたかも現実の空間で楽器44等を用いて演奏しているように認識できる又は視聴者Aがあたかも現実の空間で演奏を視聴しているように認識することができる。 According to the performance performance device 1 of the present application configured as described above, in the virtual reality space, as one of the performers and viewers, the player P (performer) is as if the player P (performer) is real. It can be recognized as if the user is performing using the musical instrument 44 or the like in the space, or the viewer A can be recognized as if the user is viewing the performance in the actual space.
 次に、プレーヤPが、演奏上演装置1を用いて演奏を行う場合の実施例を説明する。本実施例では、バンド演奏において、演奏者の1人であるプレーヤPが、自分の楽器で自分のパートの演奏をして、他のバンドメンバ(他の演奏者)とあたかも同じ場所(バーチャルリアリティ空間)で競演(合奏)しているように体感できる使用方法を、図2等を用いて説明する。 Next, an embodiment in which the player P performs using the performance device 1 will be described. In this embodiment, in a band performance, a player P who is one of the performers plays his / her part with his / her musical instrument, as if it is the same place (virtual reality) as other band members (other performers). A method of use that allows the user to feel as if they are performing in a space is described with reference to FIG.
 まず、プレーヤPは、演奏上演処理端末12で、どのような曲を、どのようなバンドメンバ(他の演奏者等)と、どのような場所(コンサート会場等)で、自分がどのような楽器(ボーカルの場合もある)のどのパートを担当するかを、入力デバイス42(キーボード、マウス、コントローラ等)で設定する。また、プレーヤPは、自分が使用する楽器44又はマイク46を、演奏上演処理端末(制御手段14)に接続しセッティングする。 First, the player P uses the performance presentation processing terminal 12 to determine what kind of music, what kind of band member (other performers, etc.) and what kind of place (concert venue, etc.) Which part (which may be a vocal) is assigned is set by the input device 42 (keyboard, mouse, controller, etc.). The player P connects and sets the musical instrument 44 or the microphone 46 used by the player P to the performance presentation processing terminal (control means 14).
 楽器44と演奏上演処理端末12(制御手段14)とのインタフェースに関しては、電子的に接続可能であれば、どのような形態であってもよく、有線でも無線での接続であってもかまわないし、信号方式がデジタル方式であってもアナログ方式であってもよい。例えば、楽器44と演奏上演処理端末とを、USBで接続したり、Midi方式で接続することも可能であり、また、楽器44からのアナログのオーディオ信号を演奏上演処理端末12の制御手段14に送るようにしてもよい。尚、演奏上演処理端末12が、ゲーム機であるような場合には、入力デバイス42であるコントローラを介して、演奏情報である演奏反応情報を演奏上演処理端末12(制御手段14)を入力するようにしてもよい。 The interface between the musical instrument 44 and the performance presentation processing terminal 12 (control means 14) may be in any form as long as it is electronically connectable, and may be wired or wirelessly connected. The signal system may be a digital system or an analog system. For example, the musical instrument 44 and the performance presentation processing terminal can be connected by USB or by the Midi method, and an analog audio signal from the musical instrument 44 is sent to the control means 14 of the performance presentation processing terminal 12. You may make it send. When the performance presentation processing terminal 12 is a game machine, performance response information as performance information is input to the performance presentation processing terminal 12 (control means 14) via a controller which is an input device 42. You may do it.
 次に、プレーヤPは、VR投影手段30であるHMD32を装着し、映像と音響とによりバーチャルリアリティ空間を体感できる状態にする。そして、プレーヤPが、演奏上演処理端末12を操作し、演奏を開始させると、プレーヤPが設定したバーチャルリアリティ空間が、HMD32に投影され、プレーヤPは、自身が設定したコンサート会場等にあたかもいるような状態になる。また、他の演奏者や観客が、HMD32を介して眼前に現れる。そして、プレーヤPは、他の架空の演奏者に合わせて、楽曲を演奏又は歌い始める。プレーヤPは、楽器44を演奏したりマイク46を介して歌うことで、演奏情報である演奏反応情報をこれらの入力手段40を介して、演奏上演処理端末12の制御手段14に送る。すると、制御手段14は、プレーヤPがあたかも現実の空間で入力手段40(例えば楽器44)を用いて演奏しているように認識できるバーチャルリアリティデータを生成し、VR投影手段30であるHMD32に投影させる。 Next, the player P wears the HMD 32, which is the VR projection means 30, so that the virtual reality space can be experienced through video and sound. Then, when the player P operates the performance presentation processing terminal 12 to start the performance, the virtual reality space set by the player P is projected onto the HMD 32, and the player P appears to be at a concert venue or the like set by the player P. It becomes a state like this. In addition, other performers and spectators appear in front of the eyes via the HMD 32. Then, the player P starts to play or sing the music in accordance with other imaginary players. The player P sends performance response information, which is performance information, to the control means 14 of the performance presentation processing terminal 12 via these input means 40 by playing the instrument 44 or singing via the microphone 46. Then, the control means 14 generates virtual reality data that can be recognized as if the player P is playing using the input means 40 (for example, the musical instrument 44) in a real space, and projects it onto the HMD 32 that is the VR projection means 30. Let
 使用されるバーチャルリアリティコンテンツデータ22で、例えば楽曲のギターのパートをオフした情報を用いた場合、プレーヤPの演奏するギターを楽器44として、その演奏反応情報を制御手段14に送るようにして演奏を行うと、HMD32に投影されるバーチャルリアリティ空間は、プレーヤPが自分のパートを担って演奏しているように認識できるようになる。他の演奏者は、それぞれの楽器やパートを受け持って演奏しているように投影され、プレーヤPは、あたかも他の演奏者と現実の空間でセッションしているような感覚で、バーチャルリアリティ空間で演奏することができる。 In the virtual reality content data 22 used, for example, when the information that the guitar part of the music is turned off is used, the performance performance information is sent to the control means 14 using the guitar played by the player P as the instrument 44. As a result, the virtual reality space projected on the HMD 32 can be recognized as if the player P is playing with his / her part. The other players are projected as if they are playing with their instruments and parts, and the player P feels as if they are having a session with other players in the real space, in the virtual reality space. Can play.
 また、演奏上演処理端末12で、コンサート会場や観客を設定することで、その設定したコンサート会場で、他の演奏者と共に演奏しており、また設定した観客の前で観客の反応を感じながら演奏している感覚のバーチャルリアリティ空間で演奏を行うことが可能である。 In addition, by setting a concert venue and audience at the performance presentation processing terminal 12, the performance is performed with other performers at the set concert venue, and performing while feeling the audience's reaction in front of the set audience. It is possible to perform in a virtual reality space with the same feeling.
 尚、演奏する楽曲は、予めバーチャルリアリティコンテンツデータ22で用意されたものであれば、他の演奏者の演奏や観客の反応の様子は、用意されたバーチャルリアリティコンテンツデータ22に基づいてバーチャルリアリティ空間に現されるが、他の演奏者の演奏や観客の反応の様子が、バーチャルリアリティコンテンツデータ22として用意されていなかったり十分な情報でない場合には、プレーヤPの演奏の様子から、制御手段14が、楽曲のテンポ等を解析して、バーチャルリアリティ空間に現されるようにすることも可能である。また、予めバーチャルリアリティコンテンツデータ22で用意された楽曲を用いる場合でも、バンド構成や観客の有無や様子等の組み合わせ等を自由に設定することも可能である。 If the musical piece to be played is prepared in advance in the virtual reality content data 22, the performance of other performers and the state of the audience's reaction can be determined based on the prepared virtual reality content data 22. However, if the performance of other performers or the reaction of the audience is not prepared as virtual reality content data 22 or is not sufficient information, the control means 14 determines from the performance of the player P. However, it is also possible to analyze the tempo of music and make it appear in the virtual reality space. Even when music prepared in advance in the virtual reality content data 22 is used, it is also possible to freely set the combination of the band configuration, the presence / absence of the audience, the appearance, and the like.
 さらに、入力デバイス42としてプレーヤPの動作を取り込めるモーションキャプチャ手段を用いることで、プレーヤPの演奏の動作を制御手段14に送って、バーチャルリアリティ空間に反映させることも可能である。モーションキャプチャ手段としては、例えば、HMD32に設けられた加速度センサやジャイロセンサや近接センサや視線検知センサ等のHMD32のセンサ情報を用いることも可能である。そして、プレーヤP自身が、他の演奏者の視線で自分の演奏の様子を、バーチャルリアリティ空間で見ることもできるようにすることができる。もちろん、プレーヤPが、他の演奏者の視線や観客の視線やまた別の視線で、自身以外の様子をバーチャルリアリティ空間で見ることもできるようにすることができる。 Furthermore, by using a motion capture unit that can capture the operation of the player P as the input device 42, the performance operation of the player P can be sent to the control unit 14 and reflected in the virtual reality space. As the motion capture means, for example, sensor information of the HMD 32 such as an acceleration sensor, a gyro sensor, a proximity sensor, and a line-of-sight detection sensor provided in the HMD 32 can be used. Then, the player P himself / herself can see the performance of his / her performance in the virtual reality space with the eyes of another player. Of course, it is possible to allow the player P to see a state other than himself / herself in the virtual reality space with the gaze of another player, the gaze of the audience, or another gaze.
 尚、上述の説明では、他の演奏者や観客は、バーチャルリアリティコンテンツデータ22で予め用意された架空のものであるが、図1に示すように、情報通信ネットワーク5で接続された他の演奏者のプレーヤ装置10とリアルタイムで接続し、リアルタイムで遠隔地でありながら、あたかも同じ場所で演奏しているようにセッションすることも可能である。この場合、すべての演奏者が、実在して演奏する場合に限らず、一部の演奏者は上述のようにバーチャルリアリティコンテンツデータ22で予め用意された架空のものであってもよい。また、後述する視聴者Aである観客についても、情報通信ネットワーク5で接続された視聴者装置100とリアルタイムで接続し、リアルタイムで遠隔地でありながら、あたかも同じ場所で演奏を聴いてもらっているようにすることも可能である。 In the above description, the other performers and spectators are fictitious ones prepared in advance in the virtual reality content data 22, but other performances connected by the information communication network 5 as shown in FIG. It is also possible to connect to the player device 10 of the person in real time and to make a session as if playing in the same place while being in a remote place in real time. In this case, not all performers actually perform, and some performers may be fictitious prepared in advance in the virtual reality content data 22 as described above. In addition, a spectator who is a viewer A, which will be described later, is connected in real time to the viewer apparatus 100 connected by the information communication network 5, and seems to be listening to the performance at the same place even though it is a remote place in real time. It is also possible to make it.
 また、プレーヤPの演奏により生成されたバーチャルリアリティ空間をなすバーチャルリアリティデータを、そのままに保存したり情報通信ネットワーク5で配信したりすることも可能で有り、このバーチャルリアリティデータを他の媒体で再生可能に二次元的又三次元的に保存したり配信したりすることも可能である。さらに、例えば、プレーヤPの演奏と既存の演奏の情報と比較して採点したりすることも可能である。 In addition, virtual reality data forming a virtual reality space generated by the performance of the player P can be stored as it is or distributed through the information communication network 5, and the virtual reality data can be reproduced on other media. It is possible to store and distribute in two or three dimensions as much as possible. Further, for example, it is possible to score by comparing the performance of the player P with the information of the existing performance.
 次に、視聴者Aが、演奏上演装置1を用いて演奏を視聴する場合の実施例を説明する。本実施例では、実際のコンサート会場でバンド演奏を視聴しているがごとくバーチャルリアリティ空間で体感できる使用方法を、図4等を用いて説明する。 Next, an example in which the viewer A views the performance using the performance device 1 will be described. In the present embodiment, a usage method in which the user can experience a virtual reality space as if viewing a band performance at an actual concert venue will be described with reference to FIG.
 まず、視聴者Aは、演奏上演処理端末12で、どんなコンサート会場で、どのようなバンド演奏を視聴したいか、入力デバイス42(キーボード、マウス、コントローラ等)で設定する。また、視聴者Aは、必要に応じて、自らの反応情報である演奏反応情報を入力するための入力手段40であるマイク46やペンライト48等を、演奏上演処理端末12(制御手段14)に接続しセッティングする。 First, the viewer A uses the performance presentation processing terminal 12 to set what kind of band performance he / she wants to view at the concert venue, using the input device 42 (keyboard, mouse, controller, etc.). Further, the viewer A attaches the microphone 46, the penlight 48, etc., which are input means 40 for inputting performance response information as his / her reaction information, as necessary, to the performance stage processing terminal 12 (control means 14). Connect to and set.
 次に、視聴者Aは、VR投影手段30であるHMD32を装着し、映像と音響とによりバーチャルリアリティ空間を体感できる状態にする。そして、視聴者Aが、演奏上演処理端末12を操作し、演奏を開始させると、視聴者Aが設定したバーチャルリアリティ空間が、HMD32に投影され、設定したバンド演奏を、自身が設定したコンサート会場等にあたかもいるような状態で視聴できる。視聴者Aは、バンド演奏に合わせて声を上げてその声をマイク46に入力したり、また、他の入力デバイス42(キーボード、マウス、コントローラ等)やペンライト48を振って自身の反応を入力すると、反応情報である演奏反応情報をこれらの入力手段40を介して、演奏上演処理端末12の制御手段14に送る。すると、制御手段14は、視聴者Aがあたかも現実の空間で入力手段40(例えばマイク46)を用いて反応している様子が認識でき、その視聴者Aの反応に即してバーチャルリアリティデータを生成し、VR投影手段30であるHMD32に投影させることもできる。 Next, the viewer A wears the HMD 32 that is the VR projection means 30 to make the virtual reality space feelable by video and sound. Then, when the viewer A operates the performance presentation processing terminal 12 to start the performance, the virtual reality space set by the viewer A is projected onto the HMD 32, and the band performance set by the viewer A is the concert venue set by himself / herself. You can watch it as if you were in a situation. The viewer A raises a voice in accordance with the band performance and inputs the voice to the microphone 46, or shakes another input device 42 (keyboard, mouse, controller, etc.) or a penlight 48 to respond to his / her reaction. When input, performance response information, which is reaction information, is sent to the control means 14 of the performance performance processing terminal 12 via these input means 40. Then, the control means 14 can recognize that the viewer A is reacting in the real space using the input means 40 (for example, the microphone 46), and the virtual reality data is obtained in accordance with the reaction of the viewer A. It can also be generated and projected on the HMD 32 which is the VR projection means 30.
 また、入力デバイス42として視聴者Aの動作を取り込めるモーションキャプチャ手段を用いることで、視聴者Aの反応の動作を制御手段14に送って、バーチャルリアリティ空間に反映させることも可能である。モーションキャプチャ手段としては、例えば、HMD32に設けられた加速度センサやジャイロセンサや近接センサや視線検知センサ等のHMD32のセンサ情報を用いることも可能である。そして、視聴者A自身が、演奏者の視線で自分の視聴者としての様子を、バーチャルリアリティ空間で見ることもできるようにすることができる。もちろん、視聴者Aが、演奏者の視線や他の観客(視聴者)の視線やまた別の視線で、自身以外の様子をバーチャルリアリティ空間で見ることもできるようにすることができる。 Also, by using a motion capture means that can capture the motion of the viewer A as the input device 42, it is possible to send the motion of the viewer A's reaction to the control means 14 and reflect it in the virtual reality space. As the motion capture means, for example, sensor information of the HMD 32 such as an acceleration sensor, a gyro sensor, a proximity sensor, and a line-of-sight detection sensor provided in the HMD 32 can be used. Then, it is possible for the viewer A himself / herself to see the state of his / her viewer as the viewer in the virtual reality space. Of course, it is possible to allow the viewer A to see a state other than himself / herself in the virtual reality space with the gaze of the performer, the gaze of another spectator (viewer), or another gaze.
 尚、上述の説明では、演奏者や他の観客(視聴者)は、バーチャルリアリティコンテンツデータ22で予め用意された架空のものであるが、図1に示すように、情報通信ネットワーク5で接続された演奏者のプレーヤ装置10や視聴者装置100とリアルタイムで接続し、リアルタイムで遠隔地でありながら、あたかも目の前で演奏しているように見ることも可能である。この場合、すべての演奏者や観客(視聴者)が、実在して演奏したり視聴している場合に限らず、一部の演奏者や観客(視聴者)は上述のようにバーチャルリアリティコンテンツデータ22で予め用意された架空のものであってもよい。 In the above description, the performer and other spectators (viewers) are fictitious prepared in advance by the virtual reality content data 22, but are connected by the information communication network 5 as shown in FIG. It is possible to connect the player device 10 and the viewer device 100 of the performer in real time, and to feel as if they are performing in front of their eyes while being in a remote place in real time. In this case, not only all performers and spectators (viewers) are actually performing or watching, but some performers and spectators (viewers) have virtual reality content data as described above. An imaginary thing prepared in advance in 22 may be used.
 また、視聴者Aが聴視している中で生成されたバーチャルリアリティ空間をなすバーチャルリアリティデータを、そのままに保存したり情報通信ネットワーク5で配信したりすることも可能で有り、このバーチャルリアリティデータを他の媒体で再生可能に二次元的又三次元的に保存したり配信したりすることも可能である。 Further, the virtual reality data forming the virtual reality space generated while the viewer A is observing can be stored as it is or distributed through the information communication network 5. This virtual reality data Can be stored and distributed two-dimensionally or three-dimensionally so as to be reproducible on other media.
 本発明は、本発明の広義の精神と範囲を逸脱することなく、様々な実施の形態及び変形が可能とされるものである。また、上述した実施の形態は、この発明を説明するためのものであり、本発明の範囲を限定するものではない。すなわち、本発明の範囲は、実施の形態ではなく、特許請求の範囲によって示される。そして、特許請求の範囲内及びそれと同等の発明の意義の範囲内で施される様々な変形が、この発明の範囲内とみなされる。 The present invention is capable of various embodiments and modifications without departing from the broad spirit and scope of the present invention. The above-described embodiments are for explaining the present invention and do not limit the scope of the present invention. In other words, the scope of the present invention is shown not by the embodiments but by the claims. Various modifications within the scope of the claims and within the scope of the equivalent invention are considered to be within the scope of the present invention.
 本出願は、2016年7月5日に出願された日本国特許出願特願2016-133087号に基づく。本明細書中に日本国特許出願特願2016-133087号の明細書、特許請求の範囲、図面全体を参照として取り込むものとする。 This application is based on Japanese Patent Application No. 2016-133087 filed on Jul. 5, 2016. The specification, claims, and entire drawings of Japanese Patent Application No. 2016-133087 are incorporated herein by reference.
 以上のように、本発明によれば、バーチャルリアリティ空間において、演奏者や視聴者の1人として、他の演奏者や視聴者と共に、演奏者があたかも現実の空間で楽器等を用いて演奏しているように認識できる又は視聴者があたかも現実の空間で演奏を視聴しているように認識できる演奏上演装置を提供することができる。 As described above, according to the present invention, in the virtual reality space, as a performer or viewer, the performer performs as if using a musical instrument or the like in an actual space together with other performers or viewers. It is possible to provide a performance device that can be recognized as if the viewer is viewing the performance in a real space.
1・・・・・演奏上演装置
5・・・・・情報通信ネットワーク
10・・・・プレーヤ装置
12・・・・演奏上演処理端末
14・・・・制御手段
20・・・・記憶手段
22・・・・バーチャルリアリティコンテンツデータ(VRコンテンツデータ)
24・・・・映像用データ
24a・・・背景・舞台装置データ
24b・・・観客データ
24c・・・バンドメンバデータ
24d・・・プレーヤデータ
26・・・・音響用データ
26a・・・楽曲データ
26b・・・環境音データ
30・・・・VR投影手段
32・・・・HMD
40・・・・入力手段
42・・・・入力デバイス
44・・・・楽器
46・・・・マイク
48・・・・ペンライト
100・・・視聴者装置
DESCRIPTION OF SYMBOLS 1 ... Performance performance apparatus 5 ... Information communication network 10 ... Player apparatus 12 ... Performance performance processing terminal 14 ... Control means 20 ... Storage means 22- ... Virtual reality content data (VR content data)
24 .... Video data 24a ... Background / stage apparatus data 24b ... Audience data 24c ... Band member data 24d ... Player data 26 ... Audio data 26a ... Music data 26b ... environmental sound data 30 ... VR projection means 32 ... HMD
40 .... Input means 42 ... Input device 44 ... Musical instrument 46 ... Mic 48 ... Penlight 100 ... Viewer device

Claims (4)

  1.  バーチャルリアリティ空間での演奏を実現させる演奏上演装置において、
    演奏者又は視聴者の眼前に、映像及び音響によって構成される該バーチャルリアリティ空間を形成させるためのバーチャルリアリティデータを、視覚的及び聴覚的に投影させるVR投影手段と、
    該VR投影手段に投影させる該バーチャルリアリティ空間を構成する映像及び音響に係る個々のコンテンツデータであるバーチャルリアリティコンテンツデータと、
    該演奏者の演奏情報又は該視聴者の反応情報からなる演奏反応情報を入力する入力手段と、
    該バーチャルリアリティコンテンツデータと該演奏反応情報とから、該VR投影手段に投影させる該バーチャルリアリティデータを生成する制御手段とを備え、
    該制御手段が、該演奏者があたかも現実の空間で該入力手段を用いて演奏しているように認識できる又は該視聴者があたかも現実の空間で演奏を視聴しているように認識できる該バーチャルリアリティデータを生成し、該VR投影手段で投影させることを特徴とする演奏上演装置。
    In performance performance equipment that realizes performance in virtual reality space,
    VR projection means for visually and auditorily projecting virtual reality data for forming the virtual reality space composed of video and sound in front of the performer or viewer;
    Virtual reality content data, which is individual content data related to video and sound, constituting the virtual reality space to be projected on the VR projection means;
    Input means for inputting performance response information including performance information of the performer or response information of the viewer;
    Control means for generating the virtual reality data to be projected on the VR projection means from the virtual reality content data and the performance response information,
    The control means can recognize the player as if he / she was playing using the input means in a real space, or recognize the virtual as if the viewer was watching a performance in the real space. A performance performance apparatus, characterized in that reality data is generated and projected by the VR projection means.
  2.  前記バーチャルリアリティコンテンツデータに、予め入力された他の演奏者又は他の視聴者の映像及び音響に係る前記コンテンツデータを含むことを特徴とする請求項1記載の演奏上演装置。 The performance presentation device according to claim 1, wherein the virtual reality content data includes the content data related to video and sound of other performers or other viewers input in advance.
  3.  前記バーチャルリアリティコンテンツデータに、リアルタイムで生成される他の演奏者又は他の視聴者の映像及び音響に係る前記コンテンツデータを含むことを特徴とする請求項1記載の演奏上演装置。 The performance presentation apparatus according to claim 1, wherein the virtual reality content data includes the content data relating to video and sound of other performers or other viewers generated in real time.
  4.  前記バーチャルリアリティコンテンツデータに、音響として予め生成された既存楽曲情報又は前記演者が演奏する生楽曲情報にマッチした雰囲気の前記バーチャルリアリティ空間を生成するための前記コンテンツデータを含むことを特徴とする請求項1記載の演奏上演装置。 The virtual reality content data includes the content data for generating the virtual reality space having an atmosphere that matches existing music information generated in advance as sound or live music information performed by the performer. Item 1. The performance performance device according to item 1.
PCT/JP2017/023262 2016-07-05 2017-06-23 Musical performance presentation device WO2018008434A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2016133087A JP2018005019A (en) 2016-07-05 2016-07-05 Playing and staging device
JP2016-133087 2016-07-05

Publications (1)

Publication Number Publication Date
WO2018008434A1 true WO2018008434A1 (en) 2018-01-11

Family

ID=60912481

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2017/023262 WO2018008434A1 (en) 2016-07-05 2017-06-23 Musical performance presentation device

Country Status (2)

Country Link
JP (1) JP2018005019A (en)
WO (1) WO2018008434A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112203114A (en) * 2020-09-07 2021-01-08 佛山创视嘉科技有限公司 Collaborative playing method, system, terminal device and storage medium

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6514397B1 (en) * 2018-06-29 2019-05-15 株式会社コロプラ SYSTEM, PROGRAM, METHOD, AND INFORMATION PROCESSING APPARATUS
JP7249859B2 (en) * 2019-04-25 2023-03-31 株式会社第一興商 karaoke system
WO2024202351A1 (en) * 2023-03-29 2024-10-03 ソニーグループ株式会社 Information processing device, information processing method, and program

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000293172A (en) * 1999-04-05 2000-10-20 Casio Comput Co Ltd Wind instrument playing practice device and recording medium where wind instrument playing practice processing program is recorded
JP2007271698A (en) * 2006-03-30 2007-10-18 Yamaha Corp Player

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000293172A (en) * 1999-04-05 2000-10-20 Casio Comput Co Ltd Wind instrument playing practice device and recording medium where wind instrument playing practice processing program is recorded
JP2007271698A (en) * 2006-03-30 2007-10-18 Yamaha Corp Player

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
KEISUKE KIMURA ET AL.: "Pre-construction Overlay of Auditory Environment in live performance for Studio Practice", IEICE TECHNICAL REPORT, vol. 113, no. 403, 16 January 2014 (2014-01-16), pages 45 - 48, ISSN: 0913-5685 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112203114A (en) * 2020-09-07 2021-01-08 佛山创视嘉科技有限公司 Collaborative playing method, system, terminal device and storage medium

Also Published As

Publication number Publication date
JP2018005019A (en) 2018-01-11

Similar Documents

Publication Publication Date Title
WO2018008434A1 (en) Musical performance presentation device
US7725203B2 (en) Enhancing perceptions of the sensory content of audio and audio-visual media
US9646588B1 (en) Cyber reality musical instrument and device
JP5553446B2 (en) Amusement system
TW200951764A (en) Gesture-related feedback in electronic entertainment system
JP7465019B2 (en) Information processing device, information processing method, and information processing program
CN110915240B (en) Method for providing interactive music composition to user
JP6457326B2 (en) Karaoke system that supports transmission delay of singing voice
JP6803485B1 (en) Computer programs, methods and server equipment
US9601118B2 (en) Amusement system
WO2022163137A1 (en) Information processing device, information processing method, and program
KR101809617B1 (en) My-concert system
JP2022020625A (en) Sound processing system, sound processing device, sound processing method, and sound processing program
WO2016017622A1 (en) Reference display device, reference display method, and program
JP2013156543A (en) Posting reproducer and program
JP5486941B2 (en) A karaoke device that makes you feel like singing to the audience
JP7442979B2 (en) karaoke system
JP2011206267A (en) Game device, game progressing method, and game progressing program
JP2018028646A (en) Karaoke by venue
JP2021140065A (en) Processing system, sound system and program
WO2023084933A1 (en) Information processing device, information processing method, and program
WO2024202979A1 (en) Performance information generation method, performance information generation device, and program
JP2018036349A (en) "cherry picking" karaoke
JP2018022121A (en) Karaoke minus one
JP2022134182A (en) Video output method, video output device, and video output system

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17824041

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17824041

Country of ref document: EP

Kind code of ref document: A1