Nothing Special   »   [go: up one dir, main page]

WO2025023764A1 - Dispositif électronique pour fournir un espace de réalité virtuelle, procédé pour produire un son de vibration virtuelle à partir d'un dispositif électronique, et support de stockage non transitoire - Google Patents

Dispositif électronique pour fournir un espace de réalité virtuelle, procédé pour produire un son de vibration virtuelle à partir d'un dispositif électronique, et support de stockage non transitoire Download PDF

Info

Publication number
WO2025023764A1
WO2025023764A1 PCT/KR2024/010840 KR2024010840W WO2025023764A1 WO 2025023764 A1 WO2025023764 A1 WO 2025023764A1 KR 2024010840 W KR2024010840 W KR 2024010840W WO 2025023764 A1 WO2025023764 A1 WO 2025023764A1
Authority
WO
WIPO (PCT)
Prior art keywords
electronic device
vibration
virtual
interaction
speaker
Prior art date
Application number
PCT/KR2024/010840
Other languages
English (en)
Korean (ko)
Inventor
김영중
김상헌
박미지
박훈기
안진완
이상용
이지우
Original Assignee
삼성전자 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from KR1020230153831A external-priority patent/KR20250017109A/ko
Application filed by 삼성전자 주식회사 filed Critical 삼성전자 주식회사
Publication of WO2025023764A1 publication Critical patent/WO2025023764A1/fr

Links

Images

Definitions

  • the present disclosure relates to an electronic device providing a virtual reality space, a method for outputting virtual vibration sounds from the electronic device, and a non-transitory storage medium.
  • Electronic devices are being provided in various forms such as smart phones, tablet personal computers, or PDAs (personal digital assistants) with the development of digital technology. Electronic devices are also being developed in wearable forms to improve portability and accessibility for users.
  • Wearable electronic devices such as augmented reality glasses (AR glasses), video see through (VST) devices, and head mounted display (HMD) devices to provide virtual spaces in virtual environments, and the various services and additional functions provided by wearable electronic devices are gradually increasing.
  • AR glasses augmented reality glasses
  • VST video see through
  • HMD head mounted display
  • communication service providers or electronic device manufacturers are competitively developing electronic devices to provide various functions and differentiate themselves from other companies. Accordingly, the various functions provided through wearable electronic devices are also becoming increasingly advanced.
  • AR glasses or VST devices can provide users with a realistic experience by displaying virtual images while worn on the user's body.
  • AR glasses or VST devices can replace the usability of smartphones in various fields such as game entertainment, education, and SNS (social networking service). Users can receive content similar to reality through AR glasses or VST devices, and can feel as if they are staying in a virtual world through interaction.
  • an electronic device includes a display, a speaker, a memory, and at least one processor operatively connected to the display, the speaker, and the memory.
  • the instructions when individually or collectively executed by at least one processor, cause the electronic device to receive user interaction by at least one input interface in a virtual reality space provided through the display.
  • the instructions when individually or collectively executed by at least one processor, cause the electronic device to identify a type of user interaction.
  • the instructions when individually or collectively executed by at least one processor, cause the electronic device to obtain information related to a virtual object displayed in the virtual reality space based on the identified type.
  • the instructions when individually or collectively executed by at least one processor, cause the electronic device to control the speaker to output a virtual vibration sound corresponding to a vibration pattern generated based on information related to the virtual object.
  • a method of operating in an electronic device includes receiving a user interaction by at least one input interface in a virtual reality space provided through a display of the electronic device.
  • the method comprises an action of identifying a type of the user interaction.
  • the method includes an operation of obtaining information related to a virtual object displayed on the virtual reality space based on the identified type.
  • the method includes outputting a virtual vibration sound corresponding to a vibration pattern generated based on information related to the virtual object through a speaker of the electronic device.
  • a non-transitory storage medium storing one or more programs, wherein the one or more programs include executable instructions that, when executed by at least one processor of an electronic device, cause the electronic device to perform an operation of receiving a user interaction by at least one input interface in a virtual reality space provided through a display of the electronic device.
  • the one or more programs include executable instructions that, when executed by at least one processor of the electronic device, cause the electronic device to perform an operation that identifies a type of user interaction.
  • the one or more programs include executable instructions that, when executed by at least one processor of the electronic device, cause the electronic device to perform an operation of obtaining information related to a virtual object displayed in the virtual reality space based on the identified type.
  • the one or more programs include executable instructions that, when executed by at least one processor of the electronic device, cause the electronic device to perform an operation of outputting a virtual vibration sound corresponding to a vibration pattern generated based on information related to the virtual object through a speaker of the electronic device.
  • FIG. 1 is a block diagram of an electronic device within a network environment according to various embodiments.
  • FIG. 2 is a diagram showing the configuration of a wearable electronic device according to one embodiment.
  • FIGS. 3A, 3B, and 3C are drawings showing the front and back of a wearable electronic device according to one embodiment.
  • FIG. 4 is a diagram showing the configuration of an external electronic device according to one embodiment.
  • FIG. 5 is a diagram showing the configuration of an electronic device according to one embodiment.
  • FIGS. 6A, 6B, 6C, and 6D are diagrams illustrating examples of receiving user interaction by an external device or hand gesture in an electronic device according to one embodiment.
  • FIGS. 7A, 7B, and 7C are diagrams illustrating examples of user interaction types according to one embodiment.
  • FIG. 9 is a diagram showing an example configuration of a software module of an electronic device according to various embodiments.
  • FIG. 10 is a drawing showing an example of an operating method in an electronic device according to one embodiment.
  • FIG. 11 is a diagram illustrating an example for obtaining information related to a virtual object in an electronic device according to one embodiment.
  • FIGS. 12A and 12B are diagrams illustrating examples for obtaining information related to a virtual object in an electronic device according to one embodiment.
  • FIG. 13 is a drawing showing an example of an operating method in an electronic device according to one embodiment.
  • FIGS. 15A, 15B, 15C, and 15D are diagrams illustrating examples of vibration patterns generated in an electronic device according to one embodiment.
  • FIGS. 17A and 17B are diagrams illustrating an example of synchronizing a virtual vibration sound output through a speaker and a vibration output through a haptic motor in an electronic device according to one embodiment.
  • FIG. 18 is a diagram illustrating an example of operation in a low power mode in an electronic device according to one embodiment.
  • FIG. 1 is a block diagram of an electronic device (101) in a network environment (100) according to various embodiments.
  • the electronic device (101) may communicate with the electronic device (102) via a first network (198) (e.g., a short-range wireless communication network) or may communicate with at least one of the electronic device (104) or the server (108) via a second network (199) (e.g., a long-range wireless communication network).
  • the electronic device (101) may communicate with the electronic device (104) via the server (108).
  • the electronic device (101) may include a processor (120), a memory (130), an input module (150), an audio output module (155), a display module (160), an audio module (170), a sensor module (176), an interface (177), a connection terminal (178), a haptic module (179), a camera module (180), a power management module (188), a battery (189), a communication module (190), a subscriber identification module (196), or an antenna module (197).
  • the electronic device (101) may omit at least one of these components (e.g., the connection terminal (178)), or may have one or more other components added.
  • some of these components e.g., the sensor module (176), the camera module (180), or the antenna module (197) may be integrated into one component (e.g., the display module (160)).
  • the processor (120) may control at least one other component (e.g., a hardware or software component) of the electronic device (101) connected to the processor (120) by executing, for example, software (e.g., a program (140)), and may perform various data processing or calculations. According to one embodiment, as at least a part of the data processing or calculations, the processor (120) may store a command or data received from another component (e.g., a sensor module (176) or a communication module (190)) in the volatile memory (132), process the command or data stored in the volatile memory (132), and store result data in the nonvolatile memory (134).
  • a command or data received from another component e.g., a sensor module (176) or a communication module (190)
  • the processor (120) may include a main processor (121) (e.g., a central processing unit or an application processor) or an auxiliary processor (123) (e.g., a graphic processing unit, a neural processing unit (NPU), an image signal processor, a sensor hub processor, or a communication processor) that can operate independently or together therewith.
  • a main processor (121) e.g., a central processing unit or an application processor
  • an auxiliary processor (123) e.g., a graphic processing unit, a neural processing unit (NPU), an image signal processor, a sensor hub processor, or a communication processor
  • the secondary processor (123) may be configured to use lower power than the main processor (121) or to be specialized for a given function.
  • the secondary processor (123) may be implemented separately from the main processor (121) or as a part thereof.
  • the auxiliary processor (123) may control at least a portion of functions or states associated with at least one of the components of the electronic device (101) (e.g., the display module (160), the sensor module (176), or the communication module (190)), for example, on behalf of the main processor (121) while the main processor (121) is in an inactive (e.g., sleep) state, or together with the main processor (121) while the main processor (121) is in an active (e.g., application execution) state.
  • the auxiliary processor (123) e.g., an image signal processor or a communication processor
  • the artificial neural network may be one of a deep neural network (DNN), a convolutional neural network (CNN), a recurrent neural network (RNN), a restricted Boltzmann machine (RBM), a deep belief network (DBN), a bidirectional recurrent deep neural network (BRDNN), deep Q-networks, or a combination of two or more of the above, but is not limited to the examples described above.
  • the artificial intelligence model may additionally or alternatively include a software structure.
  • the input module (150) can receive commands or data to be used in a component of the electronic device (101) (e.g., a processor (120)) from an external source (e.g., a user) of the electronic device (101).
  • the input module (150) can include, for example, a microphone, a mouse, a keyboard, a key (e.g., a button), or a digital pen (e.g., a stylus pen).
  • the audio output module (155) can output an audio signal to the outside of the electronic device (101).
  • the audio output module (155) can include, for example, a speaker or a receiver.
  • the speaker can be used for general purposes such as multimedia playback or recording playback.
  • the receiver can be used to receive an incoming call. According to one embodiment, the receiver can be implemented separately from the speaker or as a part thereof.
  • the sensor module (176) can detect an operating state (e.g., power or temperature) of the electronic device (101) or an external environmental state (e.g., user state) and generate an electric signal or data value corresponding to the detected state.
  • the sensor module (176) can include, for example, a gesture sensor, a gyro sensor, a barometric pressure sensor, a magnetic sensor, an acceleration sensor, a grip sensor, a proximity sensor, a color sensor, an IR (infrared) sensor, a biometric sensor, a temperature sensor, a humidity sensor, or an illuminance sensor.
  • the interface (177) may support one or more designated protocols that may be used to directly or wirelessly connect the electronic device (101) with an external electronic device (e.g., the electronic device (102)).
  • the interface (177) may include, for example, a high definition multimedia interface (HDMI), a universal serial bus (USB) interface, an SD card interface, or an audio interface.
  • HDMI high definition multimedia interface
  • USB universal serial bus
  • SD card interface Secure Digital Card
  • the camera module (180) can capture still images and moving images.
  • the camera module (180) can include one or more lenses, image sensors, image signal processors, or flashes.
  • the power management module (188) can manage power supplied to the electronic device (101).
  • the power management module (188) can be implemented as, for example, at least a part of a power management integrated circuit (PMIC).
  • PMIC power management integrated circuit
  • the battery (189) can power at least one component of the electronic device (101).
  • the battery (189) can include, for example, a non-rechargeable primary battery, a rechargeable secondary battery, or a fuel cell.
  • the communication module (190) may support establishment of a direct (e.g., wired) communication channel or a wireless communication channel between the electronic device (101) and an external electronic device (e.g., the electronic device (102), the electronic device (104), or the server (108)), and performance of communication through the established communication channel.
  • the communication module (190) may operate independently from the processor (120) (e.g., the application processor) and may include one or more communication processors that support direct (e.g., wired) communication or wireless communication.
  • the communication module (190) may include a wireless communication module (192) (e.g., a cellular communication module, a short-range wireless communication module, or a GNSS (global navigation satellite system) communication module) or a wired communication module (194) (e.g., a local area network (LAN) communication module or a power line communication module).
  • a wireless communication module (192) e.g., a cellular communication module, a short-range wireless communication module, or a GNSS (global navigation satellite system) communication module
  • a wired communication module (194) e.g., a local area network (LAN) communication module or a power line communication module.
  • a corresponding communication module may communicate with an external electronic device (104) via a first network (198) (e.g., a short-range communication network such as Bluetooth, wireless fidelity (WiFi) direct, or infrared data association (IrDA)) or a second network (199) (e.g., a long-range communication network such as a legacy cellular network, a 5G network, a next-generation communication network, the Internet, or a computer network (e.g., a LAN or WAN)).
  • a first network (198) e.g., a short-range communication network such as Bluetooth, wireless fidelity (WiFi) direct, or infrared data association (IrDA)
  • a second network (199) e.g., a long-range communication network such as a legacy cellular network, a 5G network, a next-generation communication network, the Internet, or a computer network (e.g., a LAN or WAN)
  • a computer network e.g.,
  • the wireless communication module (192) may use subscriber information (e.g., an international mobile subscriber identity (IMSI)) stored in the subscriber identification module (196) to identify or authenticate the electronic device (101) within a communication network such as the first network (198) or the second network (199).
  • subscriber information e.g., an international mobile subscriber identity (IMSI)
  • IMSI international mobile subscriber identity
  • the wireless communication module (192) can support a 5G network and next-generation communication technology after a 4G network, for example, NR access technology (new radio access technology).
  • the NR access technology can support high-speed transmission of high-capacity data (eMBB (enhanced mobile broadband)), terminal power minimization and connection of multiple terminals (mMTC (massive machine type communications)), or high reliability and low latency (URLLC (ultra-reliable and low-latency communications)).
  • eMBB enhanced mobile broadband
  • mMTC massive machine type communications
  • URLLC ultra-reliable and low-latency communications
  • the wireless communication module (192) can support, for example, a high-frequency band (e.g., mmWave band) to achieve a high data transmission rate.
  • a high-frequency band e.g., mmWave band
  • the wireless communication module (192) may support various technologies for securing performance in a high-frequency band, such as beamforming, massive multiple-input and multiple-output (MIMO), full dimensional MIMO (FD-MIMO), array antenna, analog beam-forming, or large scale antenna.
  • the wireless communication module (192) may support various requirements specified in an electronic device (101), an external electronic device (e.g., an electronic device (104)), or a network system (e.g., a second network (199)).
  • the wireless communication module (192) can support a peak data rate (e.g., 20 Gbps or more) for 1eMBB realization, a loss coverage (e.g., 164 dB or less) for mMTC realization, or a U-plane latency (e.g., 0.5 ms or less for downlink (DL) and uplink (UL) each, or 1 ms or less for round trip) for URLLC realization.
  • a peak data rate e.g., 20 Gbps or more
  • a loss coverage e.g., 164 dB or less
  • U-plane latency e.g., 0.5 ms or less for downlink (DL) and uplink (UL) each, or 1 ms or less for round trip
  • the antenna module (197) can transmit or receive signals or power to or from the outside (e.g., an external electronic device).
  • the antenna module (197) can include an antenna including a radiator formed of a conductor or a conductive pattern formed on a substrate (e.g., a PCB).
  • the antenna module (197) can include a plurality of antennas (e.g., an array antenna).
  • at least one antenna suitable for a communication method used in a communication network, such as the first network (198) or the second network (199) can be selected from the plurality of antennas by, for example, the communication module (190).
  • a signal or power can be transmitted or received between the communication module (190) and the external electronic device through the selected at least one antenna.
  • another component e.g., a radio frequency integrated circuit (RFIC)
  • RFIC radio frequency integrated circuit
  • the antenna module (197) may form a mmWave antenna module.
  • the mmWave antenna module may include a printed circuit board, an RFIC positioned on or adjacent a first side (e.g., a bottom side) of the printed circuit board and capable of supporting a designated high-frequency band (e.g., a mmWave band), and a plurality of antennas (e.g., an array antenna) positioned on or adjacent a second side (e.g., a top or side) of the printed circuit board and capable of transmitting or receiving signals in the designated high-frequency band.
  • a first side e.g., a bottom side
  • a plurality of antennas e.g., an array antenna
  • At least some of the above components can be connected to each other and exchange signals (e.g., commands or data) with each other via a communication method between peripheral devices (e.g., a bus, GPIO (general purpose input and output), SPI (serial peripheral interface), or MIPI (mobile industry processor interface)).
  • peripheral devices e.g., a bus, GPIO (general purpose input and output), SPI (serial peripheral interface), or MIPI (mobile industry processor interface)).
  • commands or data may be transmitted or received between the electronic device (101) and an external electronic device (104) via a server (108) connected to a second network (199).
  • Each of the external electronic devices (102, or 104) may be the same or a different type of device as the electronic device (101).
  • all or part of the operations executed in the electronic device (101) may be executed in one or more of the external electronic devices (102, 104, or 108). For example, when the electronic device (101) is to perform a certain function or service automatically or in response to a request from a user or another device, the electronic device (101) may, instead of or in addition to executing the function or service itself, request one or more external electronic devices to perform at least a part of the function or service.
  • One or more external electronic devices that have received the request may execute at least a part of the requested function or service, or an additional function or service related to the request, and transmit the result of the execution to the electronic device (101).
  • the electronic device (101) may process the result as it is or additionally and provide it as at least a part of a response to the request.
  • cloud computing, distributed computing, mobile edge computing (MEC), or client-server computing technology may be used.
  • the electronic device (101) may provide an ultra-low latency service by using, for example, distributed computing or mobile edge computing.
  • the external electronic device (104) may include an IoT (Internet of Things) device.
  • the server (108) may be an intelligent server using machine learning and/or a neural network.
  • the external electronic device (104) or the server (108) may be included in the second network (199).
  • the electronic device (101) can be applied to intelligent services (e.g., smart home, smart city, smart car, or healthcare) based on 5G communication technology and IoT-related technology.
  • the term "virtual object" in the present disclosure can be explained as a general meaning of an object representing an execution screen of one or more applications displayed in a virtual reality space, an object included in the execution screen, an object related to a menu or function, an object for at least one input interface, or various other objects displayed in addition to the object.
  • the virtual object described in the present disclosure means including at least one of an object representing an execution screen of one or more applications displayed in a virtual reality space, an object included in the execution screen, an object related to a menu or function, an object for at least one input interface, or various other objects displayed in addition to the object.
  • the object is described in the singular for the convenience of explanation, but the singular form of the noun corresponding to the object may include one or more.
  • virtual reality space described in the present disclosure may mean a virtual space provided using various virtual reality technology methods (e.g., extended reality (XR: eXtended Reality), virtual reality (VR: virtual reality), or augmented reality (AR: augmented reality)) in a virtual environment.
  • virtual reality technology methods e.g., extended reality (XR: eXtended Reality), virtual reality (VR: virtual reality), or augmented reality (AR: augmented reality)
  • XR extended reality
  • VR virtual reality
  • AR augmented reality
  • FIG. 2 is a diagram showing the configuration of a wearable electronic device according to one embodiment.
  • a wearable electronic device (200) (e.g., the electronic device (101) of FIG. 1) according to one embodiment of the present disclosure may include at least one of a light output module (211), a display member (201), a camera module (250), and a speaker (261).
  • the light output module (211) may include a light source capable of outputting an image, and a lens that guides the image to the display member (201).
  • the light output module (211) may include at least one of a liquid crystal display (LCD), a digital mirror device (DMD), a liquid crystal on silicon (LCoS), an organic light emitting diode (OLED), or a micro light emitting diode (micro LED).
  • LCD liquid crystal display
  • DMD digital mirror device
  • LCDoS liquid crystal on silicon
  • OLED organic light emitting diode
  • micro LED micro light emitting diode
  • the display member (201) may include an optical waveguide (e.g., a waveguide).
  • an output image of the optical output module (211) incident on one end of the optical waveguide may be propagated inside the optical waveguide and provided to a user.
  • the optical waveguide may include at least one diffractive element (e.g., a diffractive optical element (DOE), a holographic optical element (HOE)) or at least one reflective element (e.g., a reflective mirror).
  • DOE diffractive optical element
  • HOE holographic optical element
  • the optical waveguide may guide an output image of the optical output module (211) to a user's eyes by using at least one diffractive element or reflective element.
  • the camera module (250) (e.g., the camera module (180) of FIG. 1) can capture still images and/or moving images.
  • the camera module (250) is disposed within a lens frame and can be disposed around the display member (201).
  • the first camera module (251) can capture and/or recognize the trajectory of the user's eye (e.g., pupil, iris) or gaze. According to one embodiment of the present disclosure, the first camera module (251) can periodically or aperiodically transmit information related to the trajectory of the user's eye or gaze (e.g., trajectory information) to a processor (e.g., processor (120) of FIG. 1).
  • a processor e.g., processor (120) of FIG. 1).
  • the second camera module (253) can capture an external image.
  • the third camera module (255) can be used for hand detection and tracking, and user gesture (e.g., hand movement) recognition.
  • the third camera module (255) can be used for 3DoF (3 degrees of freedom), 6DoF head tracking, position (space, environment) recognition, and/or movement recognition.
  • the second camera module (253) can also be used for hand detection and tracking, and user gesture recognition.
  • at least one of the first camera module (251) to the third camera module (255) can be replaced with a sensor module (e.g., a LiDAR sensor).
  • the sensor module can include at least one of a VCSEL (vertical cavity surface emitting laser), an infrared sensor, and/or a photodiode.
  • FIGS. 3A, 3B, and 3C are drawings showing the front and back of a wearable electronic device according to one embodiment.
  • camera modules (311, 312, 313, 314, 315, 316) and/or depth sensors (317) may be arranged on a first surface (310) of the housing to obtain information related to the surrounding environment of a wearable electronic device (300) (e.g., electronic device (101) of FIG. 1 ).
  • the camera modules (311, 312) can acquire images related to the environment surrounding the wearable electronic device.
  • the camera modules (313, 314, 315, 316) can acquire images while the wearable electronic device is worn by a user.
  • the camera modules (313, 314, 315, 316) can be used for hand detection and tracking, and recognition of user gestures (e.g., hand movements).
  • the camera modules (313, 314, 315, 316) can be used for 3DoF, 6DoF head tracking, position (spatial, environmental) recognition, and/or movement recognition.
  • the camera modules (311, 312) can also be used for hand detection and tracking, and user gestures.
  • the depth sensor (317) may be configured to transmit a signal and receive a signal reflected from a subject, and may be used for purposes such as time of flight (TOF) to determine the distance to an object.
  • TOF time of flight
  • the camera modules (313, 314, 315, 316) may determine the distance to an object.
  • face recognition camera modules (325, 326) and/or a display (321) (and/or a lens) may be arranged on the second side (320) of the housing.
  • the facial recognition camera modules (325, 326) adjacent to the display may be used to recognize the user's face, or may recognize and/or track the user's two eyes.
  • the display (321) (and/or lens) may be disposed on the second side (320) of the wearable electronic device (300).
  • the wearable electronic device (300) may not include camera modules (315, 316) among the plurality of camera modules (313, 314, 315, 316).
  • the wearable electronic device (300) may further include at least one of the configurations illustrated in FIG. 2 .
  • a wearable electronic device (300) may have a form factor for being worn on a user's head (e.g., HMD: Head Mounted Display).
  • the wearable electronic device (300) may further include a strap for being fixed on a body part of the user, and/or a wearing member.
  • the wearable electronic device (300) may include a volume button (331), a ventilation hole (333), a status indicator light (335), and a power button (e.g., including a fingerprint recognition sensor) (337), and such configurations may be equally included in the wearable electronic device (300) illustrated in FIGS. 3A and 3B.
  • a wearable electronic device (300) configured in the form of an HMD may include components identical or similar to the components of FIGS. 3A and 3B described above.
  • the speaker (318) (e.g., the audio output module (155) of FIG. 1 or the speaker (261) of FIG. 2) can output an audio signal (e.g., a sound and/or a virtual vibration sound).
  • the speaker (318) has been described as being configured at a location adjacent to the ventilation hole (333) in FIGS. 3A to 3C, but is not limited thereto and may be configured at a different location depending on the implementation of the wearable electronic device (200).
  • FIG. 4 is a diagram showing the configuration of an external electronic device according to one embodiment.
  • An electronic device e.g., an electronic device (101) of FIG. 1, a wearable electronic device (200) of FIG. 2, and/or a wearable electronic device (300) of FIGS. 3A and 3C) providing a virtual reality space according to one embodiment of the present invention may be connected to at least one controller (401, 403) that transmits and receives a control signal and information according to function execution through a designated communication method (e.g., BLE communication).
  • a controller 401, 403 that transmits and receives a control signal and information according to function execution through a designated communication method (e.g., BLE communication).
  • At least one controller may be an input interface (e.g., an external electronic device) including a haptic motor (e.g., a vibration motor), such as a pointing device or an input control device, and may transmit user interaction with respect to a virtual object displayed in a virtual reality space to the electronic device.
  • a haptic motor e.g., a vibration motor
  • the input interface that transmits the user interaction to the electronic device may be replaced with at least one of a keyboard, a mouse, an electronic pen, a smart phone, or an artificial intelligence (AI) device.
  • AI artificial intelligence
  • At least one controller (401, 403) may include a sensor module (411), an input module (413), a battery (not shown), and an indicator member (415) (e.g., a status indicator light and/or an infrared LED).
  • an indicator member (415) e.g., a status indicator light and/or an infrared LED
  • At least one controller (401, 403) may be configured by dividing the housing (410) into a first housing area (410a) and a second housing area (410b), and the sensor module (411), the input module (413), and the battery may be arranged in the first housing area (410a), and the indicator member (415) may be arranged in the second housing area (410b), but is not limited thereto, and may be configured by arranging the components differently.
  • a sensor module (411) may include at least one sensor (411) (e.g., a proximity sensor, an acceleration sensor, and/or a gyro sensor) for detecting a grip state by a user's hand and movement of a controller (401, 403).
  • An input module (413) may include a plurality of buttons for inputting user interaction related to driving of at least one controller (401, 403) and a function or object executed on a virtual reality space displayed on an electronic device (200).
  • an electronic device (501) may be worn by a user.
  • the electronic device (200) may be implemented as either an augmented reality glass (AR glass) or a video see through (VST) device.
  • the electronic device (501) may include a processor (510), a display (520), a camera (530), a communication circuit (540), a speaker (550), and a memory (560).
  • a processor (510) of an electronic device (501) may be electrically and/or operatively connected to each other by electronic components such as a display (520), a camera (530), a communication circuit (540), a speaker (550), and a memory (560) and a communication bus (not shown).
  • electronic components such as a display (520), a camera (530), a communication circuit (540), a speaker (550), and a memory (560) and a communication bus (not shown).
  • the processor (510) can control the communication circuit (540) to connect with at least one user interface using a short-range communication technology (e.g., BLE (Bluetooth low energy) or WiFi) or wired communication, and can receive a user interaction by the at least one input interface.
  • the at least one input interface can be at least one external electronic device including a haptic motor (e.g., the first controller (401) or the second controller (403) of FIGS. 4 and 6A), a part of the user's body (e.g., the hand (601) of FIG. 6C), or at least one external electronic device that does not include a haptic motor (e.g., the keyboard (603) of FIG.
  • the processor (510) can receive the user interaction from the at least one external electronic device including a haptic motor or the at least one external electronic device that does not include a haptic motor through the communication circuit (540).
  • the processor (510) can detect a gesture by a hand (601) or an external electronic device (e.g., the first controller (401) or the second controller (403) of FIGS. 4 and 6A) through a camera (530) or at least one sensor (e.g., a motion sensor), and identify (e.g., receive or obtain) a user interaction corresponding to the detected gesture.
  • the processor (510) can check whether a vibration mode is set.
  • the electronic device (501) can display a setting screen for setting a vibration mode in a virtual reality space, and set at least one of a sound, a vibration (e.g., a system vibration), a silent mode, or a system mode, and can adjust the vibration intensity and set a specific function by item (e.g., a vibration, an alarm, a system, or a media).
  • the sound mode can be a mode in which an audio source is played from a speaker (550) of the electronic device (501).
  • the electronic device (501) does not have a haptic motor and does not generate physical vibration, so that when an object is operated in a virtual environment, a physical haptic vibration can be generated in an external electronic device.
  • the vibration mode may be a mode in which an audio source is not played from the speaker (550), and when an object is operated in a virtual environment, a physical haptic vibration can be generated in the external electronic device.
  • the silent mode may be a mode in which an audio source is not played from the speaker (550), and when an object is operated in a virtual environment, a physical haptic vibration may or may not be generated in the external electronic device according to a user setting.
  • the electronic device may provide a system mode.
  • a touch feedback interaction operation may be generated from a haptic motor (e.g., a haptic controller haptic actuator) of the external electronic device.
  • a dial keypad interaction operation may be generated from a haptic motor (e.g., a haptic controller haptic actuator) of the external electronic device.
  • a keyboard vibration feedback interaction action may occur in a controller haptic actuator (motor).
  • a gesture feedback interaction action may occur in a controller haptic actuator (motor) (e.g., Grip, Pich, Click, Zoom in/out, etc.).
  • the electronic device (501) may provide a personalized vibration method according to, for example, a vibration setting condition, and for example, in a low power mode of the external electronic device (e.g., battery 10% or less), the external electronic device may not provide a vibration mode.
  • the processor (510) can identify a type (e.g., mode) of a user interaction based on reception of at least one user interaction.
  • the processor (510) can identify the user interaction as a first type (e.g., first mode) as illustrated in FIG. 7A if the user interaction is, for example, an interaction by the first controller (401) or the second controller (403).
  • the processor (510) can identify the user interaction as a second type (e.g., second mode) as illustrated in FIG. 7B if the user interaction is, for example, an interaction by the hand (601).
  • the processor (510) can identify the user interaction as a third type (e.g., third mode) as illustrated in FIG.
  • the processor (510) may identify the user interaction as an n-th type (e.g., n-th mode) if the user interaction is an interaction by an external electronic device (e.g., a keyboard, a mouse, an electronic pen, a smartphone, an AI device, or a watch) that does not include a haptic motor, for example.
  • the processor (510) may preset the first type to the n-th type according to the type of the input interface, and may also preset more types (e.g., modes) according to other types of the input interface.
  • the processor (510) may identify the occurrence of a passive interaction event or an active interaction event based on the user interaction type, depending on the received user interaction.
  • the passive interaction event is an event that does not reflect the user's will, and may be, for example, an event related to a phone call, a notification, video or music playback, or vibration due to a concentration mode, in which a physical collision may occur between the first virtual object and the second virtual object.
  • the electronic device (501) may determine or generate different vibration patterns according to the texture of the surface, and output different virtual vibration sounds for the first virtual object and the second virtual object according to the texture of the surface.
  • an active interaction event is an event that reflects a user's will, and may be an event related to vibration of a motion for a virtual object corresponding to a user interaction received from at least one external electronic device (e.g., a first controller (401) and/or a second controller (403)) or a user's hand.
  • the interaction motion may mean an interaction for a virtual object displayed in a virtual reality space (610) corresponding to a motion input (e.g., a button input) by at least one external electronic device (e.g., the first controller (401) and/or the second controller (403)) and a motion input (e.g., a gesture input) by a user's hand.
  • the event may mean information on an action received according to the interaction, and for example, when the electronic device (501) receives a user interaction for a button input according to the user's will, it may indicate that a vibration output for a virtual object corresponding to the user interaction is required.
  • the processor (510) can generate a vibration pattern based on the acquired information, and control the speaker (550) to output a virtual vibration sound corresponding to the generated vibration pattern.
  • the processor (510) can control the communication circuit (540) to transmit or not transmit feedback information including the vibration pattern to an external electronic device (e.g., the first controller (401) or the second controller (403)) including a haptic motor so as to output vibration corresponding to the generated vibration pattern through the haptic motor according to the identified type.
  • the processor (510) can check the distances of other objects displayed in a virtual reality space based on the distance between the user and the object, set the vibration intensity according to the checked distance, and generate the vibration pattern with the set vibration intensity.
  • the vibration pattern can be generated by applying elements for at least one of the frequency value (e.g., vibration intensity), vibration time and frequency type (e.g., resonant frequency (F(0)), low frequency (F(low)), medium frequency (F(mid)), high frequency (F(high)), alert frequency (F(alert))) or vibration interval.
  • the frequency value e.g., vibration intensity
  • vibration time and frequency type e.g., resonant frequency (F(0)), low frequency (F(low)), medium frequency (F(mid)), high frequency (F(high)), alert frequency (F(alert))
  • vibration interval e.g., vibration intensity
  • the frequency value e.g., vibration intensity
  • vibration time and frequency type e.g., resonant frequency (F(0)), low frequency (F(low)), medium frequency (F(mid)), high frequency (F(high)), alert frequency (F(alert)
  • vibration interval e.g., vibration intensity
  • vibration time and frequency type
  • the processor (510) may identify that the type of the user interaction is the first type, and, based on the identification of the first type, control the communication circuit (540) to transmit feedback information including a vibration pattern generated to output vibration (e.g., haptic vibration) (803, 805) via the haptic motor while outputting a virtual vibration sound (801), to the external electronic device, as illustrated in FIG. 8A.
  • the processor (510) may set an output condition (e.g., output distribution) to synchronize the virtual vibration sound and the vibration so as to output the virtual vibration sound and the vibration simultaneously.
  • the processor (510) may control the speaker (550) to output only a virtual vibration sound (801) without transmitting feedback information to an external electronic device (e.g., the first controller (401) and/or the second controller (403)) so as not to output vibration through the haptic motor when the type of the user interaction is the second type, as illustrated in FIG. 8b.
  • an external electronic device e.g., the first controller (401) and/or the second controller (403)
  • the processor (510) may output a virtual vibration sound through the speaker (550) and transmit feedback information through an external electronic device (e.g., the first controller (401) or the second controller (403)).
  • the feedback information may include a vibration pattern different from the virtual vibration sound.
  • the processor (510) may identify it as the third type.
  • the processor (510) may transmit feedback information including a first vibration pattern generated in response to a user interaction (e.g., a first interaction) to the first controller (401) upon identifying the third type, and may simultaneously or sequentially output a virtual vibration sound corresponding to a second vibration pattern generated in response to the user interaction (e.g., a second interaction) through the speaker (550).
  • the processor (510) may control the speaker (550) arranged on the right member to output a virtual vibration sound corresponding to the user's right-hand direction.
  • a virtual vibration sound corresponding to the first vibration pattern may be simultaneously output through a speaker (550), and then only a virtual vibration sound corresponding to the second vibration pattern generated in response to the second interaction may be output through the speaker (550).
  • the processor (510) may generate a vibration pattern in response to the received user interaction when the type of the user interaction is an n-th type interaction by an external electronic device (e.g., a keyboard, a mouse, or an electronic pen) that does not include a haptic motor, and output only a virtual vibration sound corresponding to the generated vibration pattern through the speaker (550).
  • an external electronic device e.g., a keyboard, a mouse, or an electronic pen
  • the processor (510) may adjust the intensity of vibration output through the haptic motor of the first controller (401) or the second controller (403) and/or the intensity of virtual vibration sound output through the speaker (550) based on information related to a virtual object provided in the virtual reality space.
  • the processor (510) may adjust the intensity of vibration and/or the intensity of virtual vibration when outputting the intensity of vibration and/or the intensity of virtual vibration with a generated vibration pattern, or outputting the intensity of vibration and/or the intensity of virtual vibration with a vibration pattern that does not consider a change in the position of the user or the position of the object.
  • the processor (510) may adjust the intensity of vibration and/or the intensity of virtual vibration based on a change in the position of the user or the position of the object (e.g., the distance between the user and the object or the direction of the object based on the position of the user) when outputting the intensity of vibration and/or the intensity of virtual vibration with the generated vibration pattern.
  • the processor (510) may adjust the intensity of the vibration output or the intensity of the virtual vibration sound output in proportion to the distance between the object displayed in the virtual reality space and the user. For example, when the distance becomes closer, the processor (510) may adjust the intensity of the vibration output or the intensity of the virtual vibration sound output to be weaker in proportion to the distance (e.g., adjusted to be lower than the set vibration intensity value or the intensity value of the virtual vibration sound). For example, when the distance becomes far, the processor (510) may adjust the intensity of the vibration output or the intensity of the virtual vibration sound output to be stronger in proportion to the distance (e.g., adjusted to be higher than the set vibration intensity value or the intensity value of the virtual vibration sound).
  • the processor (510) may adjust the intensity of the vibration output or the intensity of the virtual vibration sound output in inverse proportion to the distance between the object displayed in the virtual reality space and the user. For example, when the distance becomes closer, the processor (510) can adjust the intensity of the output vibration or the intensity of the output virtual vibration sound to become stronger in inverse proportion to the distance (e.g., adjusted to be higher than the set vibration intensity value or the intensity value of the virtual vibration sound). For example, when the distance becomes far, the processor (510) can adjust the intensity of the output vibration or the intensity of the output virtual vibration sound to become weaker in inverse proportion to the distance (e.g., adjusted to be lower than the set vibration intensity value or the intensity value of the virtual vibration sound).
  • the processor (510) may adjust the intensity of the output vibration or the intensity of the output virtual vibration sound based on the direction of the object displayed in the virtual reality space.
  • the processor (510) may identify the direction of the object based on the current location of the user. For example, if the speakers (550) are respectively placed on the left and right members of the electronic device (501), the processor (510) may adjust the intensity of the virtual vibration sound output through the speaker (550) placed in the direction corresponding to the direction of the identified object to be greater than a specified threshold value (e.g., the intensity of the virtual vibration sound being output or the intensity of the virtual vibration sound set in the vibration pattern) or greater than the intensity of the virtual vibration sound output through the speaker (550) placed in the opposite direction.
  • a specified threshold value e.g., the intensity of the virtual vibration sound being output or the intensity of the virtual vibration sound set in the vibration pattern
  • the intensity of the virtual vibration sound output through the speaker (550) placed in the opposite direction may be adjusted to be lower than the intensity of the virtual vibration sound output through the speaker (550) placed in the direction corresponding to the direction of the identified object.
  • the processor (510) can adjust the intensity of the vibration output through the first controller (401) held by the hand in the direction corresponding to the direction of the identified object to be greater than a specified threshold value (e.g., the intensity of the vibration being output or the intensity of the vibration set in the vibration pattern) or the intensity of the vibration output through the second controller (403) held by the hand in the opposite direction, and transmit feedback information including the adjusted vibration intensity value to the first controller (401).
  • a specified threshold value e.g., the intensity of the vibration being output or the intensity of the vibration set in the vibration pattern
  • the intensity of the vibration output through the second controller (403) held by the hand in the opposite direction, and transmit feedback information including the adjusted vibration intensity value to the first controller (401).
  • the intensity of the virtual vibration sound output through the second controller (403)
  • the processor (510) may finely adjust the vibration intensity by adjusting complex elements (e.g., a specified vibration intensity value and an amplitude value) according to the purpose of an alert according to a notification event or an event that plays audio (e.g., outputs music, sound of video content, or sound of a game).
  • the processor (510) may adjust (e.g., adjust to a high level) the vibration intensity or the intensity of a virtual vibration sound to provide an alert for an incoming call when a call comes in.
  • the processor (510) may adjust (e.g., adjust to a high level) the vibration intensity or the intensity of a virtual vibration sound to provide a notification alert for functions or applications provided in a virtual reality space.
  • the processor (510) may adjust (e.g., adjust to a high level) the vibration intensity or the intensity of a virtual vibration sound to provide an alert for a user interaction, such as touching, dragging, or touching and releasing a virtual object provided in a virtual reality space.
  • the electronic device (501) may provide a sound for the alert along with a vibration or virtual vibration sound to increase the user's awareness of the occurrence of an event.
  • the processor (510) may control the communication circuit (540) to identify a grip interaction when the user is holding and using the first controller (401) and/or the second controller (403) with a hand, set the intensity of the vibration to a threshold value (e.g., the vibration intensity set in the generated vibration pattern or the reference vibration intensity set in the device) or a lower intensity than a virtual vibration sound, and transmit feedback information including the lowered intensity of the vibration to the first controller (401) and/or the second controller (403).
  • a threshold value e.g., the vibration intensity set in the generated vibration pattern or the reference vibration intensity set in the device
  • the processor (510) when the processor (510) identifies that the first controller (401) and/or the second controller (403) is in a low power mode, the processor (510) can transmit feedback information to control the haptic motor of the first controller (401) and/or the second controller (403) not to output vibration, and control the speaker (550) to output only a virtual vibration sound.
  • the processor (510) can determine whether the first controller (401) and/or the second controller (403) are in use. In one embodiment, the processor (510) can determine that the first controller (401) and/or the second controller (403) are not in use if it is determined that the first controller (401) and/or the second controller (403) has not moved for a specified period of time. In one embodiment, the processor (510) can determine whether the first controller (401) and/or the second controller (403) are in use based on sensing information acquired through a sensor included in the first controller (401) and/or the second controller (403). For example, the sensing information may include acceleration sensor information and/or gyro sensor information.
  • the processor (510) may determine that the first controller (401) and/or the second controller (403) are not in use if the first controller (401) and/or the second controller (403) are confirmed to be placed on the ground through the camera (530). According to one embodiment, the processor (510) may determine whether the first controller (401) and/or the second controller (403) is gripped by the user based on sensing information acquired through a grip sensor included in the first controller (401) and/or the second controller (403).
  • the display (520) may display an object representing at least one input interface (e.g., the first controller (401) and the second controller (403), another input interface (e.g., a keyboard, a mouse, an electronic pen, or a smart phone), or a user's hand) under the control of the processor (510).
  • the display (520) may display a virtual object under the control of the processor (510).
  • the virtual object may include a default menu or a universal menu provided by the electronic device (501).
  • the universal menu may include a menu that may execute a function capable of executing an application installed in the wearable electronic device (501), a function capable of displaying devices (e.g., the first controller (401), the second controller (403)) communicatively connected to the wearable electronic device (501), and a function capable of displaying a remaining battery level of the electronic device (501).
  • the universal menu may also include menus that provide various other functions.
  • FIG. 9 is a diagram showing an example configuration of a software module of an electronic device according to various embodiments.
  • an electronic device (501) may implement a software module (901) (e.g., a program (140) of FIG. 1) for outputting a virtual vibration sound.
  • a memory (130) of the electronic device (101) may store commands (e.g., instructions) to implement the software module (901) illustrated in FIG. 9.
  • At least one processor (510) may execute commands stored in a memory (e.g., the memory (130) of FIG. 1 and the memory (560) of FIG. 5) to implement the software module (901) illustrated in FIG. 9, and control hardware (e.g., the display module (160), the audio module (170), the sensor module (176), or the communication module (190) of FIG. 1) associated with a function of the software module (901).
  • a software module (901) of an electronic device (501) may be configured to include a kernel (or HAL) (910), a framework (e.g., middleware (144) of FIG. 1) (920), and an application (930) (e.g., application (146) of FIG. 1). At least a part of the software module (901) may be preloaded on the electronic device (101) or may be downloadable from a server (e.g., server (108)).
  • a kernel or HAL
  • framework e.g., middleware (144) of FIG. 1
  • an application (930) e.g., application (146) of FIG.
  • the kernel (910) may be configured to include, for example, a module related to a vibration module (3 rd device with vibrator) (911), a vibration module of the device (vibrator of the device) (913).
  • the kernel (910) may include, for example, a system resource manager or a device driver, and may be configured to further include other modules without limitation.
  • the system resource manager may perform at least one of control, allocation, or recovery of system resources.
  • the device driver may include, for example, a display driver, a camera driver, a Bluetooth driver, a shared memory driver, a USB driver, a keypad driver, a WIFI driver, an audio driver, or an inter-process communication (IPC) driver.
  • IPC inter-process communication
  • the framework (920) may be configured to include, for example, modules required to output a virtual vibration sound (e.g., an XR-related module (921), an input-related module (922), a system vibrator-related module (923), and an XR control-related module (924)), but is not limited thereto and may be configured to further include other modules.
  • the framework (920) may provide functions commonly required by applications (930) or provide various functions to applications (930) through an application programming interface (API) (not shown) so that the applications (930) can efficiently use limited system resources within the electronic device (101).
  • API application programming interface
  • haptic feedback may be configured to occur as an event for user interaction.
  • the application (930) may be configured to include related applications (e.g., modules, managers or programs) (931, 932, 933, 934) for outputting a virtual vibration sound from the electronic device (501).
  • the application (930) may further be configured to include a module (or application) (not shown) for wireless communication with an external electronic device (e.g., the first controller (401), the second controller (403) of FIG. 4).
  • the application (930) may include an application received from an external electronic device (e.g., the server (108) or the electronic device (102, 104)).
  • the application (930) may include a preloaded application or a third party application downloadable from a server.
  • the components of the software module (901) may vary depending on the type of operating system. According to one embodiment, at least a portion of the software module (901) may be implemented as software, firmware, hardware, or a combination of at least two or more of these. At least a portion of the software module (901) may be implemented (e.g., executed) by, for example, a processor (e.g., an AP). At least a portion of the software module (901) may include at least one of, for example, a module, a program, a routine, a set of instructions, or a process for performing at least one function.
  • the software module (901) of the electronic device (501) may be configured to simultaneously generate a virtual vibration sound together with the audio (or sound source) when playing audio (or sound source) (e.g., outputting music, sound of video content, or sound of a game).
  • audio or sound source
  • the electronic device may transmit the audio data to the audio driver of the software module (901) and transmit the haptic data to the vibration driver (e.g., haptic driver) of the software module (901) to simultaneously output the virtual vibration sound together with the audio source.
  • haptic data reflecting the audio data can be generated by a haptic generation module (e.g., a vibration service (926)) of a software module (901), and the generated haptic data can be transmitted to a vibration driver to be output as a virtual vibration sound through a speaker (550).
  • a haptic generation module e.g., a vibration service (926)
  • the generated haptic data can be transmitted to a vibration driver to be output as a virtual vibration sound through a speaker (550).
  • audio data can be output as high-quality data by a vibration driver of a software module (901), and the high-quality data output by the vibration driver can be output as a virtual vibration sound through a speaker (550).
  • the main components of the electronic device (101, 200, 300, 500) of FIGS. 1 to 3C and FIG. 5 have been described.
  • the electronic device (101, 200, 300, 500) may be implemented with more components than the illustrated components, or may be implemented with fewer components.
  • the positions of the main components of the electronic device (101, 200, 300, 500) described above with FIGS. 1 to 3C and FIG. 5 may be changed according to various embodiments.
  • an electronic device may include a display (e.g., a display module (160) of FIG. 1, a display member (201) of FIG. 2, a display (321) of FIG. 3, a display (520) of FIG. 5), a speaker (e.g., an audio output module (155) of FIG. 1, a speaker (550) of FIG. 5), a memory (e.g., a memory (130) of FIG. 1, a memory (560) of FIG. 5), and at least one processor (e.g., a processor (120) of FIG. 1, a processor (510) of FIG. 5) operatively connected to the display, the speaker, and the memory.
  • a display e.g., a display module (160) of FIG. 1, a display member (201) of FIG. 2, a display (321) of FIG. 3, a display (520) of FIG. 5
  • a speaker e.g., an audio output module (155) of FIG. 1, a speaker (550) of FIG. 5
  • a memory e.g
  • the instructions when individually or collectively executed by at least one processor, may cause the electronic device to receive user interaction by at least one input interface in a virtual reality space provided through the display.
  • the instructions when individually or collectively executed by at least one processor, may cause the electronic device to identify a type of user interaction.
  • the instructions when individually or collectively executed by at least one processor, may cause the electronic device to obtain information related to a virtual object displayed in the virtual reality space based on the identified type.
  • the instructions when individually or collectively executed by at least one processor, may cause the electronic device to control the speaker to output a virtual vibration sound corresponding to a vibration pattern generated based on information related to the virtual object.
  • the instructions when individually or collectively executed by at least one processor, may cause the electronic device to control the communication circuit to transmit feedback information including the vibration pattern to the at least one external electronic device so as to output vibration corresponding to the vibration pattern through the haptic motor while outputting the virtual vibration sound, based on determining that at least one external electronic device including a haptic motor among the at least one input interface is in use.
  • the instructions when individually or collectively executed by at least one processor, may cause the electronic device to adjust the intensity of the vibration or the intensity of the virtual vibration sound based on the information related to the object.
  • the instructions when individually or collectively executed by at least one processor, may cause the electronic device to control the speaker to simultaneously output a sound related to the passive interaction and the virtual vibration sound at different frequency bands based on identifying a passive interaction event that does not reflect a user's will.
  • the instructions when individually or collectively executed by at least one processor, may cause the electronic device to control the speaker to output the virtual vibration sound corresponding to the vibration pattern based on identifying an active interaction event reflecting a user's will.
  • the instructions when individually or collectively executed by at least one processor, may cause the electronic device to control the speaker to output the virtual vibration sound without outputting the vibration through the haptic motor of the at least one external electronic device based on the at least one external electronic device being in a low power mode.
  • FIG. 10 is a diagram showing an example of an operating method in an electronic device according to one embodiment
  • FIG. 11 is a diagram showing an example for obtaining information related to a virtual object in an electronic device according to one embodiment
  • FIGS. 12A and 12B are diagrams showing examples for obtaining information related to a virtual object in an electronic device according to one embodiment.
  • each operation may be performed sequentially, but is not necessarily performed sequentially. For example, the order of each operation may be changed, and at least two operations may be performed in parallel.
  • an electronic device may provide a virtual reality space through a display (e.g., a display module (160) of FIG. 1, a display member (201) of FIG. 2, a display (321) of FIG. 3, a display (520) of FIG. 5)) while a user is wearing the electronic device, and may display a virtual object on the virtual reality space.
  • a display e.g., a display module (160) of FIG. 1, a display member (201) of FIG. 2, a display (321) of FIG. 3, a display (520) of FIG. 5
  • an electronic device may receive a user interaction by at least one input interface.
  • the electronic device may be connected to the at least one user interface using a short-range communication technology (e.g., BLE (Bluetooth low energy) or WiFi) or a wired communication.
  • the at least one input interface may be at least one external electronic device including a haptic motor (e.g., the first controller (401) or the second controller (403) of FIGS. 4 and 6A), a part of a user's body (e.g., a hand (601) of FIG. 6C), or at least one external electronic device that does not include a haptic motor (e.g., a keyboard (603) of FIG.
  • a haptic motor e.g., the first controller (401) or the second controller (403) of FIGS. 4 and 6A
  • a part of a user's body e.g., a hand (601) of FIG. 6C
  • at least one external electronic device that does not include a
  • an electronic device may identify a type of a user interaction.
  • the electronic device may identify the user interaction as a first type (e.g., a first mode) if the user interaction is, for example, an interaction by a first controller and/or a second controller.
  • the electronic device may identify the user interaction as a second type (e.g., a second mode) if the user interaction is, for example, an interaction by a user's hand.
  • the electronic device may identify the user interaction as a third type (e.g., a third mode) if the user interaction is, for example, an interaction by the first controller or the second controller and the user's hand (e.g., a hybrid interaction).
  • the electronic device may identify the user interaction as an n-th type (e.g., an n-th mode) if the user interaction is, for example, an interaction by an external electronic device (e.g., a keyboard, a mouse, an electronic pen, a smartphone, an AI device, or a watch) that does not include a haptic motor.
  • the electronic device may preset the first type to the nth type according to the type of the input interface, and may preset more types (e.g., modes) according to other types of the input interface.
  • the electronic device may identify the occurrence of a passive interaction event or an active interaction event according to the received user interaction based on the user interaction type.
  • the passive interaction event is an event that does not reflect the user's will, and may be, for example, an event related to a phone call, a notification, video or music playback, or vibration due to a concentration mode, in which an interaction action between a first virtual object and a second virtual object may cause a physical collision.
  • the active interaction event is an event that reflects the user's will, and may be an event related to vibration of an action for a virtual object corresponding to a user interaction received from at least one external electronic device (e.g., the first controller (401) and/or the second controller (403)) or the user's hand.
  • the electronic device may obtain information related to a virtual object provided in a virtual reality space based on the identified type.
  • the information related to the virtual object may include at least one of a menu (e.g., a taskbar app icon menu) provided in the virtual reality space, information about a function or content, information about a feature of an application (e.g., an app property), or a direction of an object, a distance of an object, a feature of an object, or a type of an object.
  • a menu e.g., a taskbar app icon menu
  • information about a function or content e.g., information about a feature of an application (e.g., an app property), or a direction of an object, a distance of an object, a feature of an object, or a type of an object.
  • the electronic device may identify a virtual object (1101) that is operated or directed according to the button input operation (1110) of the external electronic device (401) or the gesture (1120) by the user's hand (601), if the received user interaction is, for example, a button input operation (1110) of an external electronic device (401) or a gesture (1120) by the user's hand (601), and obtain information related to the identified virtual object (1101). As illustrated in FIG.
  • an electronic device (501) may obtain information related to a first object (e.g., a taskbar app icon menu) (1201), a second object (e.g., a slide control for volume control) (1203), and/or a third object (e.g., an app property icon running on the dash) (1205) according to a user interaction by at least one external electronic device (401, 403).
  • a first object e.g., a taskbar app icon menu
  • a second object e.g., a slide control for volume control
  • a third object e.g., an app property icon running on the dash
  • an electronic device (501) may obtain information related to a first object (e.g., a taskbar app icon menu) (1201), a second object (e.g., a slide control for volume control) (1203), and/or a third object (e.g., an app property icon running on the dash) (1205) according to a user interaction by a user's hand (601).
  • a first object e.g., a taskbar app icon menu
  • a second object e.g., a slide control for volume control
  • a third object e.g., an app property icon running on the dash
  • an electronic device may generate a vibration pattern based on information related to a virtual object.
  • the electronic device may check distances of other objects displayed in a virtual reality space based on a distance between a user and the object, set a vibration intensity according to the checked distance, and generate a vibration pattern with the set vibration intensity.
  • the vibration pattern may be generated by applying an element for at least one of a frequency value (e.g., vibration intensity), a vibration time and a frequency type (e.g., at least one of a resonant frequency (F(0)), a low frequency (F(low)), a middle frequency (F(mid)), a high frequency (F(high)), or an alert frequency (F(alert)))) or a vibration interval.
  • a frequency value e.g., vibration intensity
  • a vibration time and a frequency type e.g., at least one of a resonant frequency (F(0)), a low frequency (F(low)), a middle frequency (F(mid)), a high frequency (F(high)), or an alert frequency (F(alert))
  • F(0) resonant frequency
  • F(low) low frequency
  • F(mid) middle frequency
  • F(high) high frequency
  • an alert frequency F(alert)
  • an electronic device may check whether a user is using at least one external electronic device (e.g., the first controller (401) and/or the second controller (403) of FIG. 4). If the user is using at least one external electronic device, operation 1015 may be performed, and if not, operation 1013 may be performed.
  • at least one external electronic device e.g., the first controller (401) and/or the second controller (403) of FIG. 4
  • the electronic device may output a virtual vibration sound corresponding to a vibration pattern generated based on information related to a virtual object through a speaker of the electronic device (e.g., the speaker (550) of FIG. 5). Thereafter, the operation may be terminated.
  • the electronic device may identify that the type of the user interaction is the second type based on determining that the user does not use at least one external electronic device, and may output only the virtual vibration sound through the speaker without transmitting feedback information to the external electronic device (e.g., the first controller (401) and/or the second controller (403) of FIG. 4) so as not to output vibration through the haptic motor.
  • an electronic device may transmit feedback information including a vibration pattern to at least one external electronic device through a communication circuit of the electronic device (e.g., the communication circuit (540) of FIG. 5) to output vibration via a haptic motor while outputting a virtual vibration sound based on the user using at least one external electronic device. Thereafter, the operation may be terminated.
  • a communication circuit of the electronic device e.g., the communication circuit (540) of FIG. 5
  • the electronic device may identify that the type of the user interaction is the first type, and based on the identification of the first type, transmit feedback information including a vibration pattern generated to output vibration (e.g., haptic vibration) via the haptic motor while outputting a virtual vibration sound to the external electronic device.
  • the electronic device may set an output condition (e.g., output distribution) to synchronize the virtual vibration sound and the vibration so as to output the virtual vibration sound and the vibration simultaneously.
  • FIG. 13 is a drawing showing an example of an operating method in an electronic device according to one embodiment.
  • an electronic device (501) may determine whether a received user interaction is a third type of a hybrid type including a first interaction (e.g., a control signal) by the external electronic device (401) and a second interaction (e.g., a gesture of grabbing a virtual object (1311) displayed on a virtual reality space (610) by the hand) by the user's hand, and may set and perform an output condition according to the third type.
  • a first interaction e.g., a control signal
  • a second interaction e.g., a gesture of grabbing a virtual object (1311) displayed on a virtual reality space (610) by the hand
  • the electronic device (501) may transmit feedback information including a first vibration pattern generated in response to a user interaction (e.g., a first interaction) to an external electronic device (401) upon identifying the third type, and may simultaneously or sequentially output a virtual vibration sound corresponding to a second vibration pattern generated in response to the user interaction (e.g., a second interaction) through a speaker.
  • a user interaction e.g., a first interaction
  • an external electronic device 401
  • the electronic device may control the speaker arranged on the right member to output a virtual vibration sound corresponding to the user's right hand direction.
  • the electronic device may transmit feedback information including the first vibration pattern generated in response to the first interaction, simultaneously output the virtual vibration sound corresponding to the first vibration pattern through the speaker, and sequentially output only the virtual vibration sound corresponding to the second vibration pattern generated in response to the second interaction through the speaker.
  • the electronic device can generate a vibration pattern in response to a received user interaction when the type of the user interaction is an n-th type interaction by an external electronic device (e.g., a keyboard, a mouse, or an electronic pen) that does not include a haptic motor, and output only a virtual vibration sound corresponding to the generated vibration pattern through a speaker.
  • an external electronic device e.g., a keyboard, a mouse, or an electronic pen
  • the electronic device can determine whether the first controller and/or the second controller is in use. According to one embodiment, if the electronic device determines that the first controller and/or the second controller is not in use for a specified period of time, the electronic device can determine that the first controller and/or the second controller is not in use. According to one embodiment, the electronic device can determine whether the first controller and/or the second controller is in use based on sensing information acquired through a sensor included in the first controller and/or the second controller. For example, the sensing information may include acceleration sensor information and/or gyro sensor information.
  • the electronic device can determine that the first controller and/or the second controller is not in use. According to one embodiment, the electronic device can determine whether the first controller and/or the second controller is gripped by a user based on sensing information acquired through a grip sensor included in the first controller and/or the second controller.
  • FIGS. 14A to 14C are diagrams illustrating examples for generating a vibration pattern in an electronic device according to one embodiment.
  • the electronic device may apply elements for at least one of a frequency value (e.g., vibration intensity), a vibration time and a frequency type (e.g., at least one of a resonant frequency (F(0)), a low frequency (F(low)), a middle frequency (F(mid)), a high frequency (F(high)), or an alert frequency (F(alert)))) or a vibration interval time to generate various combinations of vibration patterns (e.g., vibration patterns of A to M of FIG. 14A ).
  • a frequency value e.g., vibration intensity
  • a vibration time and a frequency type e.g., at least one of a resonant frequency (F(0)), a low frequency (F(low)), a middle frequency (F(mid)), a high frequency (F(high)), or an alert frequency (F(alert)
  • a vibration interval time e.g., vibration patterns of A to M of FIG. 14A .
  • the resonant frequency (F(0)) is applied to haptic feedback that requires fast and accurate interaction, such as an input device (SIP) and a dialer, and may be applied to secure the best vibration force even in a short duration.
  • the alert frequency (F(alert)) is, for example, a frequency for vibration for incoming calls/notifications, and can be derived through a sensory/performance evaluation and applied to the final tuning frequency.
  • the low frequency (F(low)) can be applied to fingerprint recognition errors and input device keypads, and can be provided as a vibration that minimizes discomfort.
  • the middle frequency (F(mid)) can be applied to low battery and smart alert functions, and can be provided as a heavy and soft vibration.
  • High frequency can be applied to camera shutter, screen capture, no input, etc., and can be provided as vibration with aperture feel and edge.
  • the first number represents the vibration duration (e.g., 7 ms) indicating how many seconds of the pattern code the vibration will occur
  • the second and third numbers represent that 2000 vibrations will occur for 10000 ms
  • the last number can represent a unique ID of the vibration (e.g., 1 or 0).
  • vibration patterns A, B, and F represent vibration categories (haptics) for haptic vibrations
  • the electronic device can generate the vibration pattern to output one vibration with, for example, a resonant frequency (F(0)), a frequency value of 180 Hz, and a vibration time of 7 ms.
  • vibration patterns C, D, E, and G represent vibration categories (operations) according to operations, and the electronic device can generate the vibration patterns with, for example, one of a low frequency (F(low)), a middle frequency (F(mid)), or a high frequency (F(high)), one of 120 Hz, 150 Hz, or 200 Hz, a combination of various vibration times, and a combination of various vibration interval times.
  • pattern C can be set to output one vibration with a vibration time of 100 ms (0 ms(off)->100 ms(on)).
  • pattern D can be set to output one vibration with a vibration time of 100 ms (0 ms(off)->100 ms(on)), and then output two vibrations with an interval time of 50 ms and then 30 ms (50 ms(off)->30 ms(on)).
  • the E pattern can be set to output one vibration with a vibration time of 10 ms (0 ms (off)->10 ms (on)), and output two vibrations with an interval time of 60 ms and a duration of 15 ms (60 ms (off)->15 ms (on)).
  • the G pattern can be set to output one vibration with a vibration time of 75 ms (0 ms (off)->10 ms (on)), and output two vibrations with an interval time of 25 ms and a duration of 75 ms (60 ms (off)->15 ms (on)), and not to output any vibration for 300 ms (off).
  • the I pattern can be set to output vibration once with a vibration time of 150 ms (0 ms (off)->150 ms (on)), output vibration twice with an interval time of 80 ms and 150 ms (80 ms (off)->150 ms (on)), and not to output vibration for 1200 ms (off).
  • the J pattern can be set to output one vibration with a vibration time of 250 ms (0 ms (off)->250 ms (on)), output two vibrations with an interval time of 250 ms each (250 ms (off)->250 ms (on)), and not output any vibration for 250 ms (off).
  • FIGS. 15A to 15D are diagrams illustrating examples of vibration patterns generated in an electronic device according to one embodiment.
  • an electronic device may generate a vibration pattern representing a simple waveform that occurs when it receives a user interaction for a gesture (e.g., grip, pinch) by a user's hand.
  • a gesture e.g., grip, pinch
  • an electronic device can generate a vibration pattern representing a simple waveform that occurs when an object having a tactile property (e.g., a pine cone) is touched.
  • a tactile property e.g., a pine cone
  • FIG. 16 is a diagram illustrating an example of a setting screen for setting a virtual vibration sound in an electronic device according to one embodiment.
  • an electronic device may display a setting screen (1610) for setting a virtual vibration sound on a display (e.g., display (520) of FIG. 5), select items related to virtual vibration sound output (e.g., virtual sound (virtual_vib_sound), speaker output, system sleep (system_enforced)) (1611) included in the setting screen, and select a volume level (1613) of the virtual vibration sound to set an output condition of the virtual vibration sound.
  • a setting screen (1610) for setting a virtual vibration sound e.g., display (520) of FIG. 5
  • select items related to virtual vibration sound output e.g., virtual sound (virtual_vib_sound), speaker output, system sleep (system_enforced)
  • FIGS. 17A and 17B are diagrams illustrating an example of synchronizing a virtual vibration sound output through a speaker and a vibration output through a haptic motor in an electronic device according to one embodiment.
  • the electronic device may identify that the type of user interaction is the first type or the third type.
  • the electronic device may transmit feedback information including a vibration pattern generated to output vibration (e.g., haptic vibration) through the haptic motor while outputting a virtual vibration sound, to the at least one external electronic device.
  • FIG. 18 is a diagram illustrating an example of operation in a low power mode in an electronic device according to one embodiment.
  • an electronic device (501) when an electronic device (501) according to one embodiment identifies that the first controller (401) and/or the second controller (403) is in a low power mode (e.g., a state where the battery is less than 10%), the electronic device may transmit feedback information to control the haptic motor of the first controller (401) and/or the second controller (403) not to output vibration, and control the speaker to output only a virtual vibration sound.
  • a low power mode e.g., a state where the battery is less than 10%
  • the electronic device may adjust the intensity of vibration output through a haptic motor of the first controller or the second controller and/or the intensity of virtual vibration sound output through a speaker based on information related to a virtual object provided in the virtual reality space.
  • the electronic device may adjust the intensity of vibration and/or the intensity of virtual vibration when outputting the intensity of vibration and/or the intensity of virtual vibration with a generated vibration pattern, or outputting the intensity of vibration and/or the intensity of virtual vibration with a vibration pattern that does not consider a change in the position of the user or the position of the object.
  • the electronic device can adjust the intensity of the vibration output or the intensity of the virtual vibration sound output in proportion to the distance between the user and an object displayed in the virtual reality space. For example, when the distance becomes closer, the electronic device can adjust the intensity of the vibration output or the intensity of the virtual vibration sound output to become weaker in proportion to the distance. For example, when the distance becomes farther, the electronic device can adjust the intensity of the vibration output or the intensity of the virtual vibration sound output to become stronger in proportion to the distance.
  • the electronic device can adjust the intensity of the output vibration or the intensity of the output virtual vibration sound based on the direction of the object displayed in the virtual reality space.
  • the electronic device can identify the direction of the object based on the current location of the user. For example, if speakers are respectively placed on the left and right members of the electronic device, the electronic device can adjust the intensity of the virtual vibration sound output through the speaker placed in the direction corresponding to the direction of the identified object to be greater than a specified threshold value (e.g., the intensity of the virtual vibration sound being output or the intensity of the virtual vibration sound set in the vibration pattern) or greater than the intensity of the virtual vibration sound output through the speaker placed in the opposite direction.
  • a specified threshold value e.g., the intensity of the virtual vibration sound being output or the intensity of the virtual vibration sound set in the vibration pattern
  • the intensity of the virtual vibration sound output through the speaker placed in the opposite direction can be adjusted to be lower than the intensity of the virtual vibration sound output through the speaker placed in the direction corresponding to the direction of the identified object.
  • the electronic device can adjust the intensity of the vibration output through the first controller held by a hand in a direction corresponding to the direction of the identified object to be greater than a specified threshold value (e.g., the intensity of the vibration being output or the intensity of the vibration set in the vibration pattern) or the intensity of the vibration output through the second controller held by the hand in the opposite direction, and transmit feedback information including the adjusted vibration intensity value to the first controller.
  • a specified threshold value e.g., the intensity of the vibration being output or the intensity of the vibration set in the vibration pattern
  • the intensity of the vibration output through the second controller held by the hand in the opposite direction
  • the intensity of a virtual vibration sound output through the second controller can be adjusted to be lower than the intensity of the virtual vibration sound output through the first controller.
  • the electronic device may finely adjust the vibration intensity by adjusting complex elements (e.g., a specified vibration intensity value and an amplitude value) according to the purpose of an alert according to a notification event or an event that plays audio (e.g., outputs music, sound of video content, or sound of a game).
  • the electronic device may adjust (e.g., adjust to a high level) the vibration intensity or the intensity of a virtual vibration sound to provide an alert for an incoming call.
  • the electronic device may adjust (e.g., adjust to a high level) the vibration intensity or the intensity of a virtual vibration sound to provide a notification alert for functions or applications provided in a virtual reality space.
  • the electronic device may adjust (e.g., adjust to a high level) the vibration intensity or the intensity of a virtual vibration sound to provide an alert for a user interaction, such as touching, dragging, or touching and releasing a virtual object provided in a virtual reality space.
  • an electronic device can provide a sound for the alert along with vibration or a virtual vibration sound to increase the user's awareness of the occurrence of the event.
  • the electronic device may identify a grip interaction when the user is holding and using the first controller and/or the second controller with a hand, set an intensity of vibration to a threshold (e.g., a vibration intensity set in a generated vibration pattern or a reference vibration intensity set in the device) or a virtual vibration sound, and control the communication circuit to transmit feedback information including the lowered vibration intensity to the first controller and/or the second controller.
  • a threshold e.g., a vibration intensity set in a generated vibration pattern or a reference vibration intensity set in the device
  • a virtual vibration sound e.g., a vibration intensity set in a generated vibration pattern or a reference vibration intensity set in the device
  • an operating method in an electronic device may include an operation of receiving a user interaction by at least one input interface in a virtual reality space provided through a display of the electronic device (e.g., a display module (160) of FIG. 1, a display member (201) of FIG. 2, a display (321) of FIG. 3, a display (520) of FIG. 5).
  • a display module (160) of FIG. 1 e.g., a display module (160) of FIG. 1, a display member (201) of FIG. 2, a display (321) of FIG. 3, a display (520) of FIG. 5).
  • the method may include an operation of obtaining information related to a virtual object displayed on the virtual reality space based on the identified type.
  • the method may include an operation of outputting a virtual vibration sound corresponding to a vibration pattern generated based on information related to the virtual object through a speaker of the electronic device (e.g., an audio output module (155) of FIG. 1, a speaker (550) of FIG. 5).
  • a speaker of the electronic device e.g., an audio output module (155) of FIG. 1, a speaker (550) of FIG. 5.
  • the method may further include an operation of determining whether at least one external electronic device including a haptic motor among the at least one input interface is in use, and an operation of transmitting feedback information including the vibration pattern to the at least one external electronic device so as to output vibration corresponding to the vibration pattern via the haptic motor while outputting the virtual vibration sound when the at least one external electronic device is in use.
  • the operation of identifying the type of the user interaction may include an operation of identifying the type of the user interaction as a first type based on the user interaction being an interaction by the at least one external electronic device.
  • the operation of outputting the virtual vibration sound through a speaker of the electronic device may include an operation of synchronizing the virtual vibration sound with the vibration, and an operation of outputting the virtual vibration sound through the speaker and transmitting the feedback information to the at least one external device so as to output the vibration simultaneously with the virtual vibration sound.
  • the operation of identifying the type of the user interaction may include an operation of identifying the type of the user interaction as a second type based on the user interaction being an interaction corresponding to a gesture by the hand.
  • the operation of outputting the virtual vibration sound through a speaker of the electronic device may include an operation of outputting the virtual vibration sound through the speaker without transmitting the feedback information.
  • the operation of identifying the type of the user interaction may include an operation of identifying the type of the user interaction as a third type based on the user interaction being a first interaction by the at least one external electronic device and a second interaction by the user's hand.
  • the operation of outputting the virtual vibration sound through the speaker of the electronic device may include an operation of transmitting feedback information including a first vibration pattern generated in response to the first interaction to the at least one external electronic device and an operation of outputting a virtual vibration sound corresponding to a second vibration pattern generated in response to the second interaction through the speaker.
  • the method may further include an operation of adjusting an intensity of the vibration or an intensity of the virtual vibration sound based on the information related to the object.
  • the operation of outputting the virtual vibration sound through the speaker of the electronic device may include an operation of simultaneously outputting a sound related to the passive interaction and the virtual vibration sound through the speaker in different frequency bands based on identifying a passive interaction event that does not reflect the user's will.
  • the operation of outputting the virtual vibration sound through the speaker of the electronic device may include an operation of outputting the virtual vibration sound corresponding to the vibration pattern through the speaker based on identifying an active interaction event reflecting the user's will.
  • the operation of outputting the virtual vibration sound through the speaker of the electronic device may include an operation of outputting the virtual vibration sound through the speaker without outputting the vibration through the haptic motor of the at least one external electronic device based on the at least one external electronic device being in a low power mode.
  • a non-transitory storage medium storing one or more programs, wherein the one or more programs, when executed by at least one processor (e.g., the processor (120) of FIG. 1, the processor (510) of FIG. 5) of an electronic device (e.g., the electronic device (101) of FIG. 1, the wearable electronic device (200) of FIG. 2, the wearable electronic device (300) of FIGS. 3A to 3C, the electronic device (501) of FIG. 5), cause the electronic device to receive a user interaction by at least one input interface in a virtual reality space provided through a display (e.g., the display module (160) of FIG. 1, the display member (201) of FIG. 2, the display (321) of FIG. 3, the display (520) of FIG.
  • a display e.g., the display module (160) of FIG. 1, the display member (201) of FIG. 2, the display (321) of FIG. 3, the display (520) of FIG.
  • an operation of identifying a type of the user interaction an operation of obtaining information related to a virtual object displayed on the virtual reality space based on the identified type, and a vibration generated based on the information related to the virtual object. It may include executable commands to execute an operation of outputting a virtual vibration sound corresponding to the pattern through a speaker of the electronic device (e.g., the sound output module (155) of FIG. 1, the speaker (550) of FIG. 5).
  • a speaker of the electronic device e.g., the sound output module (155) of FIG. 1, the speaker (550) of FIG. 5).
  • the one or more programs may further include executable instructions that, when executed by at least one processor of the electronic device, cause the electronic device to further perform an operation of determining whether at least one external electronic device including a haptic motor among the at least one input interface is in use, and an operation of transmitting feedback information including the vibration pattern to the at least one external electronic device so as to output a vibration corresponding to the vibration pattern via the haptic motor while outputting the virtual vibration sound when the at least one external electronic device is in use.
  • an electronic device can provide a sense of immersion to a user and improve user experience by analyzing the haptic (or vibration) of an object according to the type of user interaction-based event (passive, active), and distributing control of virtual vibration to a speaker and a controller to map and output it.
  • various effects that are directly or indirectly recognized through the present disclosure can be provided.
  • the effects obtainable from the present disclosure are not limited to the effects mentioned above, and other effects that are not mentioned can be clearly understood by a person skilled in the art to which the present disclosure belongs from the description below.
  • the electronic devices according to various embodiments disclosed in this document may be devices of various forms.
  • the electronic devices may include, for example, portable communication devices (e.g., smartphones), computer devices, portable multimedia devices, portable medical devices, cameras, wearable devices, or home appliance devices.
  • portable communication devices e.g., smartphones
  • computer devices portable multimedia devices
  • portable medical devices e.g., cameras
  • wearable devices e.g., smart watch devices
  • home appliance devices e.g., smartphones
  • the electronic devices according to embodiments of this document are not limited to the above-described devices.
  • first, second, or first or second may be used merely to distinguish one component from another, and do not limit the components in any other respect (e.g., importance or order).
  • a component e.g., a first
  • another component e.g., a second
  • functionally e.g., a third component
  • module used in various embodiments of this document may include a unit implemented in hardware, software or firmware, and may be used interchangeably with terms such as logic, logic block, component, or circuit, for example.
  • a module may be an integrally configured component or a minimum unit of the component or a part thereof that performs one or more functions.
  • a module may be implemented in the form of an application-specific integrated circuit (ASIC).
  • ASIC application-specific integrated circuit
  • Various embodiments of the present document may be implemented as software (e.g., a program (140)) including one or more instructions stored in a storage medium (e.g., an internal memory (136) or an external memory (138)) readable by a machine (e.g., an electronic device (101)).
  • a processor e.g., a processor (120)
  • the machine e.g., an electronic device (101)
  • the one or more instructions may include code generated by a compiler or code executable by an interpreter.
  • the machine-readable storage medium may be provided in the form of a non-transitory storage medium.
  • 'non-transitory' simply means that the storage medium is a tangible device and does not contain signals (e.g. electromagnetic waves), and the term does not distinguish between cases where data is stored semi-permanently or temporarily on the storage medium.
  • the method according to various embodiments disclosed in the present document may be provided as included in a computer program product.
  • the computer program product may be traded between a seller and a buyer as a commodity.
  • the computer program product may be distributed in the form of a machine-readable storage medium (e.g., a compact disc read only memory (CD-ROM)), or may be distributed online (e.g., downloaded or uploaded) via an application store (e.g., Play Store TM ) or directly between two user devices (e.g., smart phones).
  • an application store e.g., Play Store TM
  • at least a part of the computer program product may be at least temporarily stored or temporarily generated in a machine-readable storage medium, such as a memory of a manufacturer's server, a server of an application store, or an intermediary server.
  • each component e.g., a module or a program of the above-described components may include a single or multiple entities, and some of the multiple entities may be separately arranged in other components.
  • one or more components or operations of the above-described corresponding components may be omitted, or one or more other components or operations may be added.
  • the multiple components e.g., a module or a program
  • the integrated component may perform one or more functions of each of the multiple components identically or similarly to those performed by the corresponding component of the multiple components before the integration.
  • the operations performed by the module, program, or other component may be executed sequentially, in parallel, repeatedly, or heuristically, or one or more of the operations may be executed in a different order, omitted, or one or more other operations may be added.

Landscapes

  • User Interface Of Digital Computer (AREA)

Abstract

Le présent document concerne un dispositif électronique pour fournir un espace de réalité virtuelle, un procédé pour produire un son de vibration virtuelle à partir du dispositif électronique, et un support de stockage non transitoire. Selon un mode de réalisation, le dispositif électronique peut comprendre : un écran, un haut-parleur, une mémoire et au moins un processeur connecté fonctionnellement à l'écran, au haut-parleur et à la mémoire. Selon un mode de réalisation, le ou les processeurs peuvent être configurés pour recevoir une interaction d'utilisateur par au moins une interface d'entrée dans un espace de réalité virtuelle fourni par l'intermédiaire de l'écran. Selon un mode de réalisation, le ou les processeurs peuvent être configurés pour identifier un type de l'interaction d'utilisateur. Selon un mode de réalisation, le ou les processeurs peuvent être configurés pour, sur la base du type identifié, obtenir des informations relatives à un objet virtuel affiché dans l'espace de réalité virtuelle. Selon un mode de réalisation, le ou les processeurs peuvent être configurés pour commander le haut-parleur pour produire un son de vibration virtuelle correspondant à un motif de vibration généré sur la base des informations relatives à l'objet virtuel. D'autres modes de réalisation sont également possibles.
PCT/KR2024/010840 2023-07-26 2024-07-25 Dispositif électronique pour fournir un espace de réalité virtuelle, procédé pour produire un son de vibration virtuelle à partir d'un dispositif électronique, et support de stockage non transitoire WO2025023764A1 (fr)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
KR10-2023-0097674 2023-07-26
KR20230097674 2023-07-26
KR1020230153831A KR20250017109A (ko) 2023-07-26 2023-11-08 가상 현실 공간을 제공하는 전자 장치 및 전자 장치에서 가상 진동 소리를 출력하기 위한 방법 및 비 일시적 저장 매체
KR10-2023-0153831 2023-11-08

Publications (1)

Publication Number Publication Date
WO2025023764A1 true WO2025023764A1 (fr) 2025-01-30

Family

ID=94374839

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2024/010840 WO2025023764A1 (fr) 2023-07-26 2024-07-25 Dispositif électronique pour fournir un espace de réalité virtuelle, procédé pour produire un son de vibration virtuelle à partir d'un dispositif électronique, et support de stockage non transitoire

Country Status (1)

Country Link
WO (1) WO2025023764A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110129093A1 (en) * 2009-05-27 2011-06-02 Maria Karam System and method for displaying sound as vibrations
KR20150110403A (ko) * 2014-03-21 2015-10-02 임머숀 코퍼레이션 햅틱 효과의 자동 튜닝
US10095311B2 (en) * 2016-06-15 2018-10-09 Immersion Corporation Systems and methods for providing haptic feedback via a case
US20190187782A1 (en) * 2016-11-02 2019-06-20 Huizhou Tcl Mobile Communication Co., Ltd Method of implementing virtual reality system, and virtual reality device
US20220365590A1 (en) * 2019-11-08 2022-11-17 Meta Platforms Technologies, Llc Synthesizing haptic and sonic feedback for textured materials in interactive virtual environments

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110129093A1 (en) * 2009-05-27 2011-06-02 Maria Karam System and method for displaying sound as vibrations
KR20150110403A (ko) * 2014-03-21 2015-10-02 임머숀 코퍼레이션 햅틱 효과의 자동 튜닝
US10095311B2 (en) * 2016-06-15 2018-10-09 Immersion Corporation Systems and methods for providing haptic feedback via a case
US20190187782A1 (en) * 2016-11-02 2019-06-20 Huizhou Tcl Mobile Communication Co., Ltd Method of implementing virtual reality system, and virtual reality device
US20220365590A1 (en) * 2019-11-08 2022-11-17 Meta Platforms Technologies, Llc Synthesizing haptic and sonic feedback for textured materials in interactive virtual environments

Similar Documents

Publication Publication Date Title
WO2022108076A1 (fr) Procédé de connexion sans fil d'un environnement de réalité augmentée et dispositif électronique associé
WO2022019636A1 (fr) Procédé permettant d'effectuer une interaction utilisateur virtuelle et dispositif associé
WO2022131549A1 (fr) Dispositif électronique et procédé de fonctionnement d'un dispositif électronique
WO2022220659A1 (fr) Dispositif électronique et procédé par lequel un dispositif électronique entre des informations à l'aide d'un dispositif électronique externe
WO2022085940A1 (fr) Procédé et appareil de commande d'affichage d'une pluralité d'objets sur un dispositif électronique
WO2020159320A1 (fr) Procédé de reconnaissance d'objet au moyen d'une onde millimétrique, et dispositif électronique prenant en charge ledit procédé
WO2022086071A1 (fr) Dispositif électronique pour commander le fonctionnement d'un dispositif de type stylo électronique, procédé de fonctionnement dans un dispositif électronique, et support de stockage non transitoire
WO2025023764A1 (fr) Dispositif électronique pour fournir un espace de réalité virtuelle, procédé pour produire un son de vibration virtuelle à partir d'un dispositif électronique, et support de stockage non transitoire
WO2023106895A1 (fr) Dispositif électronique destiné à utiliser un dispositif d'entrée virtuel, et procédé de fonctionnement dans un dispositif électronique
WO2022231160A1 (fr) Dispositif électronique pour exécuter une fonction sur la base d'un geste de la main et son procédé de fonctionnement
WO2024043519A1 (fr) Procédé de commande de multiples affichages et dispositif électronique le prenant en charge
WO2025037937A1 (fr) Dispositif électronique, procédé et support de stockage non transitoire pour gérer une pluralité de fenêtres d'écran virtuel dans un espace de réalité virtuelle
WO2022102971A1 (fr) Procédé d'affichage d'interface utilisateur et dispositif électronique le prenant en charge
WO2025023550A1 (fr) Dispositif électronique porté sur soi pour changer un mode lié à une entrée d'utilisateur et son procédé de fonctionnement
WO2022220373A1 (fr) Dispositif électronique portatif pour commander l'annulation de bruit d'un dispositif électronique portatif externe, et son procédé de fonctionnement
WO2024029740A1 (fr) Procédé et dispositif de production de données de dessin en utilisant un dispositif d'entrée
WO2024135877A1 (fr) Dispositif électronique et procédé d'identification d'objet visuel parmi une pluralité d'objets visuels
WO2024172607A1 (fr) Dispositif électronique d'affichage d'icône, son procédé de fonctionnement, et support de stockage lisible par ordinateur non transitoire
WO2024080680A1 (fr) Dispositif électronique, procédé et support de stockage lisible par ordinateur non transitoire affichant une interface utilisateur pour des réglages spécifiques au contexte
WO2025014062A1 (fr) Dispositif électronique et procédé de traitement d'énoncé d'utilisateur par un dispositif électronique
WO2022124659A1 (fr) Dispositif électronique et procédé permettant de traiter une entrée d'utilisateur
WO2023149782A1 (fr) Dispositif électronique et procédé de fourniture d'une fonction haptique
WO2023033311A1 (fr) Dispositif électronique pour réaliser une fonction assortie avec une affordance graphique, et procédé de fonctionnement d'un dispositif électronique
WO2024106796A1 (fr) Procédé de commande de réglage audio et dispositif électronique portable le prenant en charge
WO2022181996A1 (fr) Dispositif de réalité augmentée et dispositif électronique interagissant avec le dispositif de réalité augmentée