WO2021189358A1 - 显示设备和音量调节方法 - Google Patents
显示设备和音量调节方法 Download PDFInfo
- Publication number
- WO2021189358A1 WO2021189358A1 PCT/CN2020/081417 CN2020081417W WO2021189358A1 WO 2021189358 A1 WO2021189358 A1 WO 2021189358A1 CN 2020081417 W CN2020081417 W CN 2020081417W WO 2021189358 A1 WO2021189358 A1 WO 2021189358A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- volume
- audio
- display device
- output
- user
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/16—Sound input; Sound output
Definitions
- This application relates to the technical field of display devices, and in particular to a display device and a volume adjustment method.
- the display device can provide users with playback screens such as audio, video, and pictures.
- display devices can not only provide users with live TV program content received through data broadcasting, but also provide users with various applications and service content such as online videos and online games.
- video chat has also become a basic function of the display device, and furthermore, a scene where there are two audio outputs at the same time also arises. For example, a scene where an audio and video chat is performed while playing a video program, or a scene where an audio and video chat is performed while playing music.
- the display device can be controlled to achieve the above functions based on the user's operation of the physical hard keys or virtual keys on the control device such as remote control, mobile terminal, etc., or through its own microphone or control
- the microphone on the device receives the voice input by the user and is controlled to perform the above-mentioned functions. For example, when the display device is playing a program, the user adjusts the volume through the volume key on the remote control.
- This application provides a display device and a volume adjustment method to solve the problem of how to better handle the two channels of sound.
- the present application provides a display device, which is characterized in that it includes:
- the controller is used for:
- a volume setting interface is presented on the display, the volume setting interface including a volume setting item for associating the output volume of the voice call with the output volume of the audio and video program;
- the output volume of the audio and video program and the output volume of the voice call are adjusted in association, so that the voice call is adjusted
- the output volume of is different from the adjusted output volume of the audio and video program.
- this application also provides a volume adjustment method, which includes:
- a volume setting interface is presented on the display, the volume setting interface including a volume setting item for associating the output volume of the voice call with the output volume of the audio and video program;
- the output volume of the audio and video program and the output volume of the voice call are adjusted in association, so that the voice call is adjusted
- the output volume of is different from the adjusted output volume of the audio and video program.
- Fig. 1 exemplarily shows a schematic diagram of an operation scenario between a display device and a control device according to an embodiment
- Fig. 2 exemplarily shows a block diagram of the hardware configuration of the control device 100 according to the embodiment
- FIG. 3 exemplarily shows a block diagram of the hardware configuration of the display device 200 according to the embodiment
- FIG. 4 exemplarily shows a block diagram of the hardware architecture of the display device 200 according to FIG. 3;
- FIG. 5 exemplarily shows a schematic diagram of the functional configuration of the display device 200 according to the embodiment
- Fig. 6a exemplarily shows a schematic diagram of the software configuration in the display device 200 according to the embodiment
- FIG. 6b exemplarily shows a schematic diagram of the configuration of the application program in the display device 200 according to the embodiment
- FIG. 7 exemplarily shows a schematic diagram of a user interface in the display device 200 according to the embodiment.
- Fig. 8 exemplarily shows a user interface in a watching and chatting scenario
- Fig. 9 exemplarily shows the processing procedure for multiple channels of audio data
- Fig. 10 exemplarily shows another user interface in a watching and chatting scenario
- Fig. 11 exemplarily shows another user interface in a watching and chatting scenario
- Fig. 12 exemplarily shows another user interface in a watching and chatting scenario
- Fig. 13 exemplarily shows another user interface in a watching and chatting scenario
- Fig. 14 exemplarily shows a flow chart of a volume control method.
- the embodiment of the present application provides a display device and a volume adjustment method.
- the display device provided in this application can be a display device with multiple controller architectures, such as the display device with a controller (dual hardware system) architecture shown in Figures 3-6 of this application, or a non-dual controller
- controller dual hardware system
- the structure of the display device is not limited in this application.
- the volume adjustment method provided in this application can be applied to display devices such as smart TVs, of course, can also be applied to other handheld devices that can provide voice and data connectivity functions and have wireless connection functions, or other devices that can be connected to a wireless modem Processing equipment, such as mobile phones (or "cellular" phones) and computers with mobile terminals, can also be portable, pocket-sized, handheld, computer-built or vehicle-mounted mobile devices, which exchange data with wireless access networks .
- various external device interfaces are usually provided on the display device to facilitate the connection of different peripheral devices or cables to achieve corresponding functions.
- a high-resolution camera When a high-resolution camera is connected to the interface of the display device, if the hardware system of the display device does not have the hardware interface of the high-pixel camera that receives the source code, it will cause the data received by the camera to be unable to present the data received by the camera to the display of the display device. On the screen.
- the hardware system of traditional display devices only supports one hard decoding resource, and usually only supports 4K resolution video decoding. Therefore, when you want to realize the video chat while watching Internet TV, in order not to reduce
- the definition of the network video picture requires the use of hard decoding resources (usually the GPU in the hardware system) to decode the network video.
- the general-purpose processor such as CPU
- the video chat screen is processed by soft decoding.
- this application discloses a dual hardware system architecture to realize multiple channels of video chat data (at least one local video).
- module used in the various embodiments of this application can refer to any known or later developed hardware, software, firmware, artificial intelligence, fuzzy logic, or combination of hardware or/and software code that can execute related components Function.
- remote control used in the various embodiments of this application refers to a component of an electronic device (such as the display device disclosed in this application), which can generally control the electronic device wirelessly within a short distance.
- the component can generally use infrared and/or radio frequency (RF) signals and/or Bluetooth to connect with electronic devices, and can also include functional modules such as WiFi, wireless USB, Bluetooth, and motion sensors.
- RF radio frequency
- a handheld touch remote control uses a user interface in a touch screen to replace most of the physical built-in hard keys in general remote control devices.
- gesture used in the embodiments of the present application refers to a user's behavior through a change of hand shape or hand movement to express expected ideas, actions, goals, and/or results.
- the term "hardware system” used in the various embodiments of this application may refer to an integrated circuit (IC), printed circuit board (Printed circuit board, PCB) and other mechanical, optical, electrical, and magnetic devices with computing , Control, storage, input and output functions of the physical components.
- the hardware system is usually also referred to as a motherboard or a chip.
- Fig. 1 exemplarily shows a schematic diagram of an operation scenario between a display device and a control device according to an embodiment. As shown in FIG. 1, the user can operate the display device 200 by controlling the device 100.
- the control device 100 may be a remote controller 100A, which can communicate with the display device 200 through infrared protocol communication, Bluetooth protocol communication, ZigBee protocol communication or other short-distance communication methods for wireless or other short-distance communication.
- the display device 200 is controlled in a wired manner.
- the user can control the display device 200 by inputting user instructions through keys on the remote control, voice input, control panel input, etc.
- the user can control the display device 200 by inputting corresponding control commands through the volume plus and minus keys, channel control keys, up/down/left/right movement keys, voice input keys, menu keys, and power on/off keys on the remote control. Function.
- the control device 100 can also be a smart device, such as a mobile terminal 100B, a tablet computer, a computer, a notebook computer, etc., which can be connected through a local area network (LAN, Wide Area Network), a wide area network (WAN, Wide Area Network), and a wireless local area network ((WLAN) , Wireless Local Area Network) or other networks communicate with the display device 200, and control the display device 200 through an application program corresponding to the display device 200.
- LAN Local area network
- WAN Wide Area Network
- WLAN wireless local area network
- the application can provide users with various controls through an intuitive user interface (UI, User Interface) on the screen associated with the smart device.
- UI User Interface
- User interface is a medium interface for interaction and information exchange between applications or operating systems and users. It realizes the conversion between the internal form of information and the form acceptable to users.
- the commonly used form of user interface is graphical user interface (graphic user interface, GUI), which refers to a user interface related to computer operations that is displayed in a graphical manner. It can be an icon, window, control and other interface elements displayed on the display screen of an electronic device.
- the control can include icons, buttons, menus, tabs, text boxes, dialog boxes, status bars, navigation bars, Widgets, etc. Visual interface elements.
- both the mobile terminal 100B and the display device 200 can install software applications, so that the connection and communication between the two can be realized through a network communication protocol, and the purpose of one-to-one control operation and data communication can be realized.
- the mobile terminal 100B can be made to establish a control command protocol with the display device 200
- the remote control keyboard can be synchronized to the mobile terminal 100B
- the function of controlling the display device 200 can be realized by controlling the user interface of the mobile terminal 100B; or the mobile terminal 100B
- the audio and video content displayed on the screen is transmitted to the display device 200 to realize the synchronous display function.
- the display device 200 can also communicate with the server 300 through multiple communication methods.
- the display device 200 may be allowed to communicate with the server 300 via a local area network, a wireless local area network, or other networks.
- the server 300 may provide various contents and interactions to the display device 200.
- the display device 200 transmits and receives information, interacts with an Electronic Program Guide (EPG, Electronic Program Guide), receives software program updates, or accesses a remotely stored digital media library.
- EPG Electronic Program Guide
- the server 300 may be a group or multiple groups, and may be one or more types of servers.
- the server 300 provides other network service content such as video-on-demand and advertising services.
- the display device 200 may be a liquid crystal display, an OLED (Organic Light Emitting Diode) display, a projection display device, or a smart TV.
- OLED Organic Light Emitting Diode
- the specific display device type, size, resolution, etc. are not limited, and those skilled in the art can understand that the display device 200 can make some changes in performance and configuration as required.
- the display device 200 may additionally provide a smart network TV function with a computer support function. Examples include Internet TV, Smart TV, Internet Protocol TV (IPTV), and so on.
- IPTV Internet Protocol TV
- the display device may not have a broadcast receiving function.
- a camera may be connected or provided on the display device 200 for presenting a picture captured by the camera on the display interface of the display device or other display devices, so as to realize interactive chats between users.
- the picture captured by the camera can be displayed on the display device in full screen, half screen, or in any selectable area.
- the camera is connected to the monitor rear shell through a connecting plate, and is fixedly installed on the upper middle of the monitor rear shell.
- a connecting plate As an installable method, it can be fixedly installed at any position of the monitor rear shell to ensure its It is only necessary that the image capture area is not blocked by the rear shell.
- the image capture area and the display device have the same display orientation.
- the camera can be connected to the display rear shell through a connecting plate or other conceivable connectors.
- the connector is equipped with a lifting motor.
- the user wants to use the camera or has an application to use the camera
- it can be embedded behind the rear shell to protect the camera from damage.
- the camera used in this application may have 16 million pixels to achieve the purpose of ultra-high-definition display. In actual use, a camera with higher or lower than 16 million pixels can also be used.
- the content displayed in different application scenarios of the display device can be merged in a variety of different ways, so as to achieve functions that cannot be achieved by traditional display devices.
- the user can have a voice call with at least one other user (that is, at least one other terminal) while enjoying an audio and video program.
- the presentation of audio and video programs can be used as the background picture, the sound of the audio and video programs can be used as the background sound, the voice call window is displayed on the background picture, and the voice call sound can be played simultaneously with the background sound through the display device.
- the function of the display device to play the above two channels of sounds at the same time can be called “watching and chatting at the same time", and the scene where the above two channels of sounds exist at the same time is called the “watching and chatting" scene.
- the chat window may not be displayed, and only the chat voice may be output. That is, the monitor plays audio and video programs, and the program sound and chat sound are output at the same time.
- chat voice when the user triggers the instruction to mute the chat voice, only the audio and video program sounds are output, and the chat voice of other users is converted into text or barrage. , Presented on the display.
- At least one video chat is conducted with other terminals.
- the user can video chat with at least one other user while entering the education application for learning.
- students can realize remote interaction with teachers while learning content in educational applications. Visually, you can call this function "learning while chatting”.
- a video chat is conducted with a player entering the game.
- a player enters a game application to participate in a game, it can realize remote interaction with other players. Visually, you can call this function "watch and play".
- the game scene is integrated with the video picture, and the portrait in the video picture is cut out and displayed on the game picture to improve the user experience.
- somatosensory games such as ball games, boxing games, running games, dancing games, etc.
- human body postures and movements are acquired through the camera, body detection and tracking, and the detection of key points of human skeleton data, and then the game Animations are integrated to realize games in scenes such as sports and dance.
- the user can interact with at least one other user in video and voice in the K song application.
- at least one user enters the application in a chat scene multiple users can jointly complete the recording of a song.
- the user can turn on the camera locally to obtain pictures and videos, which is vivid, and this function can be called "look in the mirror".
- Fig. 2 exemplarily shows a configuration block diagram of the control device 100 according to an exemplary embodiment.
- the control device 100 includes a controller 110, a communicator 130, a user input/output interface 140, a memory 190, and a power supply 180.
- the control device 100 is configured to control the display device 200, and can receive user input operation instructions, and convert the operation instructions into instructions that can be recognized and responded to by the display device 200, acting as an intermediary for the interaction between the user and the display device 200 effect.
- the user operates the channel addition and subtraction keys on the control device 100, and the display device 200 responds to the channel addition and subtraction operations.
- control device 100 may be a smart device.
- control device 100 can install various applications for controlling the display device 200 according to user requirements.
- the mobile terminal 100B or other smart electronic devices can perform similar functions to the control device 100 after installing an application for controlling the display device 200.
- the user can install various function keys or virtual buttons of the graphical user interface that can be provided on the mobile terminal 100B or other smart electronic devices by installing applications to realize the function of the physical keys of the control device 100.
- the controller 110 includes a processor 112, RAM 113 and ROM 114, a communication interface, and a communication bus.
- the controller 110 is used to control the operation and operation of the control device 100, as well as the communication and cooperation between internal components, and external and internal data processing functions.
- the communicator 130 realizes the communication of control signals and data signals with the display device 200 under the control of the controller 110. For example, the received user input signal is sent to the display device 200.
- the communicator 130 may include at least one of communication modules such as a WIFI module 131, a Bluetooth module 132, and an NFC module 133.
- the user input/output interface 140 wherein the input interface includes at least one of input interfaces such as a microphone 141, a touch panel 142, a sensor 143, and a button 144.
- input interfaces such as a microphone 141, a touch panel 142, a sensor 143, and a button 144.
- the user can implement the user instruction input function through voice, touch, gesture, pressing and other actions.
- the input interface converts the received analog signal into a digital signal and the digital signal into a corresponding instruction signal, and sends it to the display device 200.
- the output interface includes an interface for sending the received user instruction to the display device 200.
- it may be an infrared interface or a radio frequency interface.
- the user input instruction needs to be converted into an infrared control signal according to the infrared control protocol, and then sent to the display device 200 via the infrared sending module.
- a radio frequency signal interface a user input instruction needs to be converted into a digital signal, which is then modulated according to the radio frequency control signal modulation protocol, and then sent to the display device 200 by the radio frequency sending terminal.
- control device 100 includes at least one of a communicator 130 and an output interface.
- the control device 100 is equipped with a communicator 130, such as WIFI, Bluetooth, NFC and other modules, which can encode the user input command through the WIFI protocol, or the Bluetooth protocol, or the NFC protocol, and send it to the display device 200.
- a communicator 130 such as WIFI, Bluetooth, NFC and other modules, which can encode the user input command through the WIFI protocol, or the Bluetooth protocol, or the NFC protocol, and send it to the display device 200.
- the memory 190 is used to store various operating programs, data, and applications for driving and controlling the control device 100 under the control of the controller 110.
- the memory 190 can store various control signal instructions input by the user.
- the power supply 180 is used to provide operating power support for each element of the control device 100 under the control of the controller 110. Can battery and related control circuit.
- FIG. 3 exemplarily shows a hardware configuration block diagram of a hardware system in the display device 200 according to an exemplary embodiment.
- the structural relationship of the hardware system can be shown in Figure 3.
- first hardware system or the first controller one hardware system in the dual hardware system architecture
- second hardware system or the second controller the other hardware system
- the first controller includes various processors and various interfaces of the first controller
- the second controller includes various processors and various interfaces of the second controller.
- a relatively independent operating system may be installed in the first controller and the second controller, and the operating system of the first controller and the operating system of the second controller may communicate with each other through a communication protocol, for example: the first controller
- the framework layer of the operating system of the second controller and the framework layer of the operating system of the second controller can communicate for command and data transmission, so that there are two independent but interrelated subsystems in the display device 200.
- the first controller and the second controller can realize connection, communication and power supply through a plurality of different types of interfaces.
- the interface type of the interface between the first controller and the second controller may include a general-purpose input/output (GPIO), a USB interface, an HDMI interface, a UART interface, and the like.
- GPIO general-purpose input/output
- USB interface USB interface
- HDMI interface HDMI interface
- UART interface UART interface
- One or more of these interfaces can be used for communication or power transmission between the first controller and the second controller.
- the second controller can be powered by an external power source, and the first controller can be powered by the second controller instead of the external power source.
- the first controller may also include an interface for connecting other devices or components, such as the MIPI interface for connecting a camera (Camera) shown in FIG. 3, Bluetooth interface, etc.
- the second controller may also include a VBY interface for connecting to the TCON (Timer Control Register) of the display screen, and for connecting to a power amplifier (AMP). And speaker (Speaker) i2S interface; and IR/Key interface, USB interface, Wifi interface, Bluetooth interface, HDMI interface, Tuner interface, etc.
- TCON Timer Control Register
- AMP power amplifier
- IR/Key interface USB interface, Wifi interface, Bluetooth interface, HDMI interface, Tuner interface, etc.
- FIG. 4 is only an exemplary description of the dual hardware system architecture of the present application, and does not represent a limitation to the present application. In practical applications, both hardware systems can contain more or less hardware or interfaces as required.
- FIG. 4 exemplarily shows a block diagram of the hardware architecture of the display device 200 according to FIG. 3.
- the hardware system of the display device 200 may include a first controller and a second controller, and modules connected to the first controller or the second controller through various interfaces.
- the second controller may include a tuner and demodulator 220, a communicator 230, an external device interface 250, a controller 210, a memory 290, a user input interface, a video processor 260-1, an audio processor 260-2, a display 280, audio Output interface 270, power supply.
- the second controller may also include more or fewer modules.
- the tuner and demodulator 220 is used to perform modulation and demodulation processing such as amplification, mixing, and resonance on the broadcast and television signals received through wired or wireless methods, so as to demodulate the user’s information from multiple wireless or cable broadcast and television signals. Select the audio and video signals carried in the frequency of the TV channel, as well as additional information (such as EPG data signals).
- the signal path of the tuner and demodulator 220 can have many kinds, such as: terrestrial broadcasting, cable broadcasting, satellite broadcasting or Internet broadcasting, etc.; and according to different modulation types, the signal adjustment method can be digitally modulated The method may also be an analog modulation method; and according to different types of received television signals, the tuner demodulator 220 may demodulate analog signals and/or digital signals.
- the tuner and demodulator 220 is also used to respond to the TV channel frequency selected by the user and the TV signal carried by the frequency according to the user's selection and control by the controller 210.
- the tuner and demodulator 220 may also be in an external device, such as an external set-top box.
- the set-top box outputs TV audio and video signals through modulation and demodulation, and inputs them to the display device 200 through the external device interface 250.
- the communicator 230 is a component for communicating with external devices or external servers according to various communication protocol types.
- the communicator 230 may include a WIFI module 231, a Bluetooth communication protocol module 232, a wired Ethernet communication protocol module 233, and an infrared communication protocol module and other network communication protocol modules or near field communication protocol modules.
- the display device 200 may establish a control signal and a data signal connection with an external control device or content providing device through the communicator 230.
- the communicator may receive the control signal of the remote controller 100A according to the control of the controller.
- the external device interface 250 is a component that provides data transmission between the second controller 210 and the first controller and other external devices.
- the external device interface can be connected to external devices such as set-top boxes, game devices, notebook computers, etc. in a wired/wireless manner, and can receive external devices such as video signals (such as moving images), audio signals (such as music), and additional information (such as EPG). ) And other data.
- the external device interface 250 may include: a high-definition multimedia interface (HDMI) terminal 251, a composite video blanking synchronization (CVBS) terminal 252, an analog or digital component terminal 253, a universal serial bus (USB) terminal 254, red, green, and blue ( RGB) terminal (not shown in the figure) and any one or more.
- HDMI high-definition multimedia interface
- CVBS composite video blanking synchronization
- USB universal serial bus
- RGB red, green, and blue
- the controller 210 controls the work of the display device 200 and responds to user operations by running various software control programs (such as an operating system and/or various application programs) stored on the memory 290.
- various software control programs such as an operating system and/or various application programs
- the controller 210 includes a read-only memory RAM 214, a random access memory ROM 213, a graphics processor 216, a CPU processor 212, a communication interface 218, and a communication bus.
- the RAM 214 and the ROM 213, the graphics processor 216, the CPU processor 212, and the communication interface 218 are connected by a bus.
- the graphics processor 216 is used to generate various graphics objects, such as icons, operation menus, and user input instructions to display graphics. Including an arithmetic unit, which performs operations by receiving various interactive instructions input by the user, and displays various objects according to the display attributes. As well as including a renderer, various objects obtained based on the arithmetic unit are generated, and the rendering result is displayed on the display 280.
- the CPU processor 212 is configured to execute operating system and application program instructions stored in the memory 290. And according to receiving various interactive instructions input from the outside, to execute various applications, data and content, so as to finally display and play various audio and video content.
- the CPU processor 212 may include multiple processors.
- the multiple processors may include one main processor and multiple or one sub-processors.
- the main processor is used to perform some operations of the display device 200 in the pre-power-on mode, and/or to display images in the normal mode.
- the communication interface may include a first interface 218-1 to an nth interface 218-n. These interfaces may be network interfaces connected to external devices via a network.
- the controller 210 may control the overall operation of the display device 200. For example, in response to receiving a user command for selecting a UI object to be displayed on the display 280, the controller 210 may perform an operation related to the object selected by the user command.
- the object may be any one of the selectable objects, such as a hyperlink or an icon.
- Operations related to the selected object for example: display operations connected to hyperlink pages, documents, images, etc., or perform operations corresponding to the icon.
- the user command for selecting the UI object may be a command input through various input devices (for example, a mouse, a keyboard, a touch pad, etc.) connected to the display device 200 or a voice command corresponding to the voice spoken by the user.
- the memory 290 includes storing various software modules used to drive and control the display device 200.
- various software modules stored in the memory 290 include: a basic module, a detection module, a communication module, a display control module, a browser module, and various service modules.
- the basic module is a bottom-level software module used for signal communication between various hardware in the display device 200 and sending processing and control signals to the upper-level module.
- the detection module is a management module used to collect various information from various sensors or user input interfaces, and perform digital-to-analog conversion, analysis and management.
- the voice recognition module includes a voice parsing module and a voice command database module.
- the display control module is a module for controlling the display 280 to display image content, and can be used to play information such as multimedia image content and UI interfaces.
- the communication module is a module used for control and data communication with external devices.
- the browser module is a module used to perform data communication between browsing servers.
- the service module is a module used to provide various services and various applications.
- the memory 290 is also used to store and receive external data and user data, images of various items in various user interfaces, and visual effect diagrams of focus objects.
- the user input interface is used to send a user's input signal to the controller 210, or to transmit a signal output from the controller to the user.
- the control device (such as a mobile terminal or a remote control) can send input signals input by the user, such as a power switch signal, a channel selection signal, and a volume adjustment signal, to the user input interface, and then forward the user input interface to the controller;
- the control device may receive output signals such as audio, video, or data output from the user input interface processed by the controller, and display the received output signal or output the received output signal in the form of audio or vibration.
- the user may input a user command on a graphical user interface (GUI) displayed on the display 280, and the user input interface receives the user input command through the graphical user interface (GUI).
- GUI graphical user interface
- the user may input a user command by inputting a specific sound or gesture, and the user input interface recognizes the sound or gesture through the sensor to receive the user input command.
- the video processor 260-1 is used to receive video signals, and perform video data processing such as decompression, decoding, scaling, noise reduction, frame rate conversion, resolution conversion, and image synthesis according to the standard codec protocol of the input signal.
- the video signal displayed or played directly on the display 280.
- the video processor 260-1 includes a demultiplexing module, a video decoding module, an image synthesis module, a frame rate conversion module, a display formatting module, and the like.
- the demultiplexing module is used to demultiplex the input audio and video data stream. For example, if MPEG-2 is input, the demultiplexing module will demultiplex into a video signal and an audio signal.
- the video decoding module is used to process the demultiplexed video signal, including decoding and scaling.
- An image synthesis module such as an image synthesizer, is used to superimpose and mix the GUI signal generated by the graphics generator with the zoomed video image according to user input or by itself, to generate a displayable image signal.
- the frame rate conversion module is used to convert the frame rate of the input video, such as converting the frame rate of the input 24Hz, 25Hz, 30Hz, 60Hz video to the frame rate of 60Hz, 120Hz or 240Hz, where the input frame rate can be the same as the source
- the video stream is related, and the output frame rate can be related to the update rate of the display.
- the input has the usual format, such as the frame insertion method.
- the display formatting module is used to change the signal output by the frame rate conversion module to a signal conforming to the display format such as a display, for example, format the signal output by the frame rate conversion module to output RGB data signals.
- the display 280 is used to receive the image signal input from the video processor 260-1, to display video content and images and a menu control interface.
- the display 280 includes a display component for presenting a picture and a driving component for driving image display.
- the displayed video content can be from the video in the broadcast signal received by the tuner and demodulator 220, or from the video content input by the communicator or the interface of an external device.
- the display 220 simultaneously displays a user manipulation interface UI generated in the display device 200 and used for controlling the display device 200.
- the display 280 it also includes a driving component for driving the display.
- the display 280 is a projection display, it may also include a projection device and a projection screen.
- the audio processor 260-2 is used to receive audio signals, and perform decompression and decoding according to the standard codec protocol of the input signal, as well as audio data processing such as noise reduction, digital-to-analog conversion, and amplification processing, and the result can be in the speaker 272 The audio signal to be played.
- the audio output interface 270 is used to receive the audio signal output by the audio processor 260-2 under the control of the controller 210.
- the audio output interface may include a speaker 272 or output to an external audio output terminal 274 of the generator of an external device, such as : External audio terminal or headphone output terminal, etc.
- the video processor 260-1 may include one or more chips.
- the audio processor 260-2 may also include one or more chips.
- the video processor 260-1 and the audio processor 260-2 may be separate chips, or may be integrated with the controller 210 in one or more chips.
- the power supply is used to provide power supply support for the display device 200 with power input from an external power supply under the control of the controller 210.
- the power supply may include a built-in power supply circuit installed inside the display device 200, or may be a power supply installed outside the display device 200, such as a power interface that provides an external power supply in the display device 200.
- the first controller may include a controller 310, a communicator 330, a detector 340, and a memory 390. In some embodiments, it may also include a user input interface, a video processor, an audio processor, a display, and an audio output interface. In some embodiments, there may also be a power supply that independently supplies power to the first controller.
- the communicator 330 is a component for communicating with external devices or external servers according to various communication protocol types.
- the communicator 330 may include a WIFI module 331, a Bluetooth communication protocol module 332, a wired Ethernet communication protocol module 333, and an infrared communication protocol module and other network communication protocol modules or near field communication protocol modules.
- the communicator 330 of the first controller and the communicator 230 of the second controller also interact with each other.
- the WiFi module 231 of the second controller is used to connect to an external network and generate network communication with an external server or the like.
- the WiFi module 331 of the first controller is used to connect to the WiFi module 231 of the second controller without directly connecting to an external network or the like. Therefore, for the user, a display device as in the above embodiment can display a WiFi account to the outside.
- the detector 340 is a component used by the first controller of the display device to collect signals from the external environment or interact with the outside.
- the detector 340 may include a light receiver 342, a sensor used to collect the intensity of ambient light, which can adaptively display parameter changes by collecting ambient light, etc.; it may also include an image collector 341, such as a camera, a camera, etc., which can be used to collect external Environmental scenes, as well as gestures used to collect attributes of users or interact with users, can adaptively change display parameters, and can also recognize user gestures to achieve the function of interaction with users.
- the external device interface 350 provides components for data transmission between the controller 310 and the second controller or other external devices.
- the external device interface can be connected to external devices such as set-top boxes, game devices, notebook computers, etc., in a wired/wireless manner.
- the controller 310 controls the work of the display device 200 and responds to user operations by running various software control programs (such as installed third-party applications, etc.) stored on the memory 390 and interacting with the second controller.
- various software control programs such as installed third-party applications, etc.
- the controller 310 includes a read-only memory ROM313, a random access memory RAM314, a graphics processor 316, a CPU processor 312, a communication interface 318, and a communication bus.
- the ROM 313 and the RAM 314, the graphics processor 316, the CPU processor 312, and the communication interface 318 are connected by a bus.
- the CPU processor 312 runs the system startup instruction in the ROM, and copies the operating system stored in the memory 390 to the RAM 314 to start the startup operating system. After the operating system is started, the CPU processor 312 copies various application programs in the memory 390 to the RAM 314, and then starts to run and start various application programs.
- the CPU processor 312 is used to execute operating system and application instructions stored in the memory 390, communicate with the second controller, transmit and interact with signals, data, instructions, etc., and receive various interactive instructions from external inputs, To execute various applications, data and content, in order to finally display and play various audio and video content.
- the communication interface may include the first interface 318-1 to the nth interface 318-n. These interfaces may be network interfaces connected to external devices via a network, or network interfaces connected to the second controller via a network.
- the controller 310 may control the overall operation of the display device 200. For example, in response to receiving a user command for selecting a UI object to be displayed on the display 280, the controller 210 may perform an operation related to the object selected by the user command.
- the graphics processor 316 is used to generate various graphics objects, such as icons, operation menus, and user input instructions to display graphics. Including an arithmetic unit, which performs operations by receiving various interactive instructions input by the user, and displays various objects according to the display attributes. As well as including a renderer, various objects obtained based on the arithmetic unit are generated, and the rendering result is displayed on the display 280.
- Both the graphics processor 316 of the first controller and the graphics processor 216 of the second controller can generate various graphics objects. Differentily, if application 1 is installed in the first controller, and application 2 is installed in the second controller, when the user is in the interface of application 1, and the user input instructions are performed in application 1, the graphics processing of the first controller The generator 316 generates graphic objects. When the user is on the application 2 interface and performs the user input instruction in the application 2, the graphics processor 216 of the second controller generates the graphic object.
- Fig. 5 exemplarily shows a schematic diagram of a functional configuration of a display device according to an exemplary embodiment.
- the memory 390 of the first controller and the memory 290 of the second controller are respectively used to store operating system, application programs, content and user data, etc.
- the controller 310 of the first controller and the second controller 310 The system operation of driving the display device 200 and responding to various operations of the user are executed under the control of the controller 210 of the display device.
- the memory 390 of the first controller and the memory 290 of the second controller may include volatile and/or nonvolatile memory.
- the memory 290 is specifically used to store the operating program of the controller 210 in the drive display device 200, and store various application programs built in the display device 200, various application programs downloaded by the user from an external device, and Various graphical user interfaces related to the application, as well as various objects related to the graphical user interface, user data information, and various internal data supporting the application.
- the memory 290 is used to store system software such as an operating system (OS) kernel, middleware, and applications, as well as to store input video data and audio data, and other user data.
- OS operating system
- the memory 290 is specifically used to store driver programs and related data such as the video processor 260-1 and the audio processor 260-2, the display 280, the communication interface 230, the tuner and demodulator 220, and the input/output interface.
- the memory 290 may store software and/or programs.
- the software programs used to represent an operating system (OS) include, for example, kernels, middleware, application programming interfaces (APIs), and/or application programs.
- OS operating system
- the kernel may control or manage system resources, or functions implemented by other programs (such as the middleware, API, or application program), and the kernel may provide interfaces to allow middleware and APIs, or applications to access the controller , In order to achieve control or management of system resources.
- the memory 290 includes a broadcast receiving module 2901, a channel control module 2902, a volume control module 2903, an image control module 2904, a display control module 2905, an audio control module 2906, an external command recognition module 2907, a communication control module 2908, and power control Module 2910, operating system 2911, and other application programs 2912, browser module, etc.
- the controller 210 executes various software programs in the memory 290 such as: broadcast and television signal reception and demodulation function, TV channel selection control function, volume selection control function, image control function, display control function, audio control function, external command Various functions such as recognition function, communication control function, optical signal receiving function, power control function, software control platform supporting various functions, and browser function.
- the memory 390 includes storing various software modules used to drive and control the display device 200.
- various software modules stored in the memory 390 include: a basic module, a detection module, a communication module, a display control module, a browser module, and various service modules. Since the functions of the memory 390 and the memory 290 are relatively similar, please refer to the memory 290 for related parts, which will not be repeated here.
- the memory 390 includes an image control module 3904, an audio control module 2906, an external command recognition module 3907, a communication control module 3908, a light receiving module 3909, an operating system 3911, and other application programs 3912, a browser module, and so on.
- the controller 210 executes various software programs in the memory 290, such as: image control function, display control function, audio control function, external command recognition function, communication control function, optical signal receiving function, power control function, support for various Functional software control platform, as well as various functions such as browser functions.
- the external command recognition module 2907 of the second controller and the external command recognition module 3907 of the first controller can recognize different commands.
- the external command recognition module 3907 of the first controller may include a graphic recognition module 3907-1, and the graphic recognition module 3907-1 stores a graphic database, When the camera receives graphics instructions from the outside world, it makes a corresponding relationship with the instructions in the graphics database to control the display device.
- the external command recognition module 2907 of the second controller may include a voice recognition module 2907-2.
- the voice recognition module 2907-2 stores a voice database, and the voice receiver When the device receives an external voice instruction or, it corresponds to the instruction in the voice database to control the display device.
- the control device 100 such as a remote controller is connected to the second controller, and the key command recognition module interacts with the control device 100 in command.
- Fig. 6a exemplarily shows a configuration block diagram of the software system in the display device 200 according to an exemplary embodiment.
- the operating system 2911 includes operating software for processing various basic system services and for implementing hardware-related tasks, acting as a data processing platform between application programs and hardware components. medium.
- part of the operating system kernel may include a series of software to manage the hardware resources of the display device and provide services for other programs or software codes.
- part of the operating system kernel may include one or more device drivers, and the device drivers may be a set of software codes in the operating system to help operate or control devices or hardware associated with the display device.
- the drive may contain code to manipulate video, audio, and/or other multimedia components. Examples include displays, cameras, Flash, WiFi, and audio drivers.
- the accessibility module 2911-1 is used to modify or access the application program to realize the accessibility of the application program and the operability of its display content.
- the communication module 2911-2 is used to connect to other peripherals via related communication interfaces and communication networks.
- the user interface module 2911-3 is used to provide objects that display the user interface for access by various applications, and can realize user operability.
- the control application 2911-4 is used to control process management, including runtime applications, etc.
- the event transmission system 2914 can be implemented in the operating system 2911 or in the application 2912. In some embodiments, it is implemented in the operating system 2911 on the one hand, and implemented in the application program 2912 at the same time, for monitoring various user input events, and responding to the recognition results of various events or sub-events according to various events. And implement one or more sets of pre-defined operation procedures.
- the event monitoring module 2914-1 is used to monitor input events or sub-events of the user input interface.
- the event recognition module 2914-2 is used to input the definition of various events to various user input interfaces, recognize various events or sub-events, and transmit them to the processing to execute the corresponding one or more groups of processing programs .
- the event or sub-event refers to the input detected by one or more sensors in the display device 200 and the input of an external control device (such as the control device 100, etc.).
- an external control device such as the control device 100, etc.
- various sub-events of voice input such as the voice input
- gesture input sub-events of gesture recognition such as the gesture input sub-events of gesture recognition
- sub-events of remote control button command input of control devices such as: various sub-events of voice input, gesture input sub-events of gesture recognition, and sub-events of remote control button command input of control devices.
- one or more sub-events in the remote control include various forms, including but not limited to one or a combination of pressing up/down/left/right/, the OK key, and pressing the key. And the operations of non-physical buttons, such as moving, pressing, and releasing.
- the interface layout management module 2913 which directly or indirectly receives various user input events or sub-events monitored by the event transmission system 2914, is used to update the layout of the user interface, including but not limited to the position of each control or sub-control in the interface, and the container
- the size or position, level, etc. of the interface are related to the various execution operations of the interface layout.
- the application layer of the display device includes various application programs that can be executed on the display device 200.
- the application layer 2912 of the second controller may include, but is not limited to, one or more applications, such as video-on-demand applications, application centers, game applications, and so on.
- the application layer 3912 of the first controller may include, but is not limited to, one or more applications, such as a live TV application, a media center application, and so on. It should be noted that the application programs contained in the first controller and the second controller are determined according to the operating system and other designs. The present invention does not need to perform the application programs contained in the first controller and the second controller. Specific definitions and divisions.
- Live TV applications can provide live TV through different sources.
- a live TV application can provide TV signals using input from cable TV, over-the-air broadcasting, satellite services, or other types of live TV services.
- the live TV application can display the video of the live TV signal on the display device 200.
- Video-on-demand applications can provide videos from different storage sources. Unlike live TV applications, VOD provides video display from certain storage sources. For example, video on demand can come from the server side of cloud storage, and from the local hard disk storage that contains stored video programs.
- Media center applications can provide various multimedia content playback applications.
- the media center can provide services that are different from live TV or video on demand, and users can access various images or audio through the media center application.
- Application center can provide storage of various applications.
- the application program may be a game, an application program, or some other application program that is related to a computer system or other device but can be run on a display device.
- the application center can obtain these applications from different sources, store them in the local storage, and then run on the display device 200.
- independent operating systems may be installed in the first controller and the second controller, there are two independent but interrelated subsystems in the display device 200.
- both the first controller and the N can be independently installed with Android and various APPs, so that each chip can realize a certain function, and the first controller and the second controller can cooperate to realize a certain function.
- FIG. 7 exemplarily shows a schematic diagram of a user interface in the display device 200 according to an exemplary embodiment.
- the user interface includes multiple view display windows, for example, a first view display window 201 and a play screen 202, where the play screen includes one or more different items laid out.
- the user interface also includes a selector indicating that the item is selected, and the position of the selector can be moved through user input to change the selection of different items.
- multiple view display windows can present display screens of different levels.
- the first view display window can present the content of the video chat item
- the second view display window can present the content of the application layer item (eg, webpage video, VOD display, application screen, etc.).
- the presentation of display windows of different views has priority differences, and the display priorities of the view display windows are different between view display windows with different priorities.
- the priority of the system layer is higher than the priority of the application layer.
- the screen display of the view display window of the system layer is not blocked; and the application layer is enabled according to the user's choice.
- the size and position of the view display window of the system layer change, the size and position of the view display window of the system layer are not affected.
- the same level of display screen can also be presented.
- the selector can switch between the display window of the first view and the display window of the second view, and when the size and position of the display window of the first view change, the second view
- the size and position of the display window can be changed at any time.
- the display device plays at least two channels of sound at the same time.
- FIG. 8 exemplarily shows a user interface in a watching and chatting scenario.
- the display device simultaneously plays a video program and makes a voice call with three terminal users.
- the display device plays the video program in full screen, and the voice call window is suspended on the video playback screen in the form of a small window.
- the watching and chatting scene is not limited to the exemplarily shown scene of watching a video program and making a voice call, but also includes a scene of listening to an audio program and making a video call.
- a video program or audio program is paused during a voice call or the display presents a static user interface instead of a dynamic video screen, since the audio output channel of the video program is always on, this type of scene is also considered The aforementioned scene where there are at least two channels of sound, that is, watching and chatting scenes.
- the audio and video programs played by the display device may be live TV programs or network programs.
- the display device can conduct multi-channel video chats with multiple other terminal devices while playing audio and video programs.
- the display device can play more than 2 sound signals at the same time.
- the controller can receive at least two kinds of audio data, one is the audio data of audio and video programs, the audio and video programs further include live TV programs and network programs, and the second is voice Audio data of the call.
- the controller uses the audio processor 260 to decompress and decode the aforementioned at least two audio data according to the standard encoding and decoding protocol of the input signal, as well as audio data processing such as noise reduction, digital-to-analog conversion, and amplification processing, and process
- the latter at least the sound signal is superimposed and sent to the audio output interface 270 (such as the speaker 272), and finally the audio and video program sound and the voice call sound are output through the audio output interface 270, for example, the audio and video program sound and the voice call are played through the speaker 272 the sound of.
- the user can normally operate the display device through the operation control device.
- the user can adjust the output volume value of the sound signal by operating the control device, and the power amplifier manager controls the gain of the sound signal according to the output volume value set by the user.
- the output volume of the sound signal can be adjusted by operating a physical volume button (volume +, volume -) or virtual volume button on a remote control or a mobile terminal, or voice input.
- the controller when the display shows the user interface as shown in FIG. 8, if the controller receives an instruction to adjust the volume input by the user by operating the control device 100, the controller responds to the instruction and displays on the upper layer of the user interface. Indicates the interface element of the current output volume.
- the interface element may be a volume adjustment bar as shown in FIG. 10. The user can know the current output volume value of the sound signal according to the volume value shown in the volume adjustment bar, and the power amplifier manager controls the gain of the sound signal according to the output volume value.
- the power amplifier manager in the audio processor adjusts the gain of the TV program sound signal and the voice call sound signal at the same time according to the same output volume value and superimposes them, the output volume of the TV program and the voice call output The volume is the same. Furthermore, from the user's point of view, in the "watching and chatting" scene, the two voices are mixed together, interfering with each other, causing the user to be unable to recognize them.
- Figures 8 and 11-13 exemplarily show a schematic diagram of the volume adjustment interaction process in the watching and chatting scene. .
- the user when playing audio and video programs and voice calls at the same time as shown in FIG. 8, the user can input an instruction to adjust the volume by operating the control device, and the controller responds to the instruction and displays the volume setting on the display.
- the volume setting interface displays interface elements for representing the output volume of audio and video programs, and also displays volume setting items for correlating the output volume of voice calls with the output volume of audio and video programs.
- FIG 11 exemplarily shows a volume setting interface.
- the volume setting interface is displayed in the form of a view window on the playing screen of audio and video programs and the upper layer of the voice call window.
- the volume setting interface includes the volume Adjustment bar 111 and volume setting items 112-114, the volume adjustment bar is used to indicate the output volume value of audio and video programs, volume setting items 112-114 are "standard mode” 112, "loud mode” 113 and "AI silent mode” respectively "114.
- the volume setting item is used to associate the output volume of the voice call with the output volume of the audio and video program.
- the user directly adjusts the output volume of the audio and video program by operating the control device, and the controller Then, the output volume of the voice call is adjusted according to the output volume value of the audio and video program, so that the output volume of the two is different.
- the controller adjusts the output volume value of the voice call to the second volume according to the first volume of the output volume value of the audio and video program.
- the first volume is related to the second volume, and the first volume ⁇ the second volume.
- a volume setting item is preset with one adjustment coefficient, and different volume setting items correspond to different adjustment coefficients.
- the adjustment coefficient is used to multiply the output volume value of the audio and video program to obtain the output volume value of the voice call, so that the output volume of the audio and video program is different from the output volume of the voice call.
- the adjustment coefficient corresponding to a certain volume setting item is 1, the output volume value of the voice call does not change relative to the output volume value of the audio and video program. It can be seen that the output volume value of the audio and video program and the output volume of the voice call The value is the same.
- the three volume setting items 112-114 shown in Figure 11 respectively correspond to an adjustment coefficient.
- the adjustment coefficient corresponding to the "standard mode" 112 may be 1.1, which means that the person of the opposite user
- the sound volume value can be increased by 10% relative to the playback volume value of the audio and video program
- the adjustment coefficient corresponding to the "loud mode” 113 can be 1.2, which means that the human voice volume value of the peer user can be compared with the audio and video during video chat.
- the playback volume value of the program is increased by 20%
- the adjustment coefficient corresponding to the "AI mute mode" 114 can be 0, which means that the voice volume of the opposite user is muted during a voice call.
- the user can continue to operate the volume key (volume + or volume -) on the control device to input an instruction to adjust the volume, and the controller responds to the instruction , Obtain the pre-saved default items in the volume setting item, and then adjust the output volume of audio and video programs and the output volume of voice calls according to the default items.
- the controller obtains the adjustment coefficient 1.1 corresponding to the "standard mode” 112, and then adjusts the output volume of the voice call to the first volume according to the output volume value of the audio and video program.
- the second volume, the second volume the first volume ⁇ 1.1.
- the user when the display shows the volume setting interface as shown in FIG. 11, the user can select a certain volume setting item by operating the control device, and the controller can output the audio and video programs according to the selected volume setting item.
- the "standard” mode and “sounding” mode can achieve the effect of human voice enhancement, so that in the watching and chatting scene, the sound of the audio and video program is played as the background sound, and the sound of the voice call is the foreground The sound is played, so that the user can easily distinguish the source of the sound and avoid confusion.
- the adjustment coefficients of the "standard mode” 112 and the “boom mode” 113 are not limited to 1.1 or 1.2. In other embodiments, the "standard mode” 112 and the “boom mode” 113 may also be preset Any value of.
- the volume setting items displayed in the volume setting interface are not limited to the above three types. In other embodiments, other volume setting items may be provided for the user to choose from according to user needs.
- the sound reduction effect is achieved by adjusting the adjustment coefficient corresponding to the volume setting item. For example, when the adjustment coefficient corresponding to a certain volume setting item is 0.5, it means that the person of the opposite user is The sound volume value can be reduced by 50% relative to the playback volume value of the audio and video program, thereby achieving the effect of weakening the human voice.
- the at least two channels of sounds in the watching and chatting scene, by adjusting the volume of at least two channels of sounds played at the same time by the display device, the at least two channels of sounds can be played at different output volumes, so that Users can easily distinguish the source of the sound.
- the adjustment coefficient corresponding to the "AI silent mode” shown in FIG. 11 can be 0, which means that the voice volume of the video chatting person is reduced to 0.
- the "AI Mute Mode” when selected (ie turned on), the text corresponding to the call audio data can be posted on the playback screen in the form of a barrage The upper level.
- the controller stops playing the voice signal of the voice call, and presents the text corresponding to the voice call data corresponding to the peer device on the top layer of the user interface in the form of a barrage, as shown in Figure 12 .
- the following takes the display device 200B as an example to describe the implementation of the above-mentioned "AI silent mode" 114, where the display device 200B conducts a voice call with one or more other terminal devices while playing audio and video programs.
- An exemplary description will be given below in conjunction with the voice call communication process between the display device 200B and the display device 200A.
- the text corresponding to the voice call data is converted by the data sending end according to the call data collected by it, and sent to the data receiving end for display.
- the display device 200A collects the user's call data A through the microphone; when the display device 200B does not turn on the "AI silent mode” 114, the display device 200A sends the collected call data A to the display device 200B; in the display device 200B When the "AI silent mode” 114 is turned on, the display device 200A synchronizes the collected call data A to the voice server, recognizes and converts the call data A through the voice server, and obtains the text A corresponding to the call data A; then sends the text A To the display device 200B.
- the display device 200B when the "AI silent mode" 114 is not turned on, the call data A sent by the display device 200A is received, the voice call sound signal is extracted from the call data A, and the voice call sound signal is used by the power amplifier manager Perform processing, and play the processed sound signal through the speaker; when it receives the user input to select the "AI silent mode” 114, notify the display device 200A; when the display device 200B turns on the "AI silent mode” 114 , The display device 200B receives the text A sent by the display device 200A, and displays the text A on the top layer of the user interface in real time, thereby achieving the display effect shown in FIG. 11.
- the display device 200B notifies the display device 200A when it receives the user’s input on the volume setting interface shown in FIG. 11 to select the "AI silent mode" 114 operation, so that the display device 200A can collect After the call data A is converted into text A, it is sent to the display device 200B.
- the data receiving end recognizes and converts the received call data to obtain the corresponding text and display it.
- the display device 200A collects the call data A of the local user through a microphone, and sends the collected call data A to the display device 200B; for the display device 200B, when the "AI silent mode" 114 is not turned on, the display device 200A receives the display
- the call data A sent by the device 200A extracts the voice call sound signal from the call data A, processes the voice call sound signal through the power amplifier manager, and plays it through the speaker; turn on the "AI silent mode” 114 on the display device 200B
- the display device 200B receives the call data A sent by the display device 200A, and synchronizes the received call data A to the voice server, and recognizes and converts the call data A through the voice server to obtain the text A corresponding to the call data A; Display text A on the top level of the user interface.
- the display of the text A has a certain time delay Ta, which is at least the display device 200A or the display device 200B through the voice server recognition The length of time required to output text A.
- Figure 13 exemplarily shows another volume setting interface.
- the interface shown in Figure 13 also includes the item "Volume Association Adjustment" which is used to turn on or turn off the above-mentioned volume association adjustment function.
- Switch 115 the user can turn on or off the volume-related adjustment function by operating this item. Specifically, if the volume-related adjustment function is turned on, each volume setting item in the interface is an item that can be operated and then can be selected by the user; if the volume-related adjustment function is turned off, the volume setting item is an inoperable item. Furthermore, it cannot be selected by user operations.
- the user cancels the display of the volume setting interface by operating the control device such as the "return” button, the "exit” button, etc., to return to the interface shown in FIG. 8.
- FIG. 14 is a flowchart of a volume control method exemplarily shown in some embodiments of the application. As shown in FIG. 14, the method may include:
- Step 01 When the audio and video program is played and the voice call is in progress at the same time, an instruction to adjust the volume is received from the user.
- the embodiment of the present application proposes the concept of a sound playing scene, and the sound playing scene of a display device includes a watching and chatting scene and a normal scene.
- the watching and chatting scene refers to a scene that includes two or more sound signal output, for example, a scene where audio and video program playback and voice call are performed at the same time.
- step 02 when a user input instruction to adjust the volume is received, it is determined whether the current sound playback scene is a watching and chatting scene, if it is a watching and chatting scene, step 02 is executed, if not
- the chat scene that is, the normal scene, presents an interface as shown in FIG. 10, and adjusts the volume of the video and audio program played by the display device according to the user's input.
- the user can input an instruction to adjust the volume by operating the control device. For example, when the user presses a physical volume button for increasing or decreasing the volume on the remote control 100A, the controller receives an instruction to increase or decrease the volume sent by the remote control. For another example, when the user clicks a virtual volume button used to increase or decrease the volume on the mobile terminal 100B, the controller receives an instruction to increase or decrease the volume sent by the mobile terminal.
- the user can also input voice instructions for increasing or decreasing the volume through the remote control, the display device, or the microphone on the mobile terminal. The user can also input an instruction to increase or decrease the volume by pressing the local volume button on the housing of the display device for increasing or decreasing the volume.
- Step 02 In response to the instruction to adjust the volume, a volume setting interface is presented on the display.
- the volume setting interface includes an interface element for indicating the output volume of the audio and video program and an interface element for communicating the voice call.
- the output volume is a volume setting item associated with the output volume of the audio and video program.
- the volume setting interface involved in step 02 may be the volume setting interface as shown in FIG. 11, the interface element used to indicate the output volume of the audio and video program may be the volume adjustment bar 111 in FIG. 11, and the volume setting item may be as shown in FIG. Items 112-114 in 11.
- one volume setting item is preset with one adjustment coefficient, and different volume setting items correspond to different adjustment coefficients.
- the adjustment coefficient is used to multiply the output volume value of the audio and video program to obtain the output volume value of the voice call, so that the output volume of the audio and video program is different from the output volume of the voice call.
- the adjustment coefficient corresponding to a certain volume setting item is 1, the output volume value of the voice call does not change relative to the output volume value of the audio and video program. It can be seen that the output volume value of the audio and video program and the output volume of the voice call The value is the same.
- Step 03 In response to the user's selection operation of the volume setting item, according to the selected volume setting item, the output volume of the audio and video program and the output volume of the voice call are adjusted in association, so that the voice The adjusted output volume of the call is different from the adjusted output volume of the audio and video program.
- the audio stream source of the sound signal includes TV program audio and other audio based on physical type channels (ATV, DTV, HDMI, etc.), and the audio stream type of the other audio is mainly network program audio based on network type channels (STREAM_MUSIC ) And call audio (STREAM_VOICE_CALL).
- TV program audio based on physical type channels (ATV, DTV, HDMI, etc.) and network program audio based on network type channels (STREAM_MUSIC) are the audio of audio and video programs
- call audio (STREAM_VOICE_CALL) is the audio of voice calls.
- the output volume value can be independently adjusted to independently control its gain according to the corresponding output volume value.
- the output volume control mainly includes two types of branches, the main volume MainVoice and the sub-volume SubVoice.
- the sub-volume SubVoice further includes the first sub-volume MusicVoice and the second sub-volume CallVoice, where the main volume MainVoice is The volume corresponding to the TV program, the first sub-volume MusicVoice is the volume corresponding to the network program, and the second sub-volume Call Voice is the volume corresponding to the voice call.
- the audio stream source of the audio and video program is obtained, and the audio stream source is a TV program based on a physical type channel or a network program based on a network type channel. If the audio stream source of the audio and video program is a TV program, adjust the main volume value (MainVoice) to the first volume, which is the output volume value of the TV program; if the audio stream source of the audio and video program is a network program, The first sub-volume value (MusicVoice) is adjusted to the first volume, and the first sub-volume value is the volume value of the network program.
- MainVoice main volume value
- the first sub-volume value (MusicVoice) is adjusted to the first volume
- the first sub-volume value is the volume value of the network program.
- the first volume is multiplied by the adjustment coefficient corresponding to the selected volume setting item to obtain the second volume, which is the target volume of the voice call, and then the second sub-volume value (CallVoice) is adjusted to the second Volume, the second sub-volume value is the volume value of the voice call.
- the second sub-volume value (CallVoice) is adjusted to the second Volume, the second sub-volume value is the volume value of the voice call.
- the at least two channels of sounds can be played at different output volumes, so that the user can easily distinguish the source of the sound separately.
- the display shows the volume setting interface as shown in FIG. 11, if an instruction for adjusting the volume input by the user is received, the default item pre-saved in the volume setting item is obtained; and then according to the default The project adjusts the output volume of audio and video programs and the output volume of voice calls.
- the default item pre-saved in the volume setting item is obtained; and then the output volume of the audio and video program is adjusted according to the default item. Adjust the output volume in association with the voice call.
- the default item may be the volume setting item selected by the user in the last operation, or the system default volume setting item.
- each volume setting item can be marked in the state, for example, the volume setting item selected by the user or the system default volume setting item is marked as selected, and the remaining volume setting items are marked as unselected.
- the default item can be obtained by traversing the mark state of each volume setting item.
- the process ends.
- the output volume of the audio and video program and the output volume of the voice call are adjusted to 0 at the same time.
- the AI mute instruction input by the user when the AI mute instruction input by the user is received in the watching and chatting scene, in response to the AI mute instruction, the sound signal of the voice call is stopped playing, and the text corresponding to the call data is displayed as a barrage.
- the form is posted on the upper layer of the playback screen. For example, when the display shows the volume setting interface as shown in FIG. 11, the user can select the "AI mute mode" 114 by operating the control device to input an AI mute instruction.
- the text corresponding to the call data and the user information corresponding to the call data are obtained respectively, and the user information is used to characterize the user account that sends the call data, such as user nickname, user ID, user avatar, etc.
- the barrage text is generated according to the text corresponding to the call data and the user information; and then the barrage text carrying the user information mark is displayed on the upper layer of the playback screen presented on the display.
- the present invention also provides a computer storage medium, wherein the computer storage medium can store a calculation program, and when at least one controller/processor of the display device executes the computer program, the controller/processor executes the computer program.
- the storage medium may be a magnetic disk, an optical disc, a read-only memory (English: read-only memory, abbreviated as: ROM) or a random access memory (English: random access memory, abbreviated as: RAM), etc.
- the technology in the embodiments of the present invention can be implemented by means of software plus a necessary general hardware platform.
- the technical solutions in the embodiments of the present invention can be embodied in the form of software products, which can be stored in a storage medium, such as ROM/RAM. , Magnetic disks, optical disks, etc., including several instructions to make a computer device (which may be a personal computer, a server, or a network device, etc.) execute the methods described in the various embodiments or some parts of the embodiments of the present invention.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Audiology, Speech & Language Pathology (AREA)
- General Health & Medical Sciences (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Controls And Circuits For Display Device (AREA)
Abstract
本申请公开了显示设备和音量调节方法,显示设备同时进行音视频节目播放和语音通话时接收指示调节音量的指令,在显示器上呈现音量设置界面,包括用于将语音通话的输出音量与音视频节目的输出音量相关联的音量设置项目;响应于用户对音量设置项目的选中操作,对音视频节目的输出音量和所述语音通话的输出音量进行关联调节。
Description
本申请涉及显示设备技术领域,尤其涉及一种显示设备和一种音量调节方法。
显示设备可以为用户提供诸如音频、视频、图片等播放画面。如今,为了满足用户个性化需求,显示设备不仅可以为用户提供通过数据广播接收的直播电视节目内容,而且可以为用户提供诸如网络视频、网络游戏等各种应用和服务内容。另外,随着摄像头在显示设备上的应用,视频聊天也成为显示设备的基础功能,进而,同时存在两路音频输出的场景也随之产生。例如,边播放视频节目边进行音视频聊天的场景,或者,边播放音乐边进行音视频聊天的场景。
在提供上述内容及功能的同时,显示设备可以基于用户对诸如遥控器、移动终端等控制装置上的物理硬键或虚拟键的操作,而被控制实现上述功能,也可以通过自身的麦克风或控制装置上的麦克风接收的用户输入的语音,而被控制执行上述功能。例如,在显示设备播放节目时,用户通过遥控器上的音量键调节音量。
然而,对于同时存在两路声音输出的场景,如何更好地处理该两路声音,成为功能优化焦点。
发明内容
本申请提供一种显示设备和一种音量调节方法,以解决如何更好地处理该 两路声音的问题。
第一方面,本申请提供一种显示设备,其特征在于,包括:
显示器,用于呈现用户界面,所述用户界面包括至少一个视图显示窗口;
控制器用于:
当音视频节目播放和语音通话同时进行时,接收用户输入的指示调节音量的指令;
响应于所述指示调节音量的指令,在显示器上呈现音量设置界面,所述音量设置界面包括用于将所述语音通话的输出音量与所述音视频节目的输出音量相关联的音量设置项目;
响应于用户对所述音量设置项目的选中操作,根据被选中的音量设置项目,对所述音视频节目的输出音量和所述语音通话的输出音量进行关联调节,以使所述语音通话调节后的输出音量与所述音视频节目调节后的输出音量不同。
第二方面,本申请还提供一种音量调节方法,所述方法包括:
当音视频节目播放和语音通话同时进行时,接收用户输入的指示调节音量的指令;
响应于所述指示调节音量的指令,在显示器上呈现音量设置界面,所述音量设置界面包括用于将所述语音通话的输出音量与所述音视频节目的输出音量相关联的音量设置项目;
响应于用户对所述音量设置项目的选中操作,根据被选中的音量设置项目,对所述音视频节目的输出音量和所述语音通话的输出音量进行关联调节,以使所述语音通话调节后的输出音量与所述音视频节目调节后的输出音量不同。
为了更清楚地说明本申请的技术方案,下面将对实施例中所需要使用的附图作简单地介绍,显而易见地,对于本领域普通技术人员而言,在不付出创造性劳动性的前提下,还可以根据这些附图获得其他的附图。
图1中示例性示出了根据实施例中显示设备与控制装置之间操作场景的示意图;
图2中示例性示出了根据实施例中控制装置100的硬件配置框图;
图3中示例性示出了根据实施例中显示设备200的硬件配置框图;
图4中示例性示出了根据图3显示设备200的硬件架构框图;
图5中示例性示出了根据实施例中显示设备200的功能配置示意图;
图6a中示例性示出了根据实施例中显示设备200中软件配置示意图;
图6b中示例性示出了根据实施例中显示设备200中应用程序的配置示意图;
图7中示例性示出了根据实施例中显示设备200中用户界面的示意图;
图8示例性示出了一种边看边聊场景下的用户界面;
图9示例性示出了对多路音频数据的处理过程;
图10示例性示出了一种边看边聊场景下的另一用户界面;
图11示例性示出了一种边看边聊场景下的另一用户界面;
图12示例性示出了一种边看边聊场景下的另一用户界面;
图13示例性示出了一种边看边聊场景下的另一用户界面;
图14示例性示出了一种音量控制方法流程图。
为了使本技术领域的人员更好地理解本申请中的技术方案,下面将结合本申请实施例中的附图,对本申请实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例仅仅是本申请一部分实施例,而不是全部的实施例。基于本申请中的实施例,本领域普通技术人员在没有做出创造性劳动前提下所获得的所有其他实施例,都应当属于本申请保护的范围。
本申请实施例提供一种显示设备和音量调节方法。
本申请提供的显示设备,可以是具有多个控制器架构的显示设备,如本申请图3-6示出的具有控制器(双硬件系统)架构的显示设备,也可以是具有非双控制器架构的显示设备,本申请不予限定。
本申请提供的音量调节方法,可以应用于如智能电视的显示设备,当然也可以应用于其他能够提供语音和数据连通功能、并具有无线连接功能的手持式设备,或可以连接到无线调制解调器的其他处理设备,如移动电话(或称为“蜂窝”电话)和具有移动终端的计算机,还可以是便携式、袖珍式、手持式、计算机内置的或车载的移动装置,它们与无线接入网交换数据。
为便于用户使用,显示设备上通常会设置各种外部装置接口,以便于连接不同的外设设备或线缆以实现相应的功能。而在显示设备的接口上连接有高清晰度的摄像头时,如果显示设备的硬件系统没有接收源码的高像素摄像头的硬件接口,那么就会导致无法将摄像头接收到的数据呈现到显示设备的显示屏上。
并且,受制于硬件结构,传统显示设备的硬件系统仅支持一路硬解码资源,且通常最大仅能支持4K分辨率的视频解码,因此当要实现边观看网络电视边进行视频聊天时,为了不降低网络视频画面清晰度,就需要使用硬解码资源(通常是硬件系统中的GPU)对网络视频进行解码,而在此情况下,只能采取由硬 件系统中的通用处理器(例如CPU)对视频进行软解码的方式处理视频聊天画面。
采用软解码处理视频聊天画面,会大大增加CPU的数据处理负担,当CPU的数据处理负担过重时,可能会出现画面卡顿或者不流畅的问题。进一步的,受制于CPU的数据处理能力,当采用CPU软解码处理视频聊天画面时,通常无法实现多路视频通话,当用户想要再同一聊天场景同时与多个其他用户进行视频聊天时,会出现接入受阻的情况。
基于上述各方面的考虑,为克服上述缺陷,本申请公开了一种双硬件系统架构,以实现多路视频聊天数据(至少一路本地视频)。
下面首先结合附图对本申请所涉及的概念进行说明。在此需要指出的是,以下对各个概念的说明,仅为了使本申请的内容更加容易理解,并不表示对本申请保护范围的限定。
本申请各实施例中使用的术语“模块”,可以是指任何已知或后来开发的硬件、软件、固件、人工智能、模糊逻辑或硬件或/和软件代码的组合,能够执行与该元件相关的功能。
本申请各实施例中使用的术语“遥控器”,是指电子设备(如本申请中公开的显示设备)的一个组件,该组件通常可在较短的距离范围内无线控制电子设备。该组件一般可以使用红外线和/或射频(RF)信号和/或蓝牙与电子设备连接,也可以包括WiFi、无线USB、蓝牙、动作传感器等功能模块。例如:手持式触摸遥控器,是以触摸屏中用户界面取代一般遥控装置中的大部分物理内置硬键。
本申请各实施例中使用的术语“手势”,是指用户通过一种手型的变化或手部运动等动作,用于表达预期想法、动作、目的/或结果的用户行为。
本申请各实施例中使用的术语“硬件系统”,可以是指由集成电路(Integrated Circuit,IC)、印刷电路板(Printed circuit board,PCB)等机械、光、电、磁器件构成的具有计算、控制、存储、输入和输出功能的实体部件。在本申请各个实施例中,硬件系统通常也会被称为主板(motherboard)或芯片。
图1中示例性示出了根据实施例中显示设备与控制装置之间操作场景的示意图。如图1所示,用户可通过控制装置100来操作显示设备200。
其中,控制装置100可以是遥控器100A,其可与显示设备200之间通过红外协议通信、蓝牙协议通信、紫蜂(ZigBee)协议通信或其他短距离通信方式进行通信,用于通过无线或其他有线方式来控制显示设备200。用户可以通过遥控器上按键、语音输入、控制面板输入等输入用户指令,来控制显示设备200。如:用户可以通过遥控器上音量加减键、频道控制键、上/下/左/右的移动按键、语音输入按键、菜单键、开关机按键等输入相应控制指令,来实现控制显示设备200的功能。
控制装置100也可以是智能设备,如移动终端100B、平板电脑、计算机、笔记本电脑等,其可以通过本地网(LAN,Local Area Network)、广域网(WAN,Wide Area Network)、无线局域网((WLAN,Wireless Local Area Network)或其他网络与显示设备200之间通信,并通过与显示设备200相应的应用程序实现对显示设备200的控制。例如,使用在智能设备上运行的应用程序控制显示设备200。该应用程序可以在与智能设备关联的屏幕上通过直观的用户界面(UI,User Interface)为用户提供各种控制。
“用户界面”,是应用程序或操作系统与用户之间进行交互和信息交换的介质接口,它实现信息的内部形式与用户可以接受形式之间的转换。用户界面常 用的表现形式是图形用户界面(graphicuserinterface,GUI),是指采用图形方式显示的与计算机操作相关的用户界面。它可以是在电子设备的显示屏中显示的一个图标、窗口、控件等界面元素,其中控件可以包括图标、按钮、菜单、选项卡、文本框、对话框、状态栏、导航栏、Widget等可视的界面元素。
示例的,移动终端100B与显示设备200均可安装软件应用,从而可通过网络通信协议实现二者之间的连接通信,进而实现一对一控制操作的和数据通信的目的。如:可以使移动终端100B与显示设备200建立控制指令协议,将遥控控制键盘同步到移动终端100B上,通过控制移动终端100B上用户界面,实现控制显示设备200的功能;也可以将移动终端100B上显示的音视频内容传输到显示设备200上,实现同步显示功能。
如图1所示,显示设备200还可与服务器300通过多种通信方式进行数据通信。在本申请各个实施例中,可允许显示设备200通过局域网、无线局域网或其他网络与服务器300进行通信连接。服务器300可以向显示设备200提供各种内容和互动。
示例的,显示设备200通过发送和接收信息,以及电子节目指南(EPG,Electronic Program Guide)互动,接收软件程序更新,或访问远程储存的数字媒体库。服务器300可以是一组,也可以是多组,可以是一类或多类服务器。通过服务器300提供视频点播和广告服务等其他网络服务内容。
显示设备200,可以是液晶显示器、OLED(Organic Light Emitting Diode)显示器、投影显示设备、智能电视。具体显示设备类型,尺寸大小和分辨率等不作限定,本领技术人员可以理解的是,显示设备200可以根据需要做性能和配置上的一些改变。
显示设备200除了提供广播接收电视功能之外,还可以附加提供计算机支持功能的智能网络电视功能。示例的包括,网络电视、智能电视、互联网协议电视(IPTV)等。
在一些实施例中,显示设备可以不具备广播接收电视功能。
如图1所示,显示设备200上可以连接或设置有摄像头,用于将摄像头拍摄到的画面呈现在本显示设备或其他显示设备的显示界面上,以实现用户之间的交互聊天。具体的,摄像头拍摄到的画面可在显示设备上全屏显示、半屏显示、或者显示任意可选区域。
作为一种可选的连接方式,摄像头通过连接板与显示器后壳连接,固定安装在显示器后壳的上侧中部,作为可安装的方式,可以固定安装在显示器后壳的任意位置,能保证其图像采集区域不被后壳遮挡即可,例如,图像采集区域与显示设备的显示朝向相同。
作为另一种可选的连接方式,摄像头通过连接板或者其他可想到的连接器可升降的与显示后壳连接,连接器上安装有升降马达,当用户要使用摄像头或者有应用程序要使用摄像头时,再升出显示器之上,当不需要使用摄像头时,其可内嵌到后壳之后,以达到保护摄像头免受损坏。
作为一种实施例,本申请所采用的摄像头可以为1600万像素,以达到超高清显示目的。在实际使用中,也可采用比1600万像素更高或更低的摄像头。
当显示设备上安装有摄像头以后,显示设备不同应用场景所显示的内容可得到多种不同方式的融合,从而达到传统显示设备无法实现的功能。
示例性的,用户可以在边欣赏音视频节目的同时,与至少一位其他用户(也即至少一台其他终端)进行语音通话。音视频节目的呈现可作为背景画面,音 视频节目的声音可作为背景声音,语音通话的窗口显示在背景画面之上,语音通话的声音可通过显示设备与背景声音同时播放。
形象的,可以将显示设备同时播放上述两路声音的功能称为同时播放的“边看边聊”,将同时存在上述两路声音的场景称为“边看边聊”场景。
在另一些示例性的实施例中,“边看边聊”场景下,可以不展示聊天窗口,而仅仅输出聊天语音。也即显示器播放音视频节目,节目声音和聊天声音同时输出。
在另一些示例性是实施例中,“边看边聊”场景下,当用户触发聊天声音静音的指令时,只输出音视频节目声音,而其他用户的聊天语音转换为文字或弹幕的形式,呈现在显示器上。
在另一些示例性的实施例中,在“边看边聊”的场景中,在观看直播视频或网络视频的同时,与其他终端进行至少一路的视频聊天。
另一示例中,用户可以在边进入教育应用学习的同时,与至少一位其他用户进行视频聊天。例如,学生在学习教育应用程序中内容的同时,可实现与老师的远程互动。形象的,可以称该功能为“边学边聊”。
另一示例中,用户在玩纸牌游戏时,与进入游戏的玩家进行视频聊天。例如,玩家在进入游戏应用参与游戏时,可实现与其他玩家的远程互动。形象的,可以称该功能为“边看边玩”。
可选的,游戏场景与视频画面进行融合,将视频画面中人像进行抠图,显示在游戏画面中,提升用户体验。
可选的,在体感类游戏中(如打球类、拳击类、跑步类、跳舞类等),通过摄像头获取人体姿势和动作,肢体检测和追踪、人体骨骼关键点数据的检测, 再与游戏中动画进行融合,实现如体育、舞蹈等场景的游戏。
另一示例中,用户可以在K歌应用中,与至少一位其他用户进行视频和语音的交互。形象的,可以称该功能为“边看边唱”。优选的,当至少一位用户在聊天场景进入该应用时,可多个用户共同完成一首歌的录制。
另一个示例中,用户可在本地打开摄像头获取图片和视频,形象的,可以称该功能为“照镜子”。
在另一些示例中,还可以再增加更多功能或减少上述功能。本申请对该显示设备的功能不作具体限定。
图2中示例性示出了根据示例性实施例中控制装置100的配置框图。如图2所示,控制装置100包括控制器110、通信器130、用户输入/输出接口140、存储器190、供电电源180。
控制装置100被配置为可控制所述显示设备200,以及可接收用户的输入操作指令,且将操作指令转换为显示设备200可识别和响应的指令,起到用户与显示设备200之间交互中介作用。如:用户通过操作控制装置100上频道加减键,显示设备200响应频道加减的操作。
在一些实施例中,控制装置100可是一种智能设备。如:控制装置100可根据用户需求安装控制显示设备200的各种应用。
在一些实施例中,如图1所示,移动终端100B或其他智能电子设备,可在安装操控显示设备200的应用之后,起到控制装置100类似功能。如:用户可以通过安装应用,在移动终端100B或其他智能电子设备上可提供的图形用户界面的各种功能键或虚拟按钮,以实现控制装置100实体按键的功能。
控制器110包括处理器112、RAM113和ROM114、通信接口以及通信总 线。控制器110用于控制控制装置100的运行和操作,以及内部各部件之间通信协作以及外部和内部的数据处理功能。
通信器130在控制器110的控制下,实现与显示设备200之间控制信号和数据信号的通信。如:将接收到的用户输入信号发送至显示设备200上。通信器130可包括WIFI模块131、蓝牙模块132、NFC模块133等通信模块中至少一种。
用户输入/输出接口140,其中,输入接口包括麦克风141、触摸板142、传感器143、按键144等输入接口中至少一者。如:用户可以通过语音、触摸、手势、按压等动作实现用户指令输入功能,输入接口通过将接收的模拟信号转换为数字信号,以及数字信号转换为相应指令信号,发送至显示设备200。
输出接口包括将接收的用户指令发送至显示设备200的接口。在一些实施例中,可以是红外接口,也可以是射频接口。如:红外信号接口时,需要将用户输入指令按照红外控制协议转化为红外控制信号,经红外发送模块进行发送至显示设备200。再如:射频信号接口时,需将用户输入指令转化为数字信号,然后按照射频控制信号调制协议进行调制后,由射频发送端子发送至显示设备200。
在一些实施例中,控制装置100包括通信器130和输出接口中至少一者。控制装置100中配置通信器130,如:WIFI、蓝牙、NFC等模块,可将用户输入指令通过WIFI协议、或蓝牙协议、或NFC协议编码,发送至显示设备200。
存储器190,用于在控制器110的控制下存储驱动和控制控制装置100的各种运行程序、数据和应用。存储器190,可以存储用户输入的各类控制信号指令。
供电电源180,用于在控制器110的控制下为控制装置100各元件提供运行电力支持。可以电池及相关控制电路。
图3中示例性示出了根据示例性实施例中显示设备200中硬件系统的硬件配置框图。
在采用双硬件系统架构时,硬件系统的结构关系可以如图3所示。为便于表述,以下将双硬件系统架构中的一个硬件系统称为第一硬件系统或第一控制器,并将另一个硬件系统称为第二硬件系统或第二控制器。第一控制器包含第一控制器的各类处理器、各类接口,第二控制器则包含第二控制器的各类处理器、各类接口。第一控制器及第二控制器中可以各自安装有相对独立的操作系统,第一控制器的操作系统和第二控制器的操作系统可以通过通信协议相互通信,示例性的:第一控制器的操作系统的framework层和第二控制器的操作系统的framework层可以进行通信进行命令和数据的传输,从而使显示设备200中存在两个在独立但又存在相互关联的子系统。
如图3所示,第一控制器与第二控制器之间可以通过多个不同类型的接口实现连接、通信及供电。第一控制器与第二控制器之间接口的接口类型可以包括通用输入输出接口(General-purpose input/output,GPIO)、USB接口、HDMI接口、UART接口等。第一控制器与第二控制器之间可以使用这些接口中的一个或多个进行通信或电力传输。例如图3所示,在双硬件系统架构下,可以由外接的电源(power)为第二控制器供电,而第一控制器则可以不由外接电源,而由第二控制器供电。
除用于与第二控制器进行连接的接口之外,第一控制器还可以包含用于连接其他设备或组件的接口,例如图3中所示的用于连接摄像头(Camera)的 MIPI接口,蓝牙接口等。
类似的,除用于与第二控制器进行连接的接口之外,第二控制器还可以包含用于连接显示屏TCON(Timer Control Register)的VBY接口,用于连接功率放大器(Amplifier,AMP)及扬声器(Speaker)的i2S接口;以及IR/Key接口,USB接口,Wifi接口,蓝牙接口,HDMI接口,Tuner接口等。
下面结合图4对本申请双硬件系统架构进行进一步的说明。需要说明的是图4仅仅是对本申请双硬件系统架构的一个示例性说明,并不表示对本申请的限定。在实际应用中,两个硬件系统均可根据需要包含更多或更少的硬件或接口。
图4中示例性示出了根据图3显示设备200的硬件架构框图。如图4所示,显示设备200的硬件系统可以包括第一控制器和第二控制器,以及通过各类接口与第一控制器或第二控制器相连接的模块。
第二控制器可以包括调谐解调器220、通信器230、外部装置接口250、控制器210、存储器290、用户输入接口、视频处理器260-1、音频处理器260-2、显示器280、音频输出接口270、供电电源。在其他实施例中第二控制器也可以包括更多或更少的模块。
其中,调谐解调器220,用于对通过有线或无线方式接收广播电视信号,进行放大、混频和谐振等调制解调处理,从而从多个无线或有线广播电视信号中解调出用户所选择电视频道的频率中所携带的音视频信号,以及附加信息(例如EPG数据信号)。根据电视信号广播制式不同,调谐解调器220的信号途径可以有很多种,诸如:地面广播、有线广播、卫星广播或互联网广播等;以及根据调制类型不同,所述信号的调整方式可以数字调制方式,也可以模拟调制 方式;以及根据接收电视信号种类不同,调谐解调器220可以解调模拟信号和/或数字信号。
调谐解调器220,还用于根据用户选择,以及由控制器210控制,响应用户选择的电视频道频率以及该频率所携带的电视信号。
在其他一些示例性实施例中,调谐解调器220也可在外置设备中,如外置机顶盒等。这样,机顶盒通过调制解调后输出电视音视频信号,经过外部装置接口250输入至显示设备200中。
通信器230是用于根据各种通信协议类型与外部设备或外部服务器进行通信的组件。例如:通信器230可以包括WIFI模块231,蓝牙通信协议模块232,有线以太网通信协议模块233,及红外通信协议模块等其他网络通信协议模块或近场通信协议模块。
显示设备200可以通过通信器230与外部控制设备或内容提供设备之间建立控制信号和数据信号的连接。例如,通信器可根据控制器的控制接收遥控器100A的控制信号。
外部装置接口250,是提供第二控制器210和第一控制器及外部其他设备间数据传输的组件。外部装置接口可按照有线/无线方式与诸如机顶盒、游戏装置、笔记本电脑等的外部设备连接,可接收外部设备的诸如视频信号(例如运动图像)、音频信号(例如音乐)、附加信息(例如EPG)等数据。
其中,外部装置接口250可以包括:高清多媒体接口(HDMI)端子251、复合视频消隐同步(CVBS)端子252、模拟或数字分量端子253、通用串行总线(USB)端子254、红绿蓝(RGB)端子(图中未示出)等任一个或多个。本申请不对外部装置接口的数量和类型进行限制。
控制器210,通过运行存储在存储器290上的各种软件控制程序(如操作系统和/或各种应用程序),来控制显示设备200的工作和响应用户的操作。
如图4所示,控制器210包括只读存储器RAM214、随机存取存储器ROM213、图形处理器216、CPU处理器212、通信接口218、以及通信总线。其中,RAM214和ROM213以及图形处理器216、CPU处理器212、通信接口218通过总线相连接。
ROM213,用于存储各种系统启动的指令。如在收到开机信号时,显示设备200电源开始启动,CPU处理器212运行ROM中系统启动指令,将存储在存储器290的操作系统拷贝至RAM214中,以开始运行启动操作系统。当操作系统启动完成后,CPU处理器212再将存储器290中各种应用程序拷贝至RAM214中,然后,开始运行启动各种应用程序。
图形处理器216,用于产生各种图形对象,如:图标、操作菜单、以及用户输入指令显示图形等。包括运算器,通过接收用户输入各种交互指令进行运算,根据显示属性显示各种对象。以及包括渲染器,产生基于运算器得到的各种对象,进行渲染的结果显示在显示器280上。
CPU处理器212,用于执行存储在存储器290中操作系统和应用程序指令。以及根据接收外部输入的各种交互指令,来执行各种应用程序、数据和内容,以便最终显示和播放各种音视频内容。
在一些示例性实施例中,CPU处理器212,可以包括多个处理器。所述多个处理器中可包括一个主处理器以及多个或一个子处理器。主处理器,用于在预加电模式中执行显示设备200一些操作,和/或在正常模式下显示画面的操作。多个或一个子处理器,用于执行在待机模式等状态下的一种操作。
通信接口,可包括第一接口218-1到第n接口218-n。这些接口可以是经由网络被连接到外部设备的网络接口。
控制器210可以控制显示设备200的整体操作。例如:响应于接收到用于选择在显示器280上显示UI对象的用户命令,控制器210便可以执行与由用户命令选择的对象有关的操作。
其中,所述对象可以是可选对象中的任何一个,例如超链接或图标。与所选择的对象有关操作,例如:显示连接到超链接页面、文档、图像等操作,或者执行与图标相对应程序的操作。用于选择UI对象用户命令,可以是通过连接到显示设备200的各种输入装置(例如,鼠标、键盘、触摸板等)输入命令或者与由用户说出语音相对应的语音命令。
存储器290,包括存储用于驱动和控制显示设备200的各种软件模块。如:存储器290中存储的各种软件模块,包括:基础模块、检测模块、通信模块、显示控制模块、浏览器模块、和各种服务模块等。
其中,基础模块是用于显示设备200中各个硬件之间信号通信、并向上层模块发送处理和控制信号的底层软件模块。检测模块是用于从各种传感器或用户输入接口中收集各种信息,并进行数模转换以及分析管理的管理模块。
例如:语音识别模块中包括语音解析模块和语音指令数据库模块。显示控制模块是用于控制显示器280进行显示图像内容的模块,可以用于播放多媒体图像内容和UI界面等信息。通信模块,是用于与外部设备之间进行控制和数据通信的模块。浏览器模块,是用于执行浏览服务器之间数据通信的模块。服务模块,是用于提供各种服务以及各类应用程序在内的模块。
同时,存储器290还用于存储接收外部数据和用户数据、各种用户界面中 各个项目的图像以及焦点对象的视觉效果图等。
用户输入接口,用于将用户的输入信号发送给控制器210,或者,将从控制器输出的信号传送给用户。示例性的,控制装置(例如移动终端或遥控器)可将用户输入的诸如电源开关信号、频道选择信号、音量调节信号等输入信号发送至用户输入接口,再由用户输入接口转送至控制器;或者,控制装置可接收经控制器处理从用户输入接口输出的音频、视频或数据等输出信号,并且显示接收的输出信号或将接收的输出信号输出为音频或振动形式。
在一些实施例中,用户可在显示器280上显示的图形用户界面(GUI)输入用户命令,则用户输入接口通过图形用户界面(GUI)接收用户输入命令。或者,用户可通过输入特定的声音或手势进行输入用户命令,则用户输入接口通过传感器识别出声音或手势,来接收用户输入命令。
视频处理器260-1,用于接收视频信号,根据输入信号的标准编解码协议,进行解压缩、解码、缩放、降噪、帧率转换、分辨率转换、图像合成等视频数据处理,可得到直接在显示器280上显示或播放的视频信号。
示例的,视频处理器260-1,包括解复用模块、视频解码模块、图像合成模块、帧率转换模块、显示格式化模块等。
其中,解复用模块,用于对输入音视频数据流进行解复用处理,如输入MPEG-2,则解复用模块进行解复用成视频信号和音频信号等。
视频解码模块,用于对解复用后的视频信号进行处理,包括解码和缩放处理等。
图像合成模块,如图像合成器,其用于将图形生成器根据用户输入或自身生成的GUI信号,与缩放处理后视频图像进行叠加混合处理,以生成可供显示 的图像信号。
帧率转换模块,用于对输入视频的帧率进行转换,如将输入的24Hz、25Hz、30Hz、60Hz视频的帧率转换为60Hz、120Hz或240Hz的帧率,其中,输入帧率可以与源视频流有关,输出帧率可以与显示器的更新率有关。输入有通常的格式采用如插帧方式实现。
显示格式化模块,用于将帧率转换模块输出的信号,改变为符合诸如显示器显示格式的信号,如将帧率转换模块输出的信号进行格式转换以输出RGB数据信号。
显示器280,用于接收源自视频处理器260-1输入的图像信号,进行显示视频内容和图像以及菜单操控界面。显示器280包括用于呈现画面的显示器组件以及驱动图像显示的驱动组件。显示视频内容,可以来自调谐解调器220接收的广播信号中的视频,也可以来自通信器或外部设备接口输入的视频内容。显示器220,同时显示显示设备200中产生且用于控制显示设备200的用户操控界面UI。
以及,根据显示器280类型不同,还包括用于驱动显示的驱动组件。或者,倘若显示器280为一种投影显示器,还可以包括一种投影装置和投影屏幕。
音频处理器260-2,用于接收音频信号,根据输入信号的标准编解码协议,进行解压缩和解码,以及降噪、数模转换、和放大处理等音频数据处理,得到可以在扬声器272中播放的音频信号。
音频输出接口270,用于在控制器210的控制下接收音频处理器260-2输出的音频信号,音频输出接口可包括扬声器272,或输出至外接设备的发生装置的外接音响输出端子274,如:外接音响端子或耳机输出端子等。
在其他一些示例性实施例中,视频处理器260-1可以包括一个或多个芯片组成。音频处理器260-2,也可以包括一个或多个芯片组成。
以及,在其他一些示例性实施例中,视频处理器260-1和音频处理器260-2,可以为单独的芯片,也可以与控制器210一起集成在一个或多个芯片中。
供电电源,用于在控制器210控制下,将外部电源输入的电力为显示设备200提供电源供电支持。供电电源可以包括安装显示设备200内部的内置电源电路,也可以是安装在显示设备200外部的电源,如在显示设备200中提供外接电源的电源接口。
与第二控制器相类似,如图4所示,第一控制器可以包括控制器310、通信器330、检测器340、存储器390。在某些实施例中还可以包括用户输入接口、视频处理器、音频处理器、显示器、音频输出接口。在某些实施例中,也可以存在独立为第一控制器供电的供电电源。
通信器330是用于根据各种通信协议类型与外部设备或外部服务器进行通信的组件。例如:通信器330可以包括WIFI模块331,蓝牙通信协议模块332,有线以太网通信协议模块333,及红外通信协议模块等其他网络通信协议模块或近场通信协议模块。
第一控制器的通信器330和第二控制器的通信器230也有相互交互。例如,第二控制器的WiFi模块231用于连接外部网络,与外部服务器等产生网络通信。第一控制器的WiFi模块331用于连接至第二控制器的WiFi模块231,而不与外界网络等产生直接连接。因此,对于用户而言,一个如上述实施例中的显示设备至对外显示一个WiFi账号。
检测器340,是显示设备第一控制器用于采集外部环境或与外部交互的信 号的组件。检测器340可以包括光接收器342,用于采集环境光线强度的传感器,可以通过采集环境光来自适应显示参数变化等;还可以包括图像采集器341,如相机、摄像头等,可以用于采集外部环境场景,以及用于采集用户的属性或与用户交互手势,可以自适应变化显示参数,也可以识别用户手势,以实现与用户之间互动的功能。
外部装置接口350,提供控制器310与第二控制器或外部其他设备间数据传输的组件。外部装置接口可按照有线/无线方式与诸如机顶盒、游戏装置、笔记本电脑等的外部设备连接。
控制器310,通过运行存储在存储器390上的各种软件控制程序(如用安装的第三方应用等),以及与第二控制器的交互,来控制显示设备200的工作和响应用户的操作。
如图4所示,控制器310包括只读存储器ROM313、随机存取存储器RAM314、图形处理器316、CPU处理器312、通信接口318、以及通信总线。其中,ROM313和RAM314以及图形处理器316、CPU处理器312、通信接口318通过总线相连接。
ROM313,用于存储各种系统启动的指令。CPU处理器312运行ROM中系统启动指令,将存储在存储器390的操作系统拷贝至RAM314中,以开始运行启动操作系统。当操作系统启动完成后,CPU处理器312再将存储器390中各种应用程序拷贝至RAM314中,然后,开始运行启动各种应用程序。
CPU处理器312,用于执行存储在存储器390中操作系统和应用程序指令,和与第二控制器进行通信、信号、数据、指令等传输与交互,以及根据接收外部输入的各种交互指令,来执行各种应用程序、数据和内容,以便最终显示和 播放各种音视频内容。
通信接口,可包括第一接口318-1到第n接口318-n。这些接口可以是经由网络被连接到外部设备的网络接口,也可以是经由网络被连接到第二控制器的网络接口。
控制器310可以控制显示设备200的整体操作。例如:响应于接收到用于选择在显示器280上显示UI对象的用户命令,控制器210便可以执行与由用户命令选择的对象有关的操作。
图形处理器316,用于产生各种图形对象,如:图标、操作菜单、以及用户输入指令显示图形等。包括运算器,通过接收用户输入各种交互指令进行运算,根据显示属性显示各种对象。以及包括渲染器,产生基于运算器得到的各种对象,进行渲染的结果显示在显示器280上。
第一控制器的图形处理器316与第二控制器的图形处理器216均能产生各种图形对象。区别性的,若应用1安装于第一控制器,应用2安装在第二控制器,当用户在应用1的界面,且在应用1内进行用户输入的指令时,由第一控制器图形处理器316产生图形对象。当用户在应用2的界面,且在应用2内进行用户输入的指令时,由第二控制器的图形处理器216产生图形对象。
图5中示例性示出了根据示例性实施例中显示设备的功能配置示意图。
如图5所示,第一控制器的存储器390和第二控制器的存储器290分别用于存储操作系统、应用程序、内容和用户数据等,在第一控制器的控制器310和第二控制器的控制器210的控制下执行驱动显示设备200的系统运行以及响应用户的各种操作。第一控制器的存储器390和第二控制器的存储器290可以包括易失性和/或非易失性存储器。
对于第二控制器,存储器290,具体用于存储驱动显示设备200中控制器210的运行程序,以及存储显示设备200内置各种应用程序,以及用户从外部设备下载的各种应用程序、以及与应用程序相关的各种图形用户界面,以及与图形用户界面相关的各种对象,用户数据信息,以及各种支持应用程序的内部数据。存储器290用于存储操作系统(OS)内核、中间件和应用等系统软件,以及存储输入的视频数据和音频数据、及其他用户数据。
存储器290,具体用于存储视频处理器260-1和音频处理器260-2、显示器280、通信接口230、调谐解调器220、输入/输出接口等驱动程序和相关数据。
在一些实施例中,存储器290可以存储软件和/或程序,用于表示操作系统(OS)的软件程序包括,例如:内核、中间件、应用编程接口(API)和/或应用程序。示例性的,内核可控制或管理系统资源,或其它程序所实施的功能(如所述中间件、API或应用程序),以及内核可以提供接口,以允许中间件和API,或应用访问控制器,以实现控制或管理系统资源。
示例的,存储器290,包括广播接收模块2901、频道控制模块2902、音量控制模块2903、图像控制模块2904、显示控制模块2905、音频控制模块2906、外部指令识别模块2907、通信控制模块2908、电力控制模块2910、操作系统2911、以及其他应用程序2912、浏览器模块等等。控制器210通过运行存储器290中各种软件程序,来执行诸如:广播电视信号接收解调功能、电视频道选择控制功能、音量选择控制功能、图像控制功能、显示控制功能、音频控制功能、外部指令识别功能、通信控制功能、光信号接收功能、电力控制功能、支持各种功能的软件操控平台、以及浏览器功能等各类功能。
存储器390,包括存储用于驱动和控制显示设备200的各种软件模块。如: 存储器390中存储的各种软件模块,包括:基础模块、检测模块、通信模块、显示控制模块、浏览器模块、和各种服务模块等。由于存储器390与存储器290的功能比较相似,相关之处参见存储器290即可,在此就不再赘述。
示例的,存储器390,包括图像控制模块3904、音频控制模块2906、外部指令识别模块3907、通信控制模块3908、光接收模块3909、操作系统3911、以及其他应用程序3912、浏览器模块等等。控制器210通过运行存储器290中各种软件程序,来执行诸如:图像控制功能、显示控制功能、音频控制功能、外部指令识别功能、通信控制功能、光信号接收功能、电力控制功能、支持各种功能的软件操控平台、以及浏览器功能等各类功能。
区别性的,第二控制器的外部指令识别模块2907和第一控制器的外部指令识别模块3907可识别不同的指令。
示例性的,由于摄像头等图像接收设备与第一控制器连接,因此,第一控制器的外部指令识别模块3907可包括图形识别模块3907-1,图形识别模块3907-1内存储有图形数据库,摄像头接收到外界的图形指令时,与图形数据库中的指令进行对应关系,以对显示设备作出指令控制。而由于语音接收设备以及遥控器与第二控制器连接,因此,第二控制器的外部指令识别模块2907可包括语音识别模块2907-2,语音识别模块2907-2内存储有语音数据库,语音接收设备等接收到外界的语音指令或时,与语音数据库中的指令进行对应关系,以对显示设备作出指令控制。同样的,遥控器等控制装置100与第二控制器连接,由按键指令识别模块与控制装置100进行指令交互。
图6a中示例性示出了根据示例性实施例中显示设备200中软件系统的配置框图。
对第二控制器,如图6a中所示,操作系统2911,包括用于处理各种基础系统服务和用于实施硬件相关任务的执行操作软件,充当应用程序和硬件组件之间完成数据处理的媒介。
一些实施例中,部分操作系统内核可以包含一系列软件,用以管理显示设备硬件资源,并为其他程序或软件代码提供服务。
其他一些实施例中,部分操作系统内核可包含一个或多个设备驱动器,设备驱动器可以是操作系统中的一组软件代码,帮助操作或控制显示设备关联的设备或硬件。驱动器可以包含操作视频、音频和/或其他多媒体组件的代码。示例的,包括显示器、摄像头、Flash、WiFi和音频驱动器。
其中,可访问性模块2911-1,用于修改或访问应用程序,以实现应用程序的可访问性和对其显示内容的可操作性。
通信模块2911-2,用于经由相关通信接口和通信网络与其他外设的连接。
用户界面模块2911-3,用于提供显示用户界面的对象,以供各应用程序访问,可实现用户可操作性。
控制应用程序2911-4,用于控制进程管理,包括运行时间应用程序等。
事件传输系统2914,可在操作系统2911内或应用程序2912中实现。一些实施例中,一方面在在操作系统2911内实现,同时在应用程序2912中实现,用于监听各种用户输入事件,将根据各种事件指代响应各类事件或子事件的识别结果,而实施一组或多组预定义的操作的处理程序。
其中,事件监听模块2914-1,用于监听用户输入接口输入事件或子事件。
事件识别模块2914-2,用于对各种用户输入接口输入各类事件的定义,识别出各种事件或子事件,且将其传输给处理用以执行其相应一组或多组的处理 程序。
其中,事件或子事件,是指显示设备200中一个或多个传感器检测的输入,以及外界控制设备(如控制装置100等)的输入。如:语音输入各种子事件,手势识别的手势输入子事件,以及控制装置的遥控按键指令输入的子事件等。示例的,遥控器中一个或多个子事件包括多种形式,包括但不限于按键按上/下/左右/、确定键、按键按住等中一个或组合。以及非实体按键的操作,如移动、按住、释放等操作。
界面布局管理模块2913,直接或间接接收来自于事件传输系统2914监听到各用户输入事件或子事件,用于更新用户界面的布局,包括但不限于界面中各控件或子控件的位置,以及容器的大小或位置、层级等与界面布局相关各种执行操作。
由于第一控制器的操作系统3911与第二控制器的操作系统2911的功能比较相似,相关之处参见操作系统2911即可,在此就不再赘述。
如图6b中所示,显示设备的应用程序层包含可在显示设备200执行的各种应用程序。
第二控制器的应用程序层2912可包含但不限于一个或多个应用程序,如:视频点播应用程序、应用程序中心、游戏应用等。第一控制器的应用程序层3912可包含但不限于一个或多个应用程序,如:直播电视应用程序、媒体中心应用程序等。需要说明的是,第一控制器和第二控制器上分别包含什么应用程序是根据操作系统和其他设计确定的,本发明无需对第一控制器和第二控制器上所包含的应用程序做具体的限定和划分。
直播电视应用程序,可以通过不同的信号源提供直播电视。例如,直播电 视应用程可以使用来自有线电视、无线广播、卫星服务或其他类型的直播电视服务的输入提供电视信号。以及,直播电视应用程序可在显示设备200上显示直播电视信号的视频。
视频点播应用程序,可以提供来自不同存储源的视频。不同于直播电视应用程序,视频点播提供来自某些存储源的视频显示。例如,视频点播可以来自云存储的服务器端、来自包含已存视频节目的本地硬盘储存器。
媒体中心应用程序,可以提供各种多媒体内容播放的应用程序。例如,媒体中心,可以为不同于直播电视或视频点播,用户可通过媒体中心应用程序访问各种图像或音频所提供服务。
应用程序中心,可以提供储存各种应用程序。应用程序可以是一种游戏、应用程序,或某些和计算机系统或其他设备相关但可以在显示设备中运行的其他应用程序。应用程序中心可从不同来源获得这些应用程序,将它们储存在本地储存器中,然后在显示设备200上可运行。
由于第一控制器及第二控制器中可能分别安装有独立的操作系统,从而使显示设备200中存在两个在独立但又存在相互关联的子系统。例如,第一控制器和N均可以独立安装有安卓(Android)及各类APP,使得每个芯片均可以实现一定的功能,并且使第一控制器和第二控制器协同实现某项功能。
图7中示例性示出了根据示例性实施例中显示设备200中用户界面的示意图。如图7所示,用户界面包括多个视图显示窗口,示例的,第一视图显示窗口201和播放画面202,其中,播放画面包括布局一个或多个不同项目。以及用户界面中还包括指示项目被选择的选择器,可通过用户输入而移动选择器的位置,以改变选择不同的项目。
需要说明的是,多个视图显示窗口可以呈现不同层级的显示画面。如,第一视图显示窗口可呈现视频聊天项目内容,第二视图显示窗口可呈现应用层项目内容(如,网页视频、VOD展示、应用程序画面等)。
可选的,不同视图显示窗口的呈现存在优先级区别,优先级不同的视图显示窗口之间,视图显示窗口的显示优先级不同。如,系统层的优先级高于应用层的优先级,当用户在应用层使用获取选择器和画面切换时,不遮挡系统层的视图显示窗口的画面展示;以及,根据用户的选择使应用层的视图显示窗口的大小和位置发生变化时,系统层的视图显示窗口的大小和位置不受影响。
也可以呈现相同层级的显示画面,此时,选择器可以在第一视图显示窗口和第二视图显示窗口之间做切换,以及当第一视图显示窗口的大小和位置发生变化时,第二视图显示窗口的大小和位置可随及发生改变。
在一些实施例中,在边看边聊场景中,显示设备同时播放至少两路声音。
图8示例性示出了一种边看边聊场景下的用户界面,在该边看边聊场景中,显示设备同时进行视频节目播放和与三个终端用户进行语音通话。如图8所示,显示设备全屏播放视频节目,语音通话窗口以小窗口的形式悬浮在视频播放画面上。
需要说明的是,边看边聊场景不仅限于上述示例性示出的边看视频节目边进行语音通话的场景,还包括边听音频节目边进行视频通话的场景。另外,如果在进行语音通话时,视频节目或者音频节目被暂停播放或者显示器呈现静态的用户界面,而非动态视频画面,由于视频节目的音频输出通道一直处于开启状态,因此该类场景也视为前述存在至少两路声音的场景,即边看边聊场景。
还需说明的是,显示设备所播放的音视频节目可以电视直播节目或者网络节目。
还需说明的是,在边看边聊场景中,显示设备在播放音视频节目的同时,可以与多个其他终端设备进行多路视频聊天。也就是说,在边看边聊场景中,显示设备可以同时播放2路以上的声音信号。
参阅图9,在边看边聊场景中,控制器可以接收到至少两种音频数据,其一为音视频节目的音频数据,该音视频节目进一步包括电视直播节目和网络节目,其二为语音通话的音频数据。控制器通过音频处理器260,根据输入信号的标准编解码协议,对前述至少两路音频数据分别进行解压缩和解码,以及降噪、数模转换、和放大处理等音频数据处理,并将处理后的至少声音信号叠加后发送给音频输出接口270(如扬声器272),最后通过音频输出接口270输出音视频节目的声音和语音通话的声音,例如通过扬声器272播放音视频节目的声音和语音通话的声音。
在上述边看边聊场景中,用户可以通过操作控制装置正常操作显示设备。
在一些实施例中,用户可以通过操作控制装置对声音信号的输出音量值进行调整,功放管理器则根据用户设置的输出音量值控制声音信号的增益。例如,可以通过操作诸如遥控器或者移动终端上的物理音量按键(音量+、音量-)或者虚拟音量按键,或者语音输入,调节声音信号的输出音量。
示例性的,当显示器显示如图8所示的用户界面时,如果控制器接收到用户通过操作控制装置100输入的指示调整音量的指令,控制器则响应于该指令,在用户界面的上层显示指示当前输出音量的界面元素,该界面元素可以为如图10所示的音量调节条。用户根据该音量调节条示出的音量值,可以获知声音信 号当前的输出音量值,而功放管理器则是根据该输出音量值控制声音信号的增益。
在上述示例中,由于音频处理器中的功放管理器是根据同一输出音量值同时对电视节目声音信号和语音通话声音信号的增益进行调整并叠加播放,因此电视节目的输出音量与语音通话的输出音量相同。进而在用户角度,在“边看边聊”场景中,两路声音混杂在一起,彼此之间干预影响,导致用户无法辨识。
为此,本申请实施例针对上述边看边聊场景提出一种用于音量调节的交互设计,图8、图11-13示例性示出了在边看边聊场景下的音量调节交互过程示意图。
在一些实施例中,在如图8所示的同时进行音视频节目播放和语音通话时,用户可以通过操作控制装置输入指示调节音量的指令,控制器响应于该指令,在显示器上呈现音量设置界面,该音量设置界面显示有用于表示音视频节目的输出音量的界面元素,还显示有用于将语音通话的输出音量与音视频节目的输出音量相关联的音量设置项目。
图11示例性示出了一种音量设置界面,如图11所示,该音量设置界面以视图窗口的形式在音视频节目的播放画面和语音通话窗口的上层悬浮显示,该音量设置界面包括音量调节条111和音量设置项目112-114,该音量调节条用于指示音视频节目的输出音量值,音量设置项目112-114分别为“标准模式”112、“洪亮模式”113和“AI静音模式”114。
在本实施例中,音量设置项目用于将语音通话的输出音量与音视频节目的输出音量相关联,具体来说,用户通过操作控制装置对音视频节目的输出音量进行直接调节,而控制器则根据音视频节目的输出音量值对语音通话的输出音 量进行关联调节,以使二者的输出音量不同。例如,控制器根据音视频节目的输出音量值第一音量,将语音通话的输出音量值调节为第二音量,第一音量与第二音量相关,第一音量≠第二音量。
在一些可能的实现方式中,一个音量设置项目预设有一个的调节系数,不同音量设置项目对应的调节系数不同。调节系数用于与音视频节目的输出音量值相乘以得到语音通话的输出音量值,以使音视频节目的输出音量与语音通话的输出音量不同。当然,当某音量设置项目对应的调节系数为1时,语音通话的输出音量值相对于音视频节目的输出音量值并未发生变化,可见,音视频节目的输出音量值与语音通话的输出音量值相同。
示例的,图11示出的三种音量设置项目112-114分别对应一个调节系数,具体的,“标准模式”112对应的调节系数可以为1.1,意味着在语音通话时,对端用户的人声音量值可以相对于音视频节目的播放音量值提高10%;“洪亮模式”113对应的调节系数可以为1.2,意味着在视频聊天时,对端用户的人声音量值可以相对于音视频节目的播放音量值提高20%;“AI静音模式”114对应的调节系数可以为0,意味着在语音通话时,对端用户的人声音量被静音。
在一些实施例中,当显示器显示如图11所示的音量设置界面时,用户可以通过继续操作控制装置上的音量键(音量+或者音量-)输入调节音量的指令,控制器响应于该指令,获取音量设置项目中预先保存的默认项目,然后根据默认项目对音视频节目的输出音量和语音通话的输出音量进行关联调节。例如图11中处于选中状态的“标准模式”112,控制器获取“标准模式”112对应的调节系数1.1,然后根据音视频节目的输出音量值第一音量,将语音通话的输出音量调节至第二音量,第二音量=第一音量×1.1。
在一些实施例中,当显示器显示如图11所示音量设置界面时,用户可以通过操作控制装置选中某个音量设置项目,控制器根据被选中的音量设置项目,对所述音视频节目的输出音量和所述语音通话的输出音量进行关联调节,以使所述语音通话调节后的输出音量与所述音视频节目调节后的输出音量不同。例如,用户操作控制装置选中“洪亮模式”113,控制器响应于该用户操作,获取“洪亮模式”113对应的调节系数1.2,然后根据音视频节目的输出音量值第一音量,将语音通话的输出音量调节至第二音量,第二音量=第一音量×1.2。
从上述示例可以看出,“标准”模式和“洪亮”模式可以达到人声增强的效果,使得在边看边聊场景中,音视频节目的声音作为背景声音播放,语音通话的声音则作为前景声音播放,进而使得用户可以分别轻松分辨出声音的来源,避免造成混淆。
需要说明的是,该“标准模式”112和“洪亮模式”113的调节系数不局限于是1.1或者1.2,在其他实施例中,“标准模式”112和“洪亮模式”113也可以是被预先设置的任意值。
在一些实施例中,音量设置界面中显示的音量设置项目不仅限于上述三种,在其他实施例中,可以根据用户需求为用户提供其他的音量设置项目,以供用户选择。
在一些实施例中,通过调整音量设置项目对应的调节系数,实现人声减弱的效果,例如,当某音量设置项目对应的调节系数为0.5时,意味着在视频聊天时,对端用户的人声音量值可以相对于音视频节目的播放音量值降低50%,进而达到人声减弱的效果。
根据本申请提供的实施例可知,在边看边聊场景中,通过对显示设备同时 播放的至少两路声音的音量进行关联调节,可以使该至少两路声音以不同的输出音量进行播放,使得用户可以分别轻松分辨出声音的来源。
值得注意的是,图11所示的“AI静音模式”对应的调节系数可以为0,意味着将视频聊天的人声音量降低至0。为了在将语音通话声音静音的情况下,仍不影响聊天功能的实现,当“AI静音模式”被选中(即开启)时,可以将通话音频数据对应的文本以弹幕的形式发布在播放画面的上层。
示例的,在如图11所示的边看边聊场景中,当预先设置的默认模式为图11中所示的“AI静音模式”114时,或者当用户通过操作控制装置选中图11中示出的“AI静音模式”114时,控制器停止播放语音通话的声音信号,并将对端设备对应的语音通话数据对应的文本以弹幕的形式呈现在用户界面的顶层,例如图12所示。
以下以显示设备200B为例,对上述“AI静音模式”114的实现方式予以说明,其中,显示设备200B在进行音视频节目播放的同时,与一个或者多个其他终端设备进行语音通话。以下结合显示设备200B与显示设备200A的语音通话通信过程进行示例性说明。
在一些可能的实现方式中,语音通话数据对应的文本由数据发送端根据其采集的通话数据转换得到,并发送给数据接收端进行显示。具体的,显示设备200A通过麦克风采集用户的通话数据A;在显示设备200B未开启“AI静音模式”114的情况下,显示设备200A将采集的通话数据A发送给显示设备200B;在显示设备200B开启“AI静音模式”114的情况下,显示设备200A将采集的通话数据A同步到语音服务器,通过语音服务器对通话数据A进行识别转换,得到通话数据A对应的文本A;再将文本A发送给显示设备200B。对于显示设 备200B,在其未开启“AI静音模式”114的情况下,接收显示设备200A发送的通话数据A,从通话数据A提取出语音通话声音信号,通过功放管理器对该语音通话声音信号进行处理,并通过扬声器播放处理后的声音信号;在其接收到用户输入的对“AI静音模式”114选中操作时,通知显示设备200A;在显示设备200B开启“AI静音模式”114的情况下,显示设备200B接收显示设备200A发送的文本A,并将文本A实时显示在用户界面的顶层,进而达到如图11所示的显示效果。
从上述示例可以看出,显示设备200B在接收到用户在如图11所示的音量设置界面输入选中“AI静音模式”114的操作时,通知显示设备200A,以使显示设备200A将其采集的通话数据A转换成文本A后,再发送给显示设备200B。
在另一些可能的实现方式中,由数据接收端对其接收的通话数据进行识别转换得到对应的文本并显示。具体的,显示设备200A通过麦克风采集本端用户的通话数据A,将采集的通话数据A发送给显示设备200B;对于显示设备200B,在其未开启“AI静音模式”114的情况下,接收显示设备200A发送的通话数据A,从通话数据A提取出语音通话声音信号,通过功放管理器对该语音通话声音信进行处理,并通过扬声器进行播放;在显示设备200B开启“AI静音模式”114的情况下,显示设备200B接收显示设备200A发送的通话数据A,并将接收到的通话数据A同步到语音服务器,通过语音服务器对通话数据A进行识别转换,得到通话数据A对应的文本A;再将文本A显示在用户界面的顶层。
需要说明的是,在显示设备200B开启“AI静音模式”114的情况下,其对文本A的显示具有一定的时延Ta,该时延Ta至少为显示设备200A或者显示设备200B通过语音服务器识别出文本A所需的时长。
图13示例性示出了另一种音量设置界面,与图11所示界面不同的是,在图13所示界面中,还包括用于开启或者关闭上述音量关联调节功能的项目“音量关联调节开关”115,用户可以通过操作该项目开启或者关闭音量关联调节功能。具体的,如果音量关联调节功能处于开启状态,界面中各个音量设置项目为可被操作的项目,进而可被用户操作选中;如果音量关联调节功能处于关闭状态,音量设置项目为不可操作的项目,进而不可被用户操作选中。
在图11、12或13所示场景下,用户通过操作控制装置上诸如“返回”按键、“退出”等按键,撤销对音量设置界面的显示,以返回至图8所示的界面。
本申请还提供一种音量控制方法,该方法可以应用于如图1-13所示的显示设备,方法的执行主体为显示设备控制器,但不仅限于此。图14为本申请一些实施例示例性示出的音量控制方法流程图,如图14所示,该方法可以包括:
步骤01、当音视频节目播放和语音通话同时进行时,接收用户输入的指示调节音量的指令。
为方便说明,本申请实施例提出声音播放场景的概念,显示设备的声音播放场景包括边看边聊场景和普通场景。其中,边看边聊场景是指包含两种即以上声音信号输出的场景,例如,同时进行音视频节目播放和语音通话的场景。
在一些实施例中,当接收到用户输入的指示调节音量的指令时,判断当前的声音播放场景是否为边看边聊场景,如果是边看边聊场景,则执行步骤02,如果不是边看边聊场景,即普通场景,则呈现如图10所示的界面,并根据用户输入对显示设备播放视音频节目的音量进行调节。
在一些可能的实现方式中,可以通过检测前台应用的运行情况来判断当前 的声音播放场景是否为边看边聊场景,即是否音视频节目播放和语音通话同时进行。例如,如果检测到音视频聊天窗口对象正在运行,说明音视频聊天正在进行,故而当前的声音播放场景是边看边聊场景,如果检测到音视频聊天窗口对象未运行,说明当前未开启音视频聊天,故而当前的声音播放场景不是边看边聊场景,即普通场景。
在一些实施例中,用户可以通过操作控制装置输入指示调节音量的指令。例如,当用户按下遥控器100A上用于提高或者降低音量的物理音量按键时,控制器接收到遥控器发送的提高或者降低音量的指令。再如,当用户点击移动终端100B上用于提高或者降低音量的虚拟音量按键时,控制器接收到移动终端发送的提高或者降低音量的指令。此外,用户还可以通过遥控器、显示设备或者移动终端上的麦克风输入用于提高或者降低音量的语音指令。用户还可以通过按下显示设备外壳上用于提高或者降低音量的本机音量按键,输入提高或者降低音量的指令。
步骤02、响应于所述指示调节音量的指令,在显示器上呈现音量设置界面,所述音量设置界面包括用于表示所述音视频节目的输出音量的界面元素和用于将所述语音通话的输出音量与所述音视频节目的输出音量相关联的音量设置项目。
步骤02涉及的音量设置界面可以为如图11所示的音量设置界面,用于表示所述音视频节目的输出音量的界面元素可以为图11中的音量调节条111,音量设置项目可以为图11中的项目112-114。
在一些可能的实现方式中,一个音量设置项目预设有一个的调节系数,不同音量设置项目对应的调节系数不同。调节系数用于与音视频节目的输出音量 值相乘以得到语音通话的输出音量值,以使音视频节目的输出音量与语音通话的输出音量不同。当然,当某音量设置项目对应的调节系数为1时,语音通话的输出音量值相对于音视频节目的输出音量值并未发生变化,可见,音视频节目的输出音量值与语音通话的输出音量值相同。
步骤03、响应于用户对所述音量设置项目的选中操作,根据被选中的音量设置项目,对所述音视频节目的输出音量和所述语音通话的输出音量进行关联调节,以使所述语音通话调节后的输出音量与所述音视频节目调节后的输出音量不同。
本申请中,声音信号的音频流来源包括基于物理类型通道(ATV、DTV和HDMI等)的电视节目音频和其他音频,该其他音频的音频流类型主要为基于网络类型通道的网络节目音频(STREAM_MUSIC)和通话音频(STREAM_VOICE_CALL)。其中,基于物理类型通道(ATV、DTV和HDMI等)的电视节目音频和基于网络类型通道的网络节目音频(STREAM_MUSIC)即为音视频节目的音频,通话音频(STREAM_VOICE_CALL)即语音通话的音频。
对于不同类型/音频流来源声音信号,可以独立调整其输出音量值,以根据对应的输出音量值独立控制其增益。
如图9所示,对于输出音量控制,主要包括两种分支类型,分别为主音量MainVoice和子音量SubVoice,子音量SubVoice进一步包括第一子音量MusicVoice和第二子音量CallVoice,其中,主音量MainVoice为电视节目对应的的音量,第一子音量MusicVoice为网络节目对应的音量,第二子音量Call Voice为语音通话对应的音量。
基于此,当接收用户输入的指示调节音量的指令时,根据音视频节目的音频流来源调节音视频节目的输出音量值,再根据音视频节目的输出音量值对语音通话的输出音量值进行关联调节。
具体的,获取所述音视频节目的音频流来源,所述音频流来源为基于物理类型通道的电视节目或者基于网络类型通道的网络节目。如果音视频节目的音频流来源为电视节目,则调节主音量值(MainVoice)至第一音量,该主音量值即为电视节目的输出音量值;如果音视频节目的音频流来源为网络节目,则调节第一子音量值(MusicVoice)至第一音量,该第一子音量值即为网络节目的音量值。
然后,将第一音量与被选中音量设置项目对应的调节系数相乘得到第二音量,该第二音量即为语音通话的目标音量,进而,将第二子音量值(CallVoice)调节至第二音量,该第二子音量值即为语音通话的音量值。
从上述示例可以看出,在边看边聊场景中,通过对显示设备同时播放的至少两路声音的音量进行关联调节,可以使该至少两路声音以不同的输出音量进行播放,使得用户可以分别轻松分辨出声音的来源。
在一些实施例中,当显示器显示如图11所示的音量设置界面时,如果接收到用户输入的用于调节音量的指令,则获取所述音量设置项目中预先保存的默认项目;然后根据默认项目对音视频节目的输出音量和语音通话的输出音量进行关联调节。
在一些实施例中,响应于用户输入的指示调节音量的指令,在呈现音量设置界面的同时,则获取所述音量设置项目中预先保存的默认项目;然后根据默认项目对音视频节目的输出音量和语音通话的输出音量进行关联调节。
在上述示例中,默认项目可以是用户上一次操作选中的音量设置项目,也可以是系统默认的音量设置项目。
具体实现时,可以对每一种音量设置项目进行状态标记,比如将用户选中的音量设置项目或者系统默认的音量设置项目标记为选中状态,将其余音量设置项目标记为未选中状态。进而,可以通过遍历每种音量设置项目的标记状态获取到默认项目。
在另一些实施例中,在获取默认项目之前,先行判断如图13所示的音量关联调节功能是否处于开启状态,如果处于开启状态,则获取默认项目,如果处于未开启状态,则结束流程。
在一些实施例中,当在边看边聊场景下接收到用户通过操作控制装置输入的静音指令时,将音视频节目的输出音量和语音通话的输出音量同时调节至0。
在一些实施例中,当在边看边聊场景下接收到用户输入的AI静音指令时,响应于该AI静音指令,停止播放语音通话的声音信号,并将通话数据对应的文本以弹幕的形式发布在播放画面的上层。例如,当显示器显示如图11所示的音量设置界面时,用户可以通过操作控制装置选中“AI静音模式”114以输入AI静音指令。
具体实现时,分别获取通话数据对应的文本和通话数据对应的用户信息,该用户信息用于表征发送该通话数据的用户账号,例如用户昵称、用户ID、用户头像等。然后根据通话数据对应的文本和用户信息生成弹幕文字;再在显示器呈现的播放画面的上层,呈现携带有用户信息标记的弹幕文字。
具体实现中,本发明还提供一种计算机存储介质,其中,该计算机存储介质可存储有计算程序,当显示设备的至少一个控制器/处理器执行所述计算机程 序时,控制器/处理器执行图14所示的部分或者全部步骤。所述的存储介质可为磁碟、光盘、只读存储记忆体(英文:read-only memory,简称:ROM)或随机存储记忆体(英文:random access memory,简称:RAM)等。
本领域的技术人员可以清楚地了解到本发明实施例中的技术可借助软件加必需的通用硬件平台的方式来实现。基于这样的理解,本发明实施例中的技术方案本质上或者说对现有技术做出贡献的部分可以以软件产品的形式体现出来,该计算机软件产品可以存储在存储介质中,如ROM/RAM、磁碟、光盘等,包括若干指令用以使得一台计算机设备(可以是个人计算机,服务器,或者网络设备等)执行本发明各个实施例或者实施例的某些部分所述的方法。
本说明书中各个实施例之间相同相似的部分互相参见即可。尤其,对于方法实施例而言,由于其基本相似于显示设备实施例,所以描述的比较简单,相关之处参见显示设备实施例中的说明即可。
以上所述的本发明实施方式并不构成对本发明保护范围的限定。
Claims (10)
- 一种显示设备,其特征在于,包括:显示器,用于呈现用户界面,所述用户界面包括至少一个视图显示窗口;控制器用于:当音视频节目播放和语音通话同时进行时,接收用户输入的指示调节音量的指令;响应于所述指示调节音量的指令,在显示器上呈现音量设置界面,所述音量设置界面包括用于将所述语音通话的输出音量与所述音视频节目的输出音量相关联的音量设置项目;响应于用户对所述音量设置项目的选中操作,根据被选中的音量设置项目,对所述音视频节目的输出音量和所述语音通话的输出音量进行关联调节,以使所述语音通话调节后的输出音量与所述音视频节目调节后的输出音量不同。
- 根据权利要求1所述的显示设备,其特征在于,所述音量设置界面还包括用于表示所述音视频节目的输出音量的界面元素。
- 根据权利要求1所述的显示设备,其特征在于,所述响应于用户对所述音量设置项目的选中操作,根据被选中的音量设置项目,对所述音视频节目的输出音量和所述语音通话的输出音量进行关联调节之前,所述控制器还用于:获取所述音量设置项目中预先保存的默认项目;根据所述默认项目对所述音视频节目的输出音量和所述语音通话的输出音量进行关联调节。
- 根据权利要求1-3任一项所述的显示设备,其特征在于,所述根据被选中的音量设置项目或者所述默认项目,对所述音视频节目的输出音量和所述语音通话的输出音量进行关联调节,包括:根据所述被选中的音量设置项目或者所述默认项目,和预先存储的音量设置项目与调节系数的对应关系,确定调节系数;根据确定的调节系数对所述音视频节目的输出音量和所述语音通话的输出音量进行关联调节。
- 根据权利要求4所述的显示设备,其特征在于,所述根据确定的调节系数对所述音视频节目的输出音量和所述语音通话的输出音量进行关联调节,包括:根据接收到的指示调节音量的指令确定第一音量,以及将所述调节系数与所述第一音量相乘得到第二音量,所述第一音量为所述音视频节目的目标音量,所述第二音量为所述语音通话的目标音量;将所述音视频节目的输出音量调节至所述第一音量,将所述语音通话的输出音量调节至所述第二音量。
- 根据权利要求1所述的显示设备,其特征在于,所述语音通话调节后的输出音量高于所述音视频节目调节后的输出音量。
- 根据权利要求1所述的显示设备,其特征在于,所述控制器还用于:响应于用户输入的指示退出所述音量设置界面的指令,撤销所述音量设置界面的显示。
- 根据权利要求5所述的显示设备,其特征在于,所述将音视频节目的输出音量调节至所述第一音量,包括:获取所述音视频节目的音频流来源,所述音频流来源为基于物理类型通道的电视节目或者基于网络类型通道的网络节目;如果所述音视频节目的音频流来源为电视节目,则调节主音量值至所述第一音量,所述主音量值为所述电视节目的音量值;如果所述音视频节目的音频流来源为网络节目,则调节第一子音量值至所述第一音量,所述第一子音量值为所述网络节目的音量值。
- 根据权利要求8所述的显示设备,其特征在于,所述将语音通话的输出音量调节至所述第二音量,包括:调节第二子音量值至所述第二音量,所述第二子音量值为通话音频的音量值。
- 一种音量控制方法,其特征在于,所述方法包括:当音视频节目播放和语音通话同时进行时,接收用户输入的指示调节音量的指令;响应于所述指示调节音量的指令,在显示器上呈现音量设置界面,所述音量设置界面包括用于将所述语音通话的输出音量与所述音视频节目的输出音量相关联的音量设置项目;响应于用户对所述音量设置项目的选中操作,根据被选中的音量设置项目,对所述音视频节目的输出音量和所述语音通话的输出音量进行关联调节,以使所述语音通话调节后的输出音量与所述音视频节目调节后的输出音量不同。
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/CN2020/081417 WO2021189358A1 (zh) | 2020-03-26 | 2020-03-26 | 显示设备和音量调节方法 |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/CN2020/081417 WO2021189358A1 (zh) | 2020-03-26 | 2020-03-26 | 显示设备和音量调节方法 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2021189358A1 true WO2021189358A1 (zh) | 2021-09-30 |
Family
ID=77890138
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2020/081417 WO2021189358A1 (zh) | 2020-03-26 | 2020-03-26 | 显示设备和音量调节方法 |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO2021189358A1 (zh) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114598967A (zh) * | 2022-03-03 | 2022-06-07 | 合众新能源汽车有限公司 | 音频策略管理系统、方法、装置及计算机可读介质 |
CN115002553A (zh) * | 2022-04-29 | 2022-09-02 | 当趣网络科技(杭州)有限公司 | 基于同一影视视频边看边聊的方法和系统 |
CN116737104A (zh) * | 2022-09-16 | 2023-09-12 | 荣耀终端有限公司 | 音量调节方法和相关装置 |
CN116743905A (zh) * | 2022-09-30 | 2023-09-12 | 荣耀终端有限公司 | 通话音量控制方法及电子设备 |
WO2023245976A1 (zh) * | 2022-06-20 | 2023-12-28 | 由我(万安)科技有限公司 | 一种音频调节方法、蓝牙发射器及可读存储介质 |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103677703A (zh) * | 2012-09-18 | 2014-03-26 | 联想(北京)有限公司 | 电子设备及其音量调节方法 |
CN103686015A (zh) * | 2013-12-20 | 2014-03-26 | 乐视致新电子科技(天津)有限公司 | 音量调节方法及系统 |
CN106803918A (zh) * | 2017-03-02 | 2017-06-06 | 无锡纽微特科技有限公司 | 一种视频通话系统及实现方法 |
CN109683847A (zh) * | 2018-12-20 | 2019-04-26 | 维沃移动通信有限公司 | 一种音量调节方法和终端 |
-
2020
- 2020-03-26 WO PCT/CN2020/081417 patent/WO2021189358A1/zh active Application Filing
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103677703A (zh) * | 2012-09-18 | 2014-03-26 | 联想(北京)有限公司 | 电子设备及其音量调节方法 |
CN103686015A (zh) * | 2013-12-20 | 2014-03-26 | 乐视致新电子科技(天津)有限公司 | 音量调节方法及系统 |
CN106803918A (zh) * | 2017-03-02 | 2017-06-06 | 无锡纽微特科技有限公司 | 一种视频通话系统及实现方法 |
CN109683847A (zh) * | 2018-12-20 | 2019-04-26 | 维沃移动通信有限公司 | 一种音量调节方法和终端 |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114598967A (zh) * | 2022-03-03 | 2022-06-07 | 合众新能源汽车有限公司 | 音频策略管理系统、方法、装置及计算机可读介质 |
CN115002553A (zh) * | 2022-04-29 | 2022-09-02 | 当趣网络科技(杭州)有限公司 | 基于同一影视视频边看边聊的方法和系统 |
WO2023245976A1 (zh) * | 2022-06-20 | 2023-12-28 | 由我(万安)科技有限公司 | 一种音频调节方法、蓝牙发射器及可读存储介质 |
CN116737104A (zh) * | 2022-09-16 | 2023-09-12 | 荣耀终端有限公司 | 音量调节方法和相关装置 |
CN116743905A (zh) * | 2022-09-30 | 2023-09-12 | 荣耀终端有限公司 | 通话音量控制方法及电子设备 |
CN116743905B (zh) * | 2022-09-30 | 2024-04-26 | 荣耀终端有限公司 | 通话音量控制方法及电子设备 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN112073797B (zh) | 一种音量调节方法及显示设备 | |
WO2021189358A1 (zh) | 显示设备和音量调节方法 | |
WO2021031629A1 (zh) | 显示设备和控制装置按键复用方法 | |
CN110708581B (zh) | 显示设备及呈现多媒体屏保信息的方法 | |
WO2020248680A1 (zh) | 视频数据处理方法、装置及显示设备 | |
WO2020248681A1 (zh) | 显示设备及蓝牙开关状态的显示方法 | |
WO2021031598A1 (zh) | 视频聊天窗口位置的自适应调整方法及显示设备 | |
WO2020248697A1 (zh) | 显示设备及视频通讯数据处理方法 | |
CN111464840B (zh) | 显示设备及显示设备屏幕亮度的调节方法 | |
CN112463267B (zh) | 在显示设备屏幕上呈现屏保信息的方法及显示设备 | |
WO2021031620A1 (zh) | 显示设备和背光亮度调节方法 | |
WO2021031589A1 (zh) | 一种显示设备及色域空间动态调整方法 | |
CN113448529B (zh) | 显示设备和音量调节方法 | |
WO2020248699A1 (zh) | 一种声音处理法及显示设备 | |
WO2020248654A1 (zh) | 显示设备及应用共同显示的方法 | |
CN112073777B (zh) | 一种语音交互方法及显示设备 | |
CN112073666B (zh) | 一种显示设备的电源控制方法及显示设备 | |
CN112073776B (zh) | 语音控制方法及显示设备 | |
CN112073812B (zh) | 一种智能电视上的应用管理方法及显示设备 | |
WO2021169125A1 (zh) | 显示设备和控制方法 | |
CN112073808B (zh) | 一种色彩空间切换方法及显示装置 | |
CN112073803B (zh) | 一种声音再现方法及显示设备 | |
CN112073759B (zh) | 双系统之间通信方式的选取及调度方法、装置及显示设备 | |
CN113448530A (zh) | 显示设备和音量控制方法 | |
CN112073773A (zh) | 一种屏幕互动方法、装置及显示设备 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 20926959 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 20926959 Country of ref document: EP Kind code of ref document: A1 |