Nothing Special   »   [go: up one dir, main page]

TW201248607A - Display apparatus and operation method thereof - Google Patents

Display apparatus and operation method thereof Download PDF

Info

Publication number
TW201248607A
TW201248607A TW100118288A TW100118288A TW201248607A TW 201248607 A TW201248607 A TW 201248607A TW 100118288 A TW100118288 A TW 100118288A TW 100118288 A TW100118288 A TW 100118288A TW 201248607 A TW201248607 A TW 201248607A
Authority
TW
Taiwan
Prior art keywords
display device
sound
sound source
image
sensor
Prior art date
Application number
TW100118288A
Other languages
Chinese (zh)
Other versions
TWI442381B (en
Inventor
Kim-Yeung Sip
Original Assignee
Acer Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Acer Inc filed Critical Acer Inc
Priority to TW100118288A priority Critical patent/TWI442381B/en
Publication of TW201248607A publication Critical patent/TW201248607A/en
Application granted granted Critical
Publication of TWI442381B publication Critical patent/TWI442381B/en

Links

Landscapes

  • User Interface Of Digital Computer (AREA)

Abstract

A display apparatus and an operation method of the display apparatus are provided. A plurality of sound sensors are disposed at different position of the display apparatus. The display apparatus detect a sound source by using the sound sensors to obtain a plurality of detecting results. The display apparatus judge a sound direction of the sound source in accordance with the detecting results. The display apparatus automatically change a direction of image displayed by the display apparatus in accordance with the sound direction.

Description

201248607 六、發明說明: 【發明所屬之技術領域】 本發明是有關於一種顯示裝置,且特別是有關於依照 音源位置而自動改變影像方向的一種顯示裝置及其操作方 法。 【先前技術】 大部份顯示裝置的影像方向是固定的。對於個人電腦 而3,使用者可以透過作業系統(〇perating System,〇s)的 頁示η又疋以人工方式來選擇將顯示器的影像方向設定為 〇或90,也就疋允許使用者橫向放置顯示器或縱向放置 ”、、員示益。在部分智慧型手機(smart ph〇ne)產品中配置了加 速度感測器(G-sensor)。加速度感測器可以感測到使用者是 乂縱向握持智慧型手機,還是以橫向握持智慧型手機。透 ,加速度感測器判斷該智慧型手機與環境空間的相對狀 ^冬慧型手機可以自動將其顯示幕上的影像方向從〇。旋 90也就疋允許使用者橫向觀看顯示幕上的影, 或縱向觀看顯示幕上的影像。 上的當智慧型手機_示幕被水平放置時,顯示幕201248607 VI. Description of the Invention: [Technical Field] The present invention relates to a display device, and more particularly to a display device and an operation method thereof for automatically changing an image direction in accordance with a sound source position. [Prior Art] The image orientation of most display devices is fixed. For a personal computer, 3, the user can manually select the image orientation of the display to be 〇 or 90 through the page η of the operating system (〇 ating System System ) , , , , , , , , , , , , , , , , , , , 90 90 90 90 90 90 90 The display or the vertical position, the staff benefits. In some smart phones (smart ph〇ne) products are equipped with an acceleration sensor (G-sensor). The acceleration sensor can sense the user is a longitudinal grip Holding a smart phone, it is still holding the smart phone horizontally. Through the acceleration sensor, the relative shape of the smart phone and the environment space is judged. The winter mobile phone can automatically display the image direction on the display screen. 90 also allows the user to view the shadow on the display screen horizontally, or view the image on the display screen vertically. When the smart phone _ the screen is placed horizontally, the display screen

Cf皮固定而不再改變。在多人共同觀看被水 ,置的顯示幕之應㈣合下,這些使用者要以人 ‘看:台:慧型手機的水平角度,才能以正確的影像方: 規耆顯不幕上的影像。 201248607 【發明内容】 本發明提供一種顯示裝置及其操作方法。此操作方法 可以依照音源位置而自動改變顯示裝置的影像方向。 本發明實施例提出一種顯示裝置的操作方法。此操作 方法包括:藉由多個聲音感測器偵測一音源,而獲得多個 感測結果,其中該些聲音感測器各自配置於該顯示裝置的 不同位置;依據該些感測結果判斷該音源相對於該顯示裝 置的音源方向;以及依照該音源方向改變該顯示裝置的影 像方向。 本發明實施例提出一種顯示裝置,包括一顯示模組、 多個聲音感測器以及一處理單元。顯示模組顯示一影像。 所述多個聲音感測器偵測一音源,而獲得多個感測結果。 這些聲音感測器各自配置於顯示裝置的不同位置。顯示模 組以及聲音感測器各自耦接至處理單元。處理單元依據所 述多個感測結果判斷該音源相對於該顯示裝置的音源方 向,以及依照該音源方向藉由控制該顯示模組改變該影像 的影像方向。 在本發明之一實施例中,上述之顯示裝置為一平板電 腦、一智慧型手機、一會議桌或一行動上網裝置。 在本發明之一實施例中,所述依照該音源方向改變該 影像方向之步驟包括:判斷該音源所發出的一語音指令是 否為一轉向指令;以及若該語音指令為該轉向指令,則依 照該音源方向改變該影像方向。 4 201248607 基於上述,本發明實施例利用多個聲音感測器來偵測 音源的方向’然後依據音源方向來改變顯示裝置的影像方 向。在多人從不同方向共同觀看顯示裝置的影像之應用場 合下’本發明實施例所揭露的顯示裝置可以依照音源位置 (例如使用者位置)而自動改變/旋轉影像的顯示方向。因 此’多個使用者可以更方便、更友善地在不同方向共同觀 看顯示裝置的影像。 為讓本發明之上述特徵和優點能更明顯易懂,下文特 舉實施例’並配合所附圖式作詳細說明如下。 【實施方式】 圖1是依照本發明實施例說明顯示裝置10〇的應用情 境示意圖。顯示裝置100可以是平板電腦(tablet computer)、智慧型手機(smart ph〇ne)、行動上網裝置 (Mobile Internet Device ’ MID)、或其他可顯示的電子裝 置。顯示裝置100被水平放置於桌面上,如圖丨所示。在 多人從不同方向共同觀看顯示裝置100的影像之應用場合 下,本實施例所揭露的顯示裝置i 0 0可以依照音源(例如發 出語音的使用者)的位置而自動改變/旋轉影像的顯示方 向。因此,多個使用者可以更方便、更友善地在不同方向 共同觀看顯示裝置100的影像。以下將詳述顯示裝置1〇〇 的實施細節。 圖2是依照本發明實施例說明圖1所示顯示裝置1〇〇 的功能模塊示意圖。顯示裝置1〇〇包括顯示模組110、處 201248607 理單元120以及η個聲音感測器13〇1至i3〇 可以顯示影像。這些聲音感測器13〇 :二配 置於顯示裝置動的不同位置。所述多個聲 各跡 至130·η可以偵測音源,而莽 /旧 罄立咸、、㈣咖吨传多個感測結果。所述多個 #曰感心13G.1至13G_n可以是_ (rophone)、指向性麥克風、超音波感測 感測元件暨電路。顯示模組㈣以及聲音感^^^ 130-n各自搞接至處理單元12〇,如圖2所示。 。圖3是依照本發明實施例說明圖2所示顯示裝置剛 的㈣=法流程示意圖。請參照圖2與圖3,所述多個聲 音感測器130-1至13〇·η進行步驟削以仙音源,而獲 得多個感測結果’並將這些感·果傳送給處理單元 120。前述音源可以是任何聲音來源,例如發出語音的使用 者、發出超音波的電子裝置等等。 處理單元120進行步驟S32〇,以依據聲音感測器 130-1 至130-n的感測結果判斷該音源相對於顯示裝置的音 源方向。本實施例可以採用任何演算法來定位音源方向。 例如’本實施例可以採用論文「An 〇perat〇r Interface f〇r⑽The Cf skin is fixed and does not change. When multiple people watched the water together and placed the display screen (4), these users should look at the horizontal angle of the person: the type of the smart phone, in order to use the correct image side: image. 201248607 SUMMARY OF THE INVENTION The present invention provides a display device and an operation method thereof. This method of operation automatically changes the image orientation of the display unit according to the location of the source. Embodiments of the present invention provide a method of operating a display device. The operation method includes: detecting a sound source by using a plurality of sound sensors to obtain a plurality of sensing results, wherein the sound sensors are respectively disposed at different positions of the display device; determining according to the sensing results The sound source is opposite to the sound source direction of the display device; and the image direction of the display device is changed according to the sound source direction. Embodiments of the present invention provide a display device including a display module, a plurality of sound sensors, and a processing unit. The display module displays an image. The plurality of sound sensors detect a sound source to obtain a plurality of sensing results. These sound sensors are each disposed at different locations of the display device. The display module and the sound sensor are each coupled to the processing unit. The processing unit determines a sound source direction of the sound source relative to the display device according to the plurality of sensing results, and changes an image direction of the image according to the sound source direction by controlling the display module. In an embodiment of the invention, the display device is a tablet computer, a smart phone, a conference table or a mobile internet device. In an embodiment of the present invention, the step of changing the image direction according to the direction of the sound source comprises: determining whether a voice command sent by the sound source is a steering command; and if the voice command is the steering command, The direction of the source changes the direction of the image. 4 201248607 Based on the above, the embodiment of the present invention utilizes a plurality of sound sensors to detect the direction of the sound source' and then changes the image direction of the display device according to the direction of the sound source. The display device disclosed in the embodiment of the present invention can automatically change/rotate the display direction of the image in accordance with the sound source position (e.g., user position) in the application field in which a plurality of people view images of the display device in different directions. Therefore, multiple users can view the image of the display device in different directions more conveniently and friendlyly. The above described features and advantages of the present invention will become more apparent from the description of the appended claims. [Embodiment] FIG. 1 is a schematic diagram showing an application scenario of a display device 10A according to an embodiment of the present invention. The display device 100 may be a tablet computer, a smart phone, a mobile Internet device (MID), or other displayable electronic device. The display device 100 is placed horizontally on the desktop as shown in FIG. In an application where a plurality of people view images of the display device 100 in different directions, the display device i 0 0 disclosed in the embodiment can automatically change/rotate the display of the image according to the position of the sound source (for example, a user who issues a voice). direction. Therefore, a plurality of users can view the image of the display device 100 in a different direction more conveniently and friendly. Details of the implementation of the display device 1A will be described in detail below. FIG. 2 is a schematic diagram showing functional blocks of the display device 1A of FIG. 1 according to an embodiment of the invention. The display device 1A includes a display module 110, a 201248607 control unit 120, and n sound sensors 13〇1 to i3〇 to display an image. These sound sensors 13 〇 : two are placed at different positions of the display device. The plurality of sound tracks to 130·η can detect the sound source, and the 莽/old 罄立咸, (4) coffee ton transmits a plurality of sensing results. The plurality of #曰心心s 13G.1 to 13G_n may be _ (rophone), directional microphone, ultrasonic sensing sensing component and circuit. The display module (4) and the sound sense ^^^ 130-n are each connected to the processing unit 12A, as shown in FIG. . FIG. 3 is a flow chart showing the (four)= method of the display device of FIG. 2 according to an embodiment of the invention. Referring to FIG. 2 and FIG. 3, the plurality of sound sensors 130-1 to 13〇·η perform step-cutting to the source, and obtain a plurality of sensing results′ and transmit the senses to the processing unit 120. . The aforementioned sound source may be any sound source such as a user who issues a voice, an electronic device that emits ultrasonic waves, and the like. The processing unit 120 proceeds to step S32 to determine the direction of the sound source relative to the sound source of the display device based on the sensing results of the sound sensors 130-1 to 130-n. This embodiment can use any algorithm to locate the sound source direction. For example, 'this embodiment can use the paper "An 〇perat〇r Interface f〇r (10)

Autonomous Mobile System」(由 NILESH GOEL、ADITYA AGARWAL、SUBHAYAN BANERJEE 與 CHANDRA VEER SINGH等人所發表,其詳細内容請參照 http://www.nileshg〇el.com/papers-publishing/>> 所揭露的演 算法來判斷音源方向。此論文之揭露内容是以引用之方式 併入本文中。 6 201248607 照該=====,處理單元120便依 影像的影像方向(步驟330)。完;改= 步驟测。例如圖i所示,在其卜個使用者發 ,曰:4,顯示裝置100會對應地自動改變/旋轉影像方 二^發出語音的使用者方便觀賞顯示模組u〇所顯示 的衫像。因此’多個使用者可以更方便、更友善地在不同 方向共同觀看顯示裝置100所顯示的影像,而不需要以人 工方式旋轉整台顯示裝置100的水平角度。 圖4是依照本發明另一實施例說明圖2所示顯示裝置 100的操作方法流程示意圖。圖4所示實施例可以參照圖3 的相關說明。其中,圖4所示步驟33〇包含子步驟S332 與S334。請參照圖2與圖4,當處理單元12〇在步驟S32〇 獲得音源方向後,處理單元120接著判斷該音源(使用者) 所發出的聲音是否為語音指令’以及該語音指令是否為「轉 向指令」。若音源所發出的語音指令不是預設的「轉向指 令」’則顯示裝置100回到步驟S310而不改變影像方向。 反之’若音源所發出的語音指令為「轉向指令」,則處理 單元120便依照該音源方向控制顯示模組110,使顯示模 組110改變該影像方向(步驟334)。完成步驟334後,顯示 裝置100回到步驟S310。 上述「轉向指令」以視產品設計需求及/或使用者的 應用需求而設定之。例如,本實施例將轉向指令設定為「請 轉向」或「turnpleasej。例如圖1所示’在多個使用者相 201248607 互,論的吵雜%<境中’顯示裝置動會判斷使用者所發出 的聲音疋否為#向指令」,以避免顯示裝置⑽受。桑音 干擾而,地改變/旋轉影像方向。在這些使用者的討論過 程中’若有,中-位使用者想要觀看顯示裝置丨。。所顯示 的影像,則這位使用者可以對顯示裝置1〇〇說出「請轉向」 或「turn please」(即轉向指令卜顯示裝置1〇〇判斷使用者 所發出的聲音為「轉向指令」後,處理單元12〇便會進行 步驟334 ’以依照發出「轉向指令」的音源方向控制顯示 模組110 ’使顯示模組11〇對應地自動改變/旋轉該影像方 向’以利發出「轉向指令」的使用者方便觀賞顯示模組u〇 所顯示的影像。 圖5A與圖5B是依照本發明實施例說明顯示裝置1〇〇 的應用情境示意圖。此範例將假設顯示裝置1〇〇配置兩個 聲音感測器130-1與130-2(即n=2)。聲音感測器130-1與 130-2分別被配置在顯示裝置1〇〇的二個對向侧。請參照 圖5A ’當使用者U1 (音源)發出聲音至顯示裝置1〇〇時, 聲音感測器130-1與130-2都可以偵測到使用者ui的聲 音。由於聲音感測器130-1比聲音感測器130-2更靠近使 用者U1 ’使得聲音感測器130-1 貞測到的音量會大於聲音 感測盗130-2偵測到的音量。因此,處理單元120在步驟 S320可以藉由比較聲音感測器13〇-1與130-2所情測到的 音量來判斷音源方向。由於聲音感測器130-1偵測到的音 量會大於聲音感測器130-2偵測到的音量,因此顯示裝置 100判斷音源方向為圖5A的左方。當處理單元12〇獲得音 201248607Autonomous Mobile System" (published by NILESH GOEL, ADITYA AGARWAL, SUBHAYAN BANERJEE and CHANDRA VEER SINGH, etc. For details, please refer to http://www.nileshg〇el.com/papers-publishing/>> The algorithm is used to determine the direction of the sound source. The disclosure of this paper is incorporated herein by reference. 6 201248607 According to the =====, the processing unit 120 depends on the image direction of the image (step 330). = Step measurement. For example, as shown in Figure i, in the user's voice, 曰: 4, the display device 100 will automatically change/rotate the image correspondingly. The user who makes the voice can conveniently view the display module u〇. Therefore, a plurality of users can view the image displayed by the display device 100 in different directions more conveniently and friendlyly without manually rotating the horizontal angle of the entire display device 100. FIG. 4 is in accordance with FIG. Another embodiment of the present invention is a schematic flow chart of the operation method of the display device 100 shown in Figure 2. The embodiment shown in Figure 4 can refer to the related description of Figure 3. The step 33 of Figure 4 includes sub-steps. S332 and S334. Referring to FIG. 2 and FIG. 4, after the processing unit 12 obtains the sound source direction in step S32, the processing unit 120 then determines whether the sound emitted by the sound source (user) is a voice command' and the voice command. Whether it is a "steering command". If the voice command issued by the sound source is not the preset "steering command", the display device 100 returns to step S310 without changing the image direction. Otherwise, if the voice command issued by the sound source is "steering command" The processing unit 120 controls the display module 110 according to the direction of the sound source to cause the display module 110 to change the image direction (step 334). After completing step 334, the display device 100 returns to step S310. The "steering command" is It is set according to the product design requirements and/or the application requirements of the user. For example, in this embodiment, the steering command is set to "please turn" or "turnpleasej. For example, as shown in FIG. 1 in multiple user phases 201248607 mutual The noisy %<in the environment' display device will determine whether the user's voice is a #direction command to avoid the display device (10). However, the direction of the image is changed/rotated. During the discussion of these users, if there is, the user of the middle position wants to view the display device. The displayed image, the user can view the display device. 〇 Speaking "Please turn" or "turn please" (ie, after turning to the command display device 1 to determine that the user's voice is a "steering command", the processing unit 12 will proceed to step 334 ' to issue " The sound source direction control display module 110' of the steering command "automatically changes/rotates the image direction" correspondingly to the display module 11's to facilitate the user who has issued the "steering command" to conveniently view the image displayed by the display module u. 5A and 5B are schematic diagrams showing an application scenario of a display device 1A according to an embodiment of the invention. This example will assume that the display device 1 is configured with two sound sensors 130-1 and 130-2 (i.e., n = 2). The sound sensors 130-1 and 130-2 are respectively disposed on the opposite sides of the display device 1A. Referring to FIG. 5A', when the user U1 (sound source) emits a sound to the display device 1, the sound sensors 130-1 and 130-2 can detect the sound of the user ui. Since the sound sensor 130-1 is closer to the user U1' than the sound sensor 130-2, the volume detected by the sound sensor 130-1 is greater than the volume detected by the sound sensor 130-2. Therefore, the processing unit 120 can determine the direction of the sound source by comparing the volume measured by the sound sensors 13〇-1 and 130-2 in step S320. Since the volume detected by the sound sensor 130-1 is greater than the volume detected by the sound sensor 130-2, the display device 100 determines that the sound source direction is to the left of Fig. 5A. When the processing unit 12 receives the sound 201248607

源方向後,處理單元120便依照該音源方向(使用者的 位置)控制顯示模組11〇,使顯示模組n〇改變/旋轉兮岑像 111的景>像方向(步驟330),如圖5A所示。完成步驟33〇 後,顯示裝置100回到步驟S310,以便繼續透過聲立 器130-1與130-2偵側聲音。 9 J 接下來,請參照圖5B,當使用者U2 (音源)發出聲音 至顯示裝置100時,由於聲音感測器13〇_2比聲音感測^ 130-1更靠近使用者U2,使得聲音感測器13〇_2偵測到的 音量會大於聲音感測器130-1偵測到的音量。因此^顯示 裝置100判斷音源方向為圖5B的右方。當處理單元12〇 獲得音源方向後’處理單元120便依照該音源方向(使用者 U2的位置)控制顯示模組110,使顯示模組11〇改變/旋轉 該影像111的影像方向(步驟330),如圖5B所示。 因此’如圖5A與圖5B所示,在使用者m與m其 中一個發出語音後,顯示裝置1〇〇會對應地自動改變/旋轉 影像方向’以利發出語音的使用者方便觀賞顯示模組ιι〇 所顯示的影像111。因此,使用者U1與1;2可以更方便、 更友善地在不同方向共同觀看顯示|置刚所顯示的影像 111,而不需要以人工方式旋轉整台顯示裝置1〇〇的角度。 上述實施例說明影像方向被改變/旋轉至〇。與18〇。, 然而本發明的實現方式不限於此。例如,下述實施例可以 將影像方向改變/旋轉至0。、90。、18〇。與27〇。。圖6A至 圖6D是依照本發明另一實施例說明顯示裝置二的應用 情境示意圖。此範例將假設顯示裝置1〇〇配置三個聲音感 201248607 測器130-1 ' 130-2與130-3 (即n=3)。聲音感測器13(M、 130-2與130-3分別被配置在顯示裝置1〇〇的不同側邊。請 參照圖6A,當使用者m (音源)發出聲音至顯示裝置謂 時,聲音感測器130-1、130-2與130-3都可以偵測到使用 者ui的聲音。由於聲音感測器υο」比聲音感測器13〇_2 與130-3更靠近使用者,使得聲音感測器^化丨偵測到 的音量會大於聲音感測器130_2與13〇_3偵測到的音量。 因此,顯示裝置100判斷音源方向為圖6A的左方。當處 理單元120獲得音源方向後,處理單元12〇便依照該音源 方向(使用者U1的位置)控制顯示模組丨10,使顯示模組11〇 改變/旋轉該影像111的影像方向(步驟33〇),如圖6A所 示。完成步驟330後,顯示裝置1〇〇回到步驟S31〇,以便 繼續透過聲音感測器130-1、130-2與130-3偵側聲音。 接下來,請參照圖6B,當使用者U2 (音源)發出聲音 至顯示裝置100時,由於聲音感測器Π〇-2與130-3比聲 音感測器130_1更靠近使用者U2,使得聲音感測器130_2 與130_3偵測到的音量會大於聲音感測器η"偵測到的 音量。再者’由於聲音感測器130-2至使用者U2的距離 相似於聲音感測器130-3至使用者U2的距離,使得聲音 感測器130-2與130-3 <貞測到的音量會約略相同。因此, 顯不裝置100判辦音源方向為圖6B的右方。當處理單元 120獍得音源方向後’處理單元12〇便依照該音源方向(使 用者U2的位置)控制顯示模組11 〇,使顯示模組11 〇改變/ 旋轉該影像111的影像方向(步驟330),如圖6B所示。 201248607 當使用者U3 (音源)發出聲音至顯示裝置1 〇〇時,請參 照圖6C,由於聲音感測器130_3比聲音感測器^04與 130-2更靠近使用者U3,使得聲音感測器13〇-3偵測到的 曰里會大於聲音感測器130-1與130-2偵測到的音量。因 此,顯示裝置100判斷音源方向為圖6C的下方。當處理 單元120獲得音源方向後,處理單元12〇便依照該音源方 向(使用者U3的位置)控制顯示模組11〇,使顯示模組11〇 改變/旋轉該影像111的影像方向(步驟33〇),如圖6C所示。 當使用者U4(音源)發出聲音至顯示裝置1〇〇時,請參 照圖6D’由於聲音制器13G_2比聲音制^⑽·!與 i3〇:3更靠近使用者U4 ’使得聲音感測器13〇_2制到的 音量會大於聲音感測器13(M與13〇_3偵測到的音量。因 此,顯不裝i 100判斷音源方向為圖6D的上方。當處理 用得音源方向後,處理單元12G便依照該音源方 向(使用者U4的位置)控制顯示模組11〇,使顯示模 改變^該^像U1的影像方向(步驟夠,如圖仍所示。 诳U4 t 至圖6〇所示,在使用者U1、仍、U3 與U4其中-個發出語音後,顯置 改變/旋轉影像方向,以利發 二二曰對應地自動 模組m所顯示的影像u;因。:的使用者方便觀賞顯示 方便、更左盖J 此,使用者U1〜U4可以更 方便更友善地在不同方向共更 的影像11卜而不需要以人τ 裝置所顯示 的角度。 人方式旋轉整台顯示裝置100 11 201248607 圖7是依照本發明又一實施例說明影像方向的改變/ 力疋轉角度示意圖。本實施例可以將影像方向改變/旋轉至 0。、45。、90。、135。、180。、225。、270。與 315。。聲音感測 器130-1、130-2與130-3被配置在顯示裝置1〇〇,且分位 於顯示裝置100的180。、315。與45。的位置。本實施例可 以參照圖1〜4、圖5A〜5B與圖6A〜6D的相關說明,在此 不再贊述。 圖8是依照本發明另一實施例說明圖1所示顯示裝置 川〇的功能模塊示意圖。顯示裝置100包括顯示模組U0、 處理單元120、η個聲音感測器130-1至130-n以及至少一 個體態感測器140。本實施例可以參照圖1〜4、圖5a〜5B、 ,6A〜6D與圖7的相關說明。體態感測器14〇耦接至處理 單元120。體態感測器140可以判斷顯示裝置1〇〇與環境 空間的相對狀態。例如,體態感測器14〇可以判斷顯示裝 置100被使用者垂直握持或是橫向握持,還是顯示裝置 被水平擺放在桌面上。體態感測器14〇可以是任何物理感 測器,例如滾珠開關、加速度感測器(G_sens〇r)、陀螺儀感 測器^丫1'0 361180〇、運動感測器(111(^〇11此1^〇1_)及/或羅盤感 測器 (compass sensor)等等。 顯不裝置100可以藉由體態感測器14〇判斷顯示裝置 1〇〇是否被水平放置。若顯示裝置100被水平放置(如圖i 所不)’則處理單7C 120進行步驟S32〇與S33〇,也就是判 2音源方向並錢該音源柏控纖示模组UG,使顯示 板組110對應地改變影像方向。 12 201248607 若該顯示裝置100未被水平放置,則該處理單元12〇 依照所述至少一體態感測器14〇的感測結果改變影像方 向。例如’體態感測器140可以感測到使用者是以縱向振 持智慧型手機’還是以橫向握持智慧型手機。依據體態感 測器140的感測結果’顯示裝置1〇〇可以自動將其顯示模 組110上的影像方向旋轉至縱向或是橫向。 圖9是依照本發明另一實施例說明顯示裝置1〇〇的應 用情境示意圖。圖9所示實施例可以參照圖1〜圖4、圖5A〜 圖5B、圖6A〜圖6D與圖7〜圖8的相關說明。不同於前述 諸實施例之處’在於本實施例中顯示裝置1〇〇是具有顯示 模組110的桌子(例如會議桌等)。顯示模組u〇被配置在 桌面,如圖1所示。多個使用者可以從不同方向共同觀看 顯示模組110的影像。此範例將假設顯示裝置1〇〇配置三 個聲音感測器130-1、130-2與130-3 (即n=3)。聲音咸測 器130-1、130-2與130-3各自備配置在桌面的不同側邊, 以便感測使用者的聲音。依照使用者的語音指令,顯示裝 置100可以依照音源(例如發出語音指令的使用者)的位置 而自動改變/旋轉影像的顯示方向。 綜上所述,上述諸實施例利用多個聲音感測器 130-1〜130-n來偵測音源的方向,然後依據音源方向來改變 顯示裝置100的影像方向。不論是單人觀看或多人從不同 方向共同觀看顯示裝置1〇〇的影像U1,顯示裝置1〇〇均 可以依照音源位置(例如使用者位置)而自動改變/旋轉影像 13 201248607 的顯示方向。因此,顯示裝置100提供更方便、更友善的 顯不介面。 义雖然本發明已以實施例揭露如上,然其並非用以限定 本發明,任何所屬技術領域中具有通常知識者,在不脫離 本發明之精神和範_ ’當可作料之更動與麟,故本 發明之保護範圍當視後附之中請專利範圍所界定者為準。 【圖式簡單說明】 圖1是依照本發明實施例說明顯示裝置的應用 意圖。 圖2是依照本發明實施例說明圖1所示顯示裝置的功 能模塊示意圖。 圖3是依照本發明實施例說明圖2所示顯示裴置的操 作方法流程示意圖。 圖4是依照本發明另一實施例說明圖2所示顯示裝置 的操作方法流程示意圖。 ^ 圖5A與圖5B是依照本發明實施例說明顯示骏置的應 用情境示意圖。 ~ 圖6A至圖6D是依照本發明另一實施例說明顯示 的應用情境示意圖。 ' 的改變/ 圖7是依照本發明又一實施例說明影像方向 旋轉角度示意圖。 圖8是依照本發明另一實施例說明圖1所示顯示带置 的功能模塊示意圖。 201248607 圖9是依照本發明另一實施例說明顯示裝置的應用情 境示意圖。 【主要元件符號說明】 100 顯示裝置 110 顯示模組 111 影像 120 處理單元 130-1〜130-3、130-n :聲音感測器 140 :體態感測器 S310〜S334 :步驟 U1〜U4 ··使用者 15After the source direction, the processing unit 120 controls the display module 11 according to the direction of the sound source (the position of the user), so that the display module n〇 changes/rotates the scene of the image 111 and the image direction (step 330), such as Figure 5A shows. Upon completion of step 33, the display device 100 returns to step S310 to continue detecting sounds through the sound holders 130-1 and 130-2. 9 J Next, referring to FIG. 5B, when the user U2 (sound source) emits a sound to the display device 100, since the sound sensor 13〇_2 is closer to the user U2 than the sound sensing ^130-1, the sound is made The volume detected by the sensor 13〇_2 will be greater than the volume detected by the sound sensor 130-1. Therefore, the display device 100 determines that the sound source direction is the right side of Fig. 5B. After the processing unit 12 obtains the sound source direction, the processing unit 120 controls the display module 110 according to the sound source direction (the position of the user U2), so that the display module 11 〇 changes/rotates the image direction of the image 111 (step 330). , as shown in Figure 5B. Therefore, as shown in FIG. 5A and FIG. 5B, after one of the users m and m emits a voice, the display device 1 自动 automatically changes/rotates the image direction correspondingly to facilitate the user who plays the voice to conveniently view the display module. Ιι〇 The displayed image 111. Therefore, the users U1 and 1; 2 can more conveniently and more conveniently view the image 111 displayed in the different directions in different directions without manually rotating the angle of the entire display device 1〇〇. The above embodiment illustrates that the image direction is changed/rotated to 〇. With 18 baht. However, the implementation of the present invention is not limited thereto. For example, the following embodiment can change/rotate the image direction to zero. 90. 18〇. With 27 baht. . 6A-6D are schematic diagrams showing an application scenario of a display device 2 according to another embodiment of the present invention. This example will assume that the display device 1 is configured with three sound senses 201248607 detectors 130-1 '130-2 and 130-3 (i.e., n=3). The sound sensors 13 (M, 130-2, and 130-3 are respectively disposed on different sides of the display device 1). Referring to FIG. 6A, when the user m (sound source) emits a sound to the display device, the sound is The sensors 130-1, 130-2, and 130-3 can detect the sound of the user ui. Since the sound sensor υο" is closer to the user than the sound sensors 13〇_2 and 130-3, The volume detected by the sound sensor is greater than the volume detected by the sound sensors 130_2 and 13〇_3. Therefore, the display device 100 determines that the sound source direction is to the left of Fig. 6A. When the processing unit 120 After obtaining the direction of the sound source, the processing unit 12 controls the display module 10 according to the direction of the sound source (the position of the user U1), so that the display module 11 〇 changes/rotates the image direction of the image 111 (step 33〇), such as 6A, after the completion of step 330, the display device 1 returns to step S31, in order to continue to detect the sound through the sound sensors 130-1, 130-2 and 130-3. Next, please refer to FIG. 6B. When the user U2 (sound source) emits a sound to the display device 100, the sound sensor Π〇-2 and 130-3 have a sense of sound. The device 130_1 is closer to the user U2, so that the volume detected by the sound sensors 130_2 and 130_3 is greater than the detected volume of the sound sensor η". Further, 'because the sound sensor 130-2 to the user U2 The distance is similar to the distance between the sound sensor 130-3 and the user U2, so that the sounds measured by the sound sensors 130-2 and 130-3 will be approximately the same. Therefore, the display device 100 determines the sound source. The direction is the right side of FIG. 6B. After the processing unit 120 obtains the sound source direction, the processing unit 12 controls the display module 11 依照 according to the sound source direction (the position of the user U2), so that the display module 11 〇 changes/rotates. The image direction of the image 111 (step 330) is as shown in Fig. 6B. 201248607 When the user U3 (sound source) emits a sound to the display device 1 ,, please refer to FIG. 6C, because the sound sensor 130_3 is more sensitive than the sound. The devices ^04 and 130-2 are closer to the user U3, so that the sound detected by the sound sensor 13〇-3 is larger than the volume detected by the sound sensors 130-1 and 130-2. Therefore, the display The device 100 determines that the sound source direction is below the image of Figure 6C. When the processing unit 120 obtains the sound source direction The processing unit 12 controls the display module 11A according to the direction of the sound source (the position of the user U3) to cause the display module 11 to change/rotate the image direction of the image 111 (step 33A), as shown in FIG. 6C. When the user U4 (sound source) emits a sound to the display device 1〇〇, please refer to FIG. 6D' because the sound controller 13G_2 is closer to the user U4' than the sound system (10)·! and i3〇:3, so that the sound is sensed. The volume produced by the device 13〇_2 will be greater than the volume detected by the sound sensor 13 (M and 13〇_3). Therefore, it is apparent that the i 100 is judged to be the sound source direction as the upper side of Fig. 6D. After processing the direction of the sound source, the processing unit 12G controls the display module 11 according to the direction of the sound source (the position of the user U4), so that the display mode changes the image direction of the image U1 (the steps are sufficient, as shown in the figure)诳U4 t to Fig. 6〇, after the user U1, still, U3 and U4 emit a voice, the direction of the change/rotation image is displayed to facilitate the automatic module m corresponding to the second and second The displayed image u; because: the user is convenient to view and display, and the left cover J. The user U1~U4 can more conveniently and friendlyly share the image in different directions without the need for the human τ device. The angle of the display. The human mode rotates the entire display device 100 11 201248607 FIG. 7 is a schematic diagram illustrating the change of the image direction/force rotation angle according to still another embodiment of the present invention. This embodiment can change/rotate the image direction to 0. 45, 90, 135, 180, 225, 270, and 315. The sound sensors 130-1, 130-2, and 130-3 are disposed on the display device 1 and are located in the display device The position of 180, 315, and 45 of 100. This embodiment can be used as reference 1 to 4, FIGS. 5A to 5B and FIGS. 6A to 6D are not described herein. FIG. 8 is a schematic diagram showing the function of the display device shown in FIG. 1 according to another embodiment of the present invention. The device 100 includes a display module U0, a processing unit 120, n sound sensors 130-1 to 130-n, and at least one body sensor 140. This embodiment can refer to FIGS. 1 to 4 and 5a to 5B. 6A~6D are related to the description of Fig. 7. The body state sensor 14 is coupled to the processing unit 120. The body state sensor 140 can determine the relative state of the display device 1 and the environmental space. For example, the body state sensor 14 It can be judged whether the display device 100 is vertically or laterally held by the user, or whether the display device is horizontally placed on the table. The body sensor 14 can be any physical sensor such as a ball switch or an acceleration sensor. (G_sens〇r), gyroscope sensor ^丫1'0 361180〇, motion sensor (111 (^〇11111_) and/or compass sensor, etc.) The device 100 can determine whether the display device 1 is horizontally placed by the body sensor 14 If the display device 100 is placed horizontally (as shown in FIG. 1), the processing unit 7C 120 performs steps S32 and S33, that is, the direction of the sound source is 2, and the sound source is controlled by the module UG to make the display panel. 110 correspondingly changes the image direction. 12 201248607 If the display device 100 is not placed horizontally, the processing unit 12 changes the image direction according to the sensing result of the at least one state sensor 14A. For example, the body sensor The 140 can sense whether the user is holding the smart phone in the vertical direction or holding the smart phone horizontally. According to the sensing result of the body sensor 140, the display device 1 can automatically rotate the image direction on the display module 110 to the portrait or landscape direction. Figure 9 is a diagram showing an application scenario of a display device 1 in accordance with another embodiment of the present invention. The embodiment shown in FIG. 9 can be referred to the related description of FIGS. 1 to 4, 5A to 5B, 6A to 6D, and 7 to 8. The difference from the foregoing embodiments is that the display device 1A in this embodiment is a table (e.g., conference table, etc.) having the display module 110. The display module u〇 is configured on the desktop, as shown in Figure 1. Multiple users can view the image of display module 110 in common from different directions. This example will assume that the display device 1 is configured with three sound sensors 130-1, 130-2 and 130-3 (i.e., n = 3). The sound scenters 130-1, 130-2, and 130-3 are each disposed on different sides of the tabletop to sense the user's voice. In accordance with the user's voice command, the display device 100 can automatically change/rotate the display direction of the image in accordance with the position of the sound source (e.g., the user who issued the voice command). In summary, the above embodiments use a plurality of sound sensors 130-1~130-n to detect the direction of the sound source, and then change the image direction of the display device 100 according to the direction of the sound source. The display device 1 can automatically change/rotate the display direction of the image 13 201248607 in accordance with the sound source position (e.g., user position), whether it is a single person viewing or a plurality of people viewing the image U1 of the display device 1 from different directions. Therefore, the display device 100 provides a more convenient and friendly display interface. Although the present invention has been disclosed in the above embodiments, it is not intended to limit the present invention, and any one of ordinary skill in the art may, without departing from the spirit and scope of the present invention, The scope of protection of the invention is subject to the definition of the scope of the patent. BRIEF DESCRIPTION OF THE DRAWINGS Fig. 1 is a view showing an application intention of a display device in accordance with an embodiment of the present invention. FIG. 2 is a schematic diagram showing the functional modules of the display device shown in FIG. 1 according to an embodiment of the invention. FIG. 3 is a flow chart showing an operation method of the display device shown in FIG. 2 according to an embodiment of the present invention. FIG. 4 is a flow chart showing the operation method of the display device shown in FIG. 2 according to another embodiment of the present invention. Figure 5A and Figure 5B are schematic diagrams showing the application scenario of the display device in accordance with an embodiment of the present invention. FIG. 6A to FIG. 6D are schematic diagrams showing an application scenario of a display according to another embodiment of the present invention. 'Changes' FIG. 7 is a schematic view showing an image direction rotation angle according to still another embodiment of the present invention. FIG. 8 is a schematic diagram showing the functional blocks of the display tape set of FIG. 1 according to another embodiment of the present invention. 201248607 FIG. 9 is a schematic diagram showing an application scenario of a display device according to another embodiment of the present invention. [Description of main component symbols] 100 Display device 110 Display module 111 Image 120 Processing units 130-1 to 130-3, 130-n: Sound sensor 140: Body sensor S310 to S334: Steps U1 to U4 ·· User 15

Claims (1)

201248607 七、申請專利範圍·· 一種顯示裝置的操作方法,包括. =多個聲音感測器偵測—音源,而 二:其㈣音感測器各自配置於該顯示裝置的= 音:該=測結果該音源相對於該顯示裝置的一 依照該音源方向改變該顯示裝置的—影像方向。 法,二=圍第1項所述顯示裝置的操作方 議桌或-行動上網裝置。 《慧型手機、-會 、土甘3士如申5月專利範圍第1項所述顯示裝置的操作方 > \中所賴照該音源方向改變該影像方向、 以及判斷該音源所發出的—語音指令衫為-轉向指令 該影音指令為該轉向指令,則依照該音源方向改變 冰,申明專利辜巳圍第1項所述顯示裳置的操作方 /¾ εί><〇 * .藉由至少-體態感測器判斷該顯示襄置是否被水平放 置; 若該顯示裝置被水平放置,則進行所述依照該音源方 向改變戎影像方向之步驟;以及 16 201248607 感測水平放置’則依輕 =如申請專利顧第4項所述顯 二其中;f態感測器包括-加速度感測器、一= I、—運動感測器及/或一羅盤感測器。 … 6·—種顯示裝置,包括·· 一顯示模組,用以顯示一影像,· 於果多—器’、用以_—音源’而獲得多個感測 位置;^及雜聲音感測器各自配置於該顯示裝置的i同 器,兮接至該顯示模組以及該些聲音感測 向’以及依照該音源方向藉由控制該顯 丁模、、且改變该影像的一影像方向。 =如中請專利範圍第6項所述之顯 腦、一智慧型手機、一會議桌或一㈣貞 理ηαδ·如申料利範圍第6項所述之顯示裝置,其中該處 人:7L經由該些聲音感測器判斷該音源所發出的一語音指 =為—轉向指令;以及若該語音指令為該轉向曰 岑=处理單元依照該音源方向藉由控制該顯示模組改變該 衫像方向。 人 9·如申請專利範圍第6項所述之顯示裝置,更包括: 17 201248607 至少一體態感測器,耦接至該處理單元,用以判斷該 顯示裝置與環境空間的相對狀態; 其中若該顯示裝置被水平放置,則該處理單元依照該 音源方向藉由控制該顯示模組改變該影像方向;以及若該 顯示裝置未被水平放置,則該處理單元依照所述至少一體 態感測器改變該影像方向。 10.如申請專利範圍第9項所述之顯示裝置,其中該 體態感測器包括一加速度感測器、一陀螺儀感測器、一運 動感測器及/或一羅盤感測器。 18201248607 VII. Patent application scope ·· A method for operating a display device, including: = multiple sound sensor detection - sound source, and two: its (four) sound sensor is respectively configured on the display device = sound: the = test As a result, the sound source changes the image direction of the display device according to the direction of the sound source with respect to the display device. Method, two = the operating table of the display device described in item 1 or the mobile internet device. "Hua-type mobile phone, - Hui, Tu Gan 3, such as the operator of the display device described in the first paragraph of the patent scope of May 5", the direction of the image is changed according to the direction of the sound source, and the sound source is judged. - the voice command shirt is - the steering command is the steering command, the ice is changed according to the direction of the sound source, and the operation side of the display of the first item described in the first paragraph of the patent is /3⁄4 εί><〇*. Determining, by the at least one body sensor, whether the display device is horizontally placed; if the display device is horizontally placed, performing the step of changing the direction of the image according to the direction of the sound source; and 16 201248607 sensing the horizontal placement' Light = as described in the patent application item 4; the f-state sensor includes an acceleration sensor, a = I, a motion sensor and/or a compass sensor. ... 6---display device, including a display module for displaying an image, and obtaining a plurality of sensing positions for the fruit-to-device's _-sound source; Each of the devices is disposed on the display device, coupled to the display module and the sound sensing directions, and by controlling the display mode according to the sound source direction, and changing an image direction of the image. = display device as described in item 6 of the patent scope, a smart phone, a conference table or a (four) care ηαδ·, as described in claim 6, wherein the person: 7L Determining, by the sound sensors, a voice finger issued by the sound source is a steering command; and if the voice command is the steering switch, the processing unit changes the shirt image by controlling the display module according to the sound source direction. direction. The display device of the invention of claim 6, further comprising: 17 201248607 at least an integrated state sensor coupled to the processing unit for determining a relative state of the display device and the environmental space; The processing unit is horizontally placed, and the processing unit changes the image direction by controlling the display module according to the sound source direction; and if the display device is not horizontally placed, the processing unit is configured according to the at least one integrated state sensor Change the orientation of the image. 10. The display device of claim 9, wherein the body sensor comprises an acceleration sensor, a gyroscope sensor, a motion sensor, and/or a compass sensor. 18
TW100118288A 2011-05-25 2011-05-25 Display apparatus and operation method thereof TWI442381B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
TW100118288A TWI442381B (en) 2011-05-25 2011-05-25 Display apparatus and operation method thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
TW100118288A TWI442381B (en) 2011-05-25 2011-05-25 Display apparatus and operation method thereof

Publications (2)

Publication Number Publication Date
TW201248607A true TW201248607A (en) 2012-12-01
TWI442381B TWI442381B (en) 2014-06-21

Family

ID=48138786

Family Applications (1)

Application Number Title Priority Date Filing Date
TW100118288A TWI442381B (en) 2011-05-25 2011-05-25 Display apparatus and operation method thereof

Country Status (1)

Country Link
TW (1) TWI442381B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106201178A (en) * 2016-06-29 2016-12-07 深圳市金立通信设备有限公司 A kind of adjustment screen display direction control method and terminal
CN109557951A (en) * 2018-12-10 2019-04-02 深圳Tcl数字技术有限公司 Adjusting method, television set and the computer readable storage medium of angle of television

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106201178A (en) * 2016-06-29 2016-12-07 深圳市金立通信设备有限公司 A kind of adjustment screen display direction control method and terminal
CN109557951A (en) * 2018-12-10 2019-04-02 深圳Tcl数字技术有限公司 Adjusting method, television set and the computer readable storage medium of angle of television
CN109557951B (en) * 2018-12-10 2022-07-29 深圳Tcl数字技术有限公司 Television angle adjusting method, television and computer readable storage medium

Also Published As

Publication number Publication date
TWI442381B (en) 2014-06-21

Similar Documents

Publication Publication Date Title
US9910505B2 (en) Motion control for managing content
US10317947B2 (en) Electronic device and method for processing gesture thereof
KR102169952B1 (en) Wearable device and method of controlling thereof
EP3246903B1 (en) Multi-purpose conference terminal
US20140101578A1 (en) Multi display device and control method thereof
CN108499105A (en) The method, apparatus and storage medium of visual angle adjustment are carried out in virtual environment
US20170075479A1 (en) Portable electronic device, control method, and computer program
TWI502478B (en) Touch screen electronic device and control method thereof
CN111665983A (en) Electronic device and display method thereof
KR20130142824A (en) Remote controller and control method thereof
CN110427110A (en) A kind of live broadcasting method, device and direct broadcast server
CN109166150B (en) Pose acquisition method and device storage medium
KR20150129386A (en) Foldable display device and method for controlling the same
US10019140B1 (en) One-handed zoom
US20160188021A1 (en) Method of receiving user input by detecting movement of user and apparatus therefor
CN109725683A (en) A kind of program display control method and Folding screen terminal
JP2010086192A (en) Mobile device, computer program, and recording medium
TW201426406A (en) Electronic device and method for rotating display image of the electronic device
WO2016054945A1 (en) Display method and apparatus for terminal display interface
TW201506776A (en) Method for adjusting screen displaying mode and electronic device
JP6202698B1 (en) Housing and system
CN102820022A (en) Display device and operation method thereof
JPWO2014006757A1 (en) Display device and control method of display device
TWM543398U (en) Virtual input device for use with mobile phones
US9767735B2 (en) Terminal device and illumination control method