EP0670537A1 - Hands-free user-supported portable computer - Google Patents
Hands-free user-supported portable computer Download PDFInfo
- Publication number
- EP0670537A1 EP0670537A1 EP94300856A EP94300856A EP0670537A1 EP 0670537 A1 EP0670537 A1 EP 0670537A1 EP 94300856 A EP94300856 A EP 94300856A EP 94300856 A EP94300856 A EP 94300856A EP 0670537 A1 EP0670537 A1 EP 0670537A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- user
- voice
- electrical signals
- information
- recognition module
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000012545 processing Methods 0.000 claims abstract description 17
- 238000004891 communication Methods 0.000 claims abstract description 11
- 230000001755 vocal effect Effects 0.000 claims description 43
- 238000000034 method Methods 0.000 claims description 30
- 230000001419 dependent effect Effects 0.000 claims description 16
- 238000005259 measurement Methods 0.000 claims description 15
- 239000004973 liquid crystal related substance Substances 0.000 claims description 4
- 230000000007 visual effect Effects 0.000 claims 6
- 230000008569 process Effects 0.000 description 21
- 238000010586 diagram Methods 0.000 description 9
- 238000005516 engineering process Methods 0.000 description 8
- 238000012360 testing method Methods 0.000 description 6
- 230000002093 peripheral effect Effects 0.000 description 5
- 230000008439 repair process Effects 0.000 description 5
- 230000005236 sound signal Effects 0.000 description 4
- 238000012549 training Methods 0.000 description 4
- 238000004519 manufacturing process Methods 0.000 description 3
- 238000012546 transfer Methods 0.000 description 3
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 230000001105 regulatory effect Effects 0.000 description 2
- 238000013024 troubleshooting Methods 0.000 description 2
- 230000001174 ascending effect Effects 0.000 description 1
- 238000004590 computer program Methods 0.000 description 1
- 230000001186 cumulative effect Effects 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000011156 evaluation Methods 0.000 description 1
- 210000001061 forehead Anatomy 0.000 description 1
- 210000003128 head Anatomy 0.000 description 1
- 230000004886 head movement Effects 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 230000002452 interceptive effect Effects 0.000 description 1
- 238000012423 maintenance Methods 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/16—Sound input; Sound output
- G06F3/167—Audio in a user interface, e.g. using voice commands for navigating, audio feedback
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01R—MEASURING ELECTRIC VARIABLES; MEASURING MAGNETIC VARIABLES
- G01R1/00—Details of instruments or arrangements of the types included in groups G01R5/00 - G01R13/00 and G01R31/00
- G01R1/02—General constructional details
- G01R1/025—General constructional details concerning dedicated user interfaces, e.g. GUI, or dedicated keyboards
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
- G02B27/0172—Head mounted characterised by optical features
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/163—Wearable computers, e.g. on a belt
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10L—SPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
- G10L15/00—Speech recognition
- G10L15/26—Speech to text systems
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/014—Head-up displays characterised by optical features comprising information/image processing systems
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/0143—Head-up displays characterised by optical features the two eyes not being equipped with identical nor symmetrical optical devices
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
- G02B2027/0178—Eyeglass type
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B5/00—Optical elements other than lenses
- G02B5/30—Polarising elements
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10L—SPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
- G10L15/00—Speech recognition
- G10L15/08—Speech classification or search
- G10L15/10—Speech classification or search using distance or distortion measures between unknown speech and reference templates
Definitions
- This invention relates to computers, and more particularly to a user-supported portable computer.
- ETM electronic technical manuals
- IETM interactive electronic technical manuals
- ETMs and IETMs are essentially electronic databases which are typically housed in conventional computers having a keyboard for user input and a full-sized monitor for information display. An operator may use the computer in order to access and display data stored in the ETMs and IETMs for a variety of uses including troubleshooting and repair/replacement of a system, subsystem or component thereof.
- ETMs and IETMs are particularly useful in service and repair industries wherein technicians often require detailed information from technical manuals to repair and service malfunctioning devices.
- ETMs and IETMs are useful in automobile repair centers wherein service personnel find it necessary to access information in automobile technical manuals in order to service malfunctioning automobiles.
- ETMs and IETMs are useful in military service centers wherein military technicians often require access to information in military technical manuals to service and repair malfunctioning weapon systems. In such scenarios, it is more efficient to access information from ETMs and IETMs rather than printed, publications since the printed publications may be voluminous.
- ETMs and IETMs are traditionally stored in and accessed from conventional computers having keyboards for operator input and full-sized video monitors fro displaying data. Such computers are often located in service areas adjacent to the devices being repaired. In operation, maintenance personnel move back and forth between the computers and the devices being repaired in order to retrieve data required to service the devices being repaired. Such movement between the computers and the devices being repaired gives rise to the disadvantage that it represents a considerable amount of time and effort spent for the purpose of retrieving data from the ETMs and IETMs. Therefore, conventional computers are not efficient devices for storing ETMs and IETMs since such conventional computers result in inefficient data delivery of the information in the ETMs and IETMs to operators.
- the present invention is accordingly directed to a compact, self-contained portable computing apparatus which is completely supported by a user for hands-free retrieval and display of information for the user.
- the computing apparatus includes a housing having securing means for removably securing the housing to a user for support by the user.
- the housing further includes storage means for storing previously entered information, and processor means, communicating with the storage means, for receiving, retrieving and processing information and user commands in accordance with a stored program.
- the computing apparatus also includes audio transducer and converter means, in communication with the processor means, for receiving audio commands from the user, for converting the received audio commands into electrical signals, for recognizing the converted electrical signals, and for sending the recognized electrical signals to the processor means, the audio transducer and converter means also being supported by the user.
- the computing apparatus further includes display means in communication with the processor means for receiving information from the processor means and for displaying the received information for the user, the display means being supported by the user whereby the user may operate the computing apparatus to display information in a hands-free manner utilizing only audio commands.
- Fig. 1 is a front schematic view of an operator (also called a user) wearing a compact, portable computer 102 in accordance with the present invention
- Fig. 2 is a side view of the operator wearing the computer 102.
- the location of the components of the computer 102 on the operator as shown in Figs. 1 and 2 is representational only and may vary depending on operator convenience and comfort.
- the computer 102 includes a housing such as system unit 106 having a securing means, in the present embodiment a strap or belt 104, which is adapted to be worn around the operator's waist for securing the housing or system unit to the user for support by the user.
- the computer 102 further includes display means for receiving information from the system unit 106 and for displaying the received information for the user or operator.
- the display means in the present embodiment, includes a headband 108, a display screen 110, and an adjustable arm 112 connecting the display screen 110 to the headband 108.
- the headband 108 is adapted to be worn by the user in any convenient location, but preferably upon the user's forehead, as shown.
- the position of the display screen 110 may be adjusted via the adjustable arm 112 so that the operator can comfortably view information displayed on the display screen 110.
- the display screen 110 is electrically connected to the system unit 106, in the present embodiment, via a cable 114, although other connection means may alternatively be employed.
- the computer 102 further includes audio transducer and converter means in communication with the system unit 106 for receiving audio commands from the user, for converting the receiver audio commands into electrical signals, for recognizing the converted electrical signals and for sending the recognized electrical signals to a processor within the system unit 106.
- the audio transducer and converter means includes a microphone 122 for receiving verbal commands from the operator.
- the microphone 122 which, in the present embodiment, is electrically connected to the system unit 106 via a cable 124 is preferably an ear-supported microphone, although those with ordinary skill in the art will appreciate that any audio-input or transducer device could be used and that the audio-input or transducer could be supported by the user at some other location such as proximate the mouth or throat of the user.
- the computer 102 further includes measurement means in communication with the system unit 106 for performing electrical measurements on devices being evaluated by the computer 102 (such evaluation including, but not limited to, testing, calibrating, troubleshooting, diagnosing and servicing).
- the measurement means includes an instrument pack 116 which is attachable to the belt 104 and is electrically connectable via cables 118 to a device 120 which is to be tested, analyzed, repaired or the like.
- the instrument pack 116 is also electrically connected to system unit 106, in the present embodiment, via a cable 126.
- the computer system 102 is adapted to be completely supported by a user or operator.
- the display screen 110 is placed to permit the user to accomplish other tasks, i.e., servicing a device, while glancing at the screen 110 for information concerning the task being performed.
- the microphone 122 permits the user to verbally control the computer system 102 to display desired information while maintaining the user in a hands-free mode for unimpaired performance of the task.
- the instrument pack 116 permits the computer 102 to obtain information from a device, such as a device being serviced, while maintaining the user in a hands-free mode with respect to the computer system 102 (note that the user may be handling the device being serviced, test equipment or other equipment while using the computer system 102).
- a device such as a device being serviced
- the user may be handling the device being serviced, test equipment or other equipment while using the computer system 1012.
- the user is able to perform a task in a more efficient manner, since the user can access data from and input data to the computer system 102 in a hands-free mode with, at most, minimal diversion from the task being performed.
- Fig. 3 is an exterior perspective view of the system unit 106. Because the system unit 106 is intended to be supported by an operator, the system unit 106 is lightweight and small sized, in the present embodiment, preferably about five inches by six inches by three inches with a weight of approximately three pounds.
- the system unit 106 includes a top panel 302, a bottom panel 310, a front panel 312, a back panel 308, a first side panel 304, and a second side panel 306. Connected to the back panel 308 is a clip 328, for attaching the system unit 106 to the belt 104.
- Located on the top panel 302 is a microphone jack 314 which is of a type well known in the art.
- the cable 124 connected to the microphone 122 is removably connected to the system unit 106 via the microphone jack 314 using a suitable connector (not shown). Also located on the top panel 302 is a voice input indicator 318, preferably a light-emitting diode (LED), which illuminates to visually confirm when the ear microphone 122 receives verbal input from the operator.
- a voice input indicator 318 preferably a light-emitting diode (LED), which illuminates to visually confirm when the ear microphone 122 receives verbal input from the operator.
- a voice output indicator 316 and a volume control 320 are also located on the top panel 302.
- the top panel 302 also includes a speaker 332.
- the voice output indicator 316 preferably a single LED, illuminates to visually confirm when the computer 102 is outputting synthesized or digitized speech through the speaker 332 for the purpose of providing the user with information, queries, instructions, messages or other feedback.
- the volume control 320 preferably a rotatable knob or depressible button, controls the volume level of the audio output of the speaker 332.
- a reset system button 322 a Power On/Off button 324 and a Power-on indicator 326.
- the Power-On indicator 326 preferably a single LED, illuminates to visually confirm when the system unit 106 is turned on via the ON/OFF button 324.
- the system unit 106 initializes when the reset system button 322 is pressed.
- an input means in the form of a pointing device 330, such as a sensitive touch pad, a joy stick, roller ball or a gyroscopic mouse.
- a pointing device 330 such as a sensitive touch pad, a joy stick, roller ball or a gyroscopic mouse.
- the specific locations of the above-described elements on the top panel 302 as shown in Fig. 3 are representational only and may vary among particular implementations for ergonomic or other reasons. It should also be appreciated that the particular implementation of such elements (i.e., LEDs for indicators) may also vary.
- the microphone 122 (shown in Fig. 1) is a microphone/speaker assembly having a first transducer for receiving audio signals and a second transducer for outputting audio signals, the second transducer essentially being an audio speaker.
- the microphone/speaker assembly 122 may be, for example, a well-known bone-conduction device.
- the speaker 332 shown in Fig. 1 is not necessary, and thus is not located on the front panel 332 and, in fact, is not part of the computer system 102.
- Fig. 4 is an exterior plan view of the bottom panel 310 of the system unit 106.
- the bottom panel 310 includes a monitor or display port 402, two serial ports 404a and 404b, a parallel port 406, a keyboard port 410, a mouse port 412 and an external power supply port 408.
- the serial ports 404a and 404b are RS-232 compatible and the parallel port 406 is centronics compatible.
- the system unit 106 may be modified to include additional and/or different ports and connectors without diverging from the spirit and intent of the present invention.
- Fig. 5 is a schematic block diagram of the primary structural features of the computer 102 in accordance with the present embodiment.
- the computer 102 includes a bus 502, which preferably has a data width of at least sixteen bits.
- the bus 502 is contained in the system unit 106.
- the computer 102 also includes processor means such as central processing unit (CPU) 504, which is connected to the bus 502 and is also preferably contained in the system unit 106.
- the CPU 504 is an 80286 or 80386SX microprocessor available from Intel. It will be appreciated by those of ordinary skill in the art that while an 80286 or 80386SX microprocessor is preferred, any other central processor or microprocessor, either available presently or in the future, could be used.
- the computer 102 also includes a memory 506 having, for example, one Mbyte to twenty Mbytes of random access memory (RAM).
- the memory 506, which is also connected to the bus 502 and is preferably contained in the system unit 106, stores an application program 508 while the computer 102 is operating.
- the application program 508 may have been loaded into the memory 506 from a magnetic storage device 519 (described below) pursuant to operator instructions.
- the computer 102 also includes an input/output interface 510 which controls all data transfers between the CPU 504 and certain other components (herein called peripherals) which communicate with the CPU 504 but which are not connected directly to the bus 502.
- the input/output interface 510 includes a video interface, a controller for at least two RS-232 compatible serial ports, a controller for the centronics-compatible parallel port, keyboard and mouse controllers, a floppy disk controller, and a hard drive interface.
- the input/output interface 510 could include additional and/or different interfaces and controllers for use with other types of peripherals such as Ethernet®, Arcnet®, token ring interface.
- the input/output interface 510 is connected to the bus 502 and preferably is located in the system unit 106.
- the computer 102 also includes input/output connectors 518 which collectively represent the above-described physical peripheral ports and accompanying electrical circuitry.
- the input/output connectors 518 include the monitor port 402, serial ports 404a and 404b, parallel port 406, keyboard port 410, and mouse port 412 shown in Fig. 4.
- the input/output connectors 518 could include additional and/or different types of physical ports.
- the computer 102 also includes a power converter 536 which is connected to an internal battery 539, an external battery 540 and/or an AC power source such as a conventional electrical outlet (not shown in Fig. 5).
- the power converter 536 and the internal battery 539 are preferably located in the system unit 106 while the external battery 540 is located outside of the system unit 106, preferably attached to the belt 104. (The external battery 540 is not shown in Figs. 1 and 2).
- the external battery 540 is connected to the power converter 536 via the external power supply port 408 shown in Fig. 4.
- the power converter 536 may be connected to the AC power source for supplying regulated DC power to the computer 102.
- the power converter 536 is usually connected to the internal battery 539 and/or the external battery 540 for supplying regulated DC power to the computer 102.
- the internal battery 539 supplies power to the power converter 536 (and ultimately the computer 102) only when the power converter 536 is not connected to either the external battery 540 or the AC power source.
- the computer 102 further includes a separate battery charger 534 for periodically charging the internal battery 539 and the external battery 540 when not in use.
- the computer 102 may include a battery-powered indicator, attached to the system unit 106 front panel 302, for indicating when the power levels of the external battery 540 and/or internal battery 539 are low.
- the bus 502, CPU 504, memory 506, input/output interface 510, input/output connectors 518, and power converter 536 described above are implemented using a backplane circuit card, processor circuit card, memory circuit card, input/output circuit card, and input/output connection circuit card in a manner well known to those skilled in the art.
- the processor circuit card, memory circuit card, input/output circuit card, and input/output connection circuit card are plugged into the backplane circuit card.
- IBM PC/AT compatible and/or 80386 compatible circuit cards available from Dover Electronics Manufacturing of Longmont, CO and Ampro Computers of Sunnyvale, CA are used.
- circuit cards from Dover Electronics Manufacturing occupy a cubic space of approximately two inches by five inches by two inches while each of the circuit cards from Ampro are approximately 3.8 inches by 3.6 inches.
- any functionally compatible circuits cards which conform to the relatively small size of the system unit 106 could be used in place of the circuit cards available from Dover Electronics Manufacturing.
- the computer 102 also includes display means which, in the present embodiment as noted above with reference to Figs. 1 and 2, includes a headband 108, a display screen 110, and an adjustable arm 112 connecting the display screen 110 to the headband 108.
- the display means further includes a display screen driver module 514 which preferably is located in the system unit 106, but which alternatively could be located outside of the system unit 106 adjacent to the display screen 110.
- the display screen driver module 514 converts display information (that is, information which is to be displayed for an operator) received from the CPU 504 (via the input/output interface 510, bus 502 and input/output connectors 518) into video signals which are sent to and compatible with the display screen 110.
- the display screen driver module 514 is of a standard design well known to those skilled in the art.
- the display screen 110 is a miniature monitor called an "eye piece monitor" which provides a display equivalent to conventional twelve-inch monitors (that is, approximately twenty-five lines by eighty characters per line), but which has a viewing screen with a diagonal length of approximately one inch. Since the display screen 110 is located close to the operator's eye and is supported by the operator's head so it follows the operator's head movement, the operator is able to view information on the display screen 110 without having to move away from his work bench (where, for example, a device is being repaired) by merely glancing from the device being repaired to the display screen 110. Therefore, the display screen 110, as described above, facilitates the retrieval of information contained in an electronic database since such information can be viewed without significantly diverting an operator's attention away from his work.
- the display screen 110 and display screen driver module 514 can be implemented using any video technology either available presently or in the future, such as color graphics adaptor (CGA), and enhanced graphics adaptor (EGA), video graphics array (VGA), and super VGA. According to a present embodiment, however, the display screen 110 and display screen driver module 514 are implemented using well-known color graphic adapter (CGA) technology. CGA eye piece monitors are available from many vendors, including Reflection Technology, Inc., of Waltham, MA which produces and sells the Private EyeTM monitor. Alternatively, the display screen 110 and display screen driver module 514 are implemented using well-known (monochrome or color) video graphics adaptor (VGA) technology.
- CGA color graphic adapter
- VGA video graphics adaptor
- VGA eye piece monitors which operate according to well-known color shutter wheel technology are currently available from sources, such as the NucolorTM Shutters produced by Tektronix, Inc., of Beaverton, Oregon.
- the display means may alternatively be a flat panel display screen attached to the system unit 106.
- Fig. 11 is a functional block diagram of a conventional display screen 110 connected to a conventional display screen driver module 514, both of which operate according to color shutter wheel technology.
- the display screen 110 has a monochrome cathode ray tube (CRT) 1106 and a filter 1108 having a color polarizer 1110 for polarizing and separating light from the CRT 1106 into cyan (that is, blue and green) and red components.
- the filter 1108 also includes a pi-cell 1112, which is preferably a relatively fast liquid-crystal switch, for rotating the polarized light from the color polarizer 1110 by either zero or ninety degrees.
- the filter 1108 also includes a second color polarizer 1114 for polarizing and separating light from the pi-cell 1112 into yellow (that is, red and green) and blue, which effectively divides the blue component from the green.
- the filter 1108 also includes a second pi-cell 1116, for rotating the polarized light from the color polarizer 1114 by either zero or ninety degrees.
- the display screen driver module 514 includes a video interface 1102, which receives signals from the bus 502 representing data to be displayed on the display screen 110, and converts the signals into video signals generally having image and color information.
- the video signals are received by a shutter controller 1104, which causes the CRT 1106 to display an image in accordance with the image information, and which also controls the filter 1108 in accordance with the color information to convert the image displayed by the CRT 1106 to a color image.
- the shutter controller 1104 causes the filter 1108 to transmit a red, green, or blue color image by selecting an appropriate combination of states of the pi-cells 1112 and 1116.
- the shutter controller 1104 sets both pi-cells 1112 and 1116 to rotate by zero degrees, such that blue and green components from the first color polarizer 1110 passes through the first pi-cell 1112 unaffected.
- the vertically oriented yellow color polarizer 1114 then absorbs the blue components and leaves only the green component (note that only the vertical components are considered here since the light transmitted through the filter 1108 is always vertically polarized).
- the shutter controller 1104 sets the first pi-cell 1112 to rotate by ninety degrees and the second pi-cell 1116 to rotate by zero degrees. Such alignment places the red component into the vertical position so that it passes through the yellow color polarizer 1114 unaffected.
- the shutter controller 1104 sets both pi-cells 1112 and 1116 to rotate by ninety degrees, thereby rotating the initially vertically polarized blue component so it passes through the horizontally oriented blue color polarizer 1110.
- Color shutter wheel technology is further described in "Reinventing the Color Wheel" by Thomas J. Haven ( Information Display , Vol. 7, No. 1, January 1991, pages 11-15) Referring again to Fig.
- the computer 102 also includes various peripherals such as an internal pointing device 230, storing means such as magnetic storage device 519, measurement means such as instrument pack 116, a microphone 122 and a voice-recognition module 522, which are all connected to the system unit 106 (and, specifically, to the input/output interface 510 which controls data traffic between the peripherals and the CPU 504) via the input/output connectors 518.
- the internal pointing device 230 is a well-known pointing device such as a mouse, sensitive touch pad, or gyroscopic mouse which is connected to the top panel 302 of the system unit 106 (as shown in Fig.
- the present invention also supports other non-audio input devices such as bar code readers, touch memory readers and proximity scanners, which may be used when the operator does not wish to or cannot interact with the computer 102 via voice.
- the computer 102 also includes a packet switching component 542 and an antenna 544 which, in the present embodiment, are preferably contained in the system unit 106 and which enable the computer 102 to send and receive information from remote locations via well-known telecommunication means, such as via telephone or satellite.
- the packet-switching component 542 and the antenna 544 are particularly useful for updating a database 520 stored in a magnetic storage device 519 (described below) in real-time with information received from a remote computer or other data source.
- the computer 102 may include a global positioning system component (not shown in Fig. 5) for receiving and processing (via the antenna 544) positioning information from navigation systems, such as the Global Positioning System.
- the magnetic storage device 519 which is preferably contained in the system unit 106, is a static, read/write memory having a relatively large memory capacity such as a removable or non-removable hard disk drive.
- the system unit 106 contains an external slot for allowing the operator to insert and remove removable storage disks.
- An optional storage device 519 could be a read-only memory such as a CD-ROM.
- the magnetic storage device 519 includes 80 Mbytes to one gigabyte of memory. Magnetic storage devices which are suitable for use as the magnetic storage device 519 and which have a size compatible with the size of the system unit 106 are produced and sold by various manufacturers such as Integral, Connor, Seagate and Syquest. As shown in Fig.
- the magnetic storage device 519 stores a database 520 (which may be an ETM or IETM) which may have been previously loaded into the magnetic storage device 519 from a floppy drive (not shown in Fig. 5) which could be connected to the computer 102 via a port on the input/output connectors 518, or from a remote computer via a telecommunication link connected to the computer via the packet switching component 542 and the antenna 544 or by direct cabling.
- a database 520 which may be an ETM or IETM
- a floppy drive not shown in Fig. 5
- the magnetic storage device 519 stores a database 520 (which may be an ETM or IETM) which may have been previously loaded into the magnetic storage device 519 from a floppy drive (not shown in Fig. 5) which could be connected to the computer 102 via a port on the input/output connectors 518, or from a remote computer via a telecommunication link connected to the computer via the packet switching component 542 and the antenna 544 or by
- the instrument pack 116 includes electrical measurement equipment such as a multimeter 524 and a counter/timer 526, and ports for connecting the electrical measurement equipment to devices being evaluated or serviced such as an IEEE-488 connector 528, an IEEE-1553 connector 530 and an IEEE-1708 connector 532.
- electrical measurement equipment such as a multimeter 524 and a counter/timer 526
- ports for connecting the electrical measurement equipment to devices being evaluated or serviced such as an IEEE-488 connector 528, an IEEE-1553 connector 530 and an IEEE-1708 connector 532.
- the instrument pack 116 could contain other types of electrical measurement equipment and ports.
- the electrical measurement equipment in the instrument pack 116 is connected to a device being evaluated or serviced, hereinafter called a device under test (DUT) 120, via a cable 118 which is connected to any one of the connectors 528, 530, and 532 (depending on the particular type of interface on the DUT 120).
- DUT device under test
- Data from the DUT 120 is measured by the multimeter 524, counter/timer 526 and/or any other measurement equipment contained in the instrument pack 116 (as appropriate for the test being performed).
- the results of such tests are sent from the instrument pack 116 to the CPU 504 via the input/output connectors 518 and the bus 502. Alternatively, they can be stored on battery-backed memory chips.
- the computer 102 may also include an external monitor 516 which is not supported by an operator (but rather rests on a desk, for example) and which connects to the system unit 106 via the monitor port 402 (shown in Fig. 4).
- the external monitor 516 receives from the CPU 504 the same display information as the display screen 110 (via the display screen driver module 514).
- the computer 102 may also include an external keyboard and mouse (not shown), which are connectable to the system unit 106 via the keyboard port 410 and mouse port 412, respectively.
- the external keyboard and mouse represent conventional means for an operator to interact with the computer 106.
- the external monitor 516, external keyboard and external mouse are connected to the system unit 106 when the computer 102 is operating in a non-portable mode (e.g., as a desk-top computer).
- the voice-recognition module 522 is preferably contained in the system unit 106 and is connected to the microphone 122 (which is preferably an ear microphone located outside of the system unit 106). Alternatively, the voice-recognition module 522 may be located outside of the system unit 106 and, far example, may be incorporated with the microphone 122 as a single unit. Alternatively, the analog-to-digital converter component of the voice-recognition module 522 is located outside of the system unit 102 while the remaining components of the voice-recognition module 522 is located inside the system unit 102, the external analog-to-digital converter preferably communicating with the system unit 102 via a serial communication stream.
- the microphone 122 receives audio input (also called verbal utterances) from an operator, converts the audio input to electrical signals and digitizes the electrical signals.
- the voice-recognition module 522 recognizes the verbal utterances (which are in the form of digitized electrical signals) and transfers the recognized verbal utterances to the CPU 504 for processing according to the application program 508.
- the voice-recognition module 522 interprets (or recognizes) as characters and words the digitized electrical signals which result from an operator speaking near or into the microphone 122. Consequently, like conventional input devices such as keyboards and pointing devices, the voice-recognition module 522 in combination with the microphone 122 provides a means for operators to interact with and control the operation of the computer 102.
- the voice-recognition module 522 operates according to well-known dependent voice recognition algorithms and is implemented in hardware of a type well known in the art.
- the voice-recognition module 522 is a dependent voice recognition circuit card available from Voice Connection of Irving, California.
- any dependent voice recognition circuit card having a size compatible with the size of the system unit 106 could be used.
- the voice-recognition module 522 operates according to well-known independent voice recognition algorithms, the independent voice recognition algorithms representing an improvement over dependent voice recognition algorithms.
- an independent voice-recognition module is able to recognize the voices of multiple speakers and includes "good listener", a learning feature for real time modification of a trained vocabulary model.
- a dependent voice-recognition module can recognize only a single speaker's voice.
- an independent voice-recognition module can be integrated with application programs.
- an application program could interact with an independent voice-recognition module in order to cause the independent voice-recognition module to recognize a verbal utterance against a subset of an entire trained vocabulary model.
- the vocabulary subset could include, for example, the words corresponding to menu selections in the current context of the application program.
- dependent voice-recognition modules cannot be integrated with application programs. Therefore, dependent voice-recognition modules are generally less reliable than independent voice-recognition modules since dependent voice-recognition modules usually attempt to recognize verbal utterances from the entire vocabulary.
- Fig. 6 is a partial block diagram of the computer 102 wherein the voice-recognition module 522 is implemented in software, rather than in hardware, as shown in Fig. 5. More particularly, Fig. 6 is a partial block diagram illustrating the structural differences (as compared to Fig. 5) in the computer 102 which are required to implement the alternate embodiment wherein the voice-recognition module 522 is implemented in software. As shown in Fig. 6, the software implementation of the voice-recognition module is indicated as 522', rather than 522, in order to underscore its software nature, and is stored in the memory 506 while the computer 102 is operating.
- the voice-recognition module 522' operates either according to dependent or independent voice recognition algorithms, although preferably the voice-recognition module 522' operates according to independent voice recognition algorithms and is implemented as independent voice recognition software produced by Scott Instruments of Denton, Texas.
- the application program 508 and the voice-recognition module 522' may be linked into a single computer program which is loaded into the memory 506 from the magnetic storage means 519 pursuant to operator instructions.
- the computer 102 includes an analog/digital converter 608 which has buffer 610 and which is preferably contained in the system unit 106.
- the analog/digital converter 608 is connected to the microphone 122.
- the microphone 122 converts audio input spoken by an operator to electrical signals.
- the analog/digital converter 608 digitizes the electrical signals and stores the digitized electrical signals in the buffer 610 for later retrieval by the CPU 504.
- Figs. 7 and 8 represent the operation of the computer 102 wherein a dependent voice-recognition module (implemented in either hardware or software) is used and Figs. 9 and 10 represent the operation of the computer 102 wherein an independent voice-recognition module (implemented in either hardware or software) is used.
- FIG. 7 there is shown an operational flow chart for the computer system 102 according to the embodiment wherein a dependent voice-recognition module is used. While performing the steps shown in Fig. 7, the computer system 102 is operating according to the programming contained in the application program 508.
- Fig. 7 is described below with reference to the structural embodiment of the computer 102 shown in Fig. 5, although it should be understood that the alternate structural embodiment of the computer 102 shown in Fig. 6 operates in substantially the same manner with regard to the flow chart in Fig. 7.
- step 704 the CPU 504 waits for user input from a keyboard buffer (not shown).
- the CPU 504 may perform step 704 by either polling the keyboard buffer or by receiving an interrupt when the keyboard buffer is full. In either case, the CPU 504 receives user input from the keyboard buffer when the keyboard buffer is flushed by a keyboard driver or other operating system tool which is associated with the keyboard buffer.
- the user input in the keyboard buffer may originate from a conventional input device such as a keyboard or pointing device, or from the voice-recognition module 522.
- the CPU 504 processes the user input according to the programming contained in the application program 508.
- the user input could represent a command from the user to retrieve particular data from the database 520 stored in the magnetic storage device 519.
- the CPU 504 processes the user command by accessing and retrieving the requested data from the database 520 and transferring the retrieved data to the display screen 110 for display to the user.
- the CPU 504 transfers the retrieved data to the input/output interface 510 along with a request to display the data on the display screen 110.
- the input/output interface 510 converts the retrieved data to generic video signals generally appropriate for display on monitors and sends the generic video signals to the display screen driver module 514 via the input/output connectors 518.
- the display screen driver module 514 translates the generic video signals to video signals compatible with the particular display screen 110.
- the user input could be information which the user is providing to the computer 102 (for storage in the storage device 520, for example).
- step 708 the CPU 504 determines if the user command represented an exit request. If the user command did not represent an exit request, then the CPU 504 loops back to step 704 to await further user input.
- Fig. 8 is an operational flow chart of the voice-recognition module 522 (shown in Fig. 5) or the voice-recognition module 522' in combination with the analog/digital converter 608 (shown in Fig. 6) wherein the voice-recognition module 522 or 522' operates according to dependent voice recognition algorithms. Note that before the steps in Fig. 8 are performed, the voice-recognition module 522 or 522' is trained using well-known methods for a particular operator to produce a vocabulary model containing words and phrases which the voice-recognition module 522 or 522' recognizes.
- the vocabulary model is not discussed here except to note that, for each word or Phrase which the voice-recognition module 522 or 522' recognizes, the vocabulary model contains a finite number (such as five) of digitized waveforms corresponding to different ways in which the particular operator (for whom the voice-recognition module 522 or 522' is trained) pronounces the word or phrase.
- Fig. 12 is an example of one such wave form of a two-word phrase (such as "Next Menu" or "End Program") wherein the first word is represented by points between times four and thirteen and the second word is represented by points between times sixteen and twenty-two.
- the words are delimited by points between times zero and three, fourteen and fifteen and twenty-three and twenty-six which fall below an energy threshold, which is determined for the particular operator.
- the vocabulary model also contains one or more operator-selected keystrokes associated with each word and phrase which the voice-recognition module 522 of 522' recognizes.
- the phrase "End Program” may be associated with the keystrokes "Ctrl", "X” if the keystrokes "Ctrl", "X" are recognized by the application program 508 as the "End program” command.
- FIG. 8 shall now be described.
- Fig. 8 is described below with reference to the structural embodiment of the computer 102 shown in Fig. 5, although it should be understood that the alternate structural embodiment of the computer 102 shown in Fig. 6 operates in substantially the same manner with regard to the flow chart in Fig. 8.
- step 806 the voice-recognition module 522 receives from the microphone 122 electrical signals representing a verbal utterance spoken by an operator, digitizes the received electrical signals and sends the digitized electrical signals to the voice-recognition module 522.
- the voice-recognition module 522 identifies the boundaries of a word or phrase contained in the digitized electrical signals by locating points in the digitized electrical signals which fall below the energy threshold (as shown in Fig. 12, for example) for the particular operator, the located points representing the boundaries of words.
- the voice-recognition module 522 determines that the digitized electrical signals contain a phrase (that is, multiple words), rather than a single word, if words in tie digitized electrical signals are separated by less than a predefined number (which is set according to the speaking characteristics of the particular operator) of points which fall below the energy threshold.
- the points in the digitized electrical signals corresponding to a word or phrase represent a digitized verbal utterance waveform which was received from the operator (that is, which was spoken by the operator). For example, if Fig. 12 represents the digitized electrical signals and if the predefined number is five, then the voice-recognition module 522 determines (in step 808) that the digitized electrical signals contain a phrase having two words because the two words are separated by only four points (between times thirteen and fifteen) which fall below the energy threshold. The points in the digitized electrical signals corresponding to the two-word phrase represent the received digitized verbal utterance waveform.
- the voice-recognition module 522 recognizes the received digitized digitized verbal utterance waveform by matching the received digitized verbal utterance waveform against all the digitized waveforms (corresponding to different ways in which the particular operator pronounces different words) contained in the vocabulary model. Specifically, in step 810 the voice-recognition module 522 selects the next word from the vocabulary model to process. Recall that the vocabulary model contains a finite number of digitized waveforms for the selected word corresponding to different ways in which the particular operator pronounces the selected word, and that the vocabulary model contains one or more operator-selected keystrokes associated with the selected word.
- the voice-recognition module 522 performs a point-by-point comparison of the received digitized verbal utterance waveform and the digitized waveforms for the selected word and determines whether there is a match between the received digitized verbal utterance waveform and any of the digitized waveforms for the selected word.
- the voice-recognition module 522 determines that a match occurs between a point in the received digitized verbal utterance waveform and a corresponding-in-time point contained in one of the digitized waveforms if the difference between the two points is less than an operator predefined limit, which may be zero for some implementations.
- step 814 the voice-recognition module 522 determines whether a match was found in step 812 and proceeds to step 816 if a match was not found.
- step 816 the voice-recognition module 522 determines whether there are any more words in the vocabulary module left to process. If, in step 816, the voice-recognition module 522 determines that there are more words left to process, then the voice-recognition module 522 flows back to step 810 to select the next word from the vocabulary model to process. Otherwise, the voice-recognition module 522 determines that the received digitized verbal utterance waveform cannot be recognized based on the current training of the vocabulary model and returns to step 806 to await further operator input. Note that, when operating according to dependent voice-recognition algorithms, the voice-recognition module 522 does not interact with the operator to update the training of the vocabulary model to recognize the received digitized verbal utterance waveform.
- step 814 the voice-recognition module 522 determines that a match was found in step 812, then the voice-recognition module 522 performs step 818.
- step 818 the voice-recognition module 522 translates the received digitized verbal utterance waveform to the keystrokes contained in the vocabulary model and associated with the selected word (which the received digitized verbal utterance waveform matched).
- step 820 the voice-recognition module 522 stores the keystrokes in the keyboard buffer. As described above, the keystrokes in the keyboard buffer are processed by the CPU 504 when the keyboard buffer is flushed (see the above text describing step 704 in Fig. 7).
- Fig. 9 is an operational flow chart for the computer system 102 according to the embodiment of the present invention wherein an independent voice-recognition module is used. While performing the steps shown in Fig. 9, the computer system 102 is operating according to the programming contained in the application program 508.
- FIG. 9 is described below with reference to the structural embodiment of the computer 102 shown in Fig. 6, although it should be understood that the alternate structural embodiment of the computer 102 shown in Fig. 5 operates in substantially the same manner with regard to the flow chart in Fig. 9.
- the voice-recognition module 522' is trained for multiple operators using well-known methods to produce a vocabulary model containing words and phrases which the voice-recognition module 522' recognizes. Since it is well known, the vocabulary model is not discussed here except to note that, for each word or phrase which the voice-recognition module 522' recognizes, the vocabulary model contains a single digitized waveform corresponding to a base line pronunciation of the word or phrase. Fig. 12, which was described above, is an example of one such digitized waveform containing a two-word phrase.
- the vocabulary model also contains operator-selected strings (each having one or more characters) associated with each word or phrase which the voice-recognition module 522' recognizes.
- the phrase “End Program” may be associated with the string “ ⁇ Ctrl ⁇ X” if the string " ⁇ Ctrl ⁇ X” is recognized by the application program 508 as the "End Program” command.
- the vocabulary model also contains permitted variances about each of the points in the digitized base line waveforms, the permitted variances representing different ways of pronouncing the words and phrases by the operators for whom the voice-recognition module 522' is trained.
- the digitized waveform in Fig. 12 represents a digitized base line waveform contained in the vocabulary model
- the permitted variance for the point at time ten is two
- the point at time ten has a digital amplitude of fifteen.
- any digitized verbal utterance waveform having an amplitude at time ten in the range from thirteen to seventeen will match the digitized base line waveform in Fig. 12, at least with regard to the point at time ten.
- step 904 the CPU 504 determines whether a verbal utterance, spoken by an operator into the microphone 122, exists at the analog/digital converter 608.
- the microphone 122 converts audio signals from the operator (that is, verbal utterances from the operator) into electrical signals and sends the electrical signals to the analog/digital converter 608.
- the analog/digital converter 608 digitizes the received electrical signals and places the digitized electrical signals into the buffer 610.
- the CPU 504 then performs step 904 by polling the analog/digital converter 608 to determine when the buffer 610 is full. When the buffer 610 is full, the CPU 504 instructs the analog/digital converter 608 to send the digitized electrical signals, which represents a digitized verbal utterance, from the buffer 610 to the CPU 504 via the bus 502.
- step 906 the CPU 504 receives the digitized verbal utterance from the analog/digital converter 608 via the bus 502.
- step 908 the CPU 504 causes the voice-recognition module 522' to recognize the digitized verbal utterance against a subset of the words and phrases contained in the vocabulary model. That is, the CPU 504 passes a number of words and phrases to the voice-recognition module 522' and instructs the voice-recognition module 522' to determine whether the digitized verbal utterance matches any of the stored digitized base line waveforms associated with the passed words and phrases, taking into consideration the permitted variances for each of the digitized base line waveforms. As a result of step 908, the voice-recognition module 522' passes the string stored in the vocabulary model and associated with the matched word or phrase to the CPU 504 for processing.
- the words which are passed by the CPU 504 to the voice-recognition module 522' may represent menu choices from a current context of the application program 508.
- the application program 508 might have asked the operator to select from a menu of choices.
- the voice-recognition module 522' would determine whether the operator's verbal response matched any of the menu choices.
- the CPU 504 may interact with the voice-recognition module 522' by invoking software routines stored in the voice-recognition module 522' pursuant to programming in the application program 522'.
- the CPU 504 processes the recognized word or phrase (that is, the string that was returned in step 908) in accordance with the particular programming of the application program 508.
- the recognized word or phrase may represent a command from the operator to access particular information from the database 520.
- the CPU 504 would access and retrieve the requested data from the database 520 located in the magnetic storage device 519. The CPU 504 would then send the retrieved data to the display screen 110 for display to the operator.
- step 918 the CPU 504 determines whether the recognized word or phrase represented an exit command. If the recognized word or phrase did not represent an exit command, then the CPU 504 loops back to step 904 to await further operator instructions. Otherwise, the CPU 504 terminates the application program in step 920.
- Fig. 10 is an operational flow chart of the voice-recognition module 522' wherein the voice-recognition module 522' operates according to independent voice-recognition algorithms. Note that before the steps in Fig. 10 are performed, the voice-recognition module 522' is trained for multiple operators using well-known methods as described above with reference to Fig. 9.
- the voice-recognition module 522' identifies the boundaries of a word or phrase contained in the digitized electrical signals (which represent the digitized verbal utterance) by locating points in the digitized electrical signals which fall below the energy threshold (as shown in Fig. 12, for example), the located points representing the boundaries of words.
- the voice-recognition module 522' determines that the digitized electrical signals contain a phrase (that is, multiple words), rather than a single word, if words in the digitized electrical signals are separated by less than a predefined number (which is set according to the speaking characteristics of the particular operator) of points which fall below the energy threshold.
- the points in the digitized electrical signals corresponding to a word or phrase represent a digitized verbal utterance waveform which was received from the operator (that is, which was spoken by the operator). For example, if Fig. 12 represents the digitized electrical signals and if the predefined number is five, then the voice-recognition module 522' determines (in step 1004) that the digitized electrical signals contain a phrase having two words because the two words are separated by only four points (between times thirteen and fifteen) which fall below the energy threshold. The points in the digitized electrical signals corresponding to the two-word phrase represent the received digitized verbal utterance waveform.
- step 1006 the voice-recognition module 522' selects the next word or phrase from the subset of words and phrases which the CPU 504 passed to the voice-recognition module 522'.
- the voice-recognition module 522' performs a point-by-point comparison of the received digitized verbal utterance waveform and the digitized base line waveform associated with the selected word or phrase and determines whether there is a match between the received digitized verbal utterance waveform and the digitized base line waveform, in light of the permitted variances associated with the digitized base line waveform.
- the voice-recognition module 522' determines that a match occurs between a point in the received digitized verbal utterance waveform and a corresponding-in-time point contained in the digitized base line waveform if the difference between the two points is less than the permitted variance for that particular point.
- the voice-recognition module 522' In processing step 1008, the voice-recognition module 522' generates a recognition score for the selected word or phrase which essentially represents a cumulative sum of the differences between the received digitized verbal utterance waveform and the digitized base line waveform at each point. Therefore, a lower recognition score indicates a closer match between the received digitized verbal utterance waveform and the digitized base line waveform.
- the voice-recognition module 522' determines whether the recognition score for the selected word or phrase is less than a maximum permitted score which is software adjustable by the operator. If the recognition score for the selected word or phrase is greater than the maximum permitted score, then the voice-recognition module 522' determines that the selected word or phrase and the received digitized verbal utterance waveform did not match and processes step 1014. If the recognition score for the selected word or phrase is less than the maximum permitted score, then the voice-recognition module 522' determines that the selected word or phrase and the received digitized verbal utterance waveform matched (to a certain confidence level as described below) and processes step 1012.
- step 1012 the voice-recognition module 522' adds the selected word or phrase to an acceptable words list.
- Words and phrases in the acceptable words list are ordered in ascending order by their respective recognition scores, such that the first entry in the acceptable words list has the lowest recognition score.
- step 1014 the voice-recognition module 522' determines whether all the words and phrases which the CPU 504 passed to the voice-recognition module 522' have been processed. If there are more words and/or phrases to process, then the CPU 504 loops back to step 1006. Otherwise, the CPU 504 processes step 1016.
- the voice-recognition module 522' interacts with the operator to determine which of the words or phrases in the acceptable words list is the word or phrase which was spoken by the operator. Specifically, in step 1016 the voice-recognition module 522' selects the first entry in the acceptable words list. Recall that the first entry has the lowest voice-recognition score of all the entries in the acceptable words list.
- step 1018 the voice-recognition module 522' generates a confidence value for the first entry by subtracting the recognition score of the second entry from the recognition score of the first entry.
- the confidence value of the first entry quantifies how confident the voice-recognition module 522' is that the first entry is the word or phrase which was spoken by the operator.
- step 1020 the voice-recognition module 522' determines whether the confidence value of the first entry is high or low by comparing the confidence value to an operator-selected minimum confidence value. If the confidence value is greater than the minimum confidence value, then the confidence value is high and the voice-recognition module 522' processes step 1030. In step 1030, since the confidence of the first entry is high, the voice-recognition module 522' returns to the CPU 504 the string associated with the first entry to the CPU 504 for processing (in step 916 of Fig. 9).
- step 1020 the confidence value is less than the minimum confidence value, then the confidence value is low and the voice-recognition module 522' processes step 1022.
- the voice-recognition module 522' asks the operator whether the first entry is the correct word or phrase (that is, the word or phrase spoken by the operator).
- the voice-recognition module 522' may query the operator by displaying a message to the operator on the display screen 110 or by sending the message in audio form to the operator via the speaker 332 (or, alternatively, to the microphone/speaker 122 if the microphone/speaker 122 is an ear microphone or to an independent speaker if the microphone 122 is a headset microphone and does not have a transducer for outputting audio signals).
- the operator may respond to the query from the voice-recognition module 522' in a number of ways, such as speaking into the microphone 122 or by manipulating the pointing device 330. If the operator responds that the first entry is correct, then the voice-recognition module 522' processes step 1028 (described below). Otherwise, the voice-recognition module 522' processes step 1024.
- step 1024 since the operator has indicated that the first entry is not correct, the voice-recognition module 522' determines whether there are any more entries in the acceptable words list to process. If there are more entries to process, then the voice-recognition module 522' processes step 1026, wherein the voice-recognition module 522' selects the next entry in the acceptable words list and then loops back to step 1022 to query the operator whether the newly selected entry is correct. Otherwise, if there are no former entries to process (see step 1024), the voice-recognition module 522' processes step 1028.
- step 1028 is reached when either (1) in step 1022, the operator indicates that the selected entry is correct, or (2) in step 1024, the operator effectively indicates that none of the entries in the acceptable words list is correct.
- the voice-recognition module 522' interacts with the operator to update the training of the vocabulary model according to well-known methods such that the voice-recognition module 522' will be able to more confidently recognize the operator's verbal utterance in the future. Since the voice-recognition module 522' utilizes well-known methods in step 1028, step 1028 will not be discussed further.
- the voice-recognition module 522' After updating the training of the vocabulary model, the voice-recognition module 522' processes step 1330, wherein the voice-recognition module 522' returns to the CPU 504 for processing (in step 916 of Fig. 9) the string associated with the word or phrase which matches the verbal utterance as indicated by the operator (in steps 1022 and 1028).
- the computer system 102 of the present invention may be used in many applications, including (but not limited to) for use with the handicapped, for education purposes, for testing and service purposes and for inventory purposes.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Theoretical Computer Science (AREA)
- Computer Hardware Design (AREA)
- Multimedia (AREA)
- Optics & Photonics (AREA)
- General Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Audiology, Speech & Language Pathology (AREA)
- Acoustics & Sound (AREA)
- Computational Linguistics (AREA)
- General Health & Medical Sciences (AREA)
- User Interface Of Digital Computer (AREA)
- Telephone Function (AREA)
- Digital Computer Display Output (AREA)
- Management, Administration, Business Operations System, And Electronic Commerce (AREA)
Abstract
Description
- This invention relates to computers, and more particularly to a user-supported portable computer.
- Today, many printed publications, particularly technical manuals,are being replaced by electronic technical manuals (ETM) and interactive electronic technical manuals (IETM). Such ETMs and IETMs are essentially electronic databases which are typically housed in conventional computers having a keyboard for user input and a full-sized monitor for information display. An operator may use the computer in order to access and display data stored in the ETMs and IETMs for a variety of uses including troubleshooting and repair/replacement of a system, subsystem or component thereof.
- ETMs and IETMs are particularly useful in service and repair industries wherein technicians often require detailed information from technical manuals to repair and service malfunctioning devices. For example, ETMs and IETMs are useful in automobile repair centers wherein service personnel find it necessary to access information in automobile technical manuals in order to service malfunctioning automobiles. Additionally, ETMs and IETMs are useful in military service centers wherein military technicians often require access to information in military technical manuals to service and repair malfunctioning weapon systems. In such scenarios, it is more efficient to access information from ETMs and IETMs rather than printed, publications since the printed publications may be voluminous.
- As noted above, ETMs and IETMs are traditionally stored in and accessed from conventional computers having keyboards for operator input and full-sized video monitors fro displaying data. Such computers are often located in service areas adjacent to the devices being repaired. In operation, maintenance personnel move back and forth between the computers and the devices being repaired in order to retrieve data required to service the devices being repaired. Such movement between the computers and the devices being repaired gives rise to the disadvantage that it represents a considerable amount of time and effort spent for the purpose of retrieving data from the ETMs and IETMs. Therefore, conventional computers are not efficient devices for storing ETMs and IETMs since such conventional computers result in inefficient data delivery of the information in the ETMs and IETMs to operators.
- It is an object of the present invention to overcome this disadvantage.
- The present invention is accordingly directed to a compact, self-contained portable computing apparatus which is completely supported by a user for hands-free retrieval and display of information for the user. The computing apparatus includes a housing having securing means for removably securing the housing to a user for support by the user. The housing further includes storage means for storing previously entered information, and processor means, communicating with the storage means, for receiving, retrieving and processing information and user commands in accordance with a stored program. The computing apparatus also includes audio transducer and converter means, in communication with the processor means, for receiving audio commands from the user, for converting the received audio commands into electrical signals, for recognizing the converted electrical signals, and for sending the recognized electrical signals to the processor means, the audio transducer and converter means also being supported by the user. The computing apparatus further includes display means in communication with the processor means for receiving information from the processor means and for displaying the received information for the user, the display means being supported by the user whereby the user may operate the computing apparatus to display information in a hands-free manner utilizing only audio commands.
- A computing apparatus in accordance with the invention will now be described, by way of example, with reference to the accompanying drawings, in which:-
- Fig. 1 is a front view of a schematic block diagram of an operator wearing a computer in accordance with the present invention;
- Fig. 2 is a side view of a schematic block diagram of an operator wearing the computer of Fig. 1;
- Fig. 3 is a perspective view of a system unit which forms part of the computer of Fig. 1;
- Fig. 4 is a bottom plan view of the system unit of Fig. 3;
- Fig. 5 is a schematic block diagram of the computer of Fig. 1;
- Fig. 6 is a partial schematic block diagram of an alternate embodiment of the computer of Fig. 1;
- Fig. 7 is an operational flow chart of the computer of Fig. 1 wherein the computer includes a dependent voice-recognition module;
- Fig. 8 is an additional operational flow chart of a dependent voice-recognition module;
- Fig. 9 is an operational flow chart of the computer of Fig. 1 wherein the computer includes an independent voice-recognition module;
- Fig. 10 is an additional operational flow chart of an independent voice-recognition module;
- Fig. 11 is a block diagram of a display screen operating according to shutter-wheel technology; and
- Fig. 12 is a digitized waveform of a verbal utterance.
- Referring to the drawings in which like numerals are used for identical elements throughout, Fig. 1 is a front schematic view of an operator (also called a user) wearing a compact,
portable computer 102 in accordance with the present invention and Fig. 2 is a side view of the operator wearing thecomputer 102. The location of the components of thecomputer 102 on the operator as shown in Figs. 1 and 2 is representational only and may vary depending on operator convenience and comfort. Thecomputer 102 includes a housing such assystem unit 106 having a securing means, in the present embodiment a strap orbelt 104, which is adapted to be worn around the operator's waist for securing the housing or system unit to the user for support by the user. - The
computer 102 further includes display means for receiving information from thesystem unit 106 and for displaying the received information for the user or operator. The display means, in the present embodiment, includes aheadband 108, adisplay screen 110, and anadjustable arm 112 connecting thedisplay screen 110 to theheadband 108. Theheadband 108 is adapted to be worn by the user in any convenient location, but preferably upon the user's forehead, as shown. The position of thedisplay screen 110 may be adjusted via theadjustable arm 112 so that the operator can comfortably view information displayed on thedisplay screen 110. Thedisplay screen 110 is electrically connected to thesystem unit 106, in the present embodiment, via acable 114, although other connection means may alternatively be employed. - The
computer 102 further includes audio transducer and converter means in communication with thesystem unit 106 for receiving audio commands from the user, for converting the receiver audio commands into electrical signals, for recognizing the converted electrical signals and for sending the recognized electrical signals to a processor within thesystem unit 106. In the present embodiment, the audio transducer and converter means includes amicrophone 122 for receiving verbal commands from the operator. Themicrophone 122, which, in the present embodiment, is electrically connected to thesystem unit 106 via acable 124 is preferably an ear-supported microphone, although those with ordinary skill in the art will appreciate that any audio-input or transducer device could be used and that the audio-input or transducer could be supported by the user at some other location such as proximate the mouth or throat of the user. - The
computer 102, in the present embodiment, further includes measurement means in communication with thesystem unit 106 for performing electrical measurements on devices being evaluated by the computer 102 (such evaluation including, but not limited to, testing, calibrating, troubleshooting, diagnosing and servicing). In the present embodiment, the measurement means includes aninstrument pack 116 which is attachable to thebelt 104 and is electrically connectable viacables 118 to adevice 120 which is to be tested, analyzed, repaired or the like. Theinstrument pack 116 is also electrically connected tosystem unit 106, in the present embodiment, via acable 126. - From the foregoing description and Figs. 1 and 2, it can be seen that the
computer system 102 is adapted to be completely supported by a user or operator. Thedisplay screen 110 is placed to permit the user to accomplish other tasks, i.e., servicing a device, while glancing at thescreen 110 for information concerning the task being performed. Themicrophone 122 permits the user to verbally control thecomputer system 102 to display desired information while maintaining the user in a hands-free mode for unimpaired performance of the task. Finally, theinstrument pack 116 permits thecomputer 102 to obtain information from a device, such as a device being serviced, while maintaining the user in a hands-free mode with respect to the computer system 102 (note that the user may be handling the device being serviced, test equipment or other equipment while using the computer system 102). By utilizing thecomputer 102 the user is able to perform a task in a more efficient manner, since the user can access data from and input data to thecomputer system 102 in a hands-free mode with, at most, minimal diversion from the task being performed. - Fig. 3 is an exterior perspective view of the
system unit 106. Because thesystem unit 106 is intended to be supported by an operator, thesystem unit 106 is lightweight and small sized, in the present embodiment, preferably about five inches by six inches by three inches with a weight of approximately three pounds. Thesystem unit 106 includes atop panel 302, abottom panel 310, afront panel 312, aback panel 308, afirst side panel 304, and asecond side panel 306. Connected to theback panel 308 is aclip 328, for attaching thesystem unit 106 to thebelt 104. Located on thetop panel 302 is amicrophone jack 314 which is of a type well known in the art. Thecable 124 connected to themicrophone 122 is removably connected to thesystem unit 106 via themicrophone jack 314 using a suitable connector (not shown). Also located on thetop panel 302 is avoice input indicator 318, preferably a light-emitting diode (LED), which illuminates to visually confirm when theear microphone 122 receives verbal input from the operator. - Also located on the
top panel 302 are avoice output indicator 316 and avolume control 320. In some embodiments, thetop panel 302 also includes aspeaker 332. Thevoice output indicator 316, preferably a single LED, illuminates to visually confirm when thecomputer 102 is outputting synthesized or digitized speech through thespeaker 332 for the purpose of providing the user with information, queries, instructions, messages or other feedback. Thevolume control 320, preferably a rotatable knob or depressible button, controls the volume level of the audio output of thespeaker 332. Also located on thetop panel 302 are areset system button 322, a Power On/Off button 324 and a Power-onindicator 326. The Power-On indicator 326, preferably a single LED, illuminates to visually confirm when thesystem unit 106 is turned on via the ON/OFF button 324. Thesystem unit 106 initializes when thereset system button 322 is pressed. Also located on thetop panel 302 is an input means in the form of apointing device 330, such as a sensitive touch pad, a joy stick, roller ball or a gyroscopic mouse. The specific locations of the above-described elements on thetop panel 302 as shown in Fig. 3 are representational only and may vary among particular implementations for ergonomic or other reasons. It should also be appreciated that the particular implementation of such elements (i.e., LEDs for indicators) may also vary. - In an alternate embodiment of the present invention, the microphone 122 (shown in Fig. 1) is a microphone/speaker assembly having a first transducer for receiving audio signals and a second transducer for outputting audio signals, the second transducer essentially being an audio speaker. The microphone/
speaker assembly 122 may be, for example, a well-known bone-conduction device. According to the alternate embodiment, thespeaker 332 shown in Fig. 1 is not necessary, and thus is not located on thefront panel 332 and, in fact, is not part of thecomputer system 102. - Fig. 4 is an exterior plan view of the
bottom panel 310 of thesystem unit 106. Thebottom panel 310 includes a monitor ordisplay port 402, twoserial ports parallel port 406, akeyboard port 410, amouse port 412 and an externalpower supply port 408. According to the preferred embodiment, theserial ports parallel port 406 is centronics compatible. Thesystem unit 106 may be modified to include additional and/or different ports and connectors without diverging from the spirit and intent of the present invention. - Fig. 5 is a schematic block diagram of the primary structural features of the
computer 102 in accordance with the present embodiment. Thecomputer 102 includes abus 502, which preferably has a data width of at least sixteen bits. According to the present embodiment, thebus 502 is contained in thesystem unit 106. Thecomputer 102 also includes processor means such as central processing unit (CPU) 504, which is connected to thebus 502 and is also preferably contained in thesystem unit 106. Preferably, theCPU 504 is an 80286 or 80386SX microprocessor available from Intel. It will be appreciated by those of ordinary skill in the art that while an 80286 or 80386SX microprocessor is preferred, any other central processor or microprocessor, either available presently or in the future, could be used. - The
computer 102 also includes amemory 506 having, for example, one Mbyte to twenty Mbytes of random access memory (RAM). Thememory 506, which is also connected to thebus 502 and is preferably contained in thesystem unit 106, stores anapplication program 508 while thecomputer 102 is operating. Theapplication program 508 may have been loaded into thememory 506 from a magnetic storage device 519 (described below) pursuant to operator instructions. - The
computer 102 also includes an input/output interface 510 which controls all data transfers between theCPU 504 and certain other components (herein called peripherals) which communicate with theCPU 504 but which are not connected directly to thebus 502. Preferably, the input/output interface 510 includes a video interface, a controller for at least two RS-232 compatible serial ports, a controller for the centronics-compatible parallel port, keyboard and mouse controllers, a floppy disk controller, and a hard drive interface. However, it will be appreciated by those of ordinary skill in the art that the input/output interface 510 could include additional and/or different interfaces and controllers for use with other types of peripherals such as Ethernet®, Arcnet®, token ring interface. The input/output interface 510 is connected to thebus 502 and preferably is located in thesystem unit 106. - The
computer 102 also includes input/output connectors 518 which collectively represent the above-described physical peripheral ports and accompanying electrical circuitry. Preferably, the input/output connectors 518 include themonitor port 402,serial ports parallel port 406,keyboard port 410, andmouse port 412 shown in Fig. 4. However, those of ordinary skill in the art will appreciate that the input/output connectors 518 could include additional and/or different types of physical ports. - The
computer 102 also includes apower converter 536 which is connected to aninternal battery 539, anexternal battery 540 and/or an AC power source such as a conventional electrical outlet (not shown in Fig. 5). Thepower converter 536 and theinternal battery 539 are preferably located in thesystem unit 106 while theexternal battery 540 is located outside of thesystem unit 106, preferably attached to thebelt 104. (Theexternal battery 540 is not shown in Figs. 1 and 2). Theexternal battery 540 is connected to thepower converter 536 via the externalpower supply port 408 shown in Fig. 4. When thecomputer 102 is used in a "desk-top" mode (e.g., non-portable mode), thepower converter 536 may be connected to the AC power source for supplying regulated DC power to thecomputer 102. When thecomputer 102 is used in a portable mode, thepower converter 536 is usually connected to theinternal battery 539 and/or theexternal battery 540 for supplying regulated DC power to thecomputer 102. Preferably, theinternal battery 539 supplies power to the power converter 536 (and ultimately the computer 102) only when thepower converter 536 is not connected to either theexternal battery 540 or the AC power source. Thecomputer 102 further includes aseparate battery charger 534 for periodically charging theinternal battery 539 and theexternal battery 540 when not in use. Thecomputer 102 may include a battery-powered indicator, attached to thesystem unit 106front panel 302, for indicating when the power levels of theexternal battery 540 and/orinternal battery 539 are low. - Preferably, the
bus 502,CPU 504,memory 506, input/output interface 510, input/output connectors 518, andpower converter 536 described above are implemented using a backplane circuit card, processor circuit card, memory circuit card, input/output circuit card, and input/output connection circuit card in a manner well known to those skilled in the art. The processor circuit card, memory circuit card, input/output circuit card, and input/output connection circuit card are plugged into the backplane circuit card. Preferably, IBM PC/AT compatible and/or 80386 compatible circuit cards available from Dover Electronics Manufacturing of Longmont, CO and Ampro Computers of Sunnyvale, CA are used. The circuit cards from Dover Electronics Manufacturing occupy a cubic space of approximately two inches by five inches by two inches while each of the circuit cards from Ampro are approximately 3.8 inches by 3.6 inches. However, those having ordinary skill in the art will appreciate that any functionally compatible circuits cards which conform to the relatively small size of thesystem unit 106 could be used in place of the circuit cards available from Dover Electronics Manufacturing. - The
computer 102 also includes display means which, in the present embodiment as noted above with reference to Figs. 1 and 2, includes aheadband 108, adisplay screen 110, and anadjustable arm 112 connecting thedisplay screen 110 to theheadband 108. As shown in Fig. 5, the display means further includes a displayscreen driver module 514 which preferably is located in thesystem unit 106, but which alternatively could be located outside of thesystem unit 106 adjacent to thedisplay screen 110. The displayscreen driver module 514 converts display information (that is, information which is to be displayed for an operator) received from the CPU 504 (via the input/output interface 510,bus 502 and input/output connectors 518) into video signals which are sent to and compatible with thedisplay screen 110. The displayscreen driver module 514 is of a standard design well known to those skilled in the art. - Preferably, the
display screen 110 is a miniature monitor called an "eye piece monitor" which provides a display equivalent to conventional twelve-inch monitors (that is, approximately twenty-five lines by eighty characters per line), but which has a viewing screen with a diagonal length of approximately one inch. Since thedisplay screen 110 is located close to the operator's eye and is supported by the operator's head so it follows the operator's head movement, the operator is able to view information on thedisplay screen 110 without having to move away from his work bench (where, for example, a device is being repaired) by merely glancing from the device being repaired to thedisplay screen 110. Therefore, thedisplay screen 110, as described above, facilitates the retrieval of information contained in an electronic database since such information can be viewed without significantly diverting an operator's attention away from his work. - Those having ordinary skill in the art will appreciate that the
display screen 110 and displayscreen driver module 514 can be implemented using any video technology either available presently or in the future, such as color graphics adaptor (CGA), and enhanced graphics adaptor (EGA), video graphics array (VGA), and super VGA. According to a present embodiment, however, thedisplay screen 110 and displayscreen driver module 514 are implemented using well-known color graphic adapter (CGA) technology. CGA eye piece monitors are available from many vendors, including Reflection Technology, Inc., of Waltham, MA which produces and sells the Private Eye™ monitor. Alternatively, thedisplay screen 110 and displayscreen driver module 514 are implemented using well-known (monochrome or color) video graphics adaptor (VGA) technology. VGA eye piece monitors which operate according to well-known color shutter wheel technology are currently available from sources, such as the Nucolor™ Shutters produced by Tektronix, Inc., of Beaverton, Oregon. The display means may alternatively be a flat panel display screen attached to thesystem unit 106. - Fig. 11 is a functional block diagram of a
conventional display screen 110 connected to a conventional displayscreen driver module 514, both of which operate according to color shutter wheel technology. Thedisplay screen 110 has a monochrome cathode ray tube (CRT) 1106 and afilter 1108 having acolor polarizer 1110 for polarizing and separating light from theCRT 1106 into cyan (that is, blue and green) and red components. Thefilter 1108 also includes a pi-cell 1112, which is preferably a relatively fast liquid-crystal switch, for rotating the polarized light from thecolor polarizer 1110 by either zero or ninety degrees. Thefilter 1108 also includes asecond color polarizer 1114 for polarizing and separating light from the pi-cell 1112 into yellow (that is, red and green) and blue, which effectively divides the blue component from the green. Thefilter 1108 also includes a second pi-cell 1116, for rotating the polarized light from thecolor polarizer 1114 by either zero or ninety degrees. - The display
screen driver module 514 includes avideo interface 1102, which receives signals from thebus 502 representing data to be displayed on thedisplay screen 110, and converts the signals into video signals generally having image and color information. The video signals are received by ashutter controller 1104, which causes theCRT 1106 to display an image in accordance with the image information, and which also controls thefilter 1108 in accordance with the color information to convert the image displayed by theCRT 1106 to a color image. - The
shutter controller 1104 causes thefilter 1108 to transmit a red, green, or blue color image by selecting an appropriate combination of states of the pi-cells shutter controller 1104 sets both pi-cells first color polarizer 1110 passes through the first pi-cell 1112 unaffected. The vertically orientedyellow color polarizer 1114 then absorbs the blue components and leaves only the green component (note that only the vertical components are considered here since the light transmitted through thefilter 1108 is always vertically polarized). - In order to transmit red, the
shutter controller 1104 sets the first pi-cell 1112 to rotate by ninety degrees and the second pi-cell 1116 to rotate by zero degrees. Such alignment places the red component into the vertical position so that it passes through theyellow color polarizer 1114 unaffected. In order to transmit blue, theshutter controller 1104 sets both pi-cells blue color polarizer 1110. Color shutter wheel technology is further described in "Reinventing the Color Wheel" by Thomas J. Haven (Information Display, Vol. 7, No. 1, January 1991, pages 11-15)
Referring again to Fig. 5, thecomputer 102 also includes various peripherals such as aninternal pointing device 230, storing means such asmagnetic storage device 519, measurement means such asinstrument pack 116, amicrophone 122 and a voice-recognition module 522, which are all connected to the system unit 106 (and, specifically, to the input/output interface 510 which controls data traffic between the peripherals and the CPU 504) via the input/output connectors 518. Theinternal pointing device 230 is a well-known pointing device such as a mouse, sensitive touch pad, or gyroscopic mouse which is connected to thetop panel 302 of the system unit 106 (as shown in Fig. 3) and which represents an alternate means for an operator to interact with the computer 102 (that is, to provide commands and data to the CPU 504) in situations where the operator does not wish to or cannot interact via voice (that is, via themicrophone 122 and voice-recognition module 522) as described below. For example, the operator may not be able to interact via voice in work environments where the background noise is too excessive for even external hearing protection work. The present invention also supports other non-audio input devices such as bar code readers, touch memory readers and proximity scanners, which may be used when the operator does not wish to or cannot interact with thecomputer 102 via voice. - The
computer 102 also includes apacket switching component 542 and anantenna 544 which, in the present embodiment, are preferably contained in thesystem unit 106 and which enable thecomputer 102 to send and receive information from remote locations via well-known telecommunication means, such as via telephone or satellite. The packet-switchingcomponent 542 and theantenna 544 are particularly useful for updating adatabase 520 stored in a magnetic storage device 519 (described below) in real-time with information received from a remote computer or other data source. Thecomputer 102 may include a global positioning system component (not shown in Fig. 5) for receiving and processing (via the antenna 544) positioning information from navigation systems, such as the Global Positioning System. - The
magnetic storage device 519, which is preferably contained in thesystem unit 106, is a static, read/write memory having a relatively large memory capacity such as a removable or non-removable hard disk drive. In embodiments where thestorage device 519 is removable, thesystem unit 106 contains an external slot for allowing the operator to insert and remove removable storage disks. Anoptional storage device 519 could be a read-only memory such as a CD-ROM. Preferably, themagnetic storage device 519 includes 80 Mbytes to one gigabyte of memory. Magnetic storage devices which are suitable for use as themagnetic storage device 519 and which have a size compatible with the size of thesystem unit 106 are produced and sold by various manufacturers such as Integral, Connor, Seagate and Syquest. As shown in Fig. 5, themagnetic storage device 519 stores a database 520 (which may be an ETM or IETM) which may have been previously loaded into themagnetic storage device 519 from a floppy drive (not shown in Fig. 5) which could be connected to thecomputer 102 via a port on the input/output connectors 518, or from a remote computer via a telecommunication link connected to the computer via thepacket switching component 542 and theantenna 544 or by direct cabling. - The
instrument pack 116 includes electrical measurement equipment such as amultimeter 524 and a counter/timer 526, and ports for connecting the electrical measurement equipment to devices being evaluated or serviced such as an IEEE-488connector 528, an IEEE-1553connector 530 and an IEEE-1708connector 532. Those with ordinary skill in the art will realize that theinstrument pack 116 could contain other types of electrical measurement equipment and ports. Referring again to Fig. 1, in operation the electrical measurement equipment in theinstrument pack 116 is connected to a device being evaluated or serviced, hereinafter called a device under test (DUT) 120, via acable 118 which is connected to any one of theconnectors DUT 120 is measured by themultimeter 524, counter/timer 526 and/or any other measurement equipment contained in the instrument pack 116 (as appropriate for the test being performed). The results of such tests are sent from theinstrument pack 116 to theCPU 504 via the input/output connectors 518 and thebus 502. Alternatively, they can be stored on battery-backed memory chips. - Referring again to Fig. 5, the
computer 102 may also include anexternal monitor 516 which is not supported by an operator (but rather rests on a desk, for example) and which connects to thesystem unit 106 via the monitor port 402 (shown in Fig. 4). Theexternal monitor 516 receives from theCPU 504 the same display information as the display screen 110 (via the display screen driver module 514). Thecomputer 102 may also include an external keyboard and mouse (not shown), which are connectable to thesystem unit 106 via thekeyboard port 410 andmouse port 412, respectively. The external keyboard and mouse represent conventional means for an operator to interact with thecomputer 106. Preferably, theexternal monitor 516, external keyboard and external mouse are connected to thesystem unit 106 when thecomputer 102 is operating in a non-portable mode (e.g., as a desk-top computer). - The voice-
recognition module 522 is preferably contained in thesystem unit 106 and is connected to the microphone 122 (which is preferably an ear microphone located outside of the system unit 106). Alternatively, the voice-recognition module 522 may be located outside of thesystem unit 106 and, far example, may be incorporated with themicrophone 122 as a single unit. Alternatively, the analog-to-digital converter component of the voice-recognition module 522 is located outside of thesystem unit 102 while the remaining components of the voice-recognition module 522 is located inside thesystem unit 102, the external analog-to-digital converter preferably communicating with thesystem unit 102 via a serial communication stream. Themicrophone 122 receives audio input (also called verbal utterances) from an operator, converts the audio input to electrical signals and digitizes the electrical signals. The voice-recognition module 522 recognizes the verbal utterances (which are in the form of digitized electrical signals) and transfers the recognized verbal utterances to theCPU 504 for processing according to theapplication program 508. Thus, just as a conventional keyboard driver interprets as characters and words the electrical signals which result from an operator typing on a conventional keyboard, the voice-recognition module 522 interprets (or recognizes) as characters and words the digitized electrical signals which result from an operator speaking near or into themicrophone 122. Consequently, like conventional input devices such as keyboards and pointing devices, the voice-recognition module 522 in combination with themicrophone 122 provides a means for operators to interact with and control the operation of thecomputer 102. - Preferably, the voice-
recognition module 522 operates according to well-known dependent voice recognition algorithms and is implemented in hardware of a type well known in the art. According to a preferred embodiment, the voice-recognition module 522 is a dependent voice recognition circuit card available from Voice Connection of Irving, California. However, those with ordinary skill in the art will appreciate that any dependent voice recognition circuit card having a size compatible with the size of thesystem unit 106 could be used. - Alternatively, the voice-
recognition module 522 operates according to well-known independent voice recognition algorithms, the independent voice recognition algorithms representing an improvement over dependent voice recognition algorithms. Specifically, an independent voice-recognition module is able to recognize the voices of multiple speakers and includes "good listener", a learning feature for real time modification of a trained vocabulary model. In contrast, a dependent voice-recognition module can recognize only a single speaker's voice. - Further, an independent voice-recognition module can be integrated with application programs. For example, an application program could interact with an independent voice-recognition module in order to cause the independent voice-recognition module to recognize a verbal utterance against a subset of an entire trained vocabulary model. The vocabulary subset could include, for example, the words corresponding to menu selections in the current context of the application program. In contrast, dependent voice-recognition modules cannot be integrated with application programs. Therefore, dependent voice-recognition modules are generally less reliable than independent voice-recognition modules since dependent voice-recognition modules usually attempt to recognize verbal utterances from the entire vocabulary. Systems and methods for voice recognition, and particularly for independent voice recognition, are described in U.S. Patent Nos. 5,025,471, 4,969,193 and 4,672,667.
- Fig. 6 is a partial block diagram of the
computer 102 wherein the voice-recognition module 522 is implemented in software, rather than in hardware, as shown in Fig. 5. More particularly, Fig. 6 is a partial block diagram illustrating the structural differences (as compared to Fig. 5) in thecomputer 102 which are required to implement the alternate embodiment wherein the voice-recognition module 522 is implemented in software. As shown in Fig. 6, the software implementation of the voice-recognition module is indicated as 522', rather than 522, in order to underscore its software nature, and is stored in thememory 506 while thecomputer 102 is operating. The voice-recognition module 522' operates either according to dependent or independent voice recognition algorithms, although preferably the voice-recognition module 522' operates according to independent voice recognition algorithms and is implemented as independent voice recognition software produced by Scott Instruments of Denton, Texas. In practice, theapplication program 508 and the voice-recognition module 522' may be linked into a single computer program which is loaded into thememory 506 from the magnetic storage means 519 pursuant to operator instructions. - According to the alternate embodiment wherein the voice-recognition module 522' is implemented in software, the
computer 102 includes an analog/digital converter 608 which hasbuffer 610 and which is preferably contained in thesystem unit 106. The analog/digital converter 608 is connected to themicrophone 122. In operation, themicrophone 122 converts audio input spoken by an operator to electrical signals. The analog/digital converter 608 digitizes the electrical signals and stores the digitized electrical signals in thebuffer 610 for later retrieval by theCPU 504. - The operation of the
computer system 102 shall now be described with reference to Figs. 7, 8, 9 and 10 wherein Figs. 7 and 8 represent the operation of thecomputer 102 wherein a dependent voice-recognition module (implemented in either hardware or software) is used and Figs. 9 and 10 represent the operation of thecomputer 102 wherein an independent voice-recognition module (implemented in either hardware or software) is used. - Referring first to Fig. 7, there is shown an operational flow chart for the
computer system 102 according to the embodiment wherein a dependent voice-recognition module is used. While performing the steps shown in Fig. 7, thecomputer system 102 is operating according to the programming contained in theapplication program 508. For simplicity purposes, Fig. 7 is described below with reference to the structural embodiment of thecomputer 102 shown in Fig. 5, although it should be understood that the alternate structural embodiment of thecomputer 102 shown in Fig. 6 operates in substantially the same manner with regard to the flow chart in Fig. 7. - In
step 704, theCPU 504 waits for user input from a keyboard buffer (not shown). In practice, theCPU 504 may performstep 704 by either polling the keyboard buffer or by receiving an interrupt when the keyboard buffer is full. In either case, theCPU 504 receives user input from the keyboard buffer when the keyboard buffer is flushed by a keyboard driver or other operating system tool which is associated with the keyboard buffer. Note that the user input in the keyboard buffer may originate from a conventional input device such as a keyboard or pointing device, or from the voice-recognition module 522. - In
step 706, theCPU 504 processes the user input according to the programming contained in theapplication program 508. For example, the user input could represent a command from the user to retrieve particular data from thedatabase 520 stored in themagnetic storage device 519. TheCPU 504 processes the user command by accessing and retrieving the requested data from thedatabase 520 and transferring the retrieved data to thedisplay screen 110 for display to the user. Specifically, theCPU 504 transfers the retrieved data to the input/output interface 510 along with a request to display the data on thedisplay screen 110. The input/output interface 510 converts the retrieved data to generic video signals generally appropriate for display on monitors and sends the generic video signals to the displayscreen driver module 514 via the input/output connectors 518. The displayscreen driver module 514 translates the generic video signals to video signals compatible with theparticular display screen 110. Alternatively, the user input could be information which the user is providing to the computer 102 (for storage in thestorage device 520, for example). - In
step 708, theCPU 504 determines if the user command represented an exit request. If the user command did not represent an exit request, then theCPU 504 loops back to step 704 to await further user input. - Fig. 8 is an operational flow chart of the voice-recognition module 522 (shown in Fig. 5) or the voice-recognition module 522' in combination with the analog/digital converter 608 (shown in Fig. 6) wherein the voice-
recognition module 522 or 522' operates according to dependent voice recognition algorithms. Note that before the steps in Fig. 8 are performed, the voice-recognition module 522 or 522' is trained using well-known methods for a particular operator to produce a vocabulary model containing words and phrases which the voice-recognition module 522 or 522' recognizes. Since it is well known, the vocabulary model is not discussed here except to note that, for each word or Phrase which the voice-recognition module 522 or 522' recognizes, the vocabulary model contains a finite number (such as five) of digitized waveforms corresponding to different ways in which the particular operator (for whom the voice-recognition module 522 or 522' is trained) pronounces the word or phrase. Fig. 12 is an example of one such wave form of a two-word phrase (such as "Next Menu" or "End Program") wherein the first word is represented by points between times four and thirteen and the second word is represented by points between times sixteen and twenty-two. The words are delimited by points between times zero and three, fourteen and fifteen and twenty-three and twenty-six which fall below an energy threshold, which is determined for the particular operator. The vocabulary model also contains one or more operator-selected keystrokes associated with each word and phrase which the voice-recognition module 522 of 522' recognizes. For example, the phrase "End Program" may be associated with the keystrokes "Ctrl", "X" if the keystrokes "Ctrl", "X" are recognized by theapplication program 508 as the "End program" command. - Fig. 8 shall now be described. For simplicity purposes, Fig. 8 is described below with reference to the structural embodiment of the
computer 102 shown in Fig. 5, although it should be understood that the alternate structural embodiment of thecomputer 102 shown in Fig. 6 operates in substantially the same manner with regard to the flow chart in Fig. 8. - Referring now to Fig. 8, in
step 806 the voice-recognition module 522 receives from themicrophone 122 electrical signals representing a verbal utterance spoken by an operator, digitizes the received electrical signals and sends the digitized electrical signals to the voice-recognition module 522. - In
step 808, the voice-recognition module 522 identifies the boundaries of a word or phrase contained in the digitized electrical signals by locating points in the digitized electrical signals which fall below the energy threshold (as shown in Fig. 12, for example) for the particular operator, the located points representing the boundaries of words. The voice-recognition module 522 determines that the digitized electrical signals contain a phrase (that is, multiple words), rather than a single word, if words in tie digitized electrical signals are separated by less than a predefined number (which is set according to the speaking characteristics of the particular operator) of points which fall below the energy threshold. The points in the digitized electrical signals corresponding to a word or phrase represent a digitized verbal utterance waveform which was received from the operator (that is, which was spoken by the operator). For example, if Fig. 12 represents the digitized electrical signals and if the predefined number is five, then the voice-recognition module 522 determines (in step 808) that the digitized electrical signals contain a phrase having two words because the two words are separated by only four points (between times thirteen and fifteen) which fall below the energy threshold. The points in the digitized electrical signals corresponding to the two-word phrase represent the received digitized verbal utterance waveform. - In
steps recognition module 522 recognizes the received digitized digitized verbal utterance waveform by matching the received digitized verbal utterance waveform against all the digitized waveforms (corresponding to different ways in which the particular operator pronounces different words) contained in the vocabulary model. Specifically, instep 810 the voice-recognition module 522 selects the next word from the vocabulary model to process. Recall that the vocabulary model contains a finite number of digitized waveforms for the selected word corresponding to different ways in which the particular operator pronounces the selected word, and that the vocabulary model contains one or more operator-selected keystrokes associated with the selected word. - In
step 812, the voice-recognition module 522 performs a point-by-point comparison of the received digitized verbal utterance waveform and the digitized waveforms for the selected word and determines whether there is a match between the received digitized verbal utterance waveform and any of the digitized waveforms for the selected word. The voice-recognition module 522 determines that a match occurs between a point in the received digitized verbal utterance waveform and a corresponding-in-time point contained in one of the digitized waveforms if the difference between the two points is less than an operator predefined limit, which may be zero for some implementations. - In
step 814, the voice-recognition module 522 determines whether a match was found instep 812 and proceeds to step 816 if a match was not found. Instep 816, the voice-recognition module 522 determines whether there are any more words in the vocabulary module left to process. If, instep 816, the voice-recognition module 522 determines that there are more words left to process, then the voice-recognition module 522 flows back to step 810 to select the next word from the vocabulary model to process. Otherwise, the voice-recognition module 522 determines that the received digitized verbal utterance waveform cannot be recognized based on the current training of the vocabulary model and returns to step 806 to await further operator input. Note that, when operating according to dependent voice-recognition algorithms, the voice-recognition module 522 does not interact with the operator to update the training of the vocabulary model to recognize the received digitized verbal utterance waveform. - If, in
step 814, the voice-recognition module 522 determines that a match was found instep 812, then the voice-recognition module 522 performsstep 818. Instep 818, the voice-recognition module 522 translates the received digitized verbal utterance waveform to the keystrokes contained in the vocabulary model and associated with the selected word (which the received digitized verbal utterance waveform matched). Instep 820, the voice-recognition module 522 stores the keystrokes in the keyboard buffer. As described above, the keystrokes in the keyboard buffer are processed by theCPU 504 when the keyboard buffer is flushed (see the abovetext describing step 704 in Fig. 7). - Fig. 9 is an operational flow chart for the
computer system 102 according to the embodiment of the present invention wherein an independent voice-recognition module is used. While performing the steps shown in Fig. 9, thecomputer system 102 is operating according to the programming contained in theapplication program 508. - For simplicity purposes, Fig. 9 is described below with reference to the structural embodiment of the
computer 102 shown in Fig. 6, although it should be understood that the alternate structural embodiment of thecomputer 102 shown in Fig. 5 operates in substantially the same manner with regard to the flow chart in Fig. 9. - Note that before the steps in Fig. 9 are performed, the voice-recognition module 522' is trained for multiple operators using well-known methods to produce a vocabulary model containing words and phrases which the voice-recognition module 522' recognizes. Since it is well known, the vocabulary model is not discussed here except to note that, for each word or phrase which the voice-recognition module 522' recognizes, the vocabulary model contains a single digitized waveform corresponding to a base line pronunciation of the word or phrase. Fig. 12, which was described above, is an example of one such digitized waveform containing a two-word phrase. The vocabulary model also contains operator-selected strings (each having one or more characters) associated with each word or phrase which the voice-recognition module 522' recognizes. For example, the phrase "End Program" may be associated with the string "〈Ctrl〉X" if the string "〈Ctrl〉X" is recognized by the
application program 508 as the "End Program" command. - The vocabulary model also contains permitted variances about each of the points in the digitized base line waveforms, the permitted variances representing different ways of pronouncing the words and phrases by the operators for whom the voice-recognition module 522' is trained. For example, suppose that the digitized waveform in Fig. 12 represents a digitized base line waveform contained in the vocabulary model, that the permitted variance for the point at time ten is two, and that the point at time ten has a digital amplitude of fifteen. According to this example, any digitized verbal utterance waveform having an amplitude at time ten in the range from thirteen to seventeen will match the digitized base line waveform in Fig. 12, at least with regard to the point at time ten.
- Referring now to Fig. 9, in
step 904, theCPU 504 determines whether a verbal utterance, spoken by an operator into themicrophone 122, exists at the analog/digital converter 608. In practice, themicrophone 122 converts audio signals from the operator (that is, verbal utterances from the operator) into electrical signals and sends the electrical signals to the analog/digital converter 608. The analog/digital converter 608 digitizes the received electrical signals and places the digitized electrical signals into thebuffer 610. TheCPU 504 then performs step 904 by polling the analog/digital converter 608 to determine when thebuffer 610 is full. When thebuffer 610 is full, theCPU 504 instructs the analog/digital converter 608 to send the digitized electrical signals, which represents a digitized verbal utterance, from thebuffer 610 to theCPU 504 via thebus 502. - In
step 906, theCPU 504 receives the digitized verbal utterance from the analog/digital converter 608 via thebus 502. - In
step 908, theCPU 504 causes the voice-recognition module 522' to recognize the digitized verbal utterance against a subset of the words and phrases contained in the vocabulary model. That is, theCPU 504 passes a number of words and phrases to the voice-recognition module 522' and instructs the voice-recognition module 522' to determine whether the digitized verbal utterance matches any of the stored digitized base line waveforms associated with the passed words and phrases, taking into consideration the permitted variances for each of the digitized base line waveforms. As a result ofstep 908, the voice-recognition module 522' passes the string stored in the vocabulary model and associated with the matched word or phrase to theCPU 504 for processing. - In practice, the words which are passed by the
CPU 504 to the voice-recognition module 522' may represent menu choices from a current context of theapplication program 508. For example, theapplication program 508 might have asked the operator to select from a menu of choices. Inprocessing step 908, the voice-recognition module 522' would determine whether the operator's verbal response matched any of the menu choices. In practice, in processing the steps of Fig. 9 theCPU 504 may interact with the voice-recognition module 522' by invoking software routines stored in the voice-recognition module 522' pursuant to programming in the application program 522'. - In
step 916, theCPU 504 processes the recognized word or phrase (that is, the string that was returned in step 908) in accordance with the particular programming of theapplication program 508. For example, the recognized word or phrase may represent a command from the operator to access particular information from thedatabase 520. In this case, theCPU 504 would access and retrieve the requested data from thedatabase 520 located in themagnetic storage device 519. TheCPU 504 would then send the retrieved data to thedisplay screen 110 for display to the operator. - In
step 918, theCPU 504 determines whether the recognized word or phrase represented an exit command. If the recognized word or phrase did not represent an exit command, then theCPU 504 loops back to step 904 to await further operator instructions. Otherwise, theCPU 504 terminates the application program instep 920. - The manner in which the voice-recognition module 522' performs step 908 (shown in Fig. 9) is illustrated in Fig. 10, which is an operational flow chart of the voice-recognition module 522' wherein the voice-recognition module 522' operates according to independent voice-recognition algorithms. Note that before the steps in Fig. 10 are performed, the voice-recognition module 522' is trained for multiple operators using well-known methods as described above with reference to Fig. 9.
- In
step 1004, the voice-recognition module 522' identifies the boundaries of a word or phrase contained in the digitized electrical signals (which represent the digitized verbal utterance) by locating points in the digitized electrical signals which fall below the energy threshold (as shown in Fig. 12, for example), the located points representing the boundaries of words. The voice-recognition module 522' determines that the digitized electrical signals contain a phrase (that is, multiple words), rather than a single word, if words in the digitized electrical signals are separated by less than a predefined number (which is set according to the speaking characteristics of the particular operator) of points which fall below the energy threshold. The points in the digitized electrical signals corresponding to a word or phrase represent a digitized verbal utterance waveform which was received from the operator (that is, which was spoken by the operator). For example, if Fig. 12 represents the digitized electrical signals and if the predefined number is five, then the voice-recognition module 522' determines (in step 1004) that the digitized electrical signals contain a phrase having two words because the two words are separated by only four points (between times thirteen and fifteen) which fall below the energy threshold. The points in the digitized electrical signals corresponding to the two-word phrase represent the received digitized verbal utterance waveform. - In
step 1006, the voice-recognition module 522' selects the next word or phrase from the subset of words and phrases which theCPU 504 passed to the voice-recognition module 522'. - In
step 1008, the voice-recognition module 522' performs a point-by-point comparison of the received digitized verbal utterance waveform and the digitized base line waveform associated with the selected word or phrase and determines whether there is a match between the received digitized verbal utterance waveform and the digitized base line waveform, in light of the permitted variances associated with the digitized base line waveform. The voice-recognition module 522' determines that a match occurs between a point in the received digitized verbal utterance waveform and a corresponding-in-time point contained in the digitized base line waveform if the difference between the two points is less than the permitted variance for that particular point. Inprocessing step 1008, the voice-recognition module 522' generates a recognition score for the selected word or phrase which essentially represents a cumulative sum of the differences between the received digitized verbal utterance waveform and the digitized base line waveform at each point. Therefore, a lower recognition score indicates a closer match between the received digitized verbal utterance waveform and the digitized base line waveform. - In
step 1010, the voice-recognition module 522' determines whether the recognition score for the selected word or phrase is less than a maximum permitted score which is software adjustable by the operator. If the recognition score for the selected word or phrase is greater than the maximum permitted score, then the voice-recognition module 522' determines that the selected word or phrase and the received digitized verbal utterance waveform did not match and processes step 1014. If the recognition score for the selected word or phrase is less than the maximum permitted score, then the voice-recognition module 522' determines that the selected word or phrase and the received digitized verbal utterance waveform matched (to a certain confidence level as described below) and processes step 1012. - In
step 1012, the voice-recognition module 522' adds the selected word or phrase to an acceptable words list. Words and phrases in the acceptable words list are ordered in ascending order by their respective recognition scores, such that the first entry in the acceptable words list has the lowest recognition score. - In
step 1014, the voice-recognition module 522' determines whether all the words and phrases which theCPU 504 passed to the voice-recognition module 522' have been processed. If there are more words and/or phrases to process, then theCPU 504 loops back tostep 1006. Otherwise, theCPU 504processes step 1016. - In the remaining steps in Fig. 10, the voice-recognition module 522' interacts with the operator to determine which of the words or phrases in the acceptable words list is the word or phrase which was spoken by the operator. Specifically, in
step 1016 the voice-recognition module 522' selects the first entry in the acceptable words list. Recall that the first entry has the lowest voice-recognition score of all the entries in the acceptable words list. - In
step 1018 the voice-recognition module 522' generates a confidence value for the first entry by subtracting the recognition score of the second entry from the recognition score of the first entry. The confidence value of the first entry quantifies how confident the voice-recognition module 522' is that the first entry is the word or phrase which was spoken by the operator. Instep 1020 the voice-recognition module 522' determines whether the confidence value of the first entry is high or low by comparing the confidence value to an operator-selected minimum confidence value. If the confidence value is greater than the minimum confidence value, then the confidence value is high and the voice-recognition module 522' processes step 1030. Instep 1030, since the confidence of the first entry is high, the voice-recognition module 522' returns to theCPU 504 the string associated with the first entry to theCPU 504 for processing (instep 916 of Fig. 9). - If, in
step 1020, the confidence value is less than the minimum confidence value, then the confidence value is low and the voice-recognition module 522' processes step 1022. Instep 1022, the voice-recognition module 522' asks the operator whether the first entry is the correct word or phrase (that is, the word or phrase spoken by the operator). The voice-recognition module 522' may query the operator by displaying a message to the operator on thedisplay screen 110 or by sending the message in audio form to the operator via the speaker 332 (or, alternatively, to the microphone/speaker 122 if the microphone/speaker 122 is an ear microphone or to an independent speaker if themicrophone 122 is a headset microphone and does not have a transducer for outputting audio signals). The operator may respond to the query from the voice-recognition module 522' in a number of ways, such as speaking into themicrophone 122 or by manipulating thepointing device 330. If the operator responds that the first entry is correct, then the voice-recognition module 522' processes step 1028 (described below). Otherwise, the voice-recognition module 522' processes step 1024. - In
step 1024, since the operator has indicated that the first entry is not correct, the voice-recognition module 522' determines whether there are any more entries in the acceptable words list to process. If there are more entries to process, then the voice-recognition module 522' processes step 1026, wherein the voice-recognition module 522' selects the next entry in the acceptable words list and then loops back to step 1022 to query the operator whether the newly selected entry is correct. Otherwise, if there are no former entries to process (see step 1024), the voice-recognition module 522' processes step 1028. - As described above,
step 1028 is reached when either (1) instep 1022, the operator indicates that the selected entry is correct, or (2) instep 1024, the operator effectively indicates that none of the entries in the acceptable words list is correct. Instep 1028, the voice-recognition module 522' interacts with the operator to update the training of the vocabulary model according to well-known methods such that the voice-recognition module 522' will be able to more confidently recognize the operator's verbal utterance in the future. Since the voice-recognition module 522' utilizes well-known methods instep 1028,step 1028 will not be discussed further. - After updating the training of the vocabulary model, the voice-recognition module 522' processes step 1330, wherein the voice-recognition module 522' returns to the
CPU 504 for processing (instep 916 of Fig. 9) the string associated with the word or phrase which matches the verbal utterance as indicated by the operator (insteps 1022 and 1028). - While preferred embodiments of the invention have been described and certain modifications thereto suggested, it will be recognized by those skilled in the art that other changes may be made to the above-described embodiments which are within the appended claims. For instance, the
computer system 102 of the present invention may be used in many applications, including (but not limited to) for use with the handicapped, for education purposes, for testing and service purposes and for inventory purposes.
Claims (10)
- A compact, self-contained portable computing apparatus for being completely supported by a user for hands-free retrieval and display of information for the user comprising:
a housing (106) having securing means for removably mounting the housing on a user for being carried without hands by the user;
storage means (519) mounted in the housing for storing previously entered information;
processor means (504) mounted in the housing and communicating with the storage means for receiving, retrieving and processing information and user commands in accordance with a stored program (508);
audio transducer and converter means (122) in communication with the processor means for receiving audio commands from the user, for converting the received audio commands into electrical signals, and for sending the converted electrical signals to the processor means, the audio transducer and converter means being supported hands-free by the user;
said processor means including means (522) for recognizing a command in the converted electrical signals and responding to the recognized command by outputting information derived by operation of the stored program;
computer display means (110) in communication with the processor means for receiving the outputted information from the processor means and for displaying the received information for the user; and
means (108) for mounting the computer display means on the user such that the computer display means is carried hands-free in a portion of the view of the user whereby the computing apparatus is capable of being operated and of displaying the received information in a hands-free manner utilizing only the audio commands. - The computing apparatus of claim 1, wherein the recognizing means (522) comprises means for matching the converted electrical signals against signals representative of words and phrases in a preprogrammed vocabulary model in accordance with dependent voice-recognition algorithms.
- The computing apparatus of claim 1, wherein the recognizing means (522) comprises means for defining a subset of a preprogrammed vocabulary model according to a current context of the stored program and for matching the converted electrical signals against the defined subset in accordance with independent voice-recognition algorithms.
- The computing apparatus of any one of claims 1 to 3, wherein the display means comprises:
visual display means (110); and
driver means (514) in communication with the visual display means for converting the information received from the processor means (504) into video signals compatible with the visual display means and for displaying the compatible video signals on the visual display means. - The computing apparatus of claim 4, wherein the visual display means comprises a monochrome display screen (1106) optically coupled to a filter (1108) having a first color polarizer (1110) for polarizing and separating light from the screen into cyan and red components, a first liquid-crystal switch (1112) for rotating polarized light from the first color polarizer by zero or ninety degrees, a second color polarizer (1114) for polarizing and separating light from the first liquid-crystal switch into yellow and blue components, and a second liquid-crystal switch (1116) for rotating polarized light from the second color polarizer by zero or ninety degrees.
- The computing apparatus of claim 4 or claim 5, wherein the visual display means (1106) comprises a color graphics adaptor (CGA) eye piece monitor, an enhanced graphics adaptor (EGA) eye piece monitor, a video graphics array (VGA) eye piece monitor, or a super VGA eye piece monitor.
- The computing apparatus of any one of claims 1 to 6 including measurement means (116) in communication with the processor means (504) for performing electrical measurements on devices being evaluated, for receiving electrical measurement signals from the devices being evaluated as a result of performing the electrical measurements, and for sending the received electrical measurement signals to the processor means, the measurement means being supported by the user.
- A method for hands-free retrieval and display of information by a compact, self-contained portable computing apparatus (102) comprising:
receiving a verbal utterance from the user of the apparatus by means of an audio transducer (122);
converting the received verbal utterance into electrical signals;
transferring the converted electrical signals from the audio transducer to a processing means of the computing apparatus;
recognizing (522) the converted electrical signals by the processing means to identify a user command matching the received verbal utterance;
processing the user command by the processing means in accordance with a stored program, the processing step including: determining whether the user command represents an information retrieval request, if the user command represents an information retrieval request then retrieving the requested information from a database (520) previously entered into a storage means (519); and displaying the retrieved information on a computer display device (110) supported by the user, whereby the computing apparatus may be operated and caused to display the retrieved information in a hands-free manner utilizing only audio commands. - The method of claim 8, wherein the recognizing step comprises:
selecting a word or phrase from a pre-trained vocabulary model;
determining whether the converted electrical signals match one of a finite number of electrical waveforms associated with the word or phrase; and
if the converted electrical signals match one of the finite number of electrical waveforms associated with the selected word or phrase, then translating the converted electrical signals to one or more particular keystrokes associated with the selected word or phrase. - The method of claim 8, wherein the recognizing step comprises:
defining a subsec of a pre-trained vocabulary model according to a current context of the stored program;
selecting a word or phrase from the subset;
determining whether deviations between the converted electrical signals and a base line waveform are within permitted variances, the base line waveform and permitted variances being associated with the selected word or phrase;
if deviations between the converted electrical signals and the base line waveform are within the permitted variances, then calculating a confidence value for the selected word or phrase;
if the confidence value is greater than a predetermined confidence value, then translating the converted electrical signals to a string associated with the selected word or phrase; and
if the confidence value is less than the predetermined confidence value, then interacting with the user to modify the pre-trained vocabulary model for the converted electrical signals.
Priority Applications (9)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US07863619 US5305244B2 (en) | 1992-04-06 | 1992-04-06 | Hands-free user-supported portable computer |
CA002114336A CA2114336C (en) | 1992-04-06 | 1994-01-27 | Hands-free user-supported portable computer |
EP94300856A EP0670537B1 (en) | 1992-04-06 | 1994-02-04 | Hands-free user-supported portable computer |
CNB941027708A CN1154946C (en) | 1992-04-06 | 1994-02-04 | Hands-free, user-supported portable computer |
ES94300856T ES2164688T3 (en) | 1992-04-06 | 1994-02-04 | PORTABLE COMPUTER, SUPPORTED BY THE USER, USING HANDS-FREE. |
DE69428264T DE69428264T2 (en) | 1992-04-06 | 1994-02-04 | Hands-free, user-supported portable calculator |
AU54891/94A AU661223B1 (en) | 1992-04-06 | 1994-02-04 | Hands-free, user-supported portable computer |
DK94300856T DK0670537T3 (en) | 1992-04-06 | 1994-02-04 | Hands-free, user-supported laptop |
HK98111547A HK1010759A1 (en) | 1992-04-06 | 1998-10-26 | Hands-free user-supported portable computer |
Applications Claiming Priority (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US07863619 US5305244B2 (en) | 1992-04-06 | 1992-04-06 | Hands-free user-supported portable computer |
CA002114336A CA2114336C (en) | 1992-04-06 | 1994-01-27 | Hands-free user-supported portable computer |
EP94300856A EP0670537B1 (en) | 1992-04-06 | 1994-02-04 | Hands-free user-supported portable computer |
CNB941027708A CN1154946C (en) | 1992-04-06 | 1994-02-04 | Hands-free, user-supported portable computer |
AU54891/94A AU661223B1 (en) | 1992-04-06 | 1994-02-04 | Hands-free, user-supported portable computer |
Publications (2)
Publication Number | Publication Date |
---|---|
EP0670537A1 true EP0670537A1 (en) | 1995-09-06 |
EP0670537B1 EP0670537B1 (en) | 2001-09-12 |
Family
ID=27507052
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP94300856A Expired - Lifetime EP0670537B1 (en) | 1992-04-06 | 1994-02-04 | Hands-free user-supported portable computer |
Country Status (9)
Country | Link |
---|---|
US (1) | US5305244B2 (en) |
EP (1) | EP0670537B1 (en) |
CN (1) | CN1154946C (en) |
AU (1) | AU661223B1 (en) |
CA (1) | CA2114336C (en) |
DE (1) | DE69428264T2 (en) |
DK (1) | DK0670537T3 (en) |
ES (1) | ES2164688T3 (en) |
HK (1) | HK1010759A1 (en) |
Cited By (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP0790584A2 (en) * | 1996-02-13 | 1997-08-20 | Sanyo Electric Co. Ltd | A system and method for equipment design and production |
EP1013157A1 (en) * | 1997-09-11 | 2000-06-28 | Comsonics, Inc. | Hands-free signal level meter |
EP1025946A1 (en) * | 1999-02-04 | 2000-08-09 | La Soudure Autogene Francaise | Protection mask/ power supply for electric welding or cutting |
WO2000062222A1 (en) * | 1999-04-14 | 2000-10-19 | Syvox Corporation | Interactive voice unit for giving instruction to a worker |
EP1208920A1 (en) * | 2000-11-22 | 2002-05-29 | Heinrich Kuper Gmbh & Co Kg | Method and device for sorting carpets |
DE10100425A1 (en) * | 2000-12-13 | 2002-06-20 | Imelauer Heinz | Portable data transmission device, directly transmitting recorded, optic and/or acoustic information into Internet, and/or comparable network |
GB2380044A (en) * | 2001-09-25 | 2003-03-26 | Draeger Safety Ag & Co Kgaa | Helmet or mask comprising a voice controlled display showing measurement data |
EP1315113A1 (en) * | 1995-10-02 | 2003-05-28 | Xybernaut Corporation | Hands-free, portable computer and system |
GB2346720B (en) * | 1999-02-12 | 2003-12-31 | Fisher Rosemount Systems Inc | A wearable computer in a process control environment |
FR2845850A1 (en) * | 1996-08-02 | 2004-04-16 | Symbol Technologies Inc | MOBILE TERMINAL, SYSTEM COMPRISING SAME, AND METHOD FOR CREATING A MOBILE SITE |
US6806847B2 (en) | 1999-02-12 | 2004-10-19 | Fisher-Rosemount Systems Inc. | Portable computer in a process control environment |
US6961700B2 (en) | 1996-09-24 | 2005-11-01 | Allvoice Computing Plc | Method and apparatus for processing the output of a speech recognition engine |
EP1603115A1 (en) | 2004-06-03 | 2005-12-07 | Nintendo Co., Limited | Speech command processing apparatus |
EP2116966A1 (en) * | 2008-05-05 | 2009-11-11 | Rheinmetall Waffe Munition GmbH | System for voice-controlled, interactive support for maintenance work or similar |
US7640007B2 (en) | 1999-02-12 | 2009-12-29 | Fisher-Rosemount Systems, Inc. | Wireless handheld communicator in a process control environment |
US11432412B2 (en) | 2017-07-12 | 2022-08-30 | Hewlett-Packard Development Company, L.P. | VR/AR sleeves |
Families Citing this family (227)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5450596A (en) * | 1991-07-18 | 1995-09-12 | Redwear Interactive Inc. | CD-ROM data retrieval system using a hands-free command controller and headwear monitor |
US5303085A (en) | 1992-02-07 | 1994-04-12 | Rallison Richard D | Optically corrected helmet mounted display |
US6097543A (en) | 1992-02-07 | 2000-08-01 | I-O Display Systems Llc | Personal visual display |
US5864326A (en) | 1992-02-07 | 1999-01-26 | I-O Display Systems Llc | Depixelated visual display |
US5305244B2 (en) * | 1992-04-06 | 1997-09-23 | Computer Products & Services I | Hands-free user-supported portable computer |
US5491651A (en) * | 1992-05-15 | 1996-02-13 | Key, Idea Development | Flexible wearable computer |
US5526022A (en) | 1993-01-06 | 1996-06-11 | Virtual I/O, Inc. | Sourceless orientation sensor |
US6853293B2 (en) | 1993-05-28 | 2005-02-08 | Symbol Technologies, Inc. | Wearable communication system |
US6811088B2 (en) * | 1993-05-28 | 2004-11-02 | Symbol Technologies, Inc. | Portable data collection system |
US6826532B1 (en) * | 1993-10-05 | 2004-11-30 | Snap-On Incorporated | Hands free automotive service system |
JP4001643B2 (en) * | 1993-10-05 | 2007-10-31 | スナップ−オン・テクノロジイズ・インク | Two-hand open type car maintenance equipment |
US7310072B2 (en) * | 1993-10-22 | 2007-12-18 | Kopin Corporation | Portable communication display device |
US5566272A (en) * | 1993-10-27 | 1996-10-15 | Lucent Technologies Inc. | Automatic speech recognition (ASR) processing using confidence measures |
US5991087A (en) | 1993-11-12 | 1999-11-23 | I-O Display System Llc | Non-orthogonal plate in a virtual reality or heads up display |
US5454063A (en) * | 1993-11-29 | 1995-09-26 | Rossides; Michael T. | Voice input system for data retrieval |
US5572401A (en) * | 1993-12-13 | 1996-11-05 | Key Idea Development L.L.C. | Wearable personal computer system having flexible battery forming casing of the system |
US5555490A (en) * | 1993-12-13 | 1996-09-10 | Key Idea Development, L.L.C. | Wearable personal computer system |
US6160666A (en) | 1994-02-07 | 2000-12-12 | I-O Display Systems Llc | Personal visual display system |
US5603065A (en) * | 1994-02-28 | 1997-02-11 | Baneth; Robin C. | Hands-free input device for operating a computer having mouthpiece with plurality of cells and a transducer for converting sound into electrical control signals |
US5903395A (en) | 1994-08-31 | 1999-05-11 | I-O Display Systems Llc | Personal visual display system |
US6463361B1 (en) * | 1994-09-22 | 2002-10-08 | Computer Motion, Inc. | Speech interface for an automated endoscopic system |
US7053752B2 (en) * | 1996-08-06 | 2006-05-30 | Intuitive Surgical | General purpose distributed operating room control system |
JPH08137428A (en) * | 1994-11-11 | 1996-05-31 | Nintendo Co Ltd | Image display device, image display system and program cartridge used for the same |
US5758322A (en) * | 1994-12-09 | 1998-05-26 | International Voice Register, Inc. | Method and apparatus for conducting point-of-sale transactions using voice recognition |
US5677834A (en) * | 1995-01-26 | 1997-10-14 | Mooneyham; Martin | Method and apparatus for computer assisted sorting of parcels |
IL112513A (en) * | 1995-02-01 | 1999-05-09 | Ald Advanced Logistics Dev Ltd | System and method for failure reporting and collection |
US5959611A (en) * | 1995-03-06 | 1999-09-28 | Carnegie Mellon University | Portable computer system with ergonomic input device |
US6567079B1 (en) | 1995-03-06 | 2003-05-20 | Carnegie Mellon University | Portable computer system with ergonomic input device |
US5860810A (en) * | 1995-04-14 | 1999-01-19 | Mcdonnell Douglas Helicopter Company | Automated instructional system for performing mechanical procedures |
US5991085A (en) | 1995-04-21 | 1999-11-23 | I-O Display Systems Llc | Head-mounted personal visual display apparatus with image generator and holder |
US5873070A (en) * | 1995-06-07 | 1999-02-16 | Norand Corporation | Data collection system |
USD383455S (en) * | 1995-08-31 | 1997-09-09 | Virtual I/O, Inc. | Head mounted display with headtracker |
USD381346S (en) * | 1995-09-13 | 1997-07-22 | Kopin Corporation | Head-mountable matrix display |
US20020118284A1 (en) * | 1995-10-02 | 2002-08-29 | Newman Edward G. | Video camera system |
US6486855B1 (en) | 1996-01-16 | 2002-11-26 | Iv Phoenix Group Inc. | Mounted display system |
TW395121B (en) * | 1996-02-26 | 2000-06-21 | Seiko Epson Corp | Personal wearing information display device and the display method using such device |
US6047301A (en) * | 1996-05-24 | 2000-04-04 | International Business Machines Corporation | Wearable computer |
US6911916B1 (en) * | 1996-06-24 | 2005-06-28 | The Cleveland Clinic Foundation | Method and apparatus for accessing medical data over a network |
US6642836B1 (en) * | 1996-08-06 | 2003-11-04 | Computer Motion, Inc. | General purpose distributed operating room control system |
AU720452B2 (en) * | 1996-08-15 | 2000-06-01 | Xybernaut Corporation | Mobile computer |
US5719743A (en) * | 1996-08-15 | 1998-02-17 | Xybernaut Corporation | Torso worn computer which can stand alone |
US5948047A (en) * | 1996-08-29 | 1999-09-07 | Xybernaut Corporation | Detachable computer structure |
US6029183A (en) * | 1996-08-29 | 2000-02-22 | Xybernaut Corporation | Transferable core computer |
US5999952A (en) * | 1997-08-15 | 1999-12-07 | Xybernaut Corporation | Core computer unit |
US5738547A (en) * | 1996-09-20 | 1998-04-14 | Russo; Ernest | Toy conversion structure |
US7321354B1 (en) | 1996-10-31 | 2008-01-22 | Kopin Corporation | Microdisplay for portable communication systems |
US7372447B1 (en) | 1996-10-31 | 2008-05-13 | Kopin Corporation | Microdisplay for portable communication systems |
US6486862B1 (en) * | 1996-10-31 | 2002-11-26 | Kopin Corporation | Card reader display system |
US6677936B2 (en) * | 1996-10-31 | 2004-01-13 | Kopin Corporation | Color display system for a camera |
CA2640647C (en) * | 1996-11-01 | 2010-12-14 | Embedded Technologies, Llc | Flexible wearable computer system |
US6018710A (en) * | 1996-12-13 | 2000-01-25 | Siemens Corporate Research, Inc. | Web-based interactive radio environment: WIRE |
WO1998029775A1 (en) * | 1997-01-02 | 1998-07-09 | Giora Kutz | A personal head mounted display device |
US5924069A (en) * | 1997-01-30 | 1999-07-13 | Lucent Technologies Inc. | Voice-control integrated field support data communications system for maintenance, repair and emergency services |
US5897618A (en) * | 1997-03-10 | 1999-04-27 | International Business Machines Corporation | Data processing system and method for switching between programs having a same title using a voice command |
US5893063A (en) * | 1997-03-10 | 1999-04-06 | International Business Machines Corporation | Data processing system and method for dynamically accessing an application using a voice command |
AU6869998A (en) | 1997-03-26 | 1998-10-20 | Via, Inc. | Wearable computer packaging configurations |
CA2218812A1 (en) * | 1997-04-14 | 1998-10-14 | Michael D. Jenkins | Mobile computer and system |
CA2286250C (en) * | 1997-04-15 | 2007-06-05 | Michael T. Perkins | A supportive belt system integrating computers, interfaces, and other devices |
AU684943B1 (en) * | 1997-05-20 | 1998-01-08 | Xybernaut Corporation | Hands-free, portable computer and system |
US6244015B1 (en) * | 1997-08-11 | 2001-06-12 | Kabushiki Kaisha Toshiba | Method of assembling plant |
US6353313B1 (en) * | 1997-09-11 | 2002-03-05 | Comsonics, Inc. | Remote, wireless electrical signal measurement device |
US5903396A (en) | 1997-10-17 | 1999-05-11 | I/O Display Systems, Llc | Intensified visual display |
US6476784B2 (en) | 1997-10-31 | 2002-11-05 | Kopin Corporation | Portable display system with memory card reader |
US6552704B2 (en) | 1997-10-31 | 2003-04-22 | Kopin Corporation | Color display with thin gap liquid crystal |
US6909419B2 (en) * | 1997-10-31 | 2005-06-21 | Kopin Corporation | Portable microdisplay system |
AU708668B2 (en) * | 1997-11-21 | 1999-08-12 | Xybernaut Corporation | A computer structure for accommodating a PC card |
USD414928S (en) * | 1998-02-17 | 1999-10-12 | Via, Inc. | Wearable computer |
US20040090423A1 (en) * | 1998-02-27 | 2004-05-13 | Logitech Europe S.A. | Remote controlled video display GUI using 2-directional pointing |
US6911969B1 (en) | 1998-05-01 | 2005-06-28 | Honeywell International Inc. | Handheld computer apparatus |
US6262889B1 (en) * | 1998-06-05 | 2001-07-17 | Xybernaut Corporation | Insulated mobile computer |
US6243076B1 (en) | 1998-09-01 | 2001-06-05 | Synthetic Environments, Inc. | System and method for controlling host system interface with point-of-interest data |
CA2261900A1 (en) * | 1998-09-11 | 2000-03-11 | Xybernaut Corporation | Convertible wearable computer |
US6301593B1 (en) * | 1998-09-25 | 2001-10-09 | Xybernaut Corp. | Mobile computer with audio interrupt system |
US20020040377A1 (en) * | 1998-09-25 | 2002-04-04 | Newman Edward G. | Computer with audio interrupt system |
US6532482B1 (en) * | 1998-09-25 | 2003-03-11 | Xybernaut Corporation | Mobile computer with audio interrupt system |
US6650305B1 (en) | 1998-10-02 | 2003-11-18 | Honeywell Inc. | Wireless electronic display |
US6597346B1 (en) | 1998-10-02 | 2003-07-22 | Honeywell Inc. | Hand held computer with see-through display |
JP2000194726A (en) * | 1998-10-19 | 2000-07-14 | Sony Corp | Device, method and system for processing information and providing medium |
US8275617B1 (en) | 1998-12-17 | 2012-09-25 | Nuance Communications, Inc. | Speech command input recognition system for interactive computer display with interpretation of ancillary relevant speech query terms into commands |
US7206747B1 (en) | 1998-12-16 | 2007-04-17 | International Business Machines Corporation | Speech command input recognition system for interactive computer display with means for concurrent and modeless distinguishing between speech commands and speech queries for locating commands |
US6937984B1 (en) | 1998-12-17 | 2005-08-30 | International Business Machines Corporation | Speech command input recognition system for interactive computer display with speech controlled display of recognized commands |
US6192343B1 (en) | 1998-12-17 | 2001-02-20 | International Business Machines Corporation | Speech command input recognition system for interactive computer display with term weighting means used in interpreting potential commands from relevant speech terms |
US6233560B1 (en) | 1998-12-16 | 2001-05-15 | International Business Machines Corporation | Method and apparatus for presenting proximal feedback in voice command systems |
US7035897B1 (en) * | 1999-01-15 | 2006-04-25 | California Institute Of Technology | Wireless augmented reality communication system |
US6424357B1 (en) * | 1999-03-05 | 2002-07-23 | Touch Controls, Inc. | Voice input system and method of using same |
US20030124200A1 (en) * | 1999-06-22 | 2003-07-03 | Stone Kevin R. | Cartilage enhancing food supplements with sucralose and methods of preparing the same |
US6539098B1 (en) | 1999-09-24 | 2003-03-25 | Mailcode Inc. | Mail processing systems and methods |
US6527711B1 (en) | 1999-10-18 | 2003-03-04 | Bodymedia, Inc. | Wearable human physiological data sensors and reporting system therefor |
US6928329B1 (en) * | 2000-02-29 | 2005-08-09 | Microsoft Corporation | Enabling separate chat and selective enablement of microphone |
US7240093B1 (en) * | 2000-02-29 | 2007-07-03 | Microsoft Corporation | Use of online messaging to facilitate selection of participants in game play |
US6634551B2 (en) * | 2000-05-11 | 2003-10-21 | United Parcel Service Of America, Inc. | Delivery notice and method of using same |
US6994253B2 (en) * | 2000-05-11 | 2006-02-07 | United Parcel Service Of America | Systems and methods of item delivery utilizing a delivery notice |
US20040211834A1 (en) * | 2000-05-11 | 2004-10-28 | United Parcel Service Of America, Inc. | Systems and methods of modifying item delivery utilizing linking |
US7261690B2 (en) | 2000-06-16 | 2007-08-28 | Bodymedia, Inc. | Apparatus for monitoring health, wellness and fitness |
US7689437B1 (en) | 2000-06-16 | 2010-03-30 | Bodymedia, Inc. | System for monitoring health, wellness and fitness |
US20060122474A1 (en) * | 2000-06-16 | 2006-06-08 | Bodymedia, Inc. | Apparatus for monitoring health, wellness and fitness |
WO2005029242A2 (en) | 2000-06-16 | 2005-03-31 | Bodymedia, Inc. | System for monitoring and managing body weight and other physiological conditions including iterative and personalized planning, intervention and reporting capability |
US6605038B1 (en) | 2000-06-16 | 2003-08-12 | Bodymedia, Inc. | System for monitoring health, wellness and fitness |
EP1292218B1 (en) * | 2000-06-23 | 2006-04-26 | Bodymedia, Inc. | System for monitoring health, wellness and fitness |
AU762858B2 (en) * | 2000-06-30 | 2003-07-10 | Xybernaut Corporation | Multimedia I/O interface device for use at entertainment events |
US6925307B1 (en) | 2000-07-13 | 2005-08-02 | Gtech Global Services Corporation | Mixed-mode interaction |
JP2002032147A (en) * | 2000-07-14 | 2002-01-31 | Toshiba Corp | Computer system |
JP2002032212A (en) * | 2000-07-14 | 2002-01-31 | Toshiba Corp | Computer system and headset type display device |
JP3567864B2 (en) * | 2000-07-21 | 2004-09-22 | 株式会社デンソー | Voice recognition device and recording medium |
US8165867B1 (en) | 2000-09-15 | 2012-04-24 | Fish Robert D | Methods for translating a device command |
US6443347B1 (en) | 2000-10-19 | 2002-09-03 | International Business Machines Corporation | Streamlined personal harness for supporting a wearable computer and associated equipment on the body of a user |
US6522531B1 (en) | 2000-10-25 | 2003-02-18 | W. Vincent Quintana | Apparatus and method for using a wearable personal computer |
US6956614B1 (en) * | 2000-11-22 | 2005-10-18 | Bath Iron Works | Apparatus and method for using a wearable computer in collaborative applications |
US20020068604A1 (en) * | 2000-12-04 | 2002-06-06 | Prabhakar Samuel Muthiah | Wearable data device for use in a wearable data network |
US6885991B2 (en) * | 2000-12-07 | 2005-04-26 | United Parcel Service Of America, Inc. | Telephony-based speech recognition for providing information for sorting mail and packages |
US20030101059A1 (en) * | 2000-12-08 | 2003-05-29 | Heyman Martin D. | System and method of accessing and recording messages at coordinate way points |
US6962277B2 (en) * | 2000-12-18 | 2005-11-08 | Bath Iron Works Corporation | Apparatus and method for using a wearable computer in testing and diagnostic applications |
US6561845B2 (en) * | 2000-12-27 | 2003-05-13 | International Business Machines Corporation | Distributed connector system for wearable computers |
US20020087220A1 (en) * | 2000-12-29 | 2002-07-04 | Tveit Tor Andreas | System and method to provide maintenance for an electrical power generation, transmission and distribution system |
US6798391B2 (en) * | 2001-01-02 | 2004-09-28 | Xybernaut Corporation | Wearable computer system |
US20040201695A1 (en) * | 2001-02-15 | 2004-10-14 | Rei Inasaka | System for delivering news |
US20020143554A1 (en) * | 2001-03-12 | 2002-10-03 | Shaw-Yuan Hou | Voice-activated control device for intelligent instruments |
US6595929B2 (en) | 2001-03-30 | 2003-07-22 | Bodymedia, Inc. | System for monitoring health, wellness and fitness having a method and apparatus for improved measurement of heat flow |
US6507486B2 (en) | 2001-04-10 | 2003-01-14 | Xybernaut Corporation | Wearable computer and garment system |
US20020165005A1 (en) * | 2001-05-03 | 2002-11-07 | Interactive Imaging Systems, Inc. | Portable computing device |
US6552899B2 (en) | 2001-05-08 | 2003-04-22 | Xybernaut Corp. | Mobile computer |
US6958905B2 (en) * | 2001-06-12 | 2005-10-25 | Xybernaut Corporation | Mobile body-supported computer with battery |
US6583982B2 (en) | 2001-06-19 | 2003-06-24 | Xybernaut Corporation | Intrinsically safe enclosure and method |
KR100471057B1 (en) * | 2001-07-10 | 2005-03-08 | 삼성전자주식회사 | portable computer and method for reproducing video signal on screen thereof |
US6529372B1 (en) | 2001-08-17 | 2003-03-04 | Xybernaut Corp. | Wearable computer-battery system |
US20030083789A1 (en) * | 2001-10-25 | 2003-05-01 | Kalley Terrence D. | Product training and demonstration software application |
US20030090437A1 (en) * | 2001-11-12 | 2003-05-15 | Adams Michael Dewayne | Display system |
DE10161570A1 (en) * | 2001-12-14 | 2003-07-03 | Fette Wilhelm Gmbh | Method for instructing an operator in maintenance and repair work on a tablet press |
US6757156B2 (en) | 2002-03-06 | 2004-06-29 | Xybernaut Corporation | Ergonomic hand held display |
US6992566B2 (en) * | 2002-04-18 | 2006-01-31 | International Business Machines Corporation | Modular school computer system and method |
US7052799B2 (en) * | 2002-06-27 | 2006-05-30 | Vocollect, Inc. | Wearable terminal with a battery latch mechanism |
US6910911B2 (en) * | 2002-06-27 | 2005-06-28 | Vocollect, Inc. | Break-away electrical connector |
US7805114B1 (en) | 2002-07-17 | 2010-09-28 | Bath Iron Works Corporation | In situ re-configurable wireless communications system (IRCWCS) |
US7020508B2 (en) | 2002-08-22 | 2006-03-28 | Bodymedia, Inc. | Apparatus for detecting human physiological and contextual information |
US8663106B2 (en) | 2002-08-22 | 2014-03-04 | Bodymedia, Inc. | Non-invasive temperature monitoring device |
JP4975249B2 (en) * | 2002-10-09 | 2012-07-11 | ボディーメディア インコーポレイテッド | Device for measuring an individual's state parameters using physiological information and / or context parameters |
US20090177068A1 (en) * | 2002-10-09 | 2009-07-09 | Stivoric John M | Method and apparatus for providing derived glucose information utilizing physiological and/or contextual parameters |
US7890336B2 (en) | 2003-01-13 | 2011-02-15 | Northwestern University | Interactive task-sensitive assistant |
US7951409B2 (en) | 2003-01-15 | 2011-05-31 | Newmarket Impressions, Llc | Method and apparatus for marking an egg with an advertisement, a freshness date and a traceability code |
US7099749B2 (en) * | 2003-02-20 | 2006-08-29 | Hunter Engineering Company | Voice controlled vehicle wheel alignment system |
US7090134B2 (en) * | 2003-03-04 | 2006-08-15 | United Parcel Service Of America, Inc. | System for projecting a handling instruction onto a moving item or parcel |
US7063256B2 (en) * | 2003-03-04 | 2006-06-20 | United Parcel Service Of America | Item tracking and processing systems and methods |
US7182738B2 (en) | 2003-04-23 | 2007-02-27 | Marctec, Llc | Patient monitoring apparatus and method for orthosis and other devices |
US7742928B2 (en) * | 2003-05-09 | 2010-06-22 | United Parcel Service Of America, Inc. | System for resolving distressed shipments |
DE10341305A1 (en) * | 2003-09-05 | 2005-03-31 | Daimlerchrysler Ag | Intelligent user adaptation in dialog systems |
EP1667579A4 (en) * | 2003-09-12 | 2008-06-11 | Bodymedia Inc | Method and apparatus for measuring heart related parameters |
US7681046B1 (en) | 2003-09-26 | 2010-03-16 | Andrew Morgan | System with secure cryptographic capabilities using a hardware specific digital secret |
CN100464374C (en) * | 2003-10-13 | 2009-02-25 | 深圳国际技术创新研究院 | Wearing type DVD |
US7694151B1 (en) * | 2003-11-20 | 2010-04-06 | Johnson Richard C | Architecture, system, and method for operating on encrypted and/or hidden information |
US20050184954A1 (en) * | 2004-02-23 | 2005-08-25 | Adams Michael D. | Portable communication system |
US7561717B2 (en) * | 2004-07-09 | 2009-07-14 | United Parcel Service Of America, Inc. | System and method for displaying item information |
US7710395B2 (en) * | 2004-07-14 | 2010-05-04 | Alken, Inc. | Head-mounted pointing and control device |
US20060161392A1 (en) * | 2004-10-15 | 2006-07-20 | Food Security Systems, Inc. | Food product contamination event management system and method |
US7933554B2 (en) * | 2004-11-04 | 2011-04-26 | The United States Of America As Represented By The Secretary Of The Army | Systems and methods for short range wireless communication |
KR100679044B1 (en) * | 2005-03-07 | 2007-02-06 | 삼성전자주식회사 | User adaptive speech recognition method and apparatus |
US20060255795A1 (en) * | 2005-05-13 | 2006-11-16 | Higgins Robert F | Six-degree-of-freedom, integrated-coil AC magnetic tracker |
US20060282317A1 (en) * | 2005-06-10 | 2006-12-14 | Outland Research | Methods and apparatus for conversational advertising |
US20070015999A1 (en) * | 2005-07-15 | 2007-01-18 | Heldreth Mark A | System and method for providing orthopaedic surgical information to a surgeon |
JP4560463B2 (en) * | 2005-09-13 | 2010-10-13 | キヤノン株式会社 | Data processing apparatus, data processing method, and computer program |
US8635073B2 (en) * | 2005-09-14 | 2014-01-21 | At&T Intellectual Property I, L.P. | Wireless multimodal voice browser for wireline-based IPTV services |
US20070078678A1 (en) * | 2005-09-30 | 2007-04-05 | Disilvestro Mark R | System and method for performing a computer assisted orthopaedic surgical procedure |
US20070080930A1 (en) * | 2005-10-11 | 2007-04-12 | Logan James R | Terminal device for voice-directed work and information exchange |
US7395962B2 (en) * | 2005-10-28 | 2008-07-08 | United Parcel Service Of America, Inc. | Pick up notice and method of using same |
FR2894754A1 (en) * | 2005-12-14 | 2007-06-15 | St Microelectronics Sa | Telecommunication system for facilitating videophonic communication, has portable telephone connected to hands-free device and vocally controlled using preprogrammed voice labels, where device has screen to display choice of commands |
US8417185B2 (en) | 2005-12-16 | 2013-04-09 | Vocollect, Inc. | Wireless headset and method for robust voice data communication |
US7773767B2 (en) * | 2006-02-06 | 2010-08-10 | Vocollect, Inc. | Headset terminal with rear stability strap |
US7885419B2 (en) | 2006-02-06 | 2011-02-08 | Vocollect, Inc. | Headset terminal with speech functionality |
US20070182662A1 (en) * | 2006-02-07 | 2007-08-09 | Rostislav Alpin | Portable electronic advertising system and method of use |
US20070243457A1 (en) * | 2006-04-12 | 2007-10-18 | Andres Viduya | Electronic device with multiple battery contacts |
US8635082B2 (en) | 2006-05-25 | 2014-01-21 | DePuy Synthes Products, LLC | Method and system for managing inventories of orthopaedic implants |
US9514746B2 (en) * | 2006-09-26 | 2016-12-06 | Storz Endoskop Produktions Gmbh | System and method for hazard mitigation in voice-driven control applications |
WO2008101248A2 (en) * | 2007-02-16 | 2008-08-21 | Bodymedia, Inc. | Systems and methods for understanding and applying the physiological and contextual life patterns of an individual or set of individuals |
US8265949B2 (en) | 2007-09-27 | 2012-09-11 | Depuy Products, Inc. | Customized patient surgical plan |
US8594395B2 (en) | 2007-09-30 | 2013-11-26 | DePuy Synthes Products, LLC | System and method for fabricating a customized patient-specific surgical instrument |
USD626949S1 (en) | 2008-02-20 | 2010-11-09 | Vocollect Healthcare Systems, Inc. | Body-worn mobile device |
WO2009132237A2 (en) * | 2008-04-25 | 2009-10-29 | Btsafety Llc | System and method of providing product quality and safety |
USD605629S1 (en) | 2008-09-29 | 2009-12-08 | Vocollect, Inc. | Headset |
US8386261B2 (en) | 2008-11-14 | 2013-02-26 | Vocollect Healthcare Systems, Inc. | Training/coaching system for a voice-enabled work environment |
US8160287B2 (en) | 2009-05-22 | 2012-04-17 | Vocollect, Inc. | Headset with adjustable headband |
US20100315329A1 (en) * | 2009-06-12 | 2010-12-16 | Southwest Research Institute | Wearable workspace |
US20110040660A1 (en) * | 2009-08-10 | 2011-02-17 | Allison Damon R | Monitoring And Management Of Lost Product |
US8438659B2 (en) | 2009-11-05 | 2013-05-07 | Vocollect, Inc. | Portable computing device and headset interface |
US9134534B2 (en) | 2010-02-28 | 2015-09-15 | Microsoft Technology Licensing, Llc | See-through near-eye display glasses including a modular image source |
US9097891B2 (en) | 2010-02-28 | 2015-08-04 | Microsoft Technology Licensing, Llc | See-through near-eye display glasses including an auto-brightness control for the display brightness based on the brightness in the environment |
US9285589B2 (en) | 2010-02-28 | 2016-03-15 | Microsoft Technology Licensing, Llc | AR glasses with event and sensor triggered control of AR eyepiece applications |
US20120194550A1 (en) * | 2010-02-28 | 2012-08-02 | Osterhout Group, Inc. | Sensor-based command and control of external devices with feedback from the external device to the ar glasses |
US10180572B2 (en) | 2010-02-28 | 2019-01-15 | Microsoft Technology Licensing, Llc | AR glasses with event and user action control of external applications |
US20120249797A1 (en) | 2010-02-28 | 2012-10-04 | Osterhout Group, Inc. | Head-worn adaptive display |
US9229227B2 (en) | 2010-02-28 | 2016-01-05 | Microsoft Technology Licensing, Llc | See-through near-eye display glasses with a light transmissive wedge shaped illumination system |
US9091851B2 (en) | 2010-02-28 | 2015-07-28 | Microsoft Technology Licensing, Llc | Light control in head mounted displays |
US9223134B2 (en) | 2010-02-28 | 2015-12-29 | Microsoft Technology Licensing, Llc | Optical imperfections in a light transmissive illumination system for see-through near-eye display glasses |
US9129295B2 (en) | 2010-02-28 | 2015-09-08 | Microsoft Technology Licensing, Llc | See-through near-eye display glasses with a fast response photochromic film system for quick transition from dark to clear |
US9128281B2 (en) | 2010-09-14 | 2015-09-08 | Microsoft Technology Licensing, Llc | Eyepiece with uniformly illuminated reflective display |
WO2011106797A1 (en) | 2010-02-28 | 2011-09-01 | Osterhout Group, Inc. | Projection triggering through an external marker in an augmented reality eyepiece |
US20150309316A1 (en) | 2011-04-06 | 2015-10-29 | Microsoft Technology Licensing, Llc | Ar glasses with predictive control of external device based on event input |
US9341843B2 (en) | 2010-02-28 | 2016-05-17 | Microsoft Technology Licensing, Llc | See-through near-eye display glasses with a small scale image source |
US9759917B2 (en) | 2010-02-28 | 2017-09-12 | Microsoft Technology Licensing, Llc | AR glasses with event and sensor triggered AR eyepiece interface to external devices |
US9182596B2 (en) | 2010-02-28 | 2015-11-10 | Microsoft Technology Licensing, Llc | See-through near-eye display glasses with the optical assembly including absorptive polarizers or anti-reflective coatings to reduce stray light |
US9097890B2 (en) | 2010-02-28 | 2015-08-04 | Microsoft Technology Licensing, Llc | Grating in a light transmissive illumination system for see-through near-eye display glasses |
US9366862B2 (en) | 2010-02-28 | 2016-06-14 | Microsoft Technology Licensing, Llc | System and method for delivering content to a group of see-through near eye display eyepieces |
CN102232635B (en) * | 2010-04-22 | 2013-09-11 | 中国人民解放军总后勤部军需装备研究所 | Informationized battle dress adopting layered structure |
US10462651B1 (en) * | 2010-05-18 | 2019-10-29 | Electric Mirror, Llc | Apparatuses and methods for streaming audio and video |
US9686673B2 (en) * | 2010-05-18 | 2017-06-20 | Electric Mirror, Llc | Apparatuses and methods for streaming audio and video |
US8659397B2 (en) | 2010-07-22 | 2014-02-25 | Vocollect, Inc. | Method and system for correctly identifying specific RFID tags |
USD643400S1 (en) | 2010-08-19 | 2011-08-16 | Vocollect Healthcare Systems, Inc. | Body-worn mobile device |
USD643013S1 (en) | 2010-08-20 | 2011-08-09 | Vocollect Healthcare Systems, Inc. | Body-worn mobile device |
TW201220055A (en) * | 2010-11-15 | 2012-05-16 | Wistron Corp | Method and system of power control |
CN102467216A (en) * | 2010-11-19 | 2012-05-23 | 纬创资通股份有限公司 | Power control method and power control system |
US10291660B2 (en) | 2010-12-31 | 2019-05-14 | Skype | Communication system and method |
US8963982B2 (en) * | 2010-12-31 | 2015-02-24 | Skype | Communication system and method |
US9717090B2 (en) | 2010-12-31 | 2017-07-25 | Microsoft Technology Licensing, Llc | Providing notifications of call-related services |
US10404762B2 (en) | 2010-12-31 | 2019-09-03 | Skype | Communication system and method |
JP6036334B2 (en) * | 2013-01-24 | 2016-11-30 | 株式会社島津製作所 | Head-mounted display device |
CN103309642A (en) * | 2013-06-09 | 2013-09-18 | 张家港市鸿嘉数字科技有限公司 | Method for operating tablet PC by sound recognition |
CN104678987A (en) * | 2013-12-31 | 2015-06-03 | 中国航空工业集团公司沈阳飞机设计研究所 | Voice interaction-based fault diagnosis method |
USD868797S1 (en) * | 2015-04-24 | 2019-12-03 | Amazon Technologies, Inc. | Display screen having a graphical user interface for product substitution |
JP6744025B2 (en) * | 2016-06-21 | 2020-08-19 | 日本電気株式会社 | Work support system, management server, mobile terminal, work support method and program |
US10198029B2 (en) | 2016-11-03 | 2019-02-05 | Smolding Bv | Wearable computer case and wearable computer |
US10471478B2 (en) | 2017-04-28 | 2019-11-12 | United Parcel Service Of America, Inc. | Conveyor belt assembly for identifying an asset sort location and methods of utilizing the same |
US20200154862A1 (en) * | 2017-07-20 | 2020-05-21 | Hewlett-Packard Development Company, L.P. | Retaining apparatuses comprising connectors |
US10061352B1 (en) * | 2017-08-14 | 2018-08-28 | Oculus Vr, Llc | Distributed augmented reality system |
US10048724B1 (en) | 2017-08-14 | 2018-08-14 | Tsai-Hsien YANG | Discrete type wearable computer |
US10528080B2 (en) * | 2017-12-19 | 2020-01-07 | Datalogic IP Tech, S.r.l. | Systems and methods for providing displays via a smart hand-strap accessory |
US11051829B2 (en) | 2018-06-26 | 2021-07-06 | DePuy Synthes Products, Inc. | Customized patient-specific orthopaedic surgical instrument |
USD940684S1 (en) * | 2019-03-24 | 2022-01-11 | Buddy Snow | Earphones |
KR20190100105A (en) * | 2019-08-09 | 2019-08-28 | 엘지전자 주식회사 | Electronic device |
JP2023050191A (en) * | 2021-09-29 | 2023-04-10 | 国立大学法人 東京大学 | Information processing device and program |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5163111A (en) * | 1989-08-18 | 1992-11-10 | Hitachi, Ltd. | Customized personal terminal device |
WO1993023801A1 (en) * | 1992-05-15 | 1993-11-25 | Mobila Technology Inc. | Flexible wearable computer |
US5305244A (en) * | 1992-04-06 | 1994-04-19 | Computer Products & Services, Inc. | Hands-free, user-supported portable computer |
Family Cites Families (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4672667A (en) * | 1983-06-02 | 1987-06-09 | Scott Instruments Company | Method for signal processing |
US4722065A (en) * | 1984-03-30 | 1988-01-26 | Casio Computer Co., Ltd. | Electronically programmable calculator with memory package |
US5012407A (en) * | 1984-12-11 | 1991-04-30 | Finn Charles A | Computer system which accesses operating system information and command handlers from optical storage via an auxiliary processor and cache memory |
NL8502959A (en) * | 1985-08-26 | 1987-03-16 | Lely Nv C Van Der | ELECTRONIC DEVICE RESPONDING TO SOUND. |
US4969193A (en) * | 1985-08-29 | 1990-11-06 | Scott Instruments Corporation | Method and apparatus for generating a signal transformation and the use thereof in signal processing |
US4776016A (en) * | 1985-11-21 | 1988-10-04 | Position Orientation Systems, Inc. | Voice control system |
US4949274A (en) * | 1987-05-22 | 1990-08-14 | Omega Engineering, Inc. | Test meters |
US5003300A (en) * | 1987-07-27 | 1991-03-26 | Reflection Technology, Inc. | Head mounted display for miniature video display system |
US4916441A (en) * | 1988-09-19 | 1990-04-10 | Clinicom Incorporated | Portable handheld terminal |
US5025471A (en) * | 1989-08-04 | 1991-06-18 | Scott Instruments Corporation | Method and apparatus for extracting information-bearing portions of a signal for recognizing varying instances of similar patterns |
-
1992
- 1992-04-06 US US07863619 patent/US5305244B2/en not_active Expired - Lifetime
-
1994
- 1994-01-27 CA CA002114336A patent/CA2114336C/en not_active Expired - Fee Related
- 1994-02-04 ES ES94300856T patent/ES2164688T3/en not_active Expired - Lifetime
- 1994-02-04 AU AU54891/94A patent/AU661223B1/en not_active Ceased
- 1994-02-04 CN CNB941027708A patent/CN1154946C/en not_active Expired - Lifetime
- 1994-02-04 EP EP94300856A patent/EP0670537B1/en not_active Expired - Lifetime
- 1994-02-04 DK DK94300856T patent/DK0670537T3/en active
- 1994-02-04 DE DE69428264T patent/DE69428264T2/en not_active Expired - Fee Related
-
1998
- 1998-10-26 HK HK98111547A patent/HK1010759A1/en not_active IP Right Cessation
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5163111A (en) * | 1989-08-18 | 1992-11-10 | Hitachi, Ltd. | Customized personal terminal device |
US5305244A (en) * | 1992-04-06 | 1994-04-19 | Computer Products & Services, Inc. | Hands-free, user-supported portable computer |
US5305244B1 (en) * | 1992-04-06 | 1996-07-02 | Computer Products & Services I | Hands-free, user-supported portable computer |
US5305244B2 (en) * | 1992-04-06 | 1997-09-23 | Computer Products & Services I | Hands-free user-supported portable computer |
WO1993023801A1 (en) * | 1992-05-15 | 1993-11-25 | Mobila Technology Inc. | Flexible wearable computer |
Non-Patent Citations (4)
Title |
---|
"Die Hände werden frei", NACHRICHTEN ELEKTRONIK UND TELEMATIK, vol. 44, no. 4, April 1990 (1990-04-01), HEIDELBERG DE, pages 154, XP000115702 * |
"Prêt-à-porter", EOS MAGAZINE, no. 7/8, July 1992 (1992-07-01), GENT, BE, pages 102 - 107, XP000255867 * |
MEISEL: "Talk to your computer", BYTE, vol. 18, no. 11, October 1993 (1993-10-01), ST PETERBOROUGH US, pages 113 - 120, XP000396640 * |
SMAILAGIC ET AL: "The VuMan 2 Wearable Computer", IEEE DESIGN & TEST OF COMPUTERS, vol. 10, no. 3, September 1993 (1993-09-01), LOS ALAMITOS US, pages 56 - 67, XP000397411 * |
Cited By (27)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP1315113A1 (en) * | 1995-10-02 | 2003-05-28 | Xybernaut Corporation | Hands-free, portable computer and system |
EP0790584A3 (en) * | 1996-02-13 | 1999-07-28 | Sanyo Electric Co. Ltd | A system and method for equipment design and production |
EP0790584A2 (en) * | 1996-02-13 | 1997-08-20 | Sanyo Electric Co. Ltd | A system and method for equipment design and production |
FR2845850A1 (en) * | 1996-08-02 | 2004-04-16 | Symbol Technologies Inc | MOBILE TERMINAL, SYSTEM COMPRISING SAME, AND METHOD FOR CREATING A MOBILE SITE |
US6961700B2 (en) | 1996-09-24 | 2005-11-01 | Allvoice Computing Plc | Method and apparatus for processing the output of a speech recognition engine |
EP1013157A1 (en) * | 1997-09-11 | 2000-06-28 | Comsonics, Inc. | Hands-free signal level meter |
EP1013157A4 (en) * | 1997-09-11 | 2003-08-13 | Comsonics Inc | Hands-free signal level meter |
EP1025946A1 (en) * | 1999-02-04 | 2000-08-09 | La Soudure Autogene Francaise | Protection mask/ power supply for electric welding or cutting |
FR2789300A1 (en) * | 1999-02-04 | 2000-08-11 | Soudure Autogene Francaise | PROTECTIVE MASK / CURRENT GENERATOR ASSEMBLY FOR ELECTRIC ARC WELDING OR CUTTING |
DE10066477B3 (en) * | 1999-02-12 | 2016-06-30 | Fisher-Rosemount Systems, Inc. | Portable computer in a process control environment |
US7640007B2 (en) | 1999-02-12 | 2009-12-29 | Fisher-Rosemount Systems, Inc. | Wireless handheld communicator in a process control environment |
GB2346720B (en) * | 1999-02-12 | 2003-12-31 | Fisher Rosemount Systems Inc | A wearable computer in a process control environment |
US8125405B2 (en) | 1999-02-12 | 2012-02-28 | Fisher-Rosemount Systems, Inc. | Wearable computer in a process control environment |
DE10066478B3 (en) * | 1999-02-12 | 2016-06-30 | Fisher-Rosemount Systems, Inc. | Portable computer in a process control environment |
US6806847B2 (en) | 1999-02-12 | 2004-10-19 | Fisher-Rosemount Systems Inc. | Portable computer in a process control environment |
DE10006126B4 (en) * | 1999-02-12 | 2011-11-17 | Fisher-Rosemount Systems, Inc. | Portable computer in a process control environment |
US7230582B1 (en) | 1999-02-12 | 2007-06-12 | Fisher-Rosemount Systems, Inc. | Wearable computer in a process control environment |
US7245271B2 (en) | 1999-02-12 | 2007-07-17 | Fisher-Rosemount Systems, Inc. | Portable computer in a process control environment |
WO2000062222A1 (en) * | 1999-04-14 | 2000-10-19 | Syvox Corporation | Interactive voice unit for giving instruction to a worker |
EP1208920A1 (en) * | 2000-11-22 | 2002-05-29 | Heinrich Kuper Gmbh & Co Kg | Method and device for sorting carpets |
DE10100425A1 (en) * | 2000-12-13 | 2002-06-20 | Imelauer Heinz | Portable data transmission device, directly transmitting recorded, optic and/or acoustic information into Internet, and/or comparable network |
GB2380044B (en) * | 2001-09-25 | 2004-01-07 | Draeger Safety Ag & Co Kgaa | A data communications system for wearers of masks/helmets |
GB2380044A (en) * | 2001-09-25 | 2003-03-26 | Draeger Safety Ag & Co Kgaa | Helmet or mask comprising a voice controlled display showing measurement data |
EP1603115A1 (en) | 2004-06-03 | 2005-12-07 | Nintendo Co., Limited | Speech command processing apparatus |
US8447605B2 (en) | 2004-06-03 | 2013-05-21 | Nintendo Co., Ltd. | Input voice command recognition processing apparatus |
EP2116966A1 (en) * | 2008-05-05 | 2009-11-11 | Rheinmetall Waffe Munition GmbH | System for voice-controlled, interactive support for maintenance work or similar |
US11432412B2 (en) | 2017-07-12 | 2022-08-30 | Hewlett-Packard Development Company, L.P. | VR/AR sleeves |
Also Published As
Publication number | Publication date |
---|---|
CA2114336A1 (en) | 1995-07-28 |
ES2164688T3 (en) | 2002-03-01 |
US5305244B2 (en) | 1997-09-23 |
CN1154946C (en) | 2004-06-23 |
US5305244A (en) | 1994-04-19 |
US5305244B1 (en) | 1996-07-02 |
DE69428264D1 (en) | 2001-10-18 |
HK1010759A1 (en) | 1999-06-25 |
AU661223B1 (en) | 1995-07-13 |
DK0670537T3 (en) | 2002-01-14 |
EP0670537B1 (en) | 2001-09-12 |
DE69428264T2 (en) | 2002-06-27 |
CN1106552A (en) | 1995-08-09 |
CA2114336C (en) | 1999-11-30 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP0670537B1 (en) | Hands-free user-supported portable computer | |
CA2182239C (en) | Hands-free, portable computer and system | |
US5751260A (en) | Sensory integrated data interface | |
US6570588B1 (en) | Editing support system including an interactive interface | |
US6556971B1 (en) | Computer-implemented speech recognition system training | |
US5867817A (en) | Speech recognition manager | |
US20030182113A1 (en) | Distributed speech recognition for mobile communication devices | |
US5970448A (en) | Historical database storing relationships of successively spoken words | |
US5924069A (en) | Voice-control integrated field support data communications system for maintenance, repair and emergency services | |
US6457024B1 (en) | Wearable hypermedium system | |
JP2873268B2 (en) | A portable computer supported by the user without using hands and its operation method | |
CA2261905C (en) | Mobile computer with audio interrupt system | |
EP0460867A2 (en) | Multimedia interface and method for computer system | |
KR19990022423A (en) | Method for performing improved voice communication, and voice transmission system | |
US20030055535A1 (en) | Voice interface for vehicle wheel alignment system | |
US6532482B1 (en) | Mobile computer with audio interrupt system | |
KR100283931B1 (en) | Portable computer device and information retrieval and display method using the same | |
TW392114B (en) | Hands-free, user-suported portable computer | |
CN114545759B (en) | Intelligent watch testing equipment | |
KR100301123B1 (en) | Hand-free Portable Computers and Systems | |
WO2001039177A2 (en) | Distributed speech recognition for mobile communication devices | |
Baker | State-of-the-art speech recognition us research and business update. | |
AU736030B2 (en) | Mobile computer with audio interrupt system | |
MXPA97009845A (en) | Computer and hand-held portable system | |
Launey | The motor-handicapped support system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): BE CH DE DK ES FR GB IT LI NL SE |
|
17P | Request for examination filed |
Effective date: 19951212 |
|
17Q | First examination report despatched |
Effective date: 19990325 |
|
GRAG | Despatch of communication of intention to grant |
Free format text: ORIGINAL CODE: EPIDOS AGRA |
|
GRAG | Despatch of communication of intention to grant |
Free format text: ORIGINAL CODE: EPIDOS AGRA |
|
GRAH | Despatch of communication of intention to grant a patent |
Free format text: ORIGINAL CODE: EPIDOS IGRA |
|
GRAH | Despatch of communication of intention to grant a patent |
Free format text: ORIGINAL CODE: EPIDOS IGRA |
|
GRAA | (expected) grant |
Free format text: ORIGINAL CODE: 0009210 |
|
AK | Designated contracting states |
Kind code of ref document: B1 Designated state(s): BE CH DE DK ES FR GB IT LI NL SE |
|
REG | Reference to a national code |
Ref country code: CH Ref legal event code: EP |
|
REF | Corresponds to: |
Ref document number: 69428264 Country of ref document: DE Date of ref document: 20011018 |
|
REG | Reference to a national code |
Ref country code: CH Ref legal event code: PUE Owner name: NEWMAN, EDWARD GEORGE;CHRISTIAN, GIL STEVEN;JENKIN |
|
REG | Reference to a national code |
Ref country code: GB Ref legal event code: IF02 |
|
REG | Reference to a national code |
Ref country code: DK Ref legal event code: T3 |
|
REG | Reference to a national code |
Ref country code: CH Ref legal event code: NV Representative=s name: AMMANN PATENTANWAELTE AG BERN |
|
ET | Fr: translation filed | ||
NLS | Nl: assignments of ep-patents |
Owner name: XYBERNAUT CORPORATION;COMPUTER PRODUCTS & SERVICES |
|
REG | Reference to a national code |
Ref country code: ES Ref legal event code: FG2A Ref document number: 2164688 Country of ref document: ES Kind code of ref document: T3 |
|
REG | Reference to a national code |
Ref country code: ES Ref legal event code: PC2A |
|
RAP2 | Party data changed (patent owner data changed or rights of a patent transferred) |
Owner name: XYBERNAUT CORPORATION |
|
RIN2 | Information on inventor provided after grant (corrected) |
Free format text: NEWMAN, EDWARD GEORGE * CHRISTIAN, GIL STEVEN * JENKINS, MICHAEL DAVID |
|
PLBE | No opposition filed within time limit |
Free format text: ORIGINAL CODE: 0009261 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT |
|
NLT2 | Nl: modifications (of names), taken from the european patent patent bulletin |
Owner name: XYBERNAUT CORPORATION |
|
26N | No opposition filed | ||
PGFP | Annual fee paid to national office [announced via postgrant information from national office to epo] |
Ref country code: SE Payment date: 20021206 Year of fee payment: 10 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: SE Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20040205 |
|
EUG | Se: european patent has lapsed | ||
PGFP | Annual fee paid to national office [announced via postgrant information from national office to epo] |
Ref country code: ES Payment date: 20080215 Year of fee payment: 15 Ref country code: CH Payment date: 20080226 Year of fee payment: 15 |
|
PGFP | Annual fee paid to national office [announced via postgrant information from national office to epo] |
Ref country code: IT Payment date: 20080129 Year of fee payment: 15 Ref country code: GB Payment date: 20080327 Year of fee payment: 15 |
|
PGFP | Annual fee paid to national office [announced via postgrant information from national office to epo] |
Ref country code: FR Payment date: 20080122 Year of fee payment: 15 |
|
PGFP | Annual fee paid to national office [announced via postgrant information from national office to epo] |
Ref country code: BE Payment date: 20080225 Year of fee payment: 15 |
|
PGFP | Annual fee paid to national office [announced via postgrant information from national office to epo] |
Ref country code: DK Payment date: 20090205 Year of fee payment: 16 |
|
PGFP | Annual fee paid to national office [announced via postgrant information from national office to epo] |
Ref country code: NL Payment date: 20090226 Year of fee payment: 16 |
|
BERE | Be: lapsed |
Owner name: *JENKINS MICHAEL DAVID Effective date: 20090228 Owner name: *CHRISTIAN GIL STEVEN Effective date: 20090228 Owner name: *NEWMAN EDWARD GEORGE Effective date: 20090228 |
|
PGFP | Annual fee paid to national office [announced via postgrant information from national office to epo] |
Ref country code: DE Payment date: 20090326 Year of fee payment: 16 |
|
REG | Reference to a national code |
Ref country code: CH Ref legal event code: PL |
|
GBPC | Gb: european patent ceased through non-payment of renewal fee |
Effective date: 20090204 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: LI Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20090228 Ref country code: CH Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20090228 |
|
REG | Reference to a national code |
Ref country code: FR Ref legal event code: ST Effective date: 20091030 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: BE Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20090228 |
|
REG | Reference to a national code |
Ref country code: ES Ref legal event code: FD2A Effective date: 20090205 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: GB Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20090204 Ref country code: FR Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20090302 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: ES Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20090205 |
|
REG | Reference to a national code |
Ref country code: NL Ref legal event code: V1 Effective date: 20100901 |
|
REG | Reference to a national code |
Ref country code: DK Ref legal event code: EBP |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: NL Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20100901 Ref country code: DK Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20100228 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: DE Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20100901 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: IT Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20090204 |