US20150079562A1 - Presenting audio based on biometrics parameters - Google Patents
Presenting audio based on biometrics parameters Download PDFInfo
- Publication number
- US20150079562A1 US20150079562A1 US14/037,271 US201314037271A US2015079562A1 US 20150079562 A1 US20150079562 A1 US 20150079562A1 US 201314037271 A US201314037271 A US 201314037271A US 2015079562 A1 US2015079562 A1 US 2015079562A1
- Authority
- US
- United States
- Prior art keywords
- user
- music
- exerciser
- processor
- threshold
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
- 230000001755 vocal effect Effects 0.000 claims description 37
- 230000037081 physical activity Effects 0.000 claims description 20
- 238000000034 method Methods 0.000 claims description 11
- 230000003247 decreasing effect Effects 0.000 claims description 6
- 230000000737 periodic effect Effects 0.000 claims description 2
- 229910003460 diamond Inorganic materials 0.000 description 20
- 239000010432 diamond Substances 0.000 description 20
- 238000004891 communication Methods 0.000 description 10
- 230000006855 networking Effects 0.000 description 10
- 230000008859 change Effects 0.000 description 7
- 230000000694 effects Effects 0.000 description 6
- 230000033001 locomotion Effects 0.000 description 5
- 238000012544 monitoring process Methods 0.000 description 5
- 230000036541 health Effects 0.000 description 4
- 230000006870 function Effects 0.000 description 3
- 210000003128 head Anatomy 0.000 description 3
- 230000036651 mood Effects 0.000 description 3
- 230000000007 visual effect Effects 0.000 description 3
- 230000003213 activating effect Effects 0.000 description 2
- 238000013459 approach Methods 0.000 description 2
- QVGXLLKOCUKJST-UHFFFAOYSA-N atomic oxygen Chemical compound [O] QVGXLLKOCUKJST-UHFFFAOYSA-N 0.000 description 2
- 239000008280 blood Substances 0.000 description 2
- 210000004369 blood Anatomy 0.000 description 2
- 230000036772 blood pressure Effects 0.000 description 2
- 238000013500 data storage Methods 0.000 description 2
- 230000007423 decrease Effects 0.000 description 2
- 230000001627 detrimental effect Effects 0.000 description 2
- 230000001815 facial effect Effects 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 229910052760 oxygen Inorganic materials 0.000 description 2
- 239000001301 oxygen Substances 0.000 description 2
- 230000008569 process Effects 0.000 description 2
- 230000003252 repetitive effect Effects 0.000 description 2
- 239000007787 solid Substances 0.000 description 2
- 230000009182 swimming Effects 0.000 description 2
- 230000007704 transition Effects 0.000 description 2
- WQZGKKKJIJFFOK-GASJEMHNSA-N Glucose Natural products OC[C@H]1OC(O)[C@H](O)[C@@H](O)[C@@H]1O WQZGKKKJIJFFOK-GASJEMHNSA-N 0.000 description 1
- 238000012952 Resampling Methods 0.000 description 1
- 230000006978 adaptation Effects 0.000 description 1
- 230000003044 adaptive effect Effects 0.000 description 1
- 230000004075 alteration Effects 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000004173 biogeochemical cycle Methods 0.000 description 1
- 230000036760 body temperature Effects 0.000 description 1
- 210000004556 brain Anatomy 0.000 description 1
- 230000007177 brain activity Effects 0.000 description 1
- 230000001914 calming effect Effects 0.000 description 1
- 238000001816 cooling Methods 0.000 description 1
- 230000036757 core body temperature Effects 0.000 description 1
- 230000001351 cycling effect Effects 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 230000002526 effect on cardiovascular system Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000007613 environmental effect Effects 0.000 description 1
- 230000008921 facial expression Effects 0.000 description 1
- 239000000835 fiber Substances 0.000 description 1
- 239000008103 glucose Substances 0.000 description 1
- 230000003862 health status Effects 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 230000002452 interceptive effect Effects 0.000 description 1
- 230000009191 jumping Effects 0.000 description 1
- 210000004072 lung Anatomy 0.000 description 1
- 239000003550 marker Substances 0.000 description 1
- 230000037323 metabolic rate Effects 0.000 description 1
- 235000015097 nutrients Nutrition 0.000 description 1
- 238000012545 processing Methods 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 230000029058 respiratory gaseous exchange Effects 0.000 description 1
- 210000001525 retina Anatomy 0.000 description 1
- 230000001020 rhythmical effect Effects 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 230000002123 temporal effect Effects 0.000 description 1
- 238000001931 thermography Methods 0.000 description 1
- 210000003813 thumb Anatomy 0.000 description 1
- 238000012549 training Methods 0.000 description 1
- 230000001052 transient effect Effects 0.000 description 1
- 238000010792 warming Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q50/00—Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
- G06Q50/10—Services
- G06Q50/22—Social work or social welfare, e.g. community support activities or counselling services
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09B—EDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
- G09B19/00—Teaching not covered by other main groups of this subclass
- G09B19/003—Repetitive work cycles; Sequence of movements
- G09B19/0038—Sports
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/02—Detecting, measuring or recording for evaluating the cardiovascular system, e.g. pulse, heart rate, blood pressure or blood flow
- A61B5/0205—Simultaneously evaluating both cardiovascular conditions and different types of body conditions, e.g. heart and respiratory condition
- A61B5/02055—Simultaneously evaluating both cardiovascular condition and temperature
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/02—Detecting, measuring or recording for evaluating the cardiovascular system, e.g. pulse, heart rate, blood pressure or blood flow
- A61B5/021—Measuring pressure in heart or blood vessels
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/02—Detecting, measuring or recording for evaluating the cardiovascular system, e.g. pulse, heart rate, blood pressure or blood flow
- A61B5/024—Measuring pulse rate or heart rate
- A61B5/02438—Measuring pulse rate or heart rate with portable devices, e.g. worn by the patient
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/02—Detecting, measuring or recording for evaluating the cardiovascular system, e.g. pulse, heart rate, blood pressure or blood flow
- A61B5/024—Measuring pulse rate or heart rate
- A61B5/0245—Measuring pulse rate or heart rate by using sensing means generating electric signals, i.e. ECG signals
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Measuring devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor or mobility of a limb
- A61B5/1123—Discriminating type of movement, e.g. walking or running
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/48—Other medical applications
- A61B5/4806—Sleep evaluation
- A61B5/4815—Sleep quality
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/74—Details of notification to user or communication with user or patient ; user input means
- A61B5/7405—Details of notification to user or communication with user or patient ; user input means using sound
- A61B5/7415—Sound rendering of measured values, e.g. by pitch or volume variation
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63B—APPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
- A63B71/00—Games or sports accessories not covered in groups A63B1/00 - A63B69/00
- A63B71/06—Indicating or scoring devices for games or players, or for other sports activities
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/20—Instruments for performing navigational calculations
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S19/00—Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
- G01S19/01—Satellite radio beacon positioning systems transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
- G01S19/13—Receivers
- G01S19/14—Receivers specially adapted for specific applications
- G01S19/19—Sporting applications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/60—Information retrieval; Database structures therefor; File system structures therefor of audio data
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/16—Sound input; Sound output
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/16—Sound input; Sound output
- G06F3/165—Management of the audio stream, e.g. setting of volume, audio stream path
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q10/00—Administration; Management
- G06Q10/06—Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
- G06Q10/063—Operations research, analysis or management
- G06Q10/0639—Performance analysis of employees; Performance analysis of enterprise or organisation operations
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B25/00—Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems
- G08B25/01—Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems characterised by the transmission medium
- G08B25/016—Personal emergency signalling and security systems
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09B—EDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
- G09B29/00—Maps; Plans; Charts; Diagrams, e.g. route diagram
- G09B29/10—Map spot or coordinate position indicators; Map reading aids
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10L—SPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
- G10L15/00—Speech recognition
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H20/00—ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
- G16H20/30—ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to physical therapies or activities, e.g. physiotherapy, acupressure or exercising
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H40/00—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
- G16H40/60—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
- G16H40/63—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for local operation
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H40/00—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
- G16H40/60—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
- G16H40/67—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for remote operation
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16Z—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS, NOT OTHERWISE PROVIDED FOR
- G16Z99/00—Subject matter not provided for in other main groups of this subclass
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04B—TRANSMISSION
- H04B5/00—Near-field transmission systems, e.g. inductive or capacitive transmission systems
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04B—TRANSMISSION
- H04B5/00—Near-field transmission systems, e.g. inductive or capacitive transmission systems
- H04B5/70—Near-field transmission systems, e.g. inductive or capacitive transmission systems specially adapted for specific purposes
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L12/00—Data switching networks
- H04L12/02—Details
- H04L12/12—Arrangements for remote connection or disconnection of substations or of equipment thereof
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L12/00—Data switching networks
- H04L12/02—Details
- H04L12/16—Arrangements for providing special services to substations
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L63/00—Network architectures or network communication protocols for network security
- H04L63/08—Network architectures or network communication protocols for network security for authentication of entities
- H04L63/0853—Network architectures or network communication protocols for network security for authentication of entities using an additional device, e.g. smartcard, SIM or a different communication terminal
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W4/00—Services specially adapted for wireless communication networks; Facilities therefor
- H04W4/80—Services using short range communication, e.g. near-field communication [NFC], radio-frequency identification [RFID] or low energy communication
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Measuring devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor or mobility of a limb
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/117—Identification of persons
- A61B5/1171—Identification of persons based on the shapes or appearances of their bodies or parts thereof
- A61B5/1172—Identification of persons based on the shapes or appearances of their bodies or parts thereof using fingerprinting
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/117—Identification of persons
- A61B5/1171—Identification of persons based on the shapes or appearances of their bodies or parts thereof
- A61B5/1176—Recognition of faces
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/145—Measuring characteristics of blood in vivo, e.g. gas concentration or pH-value ; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid or cerebral tissue
- A61B5/14532—Measuring characteristics of blood in vivo, e.g. gas concentration or pH-value ; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid or cerebral tissue for measuring glucose, e.g. by tissue impedance measurement
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/145—Measuring characteristics of blood in vivo, e.g. gas concentration or pH-value ; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid or cerebral tissue
- A61B5/14542—Measuring characteristics of blood in vivo, e.g. gas concentration or pH-value ; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid or cerebral tissue for measuring blood gases
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H50/00—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
- G16H50/20—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for computer-aided diagnosis, e.g. based on medical expert systems
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H50/00—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
- G16H50/30—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for calculating health indices; for individual health risk assessment
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M1/00—Substation equipment, e.g. for use by subscribers
- H04M1/72—Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
- H04M1/724—User interfaces specially adapted for cordless or mobile telephones
- H04M1/72403—User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
- H04M1/72409—User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality by interfacing with external accessories
- H04M1/72412—User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality by interfacing with external accessories using two-way short-range wireless interfaces
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M2250/00—Details of telephonic subscriber devices
- H04M2250/02—Details of telephonic subscriber devices including a Bluetooth interface
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M2250/00—Details of telephonic subscriber devices
- H04M2250/04—Details of telephonic subscriber devices including near field communication means, e.g. RFID
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M2250/00—Details of telephonic subscriber devices
- H04M2250/12—Details of telephonic subscriber devices including a sensor for measuring a physical value, e.g. temperature or motion
Definitions
- the present application relates generally to digital ecosystems that are configured for use when engaging in physical activity and/or fitness exercises.
- CE consumer electronics
- Portable aids can be provided to improve exercise performance, provide inspiration, enable the sharing of exercise performance for social reasons, help fulfill a person's exercise goals, analyze and track exercise results, and provide virtual coaching to exercise participants in an easy, intuitive manner.
- a device in a first aspect includes at least one computer readable storage medium bearing instructions executable by a processor and at least one processor configured for accessing the computer readable storage medium to execute the instructions.
- the instructions configure the processor for receiving signals from at least one biometric sensor of an exerciser, and based at least in part on the signals, outputting an audio cue on a speaker indicating to the exerciser to speed up or slow down.
- the biometric sensor may be a heart rate sensor.
- the processor when executing the instructions may be configured for determining whether a heart rate of the exerciser as indicated by signals from the heart rate sensor exceeds a threshold. Responsive to a determination that the heart rate exceeds the threshold, the processor may output an audio cue on the speaker to slow down, and responsive to a determination that the heart rate does not exceed the threshold, the processor may not output an audio cue on the speaker to slow down.
- the processor when executing the instructions may be configured for determining whether a heart rate of the exerciser as indicated by signals from the heart rate sensor is below a threshold. Responsive to a determination that the heart rate is below the threshold, the processor may output an audio cue on the speaker to speed up, and responsive to a determination that the heart rate exceeds the threshold, the processor may not output an audio cue on the speaker to speed up.
- the audio cue may be verbal.
- the audio cue may include music having a tempo that is increased or decreased, respectively, to indicate to the exerciser to speed up or slow down.
- the audio cue may include changing from playing a first music piece to playing a second music piece.
- the biometric sensor may include an exerciser breath sensor and/or an exerciser stride sensor.
- a method in another aspect, includes receiving signals from at least one biometric sensor representing a biometric parameter of an exerciser, and automatically transmitting signals to a speaker to present audible cues to the exerciser based on the signals from the biometric sensor.
- a computer readable storage medium that is not a carrier wave bears instructions which when executed by a processor configure the processor to execute logic including accessing planned physical activity information for a person associated with a CE device including the processor, receiving at least one signal from at least one biometric sensor representing at least one biometric parameter of the person, and determining whether the biometric parameter conforms to at least a portion of the planned physical activity information.
- the instructions then configure the processor for, responsive to a determination that the biometric parameter does not conform to at least a portion of the planned physical activity information, automatically presenting on the CE device an indication that the person is not in conformance with the planned physical activity information.
- FIG. 1 is a block diagram of an example system including an example CE device in accordance with present principles
- FIGS. 2-4 are example flowcharts of logic to be executed by a CE device for providing information and/or music to a user during physical activity in accordance with present principles
- FIG. 5 is an example flowchart of logic to be executed by a server for providing music and/or information to a CE device in accordance with present principles
- FIGS. 6-9 are example user interfaces (UIs) presentable on a CE device in accordance with present principles.
- FIGS. 10 and 11 are exemplary illustrations that demonstrate present principles.
- a system herein may include server and client components, connected over a network such that data may be exchanged between the client and server components.
- the client components may include one or more computing devices including portable televisions (e.g. smart TVs, Internet-enabled TVs), portable computers such as laptops and tablet computers, and other mobile devices including smart phones and additional examples discussed below.
- portable televisions e.g. smart TVs, Internet-enabled TVs
- portable computers such as laptops and tablet computers
- other mobile devices including smart phones and additional examples discussed below.
- These client devices may employ, as non-limiting examples, operating systems from Apple, Google, or Microsoft.
- a Unix operating system may be used.
- These operating systems can execute one or more browsers such as a browser made by Microsoft or Google or Mozilla or other browser program that can access web applications hosted by the Internet servers over a network such as the Internet, a local intranet, or a virtual private network.
- instructions refer to computer-implemented steps for processing information in the system. Instructions can be implemented in software, firmware or hardware; hence, illustrative components, blocks, modules, circuits, and steps are set forth in terms of their functionality.
- a processor may be any conventional general purpose single- or multi-chip processor that can execute logic by means of various lines such as address lines, data lines, and control lines and registers and shift registers. Moreover, any logical blocks, modules, and circuits described herein can be implemented or performed, in addition to a general purpose processor, in or by a digital signal processor (DSP), a field programmable gate array (FPGA) or other programmable logic device such as an application specific integrated circuit (ASIC), discrete gate or transistor logic, discrete hardware components, or any combination thereof designed to perform the functions described herein.
- DSP digital signal processor
- FPGA field programmable gate array
- ASIC application specific integrated circuit
- a processor can be implemented by a controller or state machine or a combination of computing devices.
- Any software modules described by way of flow charts and/or user interfaces herein can include various sub-routines, procedures, etc. It is to be understood that logic divulged as being executed by a module can be redistributed to other software modules and/or combined together in a single module and/or made available in a shareable library.
- Logic when implemented in software can be written in an appropriate language such as but not limited to C# or C++, and can be stored on or transmitted through a computer-readable storage medium such as a random access memory (RAM), read-only memory (ROM), electrically erasable programmable read-only memory (EEPROM), compact disk read-only memory (CD-ROM) of other optical disk storage such as digital versatile disc (DVD), magnetic disk storage or other magnetic storage devices including removable thumb drives, etc.
- a connection may establish a computer-readable medium.
- Such connections can include, as examples, hard-wired cables including fiber optics and coaxial wires and digital subscriber line (DSL) and twisted pair wires.
- Such connections may include wireless communication connections including infrared and radio.
- a processor can access information over its input lines from data storage, such as the computer readable storage medium, and/or the processor accesses information wirelessly from an Internet server by activating a wireless transceiver to send and receive data.
- Data typically is converted from analog signals to digital and then to binary by circuitry between the antenna and the registers of the processor when being received and from binary to digital to analog when being transmitted.
- the processor then processes the data through its shift registers to output calculated data on output lines, for presentation of the calculated data on the CE device.
- a system having at least one of A, B, and C includes systems that have A alone, B alone, C alone, A and B together, A and C together, B and C together, and/or A, B, and C together, etc.
- a computer ecosystem may be an adaptive and distributed socio-technical system that is characterized by its sustainability, self-organization, and scalability.
- environmental ecosystems which consist of biotic and abiotic components that interact through nutrient cycles and energy flows
- complete computer ecosystems consist of hardware, software, and services that in some cases may be provided by one company, such as Sony Electronics.
- the goal of each computer ecosystem is to provide consumers with everything that may be desired, at least in part services and/or software that may be exchanged via the Internet.
- interconnectedness and sharing among elements of an ecosystem such as applications within a computing cloud, provides consumers with increased capability to organize and access data and presents itself as the future characteristic of efficient integrative ecosystems.
- these ecosystems may be used while engaged in physical activity to e.g. provide inspiration, goal fulfillment and/or achievement, automated coaching/training, health and exercise analysis, convenient access to data, group sharing (e.g. of fitness data), and increased accuracy of health monitoring, all while doing so in a stylish and entertaining manner.
- the devices disclosed herein are understood to be capable of making diagnostic determinations based on data from various sensors (such as those described below in reference to FIG. 1 ) for use while exercising, for exercise monitoring (e.g. in real time), and/or for sharing of data with friends (e.g. using a social networking service) even when not all people have the same types and combinations of sensors on their respective CE devices.
- CE devices described herein may allow for easy and simplified user interaction with the device so as to not be unduly bothersome or encumbering e.g. before, during, and after an exercise.
- the CE device processors described herein can access information over its input lines from data storage, such as the computer readable storage medium, and/or the processor(s) accesses information wirelessly from an Internet server by activating a wireless transceiver to send and receive data.
- Data typically is converted from analog signals to digital and then to binary by circuitry between the antenna and the registers of the processor when being received and from binary to digital to analog when being transmitted.
- the processor then processes the data through its shift registers according to algorithms such as those described herein to output calculated data on output lines, for presentation of the calculated data on the CE device.
- the first of the example devices included in the system 10 is an example consumer electronics (CE) device 12 that may be waterproof (e.g., for use while swimming).
- CE device 12 may be, e.g., a computerized Internet enabled (“smart”) telephone, a tablet computer, a notebook computer, a wearable computerized device such as e.g.
- the CE device 12 is configured to undertake present principles (e.g. communicate with other CE devices to undertake present principles, execute the logic described herein, and perform any other functions and/or operations described herein).
- the CE device 12 can include some or all of the components shown in FIG. 1 .
- the CE device 12 can include one or more touch-enabled displays 14 , one or more speakers 16 for outputting audio in accordance with present principles, and at least one additional input device 18 such as e.g. an audio receiver/microphone for e.g. entering audible commands to the CE device 12 to control the CE device 12 .
- the example CE device 12 may also include one or more network interfaces 20 for communication over at least one network 22 such as the Internet, an WAN, an LAN, etc. under control of one or more processors 24 .
- the processor 24 controls the CE device 12 to undertake present principles, including the other elements of the CE device 12 described herein such as e.g. controlling the display 14 to present images thereon and receiving input therefrom.
- the network interface 20 may be, e.g., a wired or wireless modem or router, or other appropriate interface such as, e.g., a wireless telephony transceiver, WiFi transceiver, etc.
- the CE device 12 may also include one or more input ports 26 such as, e.g., a USB port to physically connect (e.g. using a wired connection) to another CE device and/or a headphone port to connect headphones to the CE device 12 for presentation of audio from the CE device 12 to a user through the headphones.
- the CE device 12 may further include one or more tangible computer readable storage medium 28 such as disk-based or solid state storage, it being understood that the computer readable storage medium 28 may not be a carrier wave.
- the CE device 12 can include a position or location receiver such as but not limited to a GPS receiver and/or altimeter 30 that is configured to e.g.
- the CE device 12 may include one or more cameras 32 that may be, e.g., a thermal imaging camera, a digital camera such as a webcam, and/or a camera integrated into the CE device 12 and controllable by the processor 24 to gather pictures/images and/or video in accordance with present principles (e.g. to share aspects of a physical activity such as hiking with social networking friends).
- a Bluetooth transceiver 34 and other Near Field Communication (NFC) element 36 for communication with other devices using Bluetooth and/or NFC technology, respectively.
- An example NFC element can be a radio frequency identification (RFID) element.
- RFID radio frequency identification
- the CE device 12 may include one or more motion sensors 37 (e.g., an accelerometer, gyroscope, cyclometer, magnetic sensor, infrared (IR) motion sensors such as passive IR sensors, an optical sensor, a speed and/or cadence sensor, a gesture sensor (e.g. for sensing gesture command), etc.) providing input to the processor 24 .
- the CE device 12 may include still other sensors such as e.g. one or more climate sensors 38 (e.g. barometers, humidity sensors, wind sensors, light sensors, temperature sensors, etc.) and/or one or more biometric sensors 40 (e.g.
- the CE device 12 may also include a kinetic energy harvester 42 to e.g. charge a battery (not shown) powering the CE device 12 .
- the system 10 may include one or more other CE device types such as, but not limited to, a computerized Internet-enabled bracelet 44 , computerized Internet-enabled headphones and/or ear buds 46 , computerized Internet-enabled clothing 48 , a computerized Internet-enabled exercise machine 50 (e.g. a treadmill, exercise bike, elliptical machine, etc.), etc. Also shown is a computerized Internet-enabled gymnasium entry kiosk 52 permitting authorized entry to a gymnasium housing the exercise machine 50 .
- a computerized Internet-enabled bracelet 44 computerized Internet-enabled headphones and/or ear buds 46
- computerized Internet-enabled clothing 48 e.g. a computerized Internet-enabled exercise machine 50 (e.g. a treadmill, exercise bike, elliptical machine, etc.), etc.
- a computerized Internet-enabled gymnasium entry kiosk 52 permitting authorized entry to a gymnasium housing the exercise machine 50 .
- CE devices included in the system 10 may respectively include some or all of the various components described above in reference to the CE device 12 such but not limited to e.g. the biometric sensors and motion sensors described above, as well as the position receivers, cameras, input devices, and speakers also described above.
- the headphones/ear buds 46 may include a heart rate sensor configured to sense a person's heart rate when a person is wearing the head phones
- the clothing 48 may include sensors such as perspiration sensors, climate sensors, and heart sensors for measuring the intensity of a person's workout
- the exercise machine 50 may include a camera mounted on a portion thereof for gathering facial images of a user so that the machine 50 may thereby determine whether a particular facial expression is indicative of a user struggling to keep the pace set by the exercise machine 50 and/or an NFC element to e.g.
- the kiosk 52 may include an NFC element permitting entry to a person authenticated as being authorized for entry based on input received from a complimentary NFC element (such as e.g. the NFC element 36 on the device 12 ).
- a complimentary NFC element such as e.g. the NFC element 36 on the device 12 .
- all of the devices described in reference to FIG. 1 including a server 54 to be described shortly, may communicate with each other over the network 22 using a respective network interface included thereon, and may each also include a computer readable storage medium that may not be a carrier wave for storing logic and/or software code in accordance with present principles.
- At least one server 54 includes at least one processor 56 , at least one tangible computer readable storage medium 58 that may not be a carrier wave such as disk-based or solid state storage, and at least one network interface 60 that, under control of the processor 56 , allows for communication with the other CE devices of FIG. 1 over the network 22 , and indeed may facilitate communication therebetween in accordance with present principles.
- the network interface 60 may be, e.g., a wired or wireless modem or router, WiFi transceiver, or other appropriate interface such as, e.g., a wireless telephony transceiver.
- the server 54 may be an Internet server, may facilitate fitness coordination and/or data exchange between CE device devices in accordance with present principles, and may include and perform “cloud” functions such that the CE devices of the system 10 may access a “cloud” environment via the server 54 in example embodiments to e.g. stream music to listen to while exercising and/or pair two or more devices (e.g. to “throw” music from one device to another).
- the logic begins at block 70 where the logic receives (e.g. planned) exercise information, planned physical activity information, planned exercise route information, etc. in accordance with present principles and as discussed herein (e.g. a user inputs the information using one of the user interfaces referenced herein).
- the logic may receive information pertaining to a planned exercise route (e.g. a jog) through the user's neighborhood (e.g.
- the logic at block 70 may receive information indicating that the user wishes to ride a bike for ten minutes at a moderately fast pace, then ten minutes at a very fast pace, then ten minutes of cooling down time, and indeed may even specify the desired miles per hour for each one at which the user wishes to bicycle.
- a user's personal trainer may set a workout routine at the trainer's CE device and then transmit the routine to the user's CE device for presentation thereon.
- the logic proceeds to block 72 where the logic determines music (e.g. one or more music files stored on and/or accessible to the CE device) to match at least the (e.g. estimated or user-indicated/desired) tempo and/or cadence of at least the first segment of the user's exercise routine/information (e.g. using the example above, at least selects music matching a tempo for the user to bicycle at a moderately fast pace to begin the routine).
- the tempo to music matching may be e.g. initially based on an estimate by the CE device of a tempo/cadence the user should maintain to comport with the exercise information (e.g., a certain tempo for pedaling the exercise bicycle to maintain the desired speed).
- the tempo to music matching may be estimated at first and then later adjusted to match the actual cadence of the user after the beginning of the workout.
- the first song before a user takes his or her first step on a jog may contain a tempo that is estimated to be the pace the user will set and/or should maintain, and thereafter the next song's tempo may be matched to the actual pace of the user.
- the next song's tempo may be matched to the actual pace of the user.
- a piece of music may be presented that includes one hundred fifty beats per minute for the user to thereby set his or her pace by moving one stride for every musical beat.
- tempo of the music itself may be determined by accessing metadata associated with the respective music file that contains tempo information (e.g., in beats per minute).
- the CE device may parse or otherwise access the music file to identify a tempo (e.g. identify a beat based on a repeated snare drum sound, inflections in a singer's voice, the changing of guitar chords, etc.), and then use the identified music tempo if it matches the user's pace/cadence (e.g. as close as possible, e.g. accounting for minor variances in the user's cadence as may naturally occur from step to step on a jog, or revolution to revolution on an exercise bicycle).
- a tempo e.g. identify a beat based on a repeated snare drum sound, inflections in a singer's voice, the changing of guitar chords, etc.
- the CE device may access all music files that are accessible to it (or e.g. a subset of files based on genre, artist, song length, etc.) to determine the beats per minute of each one, and then create a data table and/or metadata for later access by the CE device for efficiently identifying music with a tempo that matches the user's cadence at a given moment during an exercise routine without e.g. having to at that time parse the user's entire music library for music matching the user's cadence.
- the logic proceeds to block 74 where the logic receives an instruction to begin monitoring the user's exercise and thus to begin presenting music in accordance with e.g. the cadence of the user.
- the logic then proceeds to block 76 where the logic determines whether a turn is upcoming, e.g. a left or right turn the user should make to continue traveling on a pre-planned exercise route.
- a turn is upcoming, e.g. a left or right turn the user should make to continue traveling on a pre-planned exercise route.
- a route e.g. a fork in the road, a slight left turn, a u-turn, jumping to an upper tier of a structure in the case of parkour, etc.
- a non-verbal audio cue may also be associated with an instruction for the user to e.g. continue going straight such as at an road intersection.
- the logic determines that a turn is not upcoming (e.g. not within a predefined threshold distance for turns set by a user prior to embarking on the exercise)
- the logic proceeds to block 77 where logic continues monitoring the user's exercise and continues presenting music matched to the user's cadence in accordance with present principles. If, however, the logic determines that a turn is upcoming, the logic instead proceeds to block 78 where the logic notifies and/or cues the user of how to proceed at least using at least one non-verbal audio cue.
- a single beeping sound may be associated with a left turn (e.g. the user has preset the single beep to be associated with a left turn) while a double beeping sound may be associated with a right turn (e.g., the user having preset the double beep as well).
- the non-verbal cue may be presented in the left ear piece (only, or more prominently/loudly) to indicate a left turn should be made, and the right ear piece (only, or more prominently/loudly) to indicate a right turn should be made.
- non-verbal cues that may be presented to a user e.g. in ear pieces in accordance with present principles are haptic non-verbal cues and/or vibrations such that e.g. a non-verbal vibration cue (e.g. the ear piece(s) vibrates based on a vibrator located in each respective ear piece that is in communication with the CE device's processor) may be presented on the left ear piece (only, or more prominently) to indicate a left turn should be made, and the right ear piece (only, or more prominently) to indicate a right turn should be made.
- a non-verbal vibration cue e.g. the ear piece(s) vibrates based on a vibrator located in each respective ear piece that is in communication with the CE device's processor
- the left ear piece only, or more prominently
- the right ear piece only, or more prominently
- the non-verbal audio cue may be accompanied (e.g. immediately before or after the non-verbal audio cue) by a verbal cue such as an instruction to “turn left at the next street.”
- a verbal cue such as an instruction to “turn left at the next street.”
- the non-verbal audio cue need not be a single or double beep and that other non-verbal audio cues may be used that themselves indicate detailed information such as e.g. using an audible representation of Morse code to provide turn information to a user.
- the logic proceeds to block 80 where the logic determines that another segment of the planned exercise/route has begun, and accordingly presents music matching the tempo/cadence of the user as he or she embarks on the next segment (e.g. actual cadence, or desired cadence based on exercise information determined by the user prior to embarking on the run).
- the logic may determine at block 80 that the user has transitioned from running on flat ground to running up a hill, and accordingly presents music with a slower tempo relative to the music presented while the user was on flat ground (e.g. and also based upon segment settings set by a user where the user indicated that a slower pace up the hill was desired relative to the user's pace on flat ground).
- music may be presented with a faster tempo than that presented when the user was on flat ground, thereby assisting the user with matching a running cadence to the music tempo to thus proceed up the hill at a pace desired by the user (e.g. also based on predefined settings by the user).
- the logic proceeds to decision diamond 82 , at which the logic determines whether a virtual opponent, if the user manipulated the CE device to present a representation of one while proceeding, on the exercise, is approaching or moving away from the user.
- the user may set settings for a virtual opponent that represents the user's minimum preferred average pace or speed at which to exercise, and thus can determine based on the virtual opponent representation whether the user's actual pace has slowed below the minimum average pace based on a non-verbal audio cue including an up Doppler effect (e.g. sound frequency increasing) thereby indicating that the virtual opponent is approaching.
- the user can also determine that the virtual opponent is receding (e.g.
- the “virtual” distance separating the user and the virtual opponent is becoming larger) based on a non-verbal audio cue including a down Doppler effect (e.g. sound frequency decreasing).
- the Doppler effect sound may move from one earpiece of a headphone set to another (e.g. be presented more prominently in one ear piece, then fade in that ear piece and be increasingly more prominently presented in the other ear piece) to further signify the position of the virtual opponent.
- non-verbal Doppler cues need not be presented constantly during the exercise to indicate to the user where the virtual opponent is relative to the user, and may e.g. only be presented to the user responsive to a determination that the virtual opponent is within a threshold distance of the user (e.g. as set by the user prior to embarking on the exercise routine).
- the logic may revert back to decision diamond 76 and continue from there. If, however, the logic determines that a virtual opponent is approaching or moving away from the user in accordance with present principles, the logic moves to block 84 where at least one non-verbal audio cue that the virtual opponent is approaching or moving away from the user is presented on the CE device. Thereafter, the logic may revert from block 84 to decision diamond 76 and proceed from there.
- the non-verbal audio cue indicating the position of the virtual opponent may be accompanied by (e.g. presented concurrently with, before, and/or after) a verbal audio cue indicating the position of the virtual opponent.
- the non-verbal Doppler effect sounds may be accompanied by a verbal indication that “the virtual opponent is approaching.”
- planned exercise information that is received by the logic may include an (e.g. predefined) exercise segment time period (e.g. ten minutes), and the non-verbal cue may thus be and/or include a music segment (e.g. a music file or portion thereof) having a time period of substantially the exercise segment time period to e.g. inform the user of the time remaining for that particular segment.
- the music segment may begin at substantially the start of the exercise segment time period and end at substantially the end of the exercise segment time period.
- FIG. 3 another example flowchart of logic to be executed by a CE device such as the CE device 12 in accordance with present principles is shown, this time for creating a playlist of music matching a user's cadence.
- the logic of FIG. 3 (and/or FIG. 4 ) may be combined with FIG. 2 in some implementations, and/or executed concurrently therewith.
- the logic of FIG. 3 begins at block 90 where the logic receives exercise information in accordance with present principles.
- the logic then proceeds to block 92 where the logic receives one or more biometric signals from one or more biometric sensors in communication with the CE device as set forth herein.
- the logic then proceeds to block 94 where the logic accesses music metadata indicating a music tempo for each of one or more music files for matching of the user's cadence with at least one music having at least a substantially similar tempo in accordance with present principles. Thereafter, the logic proceeds to block 96 where the logic establishes a playlist including one or more music files of music having a tempo matching a desired cadence, actual cadence, etc. of the user. Also at block 96 the logic begins presenting the music of the playlist.
- the logic proceeds to decision diamond 98 where the logic determines whether the user's cadence has changed (e.g. actual cadence, and/or estimated based on the transition from one exercise segment to another based on time and/or location such as beginning to proceed up a hill). If the logic determines at diamond 98 that the user's cadence has not changed, the logic proceeds to block 100 where the logic continues presenting music from the playlist of music of the same tempo or substantially similar tempo.
- the user's cadence e.g. actual cadence, and/or estimated based on the transition from one exercise segment to another based on time and/or location such as beginning to proceed up a hill.
- the logic determines whether a biometric parameter of a user has exceed a threshold, or is below a threshold, depending on the particular parameter, acceptable health ranges, user settings, etc. For instance, if the user's heart rate exceeds a heart rate threshold, that could be detrimental to the user's heart and the user may thus wish to be provided with a notification in such a case. As another example where a notification may be appropriate, if the user's core body temperature exceeds a temperature threshold (e.g. the user is too hot) or even falls beneath a threshold (e.g. the user is too cold), that could be detrimental to the user's brain and thus a notification of the user's temperature would be beneficial.
- a biometric parameter of a user has exceed a threshold, or is below a threshold, depending on the particular parameter, acceptable health ranges, user settings, etc. For instance, if the user's heart rate exceeds a heart rate threshold, that could be detrimental to the user's heart and the user may thus wish to be provided with a notification in
- the logic determines that at least one biometric parameter does not exceed a threshold or is not below another threshold (e.g. the biometric parameter is within an acceptable range, healthy range, and/or user-desired range as input to the CE device by the user), the logic proceeds to block 100 and may subsequently proceed from there. If, however, the logic determines that a threshold has been breached, the logic instead moves to block 104 where the logic instructs the user to speed up the user's cadence/pace and/or slow down as may be appropriate depending on the biometric parameter to be brought within an acceptable range. Also note that at block 104 should the biometric parameter be dangerous to the user's health (e.g. based on a data table correlating as much), the logic may instead instruct the user to stop exercising completely and/or automatically without user input provide a notification to an emergency service along with location coordinates from a GPS receiver on the CE device.
- another threshold e.g. the biometric parameter is within an acceptable range, healthy range, and/or user-desired range as input
- the logic proceeds to block 106 where the logic changes or alters the playlist (and even entirely replaces the previous playlist) to include music with a tempo matchable to bring the user's biometric parameter within an acceptable range. For example, if the logic determines that a biometric parameter exceeds a threshold, and thus that a user needs to slow down, the playlist may be altered to present (e.g., from that point on) music with a slower tempo than was previously presented. Then after block 106 the logic may revert back to decision diamond 98 and proceed again from there. For completeness before moving on to FIG.
- FIG. 4 another example flowchart of logic to be executed by a CE device such as the CE device 12 in accordance with present principles is shown, again for presenting music with a tempo to match a user's cadence but this time based on a change in time and e.g. thus transition from one exercise segment to another.
- the logic of FIG. 4 begins at block 110 where the logic receives exercise information in accordance with present principles.
- the logic then proceeds to block 112 where the logic begins presenting music with a first tempo (e.g. first beat speed) for a first time to match a user's actual and/or desired cadence in accordance with present principles (e.g. after a user begins an exercise routine).
- a first tempo e.g. first beat speed
- the logic then proceeds to decision diamond 114 where the logic determines if the first (e.g. preset) time has expired at which the user was to exercise at the first tempo.
- the first time may be predefined by a user as input to the CE device prior to beginning the exercise routine. For instance, the user may provide input to the CE device to provide music of a certain tempo for ten minutes so that a user can match his or her cadence thereto, then present music of a relatively faster tempo for twenty minutes thereafter so that a user can increase his or her pace after ten minutes of warming up at a slower pace.
- the logic determines at diamond 114 that the first time has not expired, the logic proceeds to block 116 where the logic continues presenting music at the same tempo as prior to the determination. If, however, the logic determines at diamond 114 that the first time has expired, the logic instead proceeds to block 118 where the logic presents music with a second tempo (e.g. second beat speed different than the first) for a second time to match a user's actual and/or desired cadence for the second time in accordance with present principles. The logic then proceeds to decision diamond 120 where the logic determines if the second time has expired at which the user was to exercise at the second tempo. If the logic determines at diamond 120 that the second time has not expired, the logic may proceed to block 116 .
- a second tempo e.g. second beat speed different than the first
- the logic determines at diamond 120 that the second time has expired, the logic instead proceeds to block 122 where the logic presents music with a third tempo (e.g. a third beat speed different than the first and second beat speeds, or just different than the second beat speed) for a third time to match a user's actual and/or desired cadence for the third time in accordance with present principles.
- a third tempo e.g. a third beat speed different than the first and second beat speeds, or just different than the second beat speed
- FIG. 5 shows an example flowchart of logic to be executed by a server for providing music to a CE device with a tempo to match a user's cadence in accordance with present principles.
- the server logic of FIG. 5 begins at block 130 where the logic receives a request to access a user's account (e.g. such as a cloud storage account stored on the server). Assuming successful authentication of the CE device with the cloud account, access to the account is also provided at block 130 by the server. The logic then proceeds to block 132 where the logic receives tempo and/or cadence information (e.g.
- the logic locates and/or otherwise determines music files stored on the server that comport with the received tempo information.
- the music files that match the received tempo data may be determined as set forth herein (e.g. using music file metadata), and may be selected from locations including the user's cloud storage on the server but also or in lieu of that, music in the public domain and/or music provided over e.g. a general publically available music piece library and/or an Internet radio service.
- These music sources may be used or may not be used depending on e.g. settings set by a user at the CE device and manipulating a user interface in accordance with present principles.
- the logic proceeds to block 136 where the logic provides (e.g., streams) the music to the CE device, along with providing any corresponding purchase information for music files being provided e.g. that the user does not already own and/or is not in the user's cloud storage (e.g. based on determinations that the user does not own the music e.g. by searching the user's storage areas for the piece of music), such as music provided using an Internet radio service.
- the logic provides (e.g., streams) the music to the CE device, along with providing any corresponding purchase information for music files being provided e.g. that the user does not already own and/or is not in the user's cloud storage (e.g. based on determinations that the user does not own the music e.g. by searching the user's storage areas for the piece of music), such as music provided using an Internet radio service.
- the logic then proceeds to decision diamond 138 where the logic determines whether input has been received that was input at the CE device and transmitted to the server that indicates one or more music files have been designated (e.g., “bookmarked” by manipulating a user interface on the CE device and/or providing an audible command thereto) for purchase by the user. For instance, the user may want to designate a song for later purchasing so the user does not forget the details of the song he or she wished to purchase and hence cannot locate it later, but at the same time does not wish to complete all necessary purchase steps while still exercising such as e.g. providing credit card information.
- the logic determines whether input has been received that was input at the CE device and transmitted to the server that indicates one or more music files have been designated (e.g., “bookmarked” by manipulating a user interface on the CE device and/or providing an audible command thereto) for purchase by the user. For instance, the user may want to designate a song for later purchasing so the user does not forget the details of the song he or she
- the logic determines at decision diamond 138 that no input has been received to designate one or more music files for later purchasing, the logic proceeds to block 140 where the logic stores data indicating the music files provided to the CE device so that the same music files may be presented again at a later time should the user elect to do so by manipulating the user's CE device. Also at block 140 the logic may store any and/or all biometric information it has received from the CE device (e.g. for access by the user's physician to determine the user's health status or simply to maintain biometric records in the user's cloud storage).
- the logic determines thereat that input has been received to designate one or more music files for later purchasing, the logic moves to block 142 where it stores data indicating as much for later access by the user to use for purchasing the music (e.g. creates a “bookmark” file indicating the music files designated for purchase). Concluding the description of FIG. 5 , note that after block 142 the logic may proceed to block 140 .
- an exemplary user interface (UI) 150 configured for receiving input (e.g. touch input to a touch-enabled display presenting the UI 150 ) from a user to configure settings of a CE device in accordance with present principles is shown.
- the UI 150 includes a first setting 152 for configuring the CE device to match song lengths with workout segments (e.g. a set of crunches) and/or exercise route segments, and thus includes yes and no selector elements 154 for providing input on whether or not, respectively, the CE device is to match songs with segments.
- a second setting 156 for whether the CE device should provide virtual coaching instructions in accordance with present principles, and includes yes or no selector elements 158 for providing input on whether or not, respectively, the CE device should provide virtual coaching.
- the UI 150 may include a non-verbal cue section 160 .
- the section 160 may include left and right turn settings 162 , 164 , with respective input fields 166 , 168 for inputting a user-specified number of beeps (e.g. relatively high-pitched sounds separated by periods of no sound) that are to be provided to the user while proceeding on an exercise route to instruct the user where to turn in accordance with present principles.
- beeps e.g. relatively high-pitched sounds separated by periods of no sound
- settings 162 , 164 include respective selector elements 170 , 172 that are selectable to cause another UI and/or a window overlay to be presented for selecting from other available sounds other than the “beeps” that may be used to indicate turns, and indeed it is to be understood that different sounds may be used to indicate turns in addition to or in lieu of differing sound sequences.
- the UI 150 also includes a setting 174 for a user to provide input using the yes or no selectors 176 regarding whether e.g. non-verbal turn cues should be presented in only the ear piece corresponding to the direction of the turn. For instance, a right turn non-verbal cue would only be presented in the right earpiece, whereas a left turn non-verbal cue would only be presented in the left earpiece of headphones.
- a race virtual opponent setting 178 may also be included in the UI 150 and includes yes and no selector elements 180 for a user to provide input on whether the user wishes to have virtual opponent data (e.g. indications of the location of the virtual opponent represented as non-verbal audio Doppler cues) presented on the CE device in accordance with present principles.
- a submit selector 182 may be presented for selection by a user for causing the CE device to be configured according to the user's selections as input using the UI 150 .
- the UI 190 includes a faster beat setting 192 , which includes gesture command selections 194 and voice command selections 196 each for different gesture and voice command options to provide input to the CE device to present a song with a faster beat than one being currently presented. Note that one or more of the selections for each of the gesture and voice commands may be selected, if desired, though e.g. the CE device may prevent selection of the same specific command for requesting both a faster beat and a slower beat (e.g. the same hand gesture could not be used for requesting a song with a faster beat and a slower beat).
- the UI 190 also includes a slower beat setting 198 , which includes gesture command selections 200 and voice command selections 202 each for different gesture and voice command options to provide input to the CE device to present a song with a slower beat than the one currently being presented.
- a slower beat setting 198 which includes gesture command selections 200 and voice command selections 202 each for different gesture and voice command options to provide input to the CE device to present a song with a slower beat than the one currently being presented.
- the UI 190 may also include an exercise machine configuration setting 204 for providing input to the CE device for whether the CE device is to change exercise machine configurations for an exercise machine (e.g. increasing or decreasing resistance, speed, incline or decline, etc.) being used by the user and in communication with the CE device (e.g., using NFC, Bluetooth, a wireless network, etc.) based on the user's biometrics and even e.g. user-defined settings for targeted and/or desired biometrics for particular exercises and/or user-defined settings for safe ranges of biometrics.
- an exercise machine configuration setting 204 for providing input to the CE device for whether the CE device is to change exercise machine configurations for an exercise machine (e.g. increasing or decreasing resistance, speed, incline or decline, etc.) being used by the user and in communication with the CE device (e.g., using NFC, Bluetooth, a wireless network, etc.) based on the user's biometrics and even e.g. user-defined settings for targeted and/or desired biometric
- the CE device may configure the exercise machine to increase or decrease its e.g. speed or resistance to bring the user's actual heart rate into conformance with the desired heart rate input by the user to the CE device.
- the setting 204 includes yes and no selector elements 206 for providing input to the CE device to command the CE device to change exercise machine configurations accordingly or not, respectively.
- the UI 190 also includes a select machine selector element 208 for selecting an exercise machine to be communicatively connected to and configured by the CE device (e.g.
- NFC selector element 210 that is selectable to configure the CE device to communicate with the exercise machine automatically upon close juxtaposition of the two (e.g. juxtaposition of respective NFC elements) to exchange information for the CE device to command and/or configure the exercise machine in accordance with present principles.
- FIG. 7 to FIG. 8 it shows an exemplary tempo matching settings UI 220 including plural settings for matching a user's cadence and/or heart rate with music of at least substantially the same tempo in accordance with present principles.
- the UI 220 includes at least a first setting 222 for matching tempo based on one or more biometric parameters, and accordingly includes a selection box 224 for a user to select one or more particular biometric parameters for such purposes.
- a second setting 226 is also shown for selecting one or more genres of music from which music will be selected by the CE device for presentation when being matched to a biometric parameter, and accordingly includes a selection box 228 for a user to select one or more music genres for such purposes.
- a third setting 230 is also shown for selecting one or more moods of the user which the CE device is to (e.g. intelligently) match with music of a corresponding mood, the music also including a matching tempo in accordance with present principles, and accordingly setting 230 includes a selection box 232 for a user to select one or more moods that the user is feeling for such purposes.
- a fourth setting 234 is included on the UI 220 as well, the setting 234 being for selecting one or more musical artists associated with music pieces to be selected by the CE device for presentation when being matched to a biometric parameter, and accordingly includes a selection box 236 for a user to select one or more artists for such purposes.
- a fifth setting 238 may be presented for selecting one or more previous exercise routine and/or workout music playlists that were previously presented in accordance with present principles from which music may be selected for the current exercise routine (e.g., if the CE device determines that the music from the previous playlist has a beat matching one or more current biometric parameters), and accordingly includes a selection box 240 for selecting one or more previous exercise routine playlists for such purposes.
- a setting 242 for matching music using the likes and/or preferences of social networking friends may be configured using the UI 220 , such as a setting 242 for matching music using the likes and/or preferences of social networking friends, and accordingly includes respective yes and no selector elements 244 for providing input to the CE device for whether to match music to be presented with one or more biometric parameters based on likes from the user's social networking friends.
- the CE device may be configured to access one or more of the user's social networking services (e.g. based on username and password information provided by the user), to parse data in the social networking service, and make correlations between social networking posts and e.g. track names (e.g.
- Still another setting 246 may be presented for matching music in accordance with present principles by using music that is currently popular based on e.g. Billboard ratings, total sales on an online music providing service, currently trending even if on a social networking site of which the user is not a member, etc., and accordingly includes yes and no selectors 248 for providing input to the CE device for whether to match music in accordance with present principles using currently popular music.
- the UI 220 may also include a cloud storage setting 250 with a cloud selector element 252 and a local storage selector element 254 that are both selectable by the user to provide input to the CE device for different storage locations from which the CE device may gather and/or stream music to be presented in accordance with present principles.
- selecting the selector element 252 configures the CE device to gather music from the user's cloud storage account
- selecting the selector element 254 configures the CE device to gather music from the CE device's local storage area, and indeed either or both of the selector elements 252 , 254 may be selected.
- the UI 220 may include still another setting 256 with yes and no selectors 258 for providing input to the CE device on whether to instruct a server to insert recommended music into a playlist and/or sequence of music to be presented during the exercise routine, including e.g. Internet radio music, sponsored music, music determined by the processor as being potentially likeable by the user (e.g. based on genre indications input by the user, similar music already owned by the user, etc.), music not owned by the user but nonetheless comporting with one or more other settings of the UI 220 (such as being from a genre from which the user desires music to be presented), etc.
- Internet radio music e.g. Internet radio music, sponsored music, music determined by the processor as being potentially likeable by the user (e.g. based on genre indications input by the user, similar music already owned by the user, etc.), music not owned by the user but nonetheless comporting with one or more other settings of the UI 220 (such as being from a genre from which the user desires music to be presented), etc.
- the UI 220 may include a bookmark music setting 260 for configuring the CE device to receive commands to designate one or more pieces of music that are presented during a workout routine for purchasing at a later time after the workout concludes.
- a gesture selector element 262 is selectable to configure the CE device to receive a (e.g. predefined) gesture command to designate music accordingly, as well as an audible command selector element 264 selectable to configure the CE device to receive an (e.g. predefined) audible command to designate music for purchasing, and even an entire playlist selector element 266 that is selectable to configure the CE device to at a time after conclusion of the workout present a listing (e.g.
- selection of the selector elements 262 , 264 may automatically without further user input configure the CE device to present another UI and/or an overlaid UI for a user to specify one or more particular gestures and/or audible commands that are to be associated by the CE device as being a command(s) to designate/bookmark a particular piece of music when that particular piece of the music is presented in accordance with present principles.
- the CE device upon receiving the command may set a flag and/or data marker for the music to be identified at a later time and presented to the user as being previously bookmarked, and that in such instances the CE device need not present e.g. an audible or visual indication of bookmarking upon receiving the command that the piece of music is to be bookmarked (although in some implementations e.g. brief audible feedback such as a chime sound may be presented to indicate to the user that the CE device received the bookmark command and did indeed “bookmark” the piece of music for later purchasing).
- a skipping music setting 268 is shown for skipping a piece of music the user does not like (e.g. if recommended to the user during an exercise routine).
- a gesture selector element 270 and a audible selector element 272 are both selectable for configuring the CE device to skip a piece of music being presented responsive to receiving a (e.g. predefined) gesture or audible command, respectively, indicating as much.
- each of the selector elements 270 , 272 may be selectable configure the CE device to present another UI and/or an overlaid UI for a user to specify one or more particular gestures and/or audible commands that are to be associated by the CE device as being a command(s) to skip a piece of music in accordance with present principles.
- the UI 220 also includes a share selector element 274 selectable to configure the CE device to automatically post, publish, and/or share, etc., over one or more social networking services the piece(s) of music and/or music playlist presented to the user while exercising upon completion of the exercise routine, it being understood that the CE device may also be configured to present on a display of the CE device the playlist e.g. after the workout routine has been completed, including presentation of music metadata and music tempos.
- a submit selector element 276 for submitting the user's selections of settings in accordance with present principles.
- the UI 280 is shown for presenting current biometric information, music information, etc. while engaged in an exercise routine. It is to be understood that the UI 280 may thus be presented on the display of an exercise machine for viewing by the user while using the machine, and/or on the user's personal CE device that is in communication with the exercise machine.
- the UI 280 includes a music information section 282 including various pieces of information about a piece of music currently being presented that was matched by the CE device with one or more of the user's biometric parameters in accordance with present principles.
- the music information may include e.g.
- the CE device may access music e.g.
- the exercise machine may itself access a storage area storing music and then e.g. stream the music from the exercise machine to the user's headphones (e.g. using NFC pairing).
- the UI 280 also includes a biometric parameter section 284 for presenting one or more pieces of information related to the user's biometric parameters as detected by one or more biometric sensors such as those described above in reference to FIG. 1 .
- information that may be presented includes heart rate information, cadence information, and/or breathing information.
- the UI 280 may include a prompt 286 for a user to provide input using yes and no selectors 288 while a piece of music is being currently presented during the exercise routine to easily bookmark the piece of music for later purchasing (e.g., one touch bookmarking).
- the UI 280 includes a second prompt 290 for a user to provide input using yes and no selectors 292 while a piece of music is being currently presented during the exercise routine to automatically without further user input store the particular piece of music in the user's cloud storage once purchased or if purchasing is not necessary.
- an option 294 is presented on the UI 280 for whether to change exercise machine configurations manually using yes and no selectors 296 , and thus e.g.
- selection of the yes selector from the selectors 296 may cause another UI to be presented and/or overlaid that includes exercise machine settings configurable by a user to configure the exercise machine. This may be desirable when e.g. the CE device automatically configures the exercise machine according to one or more biometric parameters in accordance with present principles but the user nonetheless wishes to manually override the automatic configuration.
- an exemplary illustration 300 that illustrates present principles is shown.
- a user and a CE device in accordance with present principles are audibly exchanging information and indeed the CE device is audibly providing a “virtual coach” to provide (e.g. intelligently determined) encouragement to the person shown in the illustration 300 and even encouragement based on e.g. biometric data.
- a “virtual coach” to provide (e.g. intelligently determined) encouragement to the person shown in the illustration 300 and even encouragement based on e.g. biometric data.
- Another illustration 302 is shown in FIG.
- FIG. 11 including a graph 304 indicating the various segments of a user's workout routine represented in terms of heartbeats per minute over time, and also shows thumbnails 306 sequentially arranged from first music presented to last music presented, where each one is respectively associated with a piece of music presented during the exercise routine and matched to the user's one or more biometric parameters, and/or an album from which the piece of music was selected.
- a caption 308 is also shown that indicates an example of audio feedback that may be presented by the CE device during a “cool down” exercise stage, identifies the song, and/or provides instruction on how to bookmark the music (e.g. for later purchasing and/or listening).
- the CE device may bookmark the music (e.g.
- bookmark information locally on the CE device's storage medium
- a single tap input by the person to a particular area of a touch-enabled display of the CE device or any touch-enabled area and furthermore a double tap input to a particular area of a touch-enabled display of the CE device or any touch-enabled area may be provided by the user to skip the song being presented and cause the CE device to automatically without further user input provide another song matching the user's biometric parameter(s) and/or cool down phase of the exercise routine.
- the CE devices disclosed herein may be configured in still other ways to match music with one or more biometric parameters. For instance, when determining whether a biometric parameter conforms to at least a portion of planned physical activity information, such determining may be executed e.g. periodically at a predefined periodic interval, where responsive to the determination that the biometric parameter does not conform to at least a portion of planned physical activity information, the CE device may automatically present an audio indication in accordance with present principles by altering the time scale of a music file being presented on the CE device.
- the CE device may digitally stretch or compress the currently presented music file to thereby adjust the beats per minute as presented to the user in real time.
- time stretching of the music file may be undertaken by the CE device, as may resampling of the music file to change the duration and hence beats per minute.
- the CE device may present such information when the user configures settings for it to do so (e.g. using a UT such as the ones described above).
- Virtual coaching may include notifying a user when the user is transitioning from one exercise segment to another (e.g. based on GPS data accessible to the CE device while on an exercise route).
- the virtual coach may indicate, “You are starting to proceed up a hill, which is segment three of your exercise.”
- Other instructions that may be provided by a virtual coach include, e.g., at the beginning of an exercise routine, “Starting your ride now,” and “At the fork in the road ahead, turn right.”
- the CE device may provide an audio prompt at the beginning of the exercise routine asking whether the user wishes to race a virtual opponent (e.g., “Would you like to race against a virtual opponent?”), to which e.g. the user may audibly respond to in the affirmative as recognized by the CE device processor using natural language voice recognition principles.
- the CE device may indicate after conclusion of an exercise routine how much time, distance, and/or speed by which the user beat the virtual opponent. Also after conclusion of the routine, the CE device may e.g. audibly (and/or visually) provide statistics to the user such as the user's biometric readings, the total time to completion of the exercise routine, the distance traveled, etc. Even further, the CE device may just before conclusion of the exercise routine provide an audible indication that the routine is almost at conclusion by indicating a temporal countdown until finish such as, “Four, three, two, one . . . finished!”
- gestures in free space that are recognizable by the CE device as commands to the CE device in accordance with present principles, note that not only may a user e.g. skip a song or request a song with a faster or slower pace based on gestures in free space detected by a motion/gesture detector communicating with the CE device, but may also e.g., pause a song if the user temporarily stops an exercise. For instance, if while proceeding on an exercise route the user happens upon a friend also walking therealong, the user may provide a gesture in free space predefined at the CE device as being a command to stop presenting music (and/or tracking biometric data) until another gesture command is received to resume presentation of the music.
- present principles recognize that although much of the present specification has been directed specifically to music-related files, present principles may apply equally to any type of audio file and even e.g. audio video files as well (e.g., presenting just the audio from an audio video file or presenting both audio and video).
- the metadata for music files described herein may include not only beats per minute and music genre but still other information as well such as e.g., the lyrics to the song.
- Present principles also recognize that although much of the specification has been directed specifically to exercise routines, present principles may apply not only to exercising but also e.g. sitting down at a desk, where the CE device can detect e.g. using a brain activity monitor and blood pressure monitor that a user is stressed and thus suggests and/or automatically presents calming music to the user.
- “Different song to get going?” which may be presented responsive to a determination that the user is not keeping up a pace input by the user as being the desired pace.
- Such CE device feedback may also be provided e.g. for the user to gradually increase their tempo/cadence as a workout progresses from a lower intensity segment to a higher intensity segment.
- more than one CE device may be provide e.g. non-verbal audio cues to set a pace/cadence for respective users exercising together. For example, two or more people may wish to exercise together but do not wish to listen to the same music.
- the users' CE devices may communicate with each other and e.g. based on a predefined cadence/tempo metadata that is exchanged therebetween (e.g. based on a desired cadence indicated by a user prior to the workout routine) different songs with the same beats per minute matching the predefined cadence may be presented on each respective CE device so that the users may establish the same pace albeit with different music.
- the user may not only share the user's exercise routine over a social networking service but may also e.g. provide the exercise data to a personal trainer's CE device (e.g. using a commonly-used fitness application) so that the personal trainer may evaluate the user and view exercise results, biometric information, etc.
- a personal trainer's CE device e.g. using a commonly-used fitness application
- the CE device although detecting as much may not automatically change songs to match the new cadence but in some implementations may e.g. wait for the expiration of a threshold time at which the user runs at the new cadence, thereby not changing songs every time the user accidentally breaks pace and instead changing songs once the user has intentionally established a new pace.
- the CE devices described herein may be configured to dynamically without user input change from providing verbal cues to only providing non-verbal cues in some instances when e.g., after a threshold number of times making the same turn or otherwise exercising on the same route, the CE device determines that only non-verbal cues should be presented. This may be advantageous to a user who is already familiar with a neighborhood in which the user is exercising and hence does not necessarily need verbal cues but may nonetheless wish to have non-verbal ones presented that do not audibly interfere with the user's music as much as the verbal cues. Such determinations may be made e.g. at least in part by storing GPS data as the user proceeds along the route each time it is traveled which at a later time may be analyzed to determine whether the threshold number of times has been met.
- the headphones described herein may be configured to e.g. undertake active noise reduction on ambient noise present while exercising, while still allowing “transient” sounds like the sound generated by passing cars or someone talking to the exerciser to be heard by the exerciser.
- This headphone configuration thus promotes safety but still allows for clearly listening to music without unwanted ambient noises interfering with the user's listening enjoyment.
Landscapes
- Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Physics & Mathematics (AREA)
- General Health & Medical Sciences (AREA)
- Biomedical Technology (AREA)
- Public Health (AREA)
- Medical Informatics (AREA)
- Theoretical Computer Science (AREA)
- Biophysics (AREA)
- Cardiology (AREA)
- Pathology (AREA)
- Molecular Biology (AREA)
- Heart & Thoracic Surgery (AREA)
- Surgery (AREA)
- Animal Behavior & Ethology (AREA)
- Veterinary Medicine (AREA)
- General Physics & Mathematics (AREA)
- Business, Economics & Management (AREA)
- Physiology (AREA)
- General Engineering & Computer Science (AREA)
- Computer Networks & Wireless Communication (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Signal Processing (AREA)
- Human Computer Interaction (AREA)
- Primary Health Care (AREA)
- Epidemiology (AREA)
- Human Resources & Organizations (AREA)
- General Business, Economics & Management (AREA)
- Educational Administration (AREA)
- Multimedia (AREA)
- Entrepreneurship & Innovation (AREA)
- Audiology, Speech & Language Pathology (AREA)
- Economics (AREA)
- Strategic Management (AREA)
- Physical Education & Sports Medicine (AREA)
- Databases & Information Systems (AREA)
- Data Mining & Analysis (AREA)
- Computer Security & Cryptography (AREA)
Abstract
A device includes at least one computer readable storage medium bearing instructions executable by a processor and at least one processor configured for accessing the computer readable storage medium to execute the instructions. The instructions configure the processor for receiving signals from at least one biometric sensor of an exerciser, and based at least in part on the signals, outputting an audio cue on a speaker indicating to the exerciser to speed up or slow down.
Description
- This application claims priority to U.S. provisional patent application Ser. No. 61/878,835, filed Sep. 17, 2013.
- The present application relates generally to digital ecosystems that are configured for use when engaging in physical activity and/or fitness exercises.
- Society is becoming increasingly health-conscious. A wide variety of exercise and workouts are now offered to encourage people to stay fit through exercise. As understood herein, while stationary exercise equipment often comes equipped with data displays for the information of the exerciser, the information is not tailored to the individual and is frequently repetitive and monotonous. As further understood herein, people enjoy listening to music as workout aids but the music typically is whatever is broadcast within a gymnasium or provided on a recording device the user may wear, again being potentially monotonous and unchanging in pattern and beat in a way that is uncoupled from the actual exercise being engaged in.
- Thus, while present principles recognize that consumer electronics (CE) devices may be used while engaged in physical activity to enhance the activity, most audio and/or visual aids are static in terms of not being tied to the actual exercise.
- Present principles recognize that portable aids can be provided to improve exercise performance, provide inspiration, enable the sharing of exercise performance for social reasons, help fulfill a person's exercise goals, analyze and track exercise results, and provide virtual coaching to exercise participants in an easy, intuitive manner.
- Accordingly, in a first aspect a device includes at least one computer readable storage medium bearing instructions executable by a processor and at least one processor configured for accessing the computer readable storage medium to execute the instructions. The instructions configure the processor for receiving signals from at least one biometric sensor of an exerciser, and based at least in part on the signals, outputting an audio cue on a speaker indicating to the exerciser to speed up or slow down.
- If desired, the biometric sensor may be a heart rate sensor. In such embodiments, the processor when executing the instructions may be configured for determining whether a heart rate of the exerciser as indicated by signals from the heart rate sensor exceeds a threshold. Responsive to a determination that the heart rate exceeds the threshold, the processor may output an audio cue on the speaker to slow down, and responsive to a determination that the heart rate does not exceed the threshold, the processor may not output an audio cue on the speaker to slow down.
- Also in some embodiments, the processor when executing the instructions may be configured for determining whether a heart rate of the exerciser as indicated by signals from the heart rate sensor is below a threshold. Responsive to a determination that the heart rate is below the threshold, the processor may output an audio cue on the speaker to speed up, and responsive to a determination that the heart rate exceeds the threshold, the processor may not output an audio cue on the speaker to speed up.
- Furthermore, if desired the audio cue may be verbal. Also in some embodiments, the audio cue may include music having a tempo that is increased or decreased, respectively, to indicate to the exerciser to speed up or slow down. Even further, in some embodiments the audio cue may include changing from playing a first music piece to playing a second music piece. Also if desired, in some embodiments the biometric sensor may include an exerciser breath sensor and/or an exerciser stride sensor.
- In another aspect, a method includes receiving signals from at least one biometric sensor representing a biometric parameter of an exerciser, and automatically transmitting signals to a speaker to present audible cues to the exerciser based on the signals from the biometric sensor.
- In still another aspect, a computer readable storage medium that is not a carrier wave bears instructions which when executed by a processor configure the processor to execute logic including accessing planned physical activity information for a person associated with a CE device including the processor, receiving at least one signal from at least one biometric sensor representing at least one biometric parameter of the person, and determining whether the biometric parameter conforms to at least a portion of the planned physical activity information. The instructions then configure the processor for, responsive to a determination that the biometric parameter does not conform to at least a portion of the planned physical activity information, automatically presenting on the CE device an indication that the person is not in conformance with the planned physical activity information.
- The details of the present invention, both as to its structure and operation, can best be understood in reference to the accompanying drawings, in which like reference numerals refer to like parts, and in which:
-
FIG. 1 is a block diagram of an example system including an example CE device in accordance with present principles; -
FIGS. 2-4 are example flowcharts of logic to be executed by a CE device for providing information and/or music to a user during physical activity in accordance with present principles; -
FIG. 5 is an example flowchart of logic to be executed by a server for providing music and/or information to a CE device in accordance with present principles; -
FIGS. 6-9 are example user interfaces (UIs) presentable on a CE device in accordance with present principles; and -
FIGS. 10 and 11 are exemplary illustrations that demonstrate present principles. - This disclosure relates generally to consumer electronics (CE) device based user information. With respect to any computer systems discussed herein, a system herein may include server and client components, connected over a network such that data may be exchanged between the client and server components. The client components may include one or more computing devices including portable televisions (e.g. smart TVs, Internet-enabled TVs), portable computers such as laptops and tablet computers, and other mobile devices including smart phones and additional examples discussed below. These client devices may employ, as non-limiting examples, operating systems from Apple, Google, or Microsoft. A Unix operating system may be used. These operating systems can execute one or more browsers such as a browser made by Microsoft or Google or Mozilla or other browser program that can access web applications hosted by the Internet servers over a network such as the Internet, a local intranet, or a virtual private network.
- As used herein, instructions refer to computer-implemented steps for processing information in the system. Instructions can be implemented in software, firmware or hardware; hence, illustrative components, blocks, modules, circuits, and steps are set forth in terms of their functionality.
- A processor may be any conventional general purpose single- or multi-chip processor that can execute logic by means of various lines such as address lines, data lines, and control lines and registers and shift registers. Moreover, any logical blocks, modules, and circuits described herein can be implemented or performed, in addition to a general purpose processor, in or by a digital signal processor (DSP), a field programmable gate array (FPGA) or other programmable logic device such as an application specific integrated circuit (ASIC), discrete gate or transistor logic, discrete hardware components, or any combination thereof designed to perform the functions described herein. A processor can be implemented by a controller or state machine or a combination of computing devices.
- Any software modules described by way of flow charts and/or user interfaces herein can include various sub-routines, procedures, etc. It is to be understood that logic divulged as being executed by a module can be redistributed to other software modules and/or combined together in a single module and/or made available in a shareable library.
- Logic when implemented in software, can be written in an appropriate language such as but not limited to C# or C++, and can be stored on or transmitted through a computer-readable storage medium such as a random access memory (RAM), read-only memory (ROM), electrically erasable programmable read-only memory (EEPROM), compact disk read-only memory (CD-ROM) of other optical disk storage such as digital versatile disc (DVD), magnetic disk storage or other magnetic storage devices including removable thumb drives, etc. A connection may establish a computer-readable medium. Such connections can include, as examples, hard-wired cables including fiber optics and coaxial wires and digital subscriber line (DSL) and twisted pair wires. Such connections may include wireless communication connections including infrared and radio.
- In an example, a processor can access information over its input lines from data storage, such as the computer readable storage medium, and/or the processor accesses information wirelessly from an Internet server by activating a wireless transceiver to send and receive data. Data typically is converted from analog signals to digital and then to binary by circuitry between the antenna and the registers of the processor when being received and from binary to digital to analog when being transmitted. The processor then processes the data through its shift registers to output calculated data on output lines, for presentation of the calculated data on the CE device.
- Components included in one embodiment can be used in other embodiments in any appropriate combination. For example, any of the various components described herein and/or depicted in the Figures may be combined, interchanged or excluded from other embodiments.
- “A system having at least one of A, B, and C” (likewise “a system having at least one of A, B, or C” and “a system having at least one of A, B, C”) includes systems that have A alone, B alone, C alone, A and B together, A and C together, B and C together, and/or A, B, and C together, etc.
- Before describing
FIG. 1 , it is to be understood that the CE devices and software described herein are understood to be usable in the context of a digital ecosystem. Thus, as understood herein, a computer ecosystem, or digital ecosystem, may be an adaptive and distributed socio-technical system that is characterized by its sustainability, self-organization, and scalability. Inspired by environmental ecosystems, which consist of biotic and abiotic components that interact through nutrient cycles and energy flows, complete computer ecosystems consist of hardware, software, and services that in some cases may be provided by one company, such as Sony Electronics. The goal of each computer ecosystem is to provide consumers with everything that may be desired, at least in part services and/or software that may be exchanged via the Internet. Moreover, interconnectedness and sharing among elements of an ecosystem, such as applications within a computing cloud, provides consumers with increased capability to organize and access data and presents itself as the future characteristic of efficient integrative ecosystems. - Two general types of computer ecosystems exist: vertical and horizontal computer ecosystems. In the vertical approach, virtually all aspects of the ecosystem are associated with the same company (e.g. produced by the same manufacturer), and are specifically designed to seamlessly interact with one another. Horizontal ecosystems, one the other hand, integrate aspects such as hardware and software that are created by differing entities into one unified ecosystem. The horizontal approach allows for greater variety of input from consumers and manufactures, increasing the capacity for novel innovations and adaptations to changing demands. But regardless, it is to be understood that some digital ecosystems, including those referenced herein, may embody characteristics of both the horizontal and vertical ecosystems described above.
- Accordingly, it is to be further understood that these ecosystems may be used while engaged in physical activity to e.g. provide inspiration, goal fulfillment and/or achievement, automated coaching/training, health and exercise analysis, convenient access to data, group sharing (e.g. of fitness data), and increased accuracy of health monitoring, all while doing so in a stylish and entertaining manner. Further still, the devices disclosed herein are understood to be capable of making diagnostic determinations based on data from various sensors (such as those described below in reference to
FIG. 1 ) for use while exercising, for exercise monitoring (e.g. in real time), and/or for sharing of data with friends (e.g. using a social networking service) even when not all people have the same types and combinations of sensors on their respective CE devices. - Thus, it is to be understood that the CE devices described herein may allow for easy and simplified user interaction with the device so as to not be unduly bothersome or encumbering e.g. before, during, and after an exercise.
- It is to also be understood that the CE device processors described herein can access information over its input lines from data storage, such as the computer readable storage medium, and/or the processor(s) accesses information wirelessly from an Internet server by activating a wireless transceiver to send and receive data. Data typically is converted from analog signals to digital and then to binary by circuitry between the antenna and the registers of the processor when being received and from binary to digital to analog when being transmitted. The processor then processes the data through its shift registers according to algorithms such as those described herein to output calculated data on output lines, for presentation of the calculated data on the CE device.
- Now specifically referring to
FIG. 1 , anexample system 10 is shown, which may include one or more of the example devices mentioned above and described further below to enhance fitness experiences in accordance with present principles. The first of the example devices included in thesystem 10 is an example consumer electronics (CE)device 12 that may be waterproof (e.g., for use while swimming). TheCE device 12 may be, e.g., a computerized Internet enabled (“smart”) telephone, a tablet computer, a notebook computer, a wearable computerized device such as e.g. computerized Internet-enabled watch, a computerized Internet-enabled bracelet, other computerized Internet-enabled fitness devices, a computerized Internet-enabled music player, computerized Internet-enabled head phones, a computerized Internet-enabled implantable device such as an implantable skin device, etc., and even e.g. a computerized Internet-enabled television (TV). Regardless, it is to be understood that theCE device 12 is configured to undertake present principles (e.g. communicate with other CE devices to undertake present principles, execute the logic described herein, and perform any other functions and/or operations described herein). - Accordingly, to undertake such principles the
CE device 12 can include some or all of the components shown inFIG. 1 . For example, theCE device 12 can include one or more touch-enableddisplays 14, one ormore speakers 16 for outputting audio in accordance with present principles, and at least oneadditional input device 18 such as e.g. an audio receiver/microphone for e.g. entering audible commands to theCE device 12 to control theCE device 12. Theexample CE device 12 may also include one or more network interfaces 20 for communication over at least onenetwork 22 such as the Internet, an WAN, an LAN, etc. under control of one ormore processors 24. It is to be understood that theprocessor 24 controls theCE device 12 to undertake present principles, including the other elements of theCE device 12 described herein such as e.g. controlling thedisplay 14 to present images thereon and receiving input therefrom. Furthermore, note thenetwork interface 20 may be, e.g., a wired or wireless modem or router, or other appropriate interface such as, e.g., a wireless telephony transceiver, WiFi transceiver, etc. - In addition to the foregoing, the
CE device 12 may also include one ormore input ports 26 such as, e.g., a USB port to physically connect (e.g. using a wired connection) to another CE device and/or a headphone port to connect headphones to theCE device 12 for presentation of audio from theCE device 12 to a user through the headphones. TheCE device 12 may further include one or more tangible computerreadable storage medium 28 such as disk-based or solid state storage, it being understood that the computerreadable storage medium 28 may not be a carrier wave. Also in some embodiments, theCE device 12 can include a position or location receiver such as but not limited to a GPS receiver and/oraltimeter 30 that is configured to e.g. receive geographic position information from at least one satellite and provide the information to theprocessor 24 and/or determine an altitude at which theCE device 12 is disposed in conjunction with theprocessor 24. However, it is to be understood that that another suitable position receiver other than a GPS receiver and/or altimeter may be used in accordance with present principles to e.g. determine the location of theCE device 12 in e.g. all three dimensions. - Continuing the description of the
CE device 12, in some embodiments theCE device 12 may include one ormore cameras 32 that may be, e.g., a thermal imaging camera, a digital camera such as a webcam, and/or a camera integrated into theCE device 12 and controllable by theprocessor 24 to gather pictures/images and/or video in accordance with present principles (e.g. to share aspects of a physical activity such as hiking with social networking friends). Also included on theCE device 12 may be aBluetooth transceiver 34 and other Near Field Communication (NFC)element 36 for communication with other devices using Bluetooth and/or NFC technology, respectively. An example NFC element can be a radio frequency identification (RFID) element. - Further still, the
CE device 12 may include one or more motion sensors 37 (e.g., an accelerometer, gyroscope, cyclometer, magnetic sensor, infrared (IR) motion sensors such as passive IR sensors, an optical sensor, a speed and/or cadence sensor, a gesture sensor (e.g. for sensing gesture command), etc.) providing input to theprocessor 24. TheCE device 12 may include still other sensors such as e.g. one or more climate sensors 38 (e.g. barometers, humidity sensors, wind sensors, light sensors, temperature sensors, etc.) and/or one or more biometric sensors 40 (e.g. heart rate sensors and/or heart monitors, calorie counters, blood pressure sensors, perspiration sensors, odor and/or scent detectors, fingerprint sensors, facial recognition sensors, iris and/or retina detectors, DNA sensors, oxygen sensors (e.g. blood oxygen sensors and/or VO2 max sensors), glucose and/or blood sugar sensors, sleep sensors (e.g. a sleep tracker), pedometers and/or speed sensors, body temperature sensors, nutrient and metabolic rate sensors, voice sensors, lung input/output and other cardiovascular sensors, etc.) also providing input to theprocessor 24. In addition to the foregoing, it is noted that in some embodiments theCE device 12 may also include akinetic energy harvester 42 to e.g. charge a battery (not shown) powering theCE device 12. - Still referring to
FIG. 1 , in addition to theCE device 12, thesystem 10 may include one or more other CE device types such as, but not limited to, a computerized Internet-enabledbracelet 44, computerized Internet-enabled headphones and/orear buds 46, computerized Internet-enabledclothing 48, a computerized Internet-enabled exercise machine 50 (e.g. a treadmill, exercise bike, elliptical machine, etc.), etc. Also shown is a computerized Internet-enabledgymnasium entry kiosk 52 permitting authorized entry to a gymnasium housing theexercise machine 50. It is to be understood that other CE devices included in thesystem 10 including those described in this paragraph may respectively include some or all of the various components described above in reference to theCE device 12 such but not limited to e.g. the biometric sensors and motion sensors described above, as well as the position receivers, cameras, input devices, and speakers also described above. - Thus, for instance, the headphones/
ear buds 46 may include a heart rate sensor configured to sense a person's heart rate when a person is wearing the head phones, theclothing 48 may include sensors such as perspiration sensors, climate sensors, and heart sensors for measuring the intensity of a person's workout, theexercise machine 50 may include a camera mounted on a portion thereof for gathering facial images of a user so that themachine 50 may thereby determine whether a particular facial expression is indicative of a user struggling to keep the pace set by theexercise machine 50 and/or an NFC element to e.g. pair themachine 50 with theCE device 12 and hence access a database of preset workout routines, and thekiosk 52 may include an NFC element permitting entry to a person authenticated as being authorized for entry based on input received from a complimentary NFC element (such as e.g. theNFC element 36 on the device 12). Also note that all of the devices described in reference toFIG. 1 , including aserver 54 to be described shortly, may communicate with each other over thenetwork 22 using a respective network interface included thereon, and may each also include a computer readable storage medium that may not be a carrier wave for storing logic and/or software code in accordance with present principles. - Now in reference to the afore-mentioned at least one
server 54, it includes at least oneprocessor 56, at least one tangible computerreadable storage medium 58 that may not be a carrier wave such as disk-based or solid state storage, and at least onenetwork interface 60 that, under control of theprocessor 56, allows for communication with the other CE devices ofFIG. 1 over thenetwork 22, and indeed may facilitate communication therebetween in accordance with present principles. Note that thenetwork interface 60 may be, e.g., a wired or wireless modem or router, WiFi transceiver, or other appropriate interface such as, e.g., a wireless telephony transceiver. - Accordingly, in some embodiments the
server 54 may be an Internet server, may facilitate fitness coordination and/or data exchange between CE device devices in accordance with present principles, and may include and perform “cloud” functions such that the CE devices of thesystem 10 may access a “cloud” environment via theserver 54 in example embodiments to e.g. stream music to listen to while exercising and/or pair two or more devices (e.g. to “throw” music from one device to another). - Turning now to
FIG. 2 , an example flowchart of logic to be executed by a CE device such as theCE device 12 in accordance with present principles for presenting non-verbal audio cues is shown. The logic begins atblock 70 where the logic receives (e.g. planned) exercise information, planned physical activity information, planned exercise route information, etc. in accordance with present principles and as discussed herein (e.g. a user inputs the information using one of the user interfaces referenced herein). For instance, atblock 70 the logic may receive information pertaining to a planned exercise route (e.g. a jog) through the user's neighborhood (e.g. and may even use a user's previous average pace on past jogs) such as the user's desired pace, maximum time to completion of the route, etc. As another example, the logic atblock 70 may receive information indicating that the user wishes to ride a bike for ten minutes at a moderately fast pace, then ten minutes at a very fast pace, then ten minutes of cooling down time, and indeed may even specify the desired miles per hour for each one at which the user wishes to bicycle. As but one more example, a user's personal trainer may set a workout routine at the trainer's CE device and then transmit the routine to the user's CE device for presentation thereon. - In any case, after
block 70 the logic proceeds to block 72 where the logic determines music (e.g. one or more music files stored on and/or accessible to the CE device) to match at least the (e.g. estimated or user-indicated/desired) tempo and/or cadence of at least the first segment of the user's exercise routine/information (e.g. using the example above, at least selects music matching a tempo for the user to bicycle at a moderately fast pace to begin the routine). Note that the tempo to music matching may be e.g. initially based on an estimate by the CE device of a tempo/cadence the user should maintain to comport with the exercise information (e.g., a certain tempo for pedaling the exercise bicycle to maintain the desired speed). As another example, the tempo to music matching may be estimated at first and then later adjusted to match the actual cadence of the user after the beginning of the workout. As such, e.g. the first song before a user takes his or her first step on a jog may contain a tempo that is estimated to be the pace the user will set and/or should maintain, and thereafter the next song's tempo may be matched to the actual pace of the user. For instance and in terms of matching music to a user's actual pace, if the user is exercising at one hundred fifty strides per minute, a piece of music may be presented that includes one hundred fifty beats per minute for the user to thereby set his or her pace by moving one stride for every musical beat. - In addition, note that tempo of the music itself may be determined by accessing metadata associated with the respective music file that contains tempo information (e.g., in beats per minute). As another example, the CE device may parse or otherwise access the music file to identify a tempo (e.g. identify a beat based on a repeated snare drum sound, inflections in a singer's voice, the changing of guitar chords, etc.), and then use the identified music tempo if it matches the user's pace/cadence (e.g. as close as possible, e.g. accounting for minor variances in the user's cadence as may naturally occur from step to step on a jog, or revolution to revolution on an exercise bicycle). Thus, it may be appreciated that e.g. at a time prior to receiving exercise information at
block 70, the CE device may access all music files that are accessible to it (or e.g. a subset of files based on genre, artist, song length, etc.) to determine the beats per minute of each one, and then create a data table and/or metadata for later access by the CE device for efficiently identifying music with a tempo that matches the user's cadence at a given moment during an exercise routine without e.g. having to at that time parse the user's entire music library for music matching the user's cadence. - Still in reference to
FIG. 2 , afterblock 72 the logic proceeds to block 74 where the logic receives an instruction to begin monitoring the user's exercise and thus to begin presenting music in accordance with e.g. the cadence of the user. The logic then proceeds to block 76 where the logic determines whether a turn is upcoming, e.g. a left or right turn the user should make to continue traveling on a pre-planned exercise route. Note that although the present example will be discussed in terms of making a turn, present principles apply equally to any alteration to a user's direction in order to continue following a route (e.g. a fork in the road, a slight left turn, a u-turn, jumping to an upper tier of a structure in the case of parkour, etc.). As an aside, also note that in some implementations, a non-verbal audio cue may also be associated with an instruction for the user to e.g. continue going straight such as at an road intersection. - Regardless, if the logic determines that a turn is not upcoming (e.g. not within a predefined threshold distance for turns set by a user prior to embarking on the exercise), the logic proceeds to block 77 where logic continues monitoring the user's exercise and continues presenting music matched to the user's cadence in accordance with present principles. If, however, the logic determines that a turn is upcoming, the logic instead proceeds to block 78 where the logic notifies and/or cues the user of how to proceed at least using at least one non-verbal audio cue.
- For instance, a single beeping sound may be associated with a left turn (e.g. the user has preset the single beep to be associated with a left turn) while a double beeping sound may be associated with a right turn (e.g., the user having preset the double beep as well). In addition to or in lieu of the foregoing, should the user be wearing head phones such as the ones described above, the non-verbal cue may be presented in the left ear piece (only, or more prominently/loudly) to indicate a left turn should be made, and the right ear piece (only, or more prominently/loudly) to indicate a right turn should be made. In addition to or in lieu of the foregoing, other non-verbal cues that may be presented to a user e.g. in ear pieces in accordance with present principles are haptic non-verbal cues and/or vibrations such that e.g. a non-verbal vibration cue (e.g. the ear piece(s) vibrates based on a vibrator located in each respective ear piece that is in communication with the CE device's processor) may be presented on the left ear piece (only, or more prominently) to indicate a left turn should be made, and the right ear piece (only, or more prominently) to indicate a right turn should be made.
- Also in addition to or in lieu of the foregoing, if desired the non-verbal audio cue may be accompanied (e.g. immediately before or after the non-verbal audio cue) by a verbal cue such as an instruction to “turn left at the next street.” Also note that the non-verbal audio cue need not be a single or double beep and that other non-verbal audio cues may be used that themselves indicate detailed information such as e.g. using an audible representation of Morse code to provide turn information to a user.
- After
block 78, the logic proceeds to block 80 where the logic determines that another segment of the planned exercise/route has begun, and accordingly presents music matching the tempo/cadence of the user as he or she embarks on the next segment (e.g. actual cadence, or desired cadence based on exercise information determined by the user prior to embarking on the run). As an example, the logic may determine atblock 80 that the user has transitioned from running on flat ground to running up a hill, and accordingly presents music with a slower tempo relative to the music presented while the user was on flat ground (e.g. and also based upon segment settings set by a user where the user indicated that a slower pace up the hill was desired relative to the user's pace on flat ground). Conversely, if the user wished to “push it” up the hill, music may be presented with a faster tempo than that presented when the user was on flat ground, thereby assisting the user with matching a running cadence to the music tempo to thus proceed up the hill at a pace desired by the user (e.g. also based on predefined settings by the user). - In any case, after
block 80 the logic proceeds todecision diamond 82, at which the logic determines whether a virtual opponent, if the user manipulated the CE device to present a representation of one while proceeding, on the exercise, is approaching or moving away from the user. For instance, the user may set settings for a virtual opponent that represents the user's minimum preferred average pace or speed at which to exercise, and thus can determine based on the virtual opponent representation whether the user's actual pace has slowed below the minimum average pace based on a non-verbal audio cue including an up Doppler effect (e.g. sound frequency increasing) thereby indicating that the virtual opponent is approaching. Accordingly, the user can also determine that the virtual opponent is receding (e.g. that the “virtual” distance separating the user and the virtual opponent is becoming larger) based on a non-verbal audio cue including a down Doppler effect (e.g. sound frequency decreasing). Furthermore, from increasing to decreasing and vice versa, the Doppler effect sound may move from one earpiece of a headphone set to another (e.g. be presented more prominently in one ear piece, then fade in that ear piece and be increasingly more prominently presented in the other ear piece) to further signify the position of the virtual opponent. Also note that present principles recognize that such non-verbal Doppler cues need not be presented constantly during the exercise to indicate to the user where the virtual opponent is relative to the user, and may e.g. only be presented to the user responsive to a determination that the virtual opponent is within a threshold distance of the user (e.g. as set by the user prior to embarking on the exercise routine). - Still in reference to
decision diamond 82, if the logic determines that a virtual opponent is not approaching or moving away from the user (e.g., the pace of the user and the “virtual” pace of the virtual opponent are identical or nearly identical, and/or the virtual opponent is not within a threshold distance to present any indication to the user of the location of the virtual opponent), the logic may revert back todecision diamond 76 and continue from there. If, however, the logic determines that a virtual opponent is approaching or moving away from the user in accordance with present principles, the logic moves to block 84 where at least one non-verbal audio cue that the virtual opponent is approaching or moving away from the user is presented on the CE device. Thereafter, the logic may revert fromblock 84 todecision diamond 76 and proceed from there. - Before moving on to
FIG. 3 , note that the non-verbal audio cue indicating the position of the virtual opponent may be accompanied by (e.g. presented concurrently with, before, and/or after) a verbal audio cue indicating the position of the virtual opponent. For example, the non-verbal Doppler effect sounds may be accompanied by a verbal indication that “the virtual opponent is approaching.” - Also before moving on to
FIG. 3 , it is to be understood that e.g. planned exercise information that is received by the logic may include an (e.g. predefined) exercise segment time period (e.g. ten minutes), and the non-verbal cue may thus be and/or include a music segment (e.g. a music file or portion thereof) having a time period of substantially the exercise segment time period to e.g. inform the user of the time remaining for that particular segment. Thus, in some implementations the music segment may begin at substantially the start of the exercise segment time period and end at substantially the end of the exercise segment time period. - Continuing the detailed description in reference to
FIG. 3 , another example flowchart of logic to be executed by a CE device such as theCE device 12 in accordance with present principles is shown, this time for creating a playlist of music matching a user's cadence. It is to be understood that the logic ofFIG. 3 (and/orFIG. 4 ) may be combined withFIG. 2 in some implementations, and/or executed concurrently therewith. Regardless, the logic ofFIG. 3 begins atblock 90 where the logic receives exercise information in accordance with present principles. The logic then proceeds to block 92 where the logic receives one or more biometric signals from one or more biometric sensors in communication with the CE device as set forth herein. The logic then proceeds to block 94 where the logic accesses music metadata indicating a music tempo for each of one or more music files for matching of the user's cadence with at least one music having at least a substantially similar tempo in accordance with present principles. Thereafter, the logic proceeds to block 96 where the logic establishes a playlist including one or more music files of music having a tempo matching a desired cadence, actual cadence, etc. of the user. Also atblock 96 the logic begins presenting the music of the playlist. - After
block 96, the logic proceeds todecision diamond 98 where the logic determines whether the user's cadence has changed (e.g. actual cadence, and/or estimated based on the transition from one exercise segment to another based on time and/or location such as beginning to proceed up a hill). If the logic determines atdiamond 98 that the user's cadence has not changed, the logic proceeds to block 100 where the logic continues presenting music from the playlist of music of the same tempo or substantially similar tempo. If, however, the logic determines atdiamond 98 that the user's cadence has changed, the logic instead proceeds todecision diamond 102 where the logic determines whether a biometric parameter of a user has exceed a threshold, or is below a threshold, depending on the particular parameter, acceptable health ranges, user settings, etc. For instance, if the user's heart rate exceeds a heart rate threshold, that could be detrimental to the user's heart and the user may thus wish to be provided with a notification in such a case. As another example where a notification may be appropriate, if the user's core body temperature exceeds a temperature threshold (e.g. the user is too hot) or even falls beneath a threshold (e.g. the user is too cold), that could be detrimental to the user's brain and thus a notification of the user's temperature would be beneficial. - In any case, should the logic determined that at least one biometric parameter does not exceed a threshold or is not below another threshold (e.g. the biometric parameter is within an acceptable range, healthy range, and/or user-desired range as input to the CE device by the user), the logic proceeds to block 100 and may subsequently proceed from there. If, however, the logic determines that a threshold has been breached, the logic instead moves to block 104 where the logic instructs the user to speed up the user's cadence/pace and/or slow down as may be appropriate depending on the biometric parameter to be brought within an acceptable range. Also note that at
block 104 should the biometric parameter be dangerous to the user's health (e.g. based on a data table correlating as much), the logic may instead instruct the user to stop exercising completely and/or automatically without user input provide a notification to an emergency service along with location coordinates from a GPS receiver on the CE device. - Regardless, after
block 104 the logic proceeds to block 106 where the logic changes or alters the playlist (and even entirely replaces the previous playlist) to include music with a tempo matchable to bring the user's biometric parameter within an acceptable range. For example, if the logic determines that a biometric parameter exceeds a threshold, and thus that a user needs to slow down, the playlist may be altered to present (e.g., from that point on) music with a slower tempo than was previously presented. Then afterblock 106 the logic may revert back todecision diamond 98 and proceed again from there. For completeness before moving on toFIG. 4 , also note that based on a positive determination atdecision diamond 98, in other exemplary instances the logic may proceed directly to block 106 where the playlist is changed to match the user's current cadence, which has changed according to a positive determination made atdiamond 98. - Now in reference to
FIG. 4 , another example flowchart of logic to be executed by a CE device such as theCE device 12 in accordance with present principles is shown, again for presenting music with a tempo to match a user's cadence but this time based on a change in time and e.g. thus transition from one exercise segment to another. The logic ofFIG. 4 begins atblock 110 where the logic receives exercise information in accordance with present principles. The logic then proceeds to block 112 where the logic begins presenting music with a first tempo (e.g. first beat speed) for a first time to match a user's actual and/or desired cadence in accordance with present principles (e.g. after a user begins an exercise routine). The logic then proceeds todecision diamond 114 where the logic determines if the first (e.g. preset) time has expired at which the user was to exercise at the first tempo. Thus, it is to be understood that the first time, and indeed subsequent times, may be predefined by a user as input to the CE device prior to beginning the exercise routine. For instance, the user may provide input to the CE device to provide music of a certain tempo for ten minutes so that a user can match his or her cadence thereto, then present music of a relatively faster tempo for twenty minutes thereafter so that a user can increase his or her pace after ten minutes of warming up at a slower pace. - In any case, if the logic determines at
diamond 114 that the first time has not expired, the logic proceeds to block 116 where the logic continues presenting music at the same tempo as prior to the determination. If, however, the logic determines atdiamond 114 that the first time has expired, the logic instead proceeds to block 118 where the logic presents music with a second tempo (e.g. second beat speed different than the first) for a second time to match a user's actual and/or desired cadence for the second time in accordance with present principles. The logic then proceeds todecision diamond 120 where the logic determines if the second time has expired at which the user was to exercise at the second tempo. If the logic determines atdiamond 120 that the second time has not expired, the logic may proceed to block 116. If, however, the logic determines atdiamond 120 that the second time has expired, the logic instead proceeds to block 122 where the logic presents music with a third tempo (e.g. a third beat speed different than the first and second beat speeds, or just different than the second beat speed) for a third time to match a user's actual and/or desired cadence for the third time in accordance with present principles. - Continuing the detailed description in reference to
FIG. 5 , it shows an example flowchart of logic to be executed by a server for providing music to a CE device with a tempo to match a user's cadence in accordance with present principles. The server logic ofFIG. 5 begins atblock 130 where the logic receives a request to access a user's account (e.g. such as a cloud storage account stored on the server). Assuming successful authentication of the CE device with the cloud account, access to the account is also provided atblock 130 by the server. The logic then proceeds to block 132 where the logic receives tempo and/or cadence information (e.g. based on input from a biometric sensor on the CE device providing the information) for which music with a corresponding at least substantially similar tempo is to be matched. The logic then proceeds to block 134 where the logic locates and/or otherwise determines music files stored on the server that comport with the received tempo information. Note that atblock 134, the music files that match the received tempo data may be determined as set forth herein (e.g. using music file metadata), and may be selected from locations including the user's cloud storage on the server but also or in lieu of that, music in the public domain and/or music provided over e.g. a general publically available music piece library and/or an Internet radio service. These music sources may be used or may not be used depending on e.g. settings set by a user at the CE device and manipulating a user interface in accordance with present principles. - In any case, after
block 134 the logic proceeds to block 136 where the logic provides (e.g., streams) the music to the CE device, along with providing any corresponding purchase information for music files being provided e.g. that the user does not already own and/or is not in the user's cloud storage (e.g. based on determinations that the user does not own the music e.g. by searching the user's storage areas for the piece of music), such as music provided using an Internet radio service. The logic then proceeds todecision diamond 138 where the logic determines whether input has been received that was input at the CE device and transmitted to the server that indicates one or more music files have been designated (e.g., “bookmarked” by manipulating a user interface on the CE device and/or providing an audible command thereto) for purchase by the user. For instance, the user may want to designate a song for later purchasing so the user does not forget the details of the song he or she wished to purchase and hence cannot locate it later, but at the same time does not wish to complete all necessary purchase steps while still exercising such as e.g. providing credit card information. - If the logic determines at
decision diamond 138 that no input has been received to designate one or more music files for later purchasing, the logic proceeds to block 140 where the logic stores data indicating the music files provided to the CE device so that the same music files may be presented again at a later time should the user elect to do so by manipulating the user's CE device. Also atblock 140 the logic may store any and/or all biometric information it has received from the CE device (e.g. for access by the user's physician to determine the user's health status or simply to maintain biometric records in the user's cloud storage). Referencingdecision diamond 138 again, if the logic determines thereat that input has been received to designate one or more music files for later purchasing, the logic moves to block 142 where it stores data indicating as much for later access by the user to use for purchasing the music (e.g. creates a “bookmark” file indicating the music files designated for purchase). Concluding the description ofFIG. 5 , note that afterblock 142 the logic may proceed to block 140. - Continuing the detailed description in reference to
FIG. 6 , an exemplary user interface (UI) 150 configured for receiving input (e.g. touch input to a touch-enabled display presenting the UI 150) from a user to configure settings of a CE device in accordance with present principles is shown. TheUI 150 includes afirst setting 152 for configuring the CE device to match song lengths with workout segments (e.g. a set of crunches) and/or exercise route segments, and thus includes yes and noselector elements 154 for providing input on whether or not, respectively, the CE device is to match songs with segments. Also shown on theUI 150 is asecond setting 156 for whether the CE device should provide virtual coaching instructions in accordance with present principles, and includes yes or noselector elements 158 for providing input on whether or not, respectively, the CE device should provide virtual coaching. - In addition to the foregoing, the
UI 150 may include anon-verbal cue section 160. Thesection 160 may include left andright turn settings settings respective selector elements - The
UI 150 also includes a setting 174 for a user to provide input using the yes or noselectors 176 regarding whether e.g. non-verbal turn cues should be presented in only the ear piece corresponding to the direction of the turn. For instance, a right turn non-verbal cue would only be presented in the right earpiece, whereas a left turn non-verbal cue would only be presented in the left earpiece of headphones. A race virtual opponent setting 178 may also be included in theUI 150 and includes yes and noselector elements 180 for a user to provide input on whether the user wishes to have virtual opponent data (e.g. indications of the location of the virtual opponent represented as non-verbal audio Doppler cues) presented on the CE device in accordance with present principles. Last, note that a submitselector 182 may be presented for selection by a user for causing the CE device to be configured according to the user's selections as input using theUI 150. - Turning now to
FIG. 7 , anexemplary UI 190 for configuring gesture and/or voice control settings in accordance with present principles is shown. TheUI 190 includes a faster beat setting 192, which includesgesture command selections 194 andvoice command selections 196 each for different gesture and voice command options to provide input to the CE device to present a song with a faster beat than one being currently presented. Note that one or more of the selections for each of the gesture and voice commands may be selected, if desired, though e.g. the CE device may prevent selection of the same specific command for requesting both a faster beat and a slower beat (e.g. the same hand gesture could not be used for requesting a song with a faster beat and a slower beat). In any case, theUI 190 also includes a slower beat setting 198, which includesgesture command selections 200 andvoice command selections 202 each for different gesture and voice command options to provide input to the CE device to present a song with a slower beat than the one currently being presented. - In addition to the foregoing, the
UI 190 may also include an exercise machine configuration setting 204 for providing input to the CE device for whether the CE device is to change exercise machine configurations for an exercise machine (e.g. increasing or decreasing resistance, speed, incline or decline, etc.) being used by the user and in communication with the CE device (e.g., using NFC, Bluetooth, a wireless network, etc.) based on the user's biometrics and even e.g. user-defined settings for targeted and/or desired biometrics for particular exercises and/or user-defined settings for safe ranges of biometrics. For example, if the user indicated that he or she wished their heart rate to average a particular beats per minute, the CE device may configure the exercise machine to increase or decrease its e.g. speed or resistance to bring the user's actual heart rate into conformance with the desired heart rate input by the user to the CE device. Thus, the setting 204 includes yes and noselector elements 206 for providing input to the CE device to command the CE device to change exercise machine configurations accordingly or not, respectively. Also note that theUI 190 also includes a selectmachine selector element 208 for selecting an exercise machine to be communicatively connected to and configured by the CE device (e.g. by presenting another UI or overlay window for machine selection) and also a pair usingNFC selector element 210 that is selectable to configure the CE device to communicate with the exercise machine automatically upon close juxtaposition of the two (e.g. juxtaposition of respective NFC elements) to exchange information for the CE device to command and/or configure the exercise machine in accordance with present principles. - Moving from
FIG. 7 toFIG. 8 , it shows an exemplary tempomatching settings UI 220 including plural settings for matching a user's cadence and/or heart rate with music of at least substantially the same tempo in accordance with present principles. TheUI 220 includes at least afirst setting 222 for matching tempo based on one or more biometric parameters, and accordingly includes aselection box 224 for a user to select one or more particular biometric parameters for such purposes. Asecond setting 226 is also shown for selecting one or more genres of music from which music will be selected by the CE device for presentation when being matched to a biometric parameter, and accordingly includes aselection box 228 for a user to select one or more music genres for such purposes. Athird setting 230 is also shown for selecting one or more moods of the user which the CE device is to (e.g. intelligently) match with music of a corresponding mood, the music also including a matching tempo in accordance with present principles, and accordingly setting 230 includes a selection box 232 for a user to select one or more moods that the user is feeling for such purposes. Afourth setting 234 is included on theUI 220 as well, the setting 234 being for selecting one or more musical artists associated with music pieces to be selected by the CE device for presentation when being matched to a biometric parameter, and accordingly includes aselection box 236 for a user to select one or more artists for such purposes. Yet afifth setting 238 may be presented for selecting one or more previous exercise routine and/or workout music playlists that were previously presented in accordance with present principles from which music may be selected for the current exercise routine (e.g., if the CE device determines that the music from the previous playlist has a beat matching one or more current biometric parameters), and accordingly includes aselection box 240 for selecting one or more previous exercise routine playlists for such purposes. - It is to be understood that still other settings may be configured using the
UI 220, such as a setting 242 for matching music using the likes and/or preferences of social networking friends, and accordingly includes respective yes and noselector elements 244 for providing input to the CE device for whether to match music to be presented with one or more biometric parameters based on likes from the user's social networking friends. Note that e.g. the CE device may be configured to access one or more of the user's social networking services (e.g. based on username and password information provided by the user), to parse data in the social networking service, and make correlations between social networking posts and e.g. track names (e.g. from a database of track names) for musical tracks to thereby identify music that is “trending” or otherwise “liked” by the user's friends. Still another setting 246 may be presented for matching music in accordance with present principles by using music that is currently popular based on e.g. Billboard ratings, total sales on an online music providing service, currently trending even if on a social networking site of which the user is not a member, etc., and accordingly includes yes and noselectors 248 for providing input to the CE device for whether to match music in accordance with present principles using currently popular music. TheUI 220 may also include a cloud storage setting 250 with acloud selector element 252 and a localstorage selector element 254 that are both selectable by the user to provide input to the CE device for different storage locations from which the CE device may gather and/or stream music to be presented in accordance with present principles. Thus, selecting theselector element 252 configures the CE device to gather music from the user's cloud storage account, and selecting theselector element 254 configures the CE device to gather music from the CE device's local storage area, and indeed either or both of theselector elements UI 220 may include still another setting 256 with yes and noselectors 258 for providing input to the CE device on whether to instruct a server to insert recommended music into a playlist and/or sequence of music to be presented during the exercise routine, including e.g. Internet radio music, sponsored music, music determined by the processor as being potentially likeable by the user (e.g. based on genre indications input by the user, similar music already owned by the user, etc.), music not owned by the user but nonetheless comporting with one or more other settings of the UI 220 (such as being from a genre from which the user desires music to be presented), etc. - Still in reference to
FIG. 8 , in addition to the foregoing, theUI 220 may include a bookmark music setting 260 for configuring the CE device to receive commands to designate one or more pieces of music that are presented during a workout routine for purchasing at a later time after the workout concludes. Thus, a gesture selector element 262 is selectable to configure the CE device to receive a (e.g. predefined) gesture command to designate music accordingly, as well as an audiblecommand selector element 264 selectable to configure the CE device to receive an (e.g. predefined) audible command to designate music for purchasing, and even an entireplaylist selector element 266 that is selectable to configure the CE device to at a time after conclusion of the workout present a listing (e.g. playlist) of all music pieces that were presented to the user during the workout routine and from which a user may select one or more music pieces for purchasing. Note that in some embodiments, selection of theselector elements 262, 264 may automatically without further user input configure the CE device to present another UI and/or an overlaid UI for a user to specify one or more particular gestures and/or audible commands that are to be associated by the CE device as being a command(s) to designate/bookmark a particular piece of music when that particular piece of the music is presented in accordance with present principles. Thus, for instance, should a particular gesture be designated as a command when detected by the CE device to bookmark the music piece, the CE device upon receiving the command may set a flag and/or data marker for the music to be identified at a later time and presented to the user as being previously bookmarked, and that in such instances the CE device need not present e.g. an audible or visual indication of bookmarking upon receiving the command that the piece of music is to be bookmarked (although in some implementations e.g. brief audible feedback such as a chime sound may be presented to indicate to the user that the CE device received the bookmark command and did indeed “bookmark” the piece of music for later purchasing). - Still in reference to the
UI 220, a skipping music setting 268 is shown for skipping a piece of music the user does not like (e.g. if recommended to the user during an exercise routine). Thus, agesture selector element 270 and aaudible selector element 272 are both selectable for configuring the CE device to skip a piece of music being presented responsive to receiving a (e.g. predefined) gesture or audible command, respectively, indicating as much. Note further that each of theselector elements - Concluding the description of
FIG. 8 , theUI 220 also includes ashare selector element 274 selectable to configure the CE device to automatically post, publish, and/or share, etc., over one or more social networking services the piece(s) of music and/or music playlist presented to the user while exercising upon completion of the exercise routine, it being understood that the CE device may also be configured to present on a display of the CE device the playlist e.g. after the workout routine has been completed, including presentation of music metadata and music tempos. A submitselector element 276 for submitting the user's selections of settings in accordance with present principles. - Now in reference to
FIG. 9 , anotherUI 280 is shown for presenting current biometric information, music information, etc. while engaged in an exercise routine. It is to be understood that theUI 280 may thus be presented on the display of an exercise machine for viewing by the user while using the machine, and/or on the user's personal CE device that is in communication with the exercise machine. In any case, theUI 280 includes amusic information section 282 including various pieces of information about a piece of music currently being presented that was matched by the CE device with one or more of the user's biometric parameters in accordance with present principles. As may be appreciated from thesection 282, the music information may include e.g. artist name, track title of the song, album of the song, duration of the song, who owns the song (e.g. the user and stored locally on the CE device, and/or a third party music provider streaming the music to the CE device), an indication of the popularity of the music and even a particular demographic with which the music is popular (e.g. in the present instance the song is popular based on “like” indications by five kilometer runners input at their respective CE devices, and in other instances popular and/or recommended music from a user's personal trainer monitoring exercise plans and observing biometric information collected by the CE device in accordance with present principles), and an indication of the beats per minute of the song. Note that although the CE device may access music e.g. using its own network interface to access a cloud storage area of the user, in addition to or in lieu of that the exercise machine may itself access a storage area storing music and then e.g. stream the music from the exercise machine to the user's headphones (e.g. using NFC pairing). - In addition to the foregoing, the
UI 280 also includes abiometric parameter section 284 for presenting one or more pieces of information related to the user's biometric parameters as detected by one or more biometric sensors such as those described above in reference toFIG. 1 . For instance, information that may be presented includes heart rate information, cadence information, and/or breathing information. - Furthermore, the
UI 280 may include a prompt 286 for a user to provide input using yes and noselectors 288 while a piece of music is being currently presented during the exercise routine to easily bookmark the piece of music for later purchasing (e.g., one touch bookmarking). TheUI 280 includes asecond prompt 290 for a user to provide input using yes and noselectors 292 while a piece of music is being currently presented during the exercise routine to automatically without further user input store the particular piece of music in the user's cloud storage once purchased or if purchasing is not necessary. Last, anoption 294 is presented on theUI 280 for whether to change exercise machine configurations manually using yes and noselectors 296, and thus e.g. selection of the yes selector from theselectors 296 may cause another UI to be presented and/or overlaid that includes exercise machine settings configurable by a user to configure the exercise machine. This may be desirable when e.g. the CE device automatically configures the exercise machine according to one or more biometric parameters in accordance with present principles but the user nonetheless wishes to manually override the automatic configuration. - Moving on in the detailed description with reference to
FIG. 10 , anexemplary illustration 300 that illustrates present principles is shown. As may be appreciated from the caption boxes ofFIG. 10 , a user and a CE device in accordance with present principles are audibly exchanging information and indeed the CE device is audibly providing a “virtual coach” to provide (e.g. intelligently determined) encouragement to the person shown in theillustration 300 and even encouragement based on e.g. biometric data. Another illustration 302 is shown inFIG. 11 including agraph 304 indicating the various segments of a user's workout routine represented in terms of heartbeats per minute over time, and also showsthumbnails 306 sequentially arranged from first music presented to last music presented, where each one is respectively associated with a piece of music presented during the exercise routine and matched to the user's one or more biometric parameters, and/or an album from which the piece of music was selected. Note that acaption 308 is also shown that indicates an example of audio feedback that may be presented by the CE device during a “cool down” exercise stage, identifies the song, and/or provides instruction on how to bookmark the music (e.g. for later purchasing and/or listening). In the present exemplary instance the CE device may bookmark the music (e.g. and may also store bookmark information locally on the CE device's storage medium) responsive to a single tap input by the person to a particular area of a touch-enabled display of the CE device or any touch-enabled area, and furthermore a double tap input to a particular area of a touch-enabled display of the CE device or any touch-enabled area may be provided by the user to skip the song being presented and cause the CE device to automatically without further user input provide another song matching the user's biometric parameter(s) and/or cool down phase of the exercise routine. - With no particular reference to any figure, it is to be understood that in accordance with present principles, the CE devices disclosed herein may be configured in still other ways to match music with one or more biometric parameters. For instance, when determining whether a biometric parameter conforms to at least a portion of planned physical activity information, such determining may be executed e.g. periodically at a predefined periodic interval, where responsive to the determination that the biometric parameter does not conform to at least a portion of planned physical activity information, the CE device may automatically present an audio indication in accordance with present principles by altering the time scale of a music file being presented on the CE device. E.g., rather than presenting an entirely different piece of music to the user, the CE device may digitally stretch or compress the currently presented music file to thereby adjust the beats per minute as presented to the user in real time. Thus, time stretching of the music file may be undertaken by the CE device, as may resampling of the music file to change the duration and hence beats per minute.
- In reference to the automated and/or virtual coaching discussed herein, it is to be understood that the CE device may present such information when the user configures settings for it to do so (e.g. using a UT such as the ones described above). Virtual coaching may include notifying a user when the user is transitioning from one exercise segment to another (e.g. based on GPS data accessible to the CE device while on an exercise route). For instance, the virtual coach may indicate, “You are starting to proceed up a hill, which is segment three of your exercise.” Other instructions that may be provided by a virtual coach include, e.g., at the beginning of an exercise routine, “Starting your ride now,” and “At the fork in the road ahead, turn right.” Also at the beginning of the workout and assuming the user has not already provided input to the CE device instructing the CE device to present a virtual opponent in accordance with present principles, the CE device may provide an audio prompt at the beginning of the exercise routine asking whether the user wishes to race a virtual opponent (e.g., “Would you like to race against a virtual opponent?”), to which e.g. the user may audibly respond to in the affirmative as recognized by the CE device processor using natural language voice recognition principles.
- As other examples of indications that may be made by a “virtual coach” using the CE device, the CE device may indicate after conclusion of an exercise routine how much time, distance, and/or speed by which the user beat the virtual opponent. Also after conclusion of the routine, the CE device may e.g. audibly (and/or visually) provide statistics to the user such as the user's biometric readings, the total time to completion of the exercise routine, the distance traveled, etc. Even further, the CE device may just before conclusion of the exercise routine provide an audible indication that the routine is almost at conclusion by indicating a temporal countdown until finish such as, “Four, three, two, one . . . finished!”
- Referring specifically to gestures in free space that are recognizable by the CE device as commands to the CE device in accordance with present principles, note that not only may a user e.g. skip a song or request a song with a faster or slower pace based on gestures in free space detected by a motion/gesture detector communicating with the CE device, but may also e.g., pause a song if the user temporarily stops an exercise. For instance, if while proceeding on an exercise route the user happens upon a friend also walking therealong, the user may provide a gesture in free space predefined at the CE device as being a command to stop presenting music (and/or tracking biometric data) until another gesture command is received to resume presentation of the music.
- Now in reference to the music, music files, songs, etc. described herein, present principles recognize that although much of the present specification has been directed specifically to music-related files, present principles may apply equally to any type of audio file and even e.g. audio video files as well (e.g., presenting just the audio from an audio video file or presenting both audio and video). Furthermore and in the context of a music file, the metadata for music files described herein may include not only beats per minute and music genre but still other information as well such as e.g., the lyrics to the song.
- Present principles also recognize that although much of the specification has been directed specifically to exercise routines, present principles may apply not only to exercising but also e.g. sitting down at a desk, where the CE device can detect e.g. using a brain activity monitor and blood pressure monitor that a user is stressed and thus suggests and/or automatically presents calming music to the user.
- Notwithstanding, present principles as applied to exercising recognize that the following are exemplary audible and/or visual outputs by the CE device in accordance with present principles:
- “Different song to get going?”, which may be presented responsive to a determination that the user is not keeping up a pace input by the user as being the desired pace.
- “You are slowing down, want a different song?”, which may be presented responsive to a determination that the user is beginning to slow down his or her pace (e.g. gradually but falling outside the predefined desired pace).
- “Run until end of song,” which may be presented responsive to a determination that the user is about to come to the end of an exercise segment or the exercise routine in totality, and hence the end of the current song signifies the end of the segment and/or routine.
- “Increase activity for next minute,” which may be presented responsive to a determination that the user needs to exercise faster for the next minute to comport with e.g. a predefined exercise goal. Such CE device feedback may also be provided e.g. for the user to gradually increase their tempo/cadence as a workout progresses from a lower intensity segment to a higher intensity segment.
- “Your heart rate is one hundred two beats per minute,” which may be presented responsive to a determination that a user has input a command during an exercise routine requesting biometric information for heart rate.
- Present principles also recognize that more than one CE device may be provide e.g. non-verbal audio cues to set a pace/cadence for respective users exercising together. For example, two or more people may wish to exercise together but do not wish to listen to the same music. The users' CE devices may communicate with each other and e.g. based on a predefined cadence/tempo metadata that is exchanged therebetween (e.g. based on a desired cadence indicated by a user prior to the workout routine) different songs with the same beats per minute matching the predefined cadence may be presented on each respective CE device so that the users may establish the same pace albeit with different music.
- Moving on, it is to be understood that e.g. after conclusion of an exercise routine, the user may not only share the user's exercise routine over a social networking service but may also e.g. provide the exercise data to a personal trainer's CE device (e.g. using a commonly-used fitness application) so that the personal trainer may evaluate the user and view exercise results, biometric information, etc.
- Describing changes in cadence/tempo of a user, it is to be understood that should the user break stride, the CE device although detecting as much may not automatically change songs to match the new cadence but in some implementations may e.g. wait for the expiration of a threshold time at which the user runs at the new cadence, thereby not changing songs every time the user accidentally breaks pace and instead changing songs once the user has intentionally established a new pace.
- Describing the non-verbal cues with more specificity, note that e.g. the CE devices described herein may be configured to dynamically without user input change from providing verbal cues to only providing non-verbal cues in some instances when e.g., after a threshold number of times making the same turn or otherwise exercising on the same route, the CE device determines that only non-verbal cues should be presented. This may be advantageous to a user who is already familiar with a neighborhood in which the user is exercising and hence does not necessarily need verbal cues but may nonetheless wish to have non-verbal ones presented that do not audibly interfere with the user's music as much as the verbal cues. Such determinations may be made e.g. at least in part by storing GPS data as the user proceeds along the route each time it is traveled which at a later time may be analyzed to determine whether the threshold number of times has been met.
- Present principles further recognize that although some of the specification describes CE device features in reference to e.g. running or cycling, present principles may apply equally to other instances as well such as e.g. swimming or any other exercises establishing repetitive/rhythmic exercise motions.
- Last, note that the headphones described herein may be configured to e.g. undertake active noise reduction on ambient noise present while exercising, while still allowing “transient” sounds like the sound generated by passing cars or someone talking to the exerciser to be heard by the exerciser. This headphone configuration thus promotes safety but still allows for clearly listening to music without unwanted ambient noises interfering with the user's listening enjoyment.
- While the particular PRESENTING AUDIO BASED ON BIOMETRIC PARAMETERS is herein shown and described in detail, it is to be understood that the subject matter which is encompassed by the present invention is limited only by the claims.
Claims (20)
1. A device comprising:
at least one computer readable storage medium bearing instructions executable by a processor;
at least one processor configured for accessing the computer readable storage medium to execute the instructions to configure the processor for:
receiving signals from at least one biometric sensor of an exerciser; and
based at least in part on the signals, outputting an audio cue on a speaker indicating to the exerciser to speed up or slow down.
2. The device of claim 1 , wherein the biometric sensor is a heart rate sensor.
3. The device of claim 2 , wherein the processor when executing the instructions is configured for:
determining whether a heart rate of the exerciser as indicated by signals from the heart rate sensor exceeds a threshold;
responsive to a determination that the heart rate exceeds the threshold, outputting an audio cue on the speaker to slow down; and
responsive to a determination that the heart rate does not exceed the threshold, not outputting an audio cue on the speaker to slow down.
4. The device of 2, wherein the processor when executing the instructions is configured for:
determining whether a heart rate of the exerciser as indicated by signals from the heart rate sensor is below a threshold;
responsive to a determination that the heart rate is below the threshold, outputting an audio cue on the speaker to speed up; and
responsive to a determination that the heart rate exceeds the threshold, not outputting an audio cue on the speaker to speed up.
5. The device of claim 1 , wherein the audio cue is verbal.
6. The device of claim 1 , wherein the audio cue includes music having a tempo that is increased or decreased, respectively, to indicate to the exerciser to speed up or slow down.
7. The device of claim 1 , wherein the audio cue includes changing from playing a first music piece to playing a second music piece.
8. The device of claim 1 , wherein the biometric sensor includes an exerciser breath sensor.
9. The device of claim 1 , wherein the biometric sensor includes an exerciser stride sensor.
10. Method comprising:
receiving signals from at least one biometric sensor representing a biometric parameter of an exerciser; and
automatically transmitting signals to a speaker to present audible cues to the exerciser based on the signals from the biometric sensor.
11. The method of claim 10 , comprising outputting an audio cue on a speaker indicating to the exerciser to speed up or slow down.
12. The method of claim 10 , comprising:
determining whether an exercise parameter of the exerciser as indicated by signals from the sensor exceeds a threshold;
responsive to a determination that the parameter exceeds the threshold, outputting an audio cue on the speaker to slow down; and
responsive to a determination that the parameter does not exceed the threshold, not outputting an audio cue on the speaker to slow down.
13. The method of claim 10 , comprising:
determining whether an exercise parameter of the exerciser as indicated by signals from sensor is below a threshold;
responsive to a determination that the parameter is below the threshold, outputting an audio cue on the speaker to speed up; and
responsive to a determination that the parameter exceeds the threshold, not outputting an audio cue on the speaker to speed up.
14. The method of claim 10 , wherein the audio cue is verbal.
15. The method of claim 10 , wherein the audio cue includes music having a tempo that is increased or decreased, respectively, to indicate to the exerciser to speed up or slow down.
16. The method of claim 10 , wherein the audio cue includes changing from playing a first music piece to playing a second music piece.
17. A computer readable storage medium that is not a carrier wave, the computer readable storage medium bearing instructions which when executed by a processor configure the processor to execute logic comprising:
accessing planned physical activity information for a person associated with a CE device including the processor;
receiving at least one signal from at least one biometric sensor representing at least one biometric parameter of the person; and
determining whether the biometric parameter conforms to at least a portion of the planned physical activity information; and
responsive to a determination that the biometric parameter does not conform to at least a portion of the planned physical activity information, automatically presenting on the CE device an indication that the person is not in conformance with the planned physical activity information.
18. The computer readable storage medium of claim 17 , wherein the indication is an audio indication presented on at least one speaker of the CE device.
19. The computer readable storage medium of claim 18 , wherein the audio indication includes a computerized voice indicating that the person is not in conformance with the planned physical activity information.
20. The computer readable storage medium of claim 18 , wherein the determining is done periodically at a predefined periodic interval, wherein responsive to the determination that the biometric parameter does not conform to at least a portion of the planned physical activity information, the instructions configure the processor for automatically presenting on the CE device the audio indication by altering the time scale of a music file being presented on the CE device, and wherein responsive to a determination that the biometric parameter does conform to the planned physical activity information, the instructions configure the processor for automatically presenting on the CE device an audible indication that the person is comporting with the planned physical activity information.
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/037,271 US20150079562A1 (en) | 2013-09-17 | 2013-09-25 | Presenting audio based on biometrics parameters |
KR20140116916A KR20150032170A (en) | 2013-09-17 | 2014-09-03 | Presenting audio based on biometric parameters |
CN201410468325.2A CN104460982A (en) | 2013-09-17 | 2014-09-15 | Presenting audio based on biometrics parameters |
JP2014188913A JP2015058363A (en) | 2013-09-17 | 2014-09-17 | Audio representation device based on biomedical measurement parameter |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201361878835P | 2013-09-17 | 2013-09-17 | |
US14/037,271 US20150079562A1 (en) | 2013-09-17 | 2013-09-25 | Presenting audio based on biometrics parameters |
Publications (1)
Publication Number | Publication Date |
---|---|
US20150079562A1 true US20150079562A1 (en) | 2015-03-19 |
Family
ID=51228977
Family Applications (10)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/037,271 Abandoned US20150079562A1 (en) | 2013-09-17 | 2013-09-25 | Presenting audio based on biometrics parameters |
US14/037,278 Abandoned US20150079563A1 (en) | 2013-09-17 | 2013-09-25 | Nonverbal audio cues during physical activity |
US14/037,263 Abandoned US20150082408A1 (en) | 2013-09-17 | 2013-09-25 | Quick login to user profile on exercise machine |
US14/037,286 Abandoned US20150081210A1 (en) | 2013-09-17 | 2013-09-25 | Altering exercise routes based on device determined information |
US14/037,228 Abandoned US20150082167A1 (en) | 2013-09-17 | 2013-09-25 | Intelligent device mode shifting based on activity |
US14/037,224 Active US8795138B1 (en) | 2013-09-17 | 2013-09-25 | Combining data sources to provide accurate effort monitoring |
US14/037,267 Abandoned US20150081067A1 (en) | 2013-09-17 | 2013-09-25 | Synchronized exercise buddy headphones |
US14/037,276 Active US9142141B2 (en) | 2013-09-17 | 2013-09-25 | Determining exercise routes based on device determined information |
US14/037,252 Abandoned US20150081066A1 (en) | 2013-09-17 | 2013-09-25 | Presenting audio based on biometrics parameters |
US14/255,663 Active US9224311B2 (en) | 2013-09-17 | 2014-04-17 | Combining data sources to provide accurate effort monitoring |
Family Applications After (9)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/037,278 Abandoned US20150079563A1 (en) | 2013-09-17 | 2013-09-25 | Nonverbal audio cues during physical activity |
US14/037,263 Abandoned US20150082408A1 (en) | 2013-09-17 | 2013-09-25 | Quick login to user profile on exercise machine |
US14/037,286 Abandoned US20150081210A1 (en) | 2013-09-17 | 2013-09-25 | Altering exercise routes based on device determined information |
US14/037,228 Abandoned US20150082167A1 (en) | 2013-09-17 | 2013-09-25 | Intelligent device mode shifting based on activity |
US14/037,224 Active US8795138B1 (en) | 2013-09-17 | 2013-09-25 | Combining data sources to provide accurate effort monitoring |
US14/037,267 Abandoned US20150081067A1 (en) | 2013-09-17 | 2013-09-25 | Synchronized exercise buddy headphones |
US14/037,276 Active US9142141B2 (en) | 2013-09-17 | 2013-09-25 | Determining exercise routes based on device determined information |
US14/037,252 Abandoned US20150081066A1 (en) | 2013-09-17 | 2013-09-25 | Presenting audio based on biometrics parameters |
US14/255,663 Active US9224311B2 (en) | 2013-09-17 | 2014-04-17 | Combining data sources to provide accurate effort monitoring |
Country Status (7)
Country | Link |
---|---|
US (10) | US20150079562A1 (en) |
EP (1) | EP3020253A4 (en) |
JP (6) | JP2016533237A (en) |
KR (7) | KR20150032170A (en) |
CN (7) | CN108428473B (en) |
CA (1) | CA2917927A1 (en) |
WO (2) | WO2015041970A1 (en) |
Cited By (20)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106598537A (en) * | 2016-11-16 | 2017-04-26 | 上海斐讯数据通信技术有限公司 | Mobile terminal music play control method and system and mobile terminal |
TWI601155B (en) * | 2016-06-08 | 2017-10-01 | 群聯電子股份有限公司 | Memory interface, memory control circuit unit, memory storage device and clock generation method |
US10188890B2 (en) | 2013-12-26 | 2019-01-29 | Icon Health & Fitness, Inc. | Magnetic resistance mechanism in a cable machine |
US10220259B2 (en) | 2012-01-05 | 2019-03-05 | Icon Health & Fitness, Inc. | System and method for controlling an exercise device |
US10226396B2 (en) | 2014-06-20 | 2019-03-12 | Icon Health & Fitness, Inc. | Post workout massage device |
US10272317B2 (en) | 2016-03-18 | 2019-04-30 | Icon Health & Fitness, Inc. | Lighted pace feature in a treadmill |
US10279212B2 (en) | 2013-03-14 | 2019-05-07 | Icon Health & Fitness, Inc. | Strength training apparatus with flywheel and related methods |
US10391361B2 (en) | 2015-02-27 | 2019-08-27 | Icon Health & Fitness, Inc. | Simulating real-world terrain on an exercise device |
US10426989B2 (en) | 2014-06-09 | 2019-10-01 | Icon Health & Fitness, Inc. | Cable system incorporated into a treadmill |
US10433612B2 (en) | 2014-03-10 | 2019-10-08 | Icon Health & Fitness, Inc. | Pressure sensor to quantify work |
US10493349B2 (en) | 2016-03-18 | 2019-12-03 | Icon Health & Fitness, Inc. | Display on exercise device |
US10587941B2 (en) | 2017-08-29 | 2020-03-10 | Kabushiki Kaisha Toshiba | Microphone cooperation device |
US10625137B2 (en) | 2016-03-18 | 2020-04-21 | Icon Health & Fitness, Inc. | Coordinated displays in an exercise device |
US10671705B2 (en) | 2016-09-28 | 2020-06-02 | Icon Health & Fitness, Inc. | Customizing recipe recommendations |
US11145313B2 (en) * | 2018-07-06 | 2021-10-12 | Michael Bond | System and method for assisting communication through predictive speech |
US11185254B2 (en) | 2017-08-21 | 2021-11-30 | Muvik Labs, Llc | Entrainment sonification techniques |
US11205408B2 (en) | 2017-08-21 | 2021-12-21 | Muvik Labs, Llc | Method and system for musical communication |
US11211098B2 (en) * | 2015-05-19 | 2021-12-28 | Spotify Ab | Repetitive-motion activity enhancement based upon media content selection |
US11256471B2 (en) | 2015-05-19 | 2022-02-22 | Spotify Ab | Media content selection based on physiological attributes |
US20240054159A1 (en) * | 2022-08-12 | 2024-02-15 | Lmdp Co. | Method and apparatus for a user-adaptive audiovisual experience |
Families Citing this family (235)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
AU2002255568B8 (en) * | 2001-02-20 | 2014-01-09 | Adidas Ag | Modular personal network systems and methods |
US10314533B2 (en) * | 2009-08-28 | 2019-06-11 | Samsung Electronics Co., Ltd | Method and apparatus for recommending a route |
US9886871B1 (en) | 2011-12-27 | 2018-02-06 | PEAR Sports LLC | Fitness and wellness system with dynamically adjusting guidance |
US9123317B2 (en) * | 2012-04-06 | 2015-09-01 | Icon Health & Fitness, Inc. | Using music to motivate a user during exercise |
US9582035B2 (en) | 2014-02-25 | 2017-02-28 | Medibotics Llc | Wearable computing devices and methods for the wrist and/or forearm |
US10314492B2 (en) | 2013-05-23 | 2019-06-11 | Medibotics Llc | Wearable spectroscopic sensor to measure food consumption based on interaction between light and the human body |
US10124255B2 (en) | 2012-08-31 | 2018-11-13 | Blue Goji Llc. | Multiple electronic control and tracking devices for mixed-reality interaction |
US8864587B2 (en) | 2012-10-03 | 2014-10-21 | Sony Corporation | User device position indication for security and distributed race challenges |
US11083344B2 (en) | 2012-10-11 | 2021-08-10 | Roman Tsibulevskiy | Partition technologies |
KR101968621B1 (en) * | 2012-11-16 | 2019-04-12 | 삼성전자주식회사 | 1rm presume device and method |
WO2014143776A2 (en) | 2013-03-15 | 2014-09-18 | Bodhi Technology Ventures Llc | Providing remote interactions with host device using a wireless device |
US9424348B1 (en) | 2013-05-08 | 2016-08-23 | Rock My World, Inc. | Sensor-driven audio playback modification |
US10643483B2 (en) | 2013-07-19 | 2020-05-05 | PEAR Sports LLC | Physical activity coaching platform with dynamically changing workout content |
US9514620B2 (en) | 2013-09-06 | 2016-12-06 | Immersion Corporation | Spatialized haptic feedback based on dynamically scaled values |
US20150079562A1 (en) | 2013-09-17 | 2015-03-19 | Sony Corporation | Presenting audio based on biometrics parameters |
GB201317033D0 (en) * | 2013-09-25 | 2013-11-06 | Naylor David | Selecting routes |
AU2014246686A1 (en) * | 2013-10-14 | 2015-04-30 | Extronics Pty Ltd | An interactive system for monitoring and assisting the physical activity of a user within a gym environment |
US9535505B2 (en) | 2013-11-08 | 2017-01-03 | Polar Electro Oy | User interface control in portable system |
US12080421B2 (en) | 2013-12-04 | 2024-09-03 | Apple Inc. | Wellness aggregator |
KR102061913B1 (en) | 2013-12-12 | 2020-01-02 | 삼성전자주식회사 | Method and apparatus for controlling operations of electronic device |
US9269119B2 (en) | 2014-01-22 | 2016-02-23 | Sony Corporation | Devices and methods for health tracking and providing information for improving health |
JP6586274B2 (en) * | 2014-01-24 | 2019-10-02 | パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカPanasonic Intellectual Property Corporation of America | Cooking apparatus, cooking method, cooking control program, and cooking information providing method |
US10429888B2 (en) | 2014-02-25 | 2019-10-01 | Medibotics Llc | Wearable computer display devices for the forearm, wrist, and/or hand |
US9996442B2 (en) * | 2014-03-25 | 2018-06-12 | Krystallize Technologies, Inc. | Cloud computing benchmarking |
CN106416063A (en) * | 2014-04-10 | 2017-02-15 | 弗劳恩霍夫应用研究促进协会 | Audio system and method for adaptive sound playback during physical activity |
DE202015010002U1 (en) * | 2014-05-21 | 2022-12-12 | Abbott Diabetes Care, Inc. | Management of multiple devices within an analyte monitoring environment |
US9672482B2 (en) * | 2014-06-11 | 2017-06-06 | Palo Alto Research Center Incorporated | System and method for automatic objective reporting via wearable sensors |
US9612862B2 (en) * | 2014-06-24 | 2017-04-04 | Google Inc. | Performing an operation during inferred periods of non-use of a wearable device |
EP3166698B1 (en) * | 2014-07-07 | 2019-10-30 | Leila Benedicte Habiche | Device for practising sport activities |
US9838858B2 (en) | 2014-07-08 | 2017-12-05 | Rapidsos, Inc. | System and method for call management |
US10229192B2 (en) * | 2014-07-14 | 2019-03-12 | Under Armour, Inc. | Hierarchical de-duplication techniques for tracking fitness metrics |
US9852264B1 (en) * | 2014-07-21 | 2017-12-26 | Padmanabaiah Srirama | Authentic and verifiable electronic wellness record |
US10135905B2 (en) | 2014-07-21 | 2018-11-20 | Apple Inc. | Remote user interface |
WO2016033375A1 (en) * | 2014-08-29 | 2016-03-03 | Icon Health & Fitness, Inc. | A sensor incorporated into an exercise garment |
US9547419B2 (en) | 2014-09-02 | 2017-01-17 | Apple Inc. | Reduced size configuration interface |
CN115623117A (en) | 2014-09-02 | 2023-01-17 | 苹果公司 | Telephone user interface |
EP3195563B1 (en) | 2014-09-19 | 2021-12-15 | Rapidsos Inc. | Method for emergency call management |
US20170165525A1 (en) * | 2014-10-16 | 2017-06-15 | Manuel Eduardo Tellez | Adrenaline Junkie |
JP6405893B2 (en) * | 2014-10-30 | 2018-10-17 | オムロンヘルスケア株式会社 | Exercise information measuring device, exercise support method, exercise support program |
WO2016080558A1 (en) * | 2014-11-17 | 2016-05-26 | 엘지전자 주식회사 | Iot management device for carrying out condition modification mode and method for controlling same |
JPWO2016080183A1 (en) * | 2014-11-18 | 2017-08-31 | ソニー株式会社 | Information processing apparatus, information processing system, information processing method, and program |
SE538331C2 (en) * | 2014-11-21 | 2016-05-17 | Melaud Ab | Earphones with sensor controlled audio output |
US9691023B2 (en) * | 2014-11-30 | 2017-06-27 | WiseWear Corporation | Exercise behavior prediction |
US11182870B2 (en) * | 2014-12-24 | 2021-11-23 | Mcafee, Llc | System and method for collective and collaborative navigation by a group of individuals |
US20160189039A1 (en) * | 2014-12-31 | 2016-06-30 | Nokia Corporation | Clothing selection |
WO2016128862A1 (en) * | 2015-02-09 | 2016-08-18 | Koninklijke Philips N.V. | Sequence of contexts wearable |
US11351420B2 (en) | 2015-02-23 | 2022-06-07 | Smartweights, Inc. | Method and system for virtual fitness training and tracking devices |
US10881907B2 (en) * | 2015-02-23 | 2021-01-05 | Smartweights, Inc. | Method and system for virtual fitness training and tracking service |
WO2016138331A1 (en) | 2015-02-27 | 2016-09-01 | Kimberly-Clark Worldwide, Inc. | Absorbent article leakage assessment system |
CN104837083B (en) * | 2015-04-11 | 2019-01-11 | 黄银桃 | Multifunctional intellectual neck ring |
US9570059B2 (en) | 2015-05-19 | 2017-02-14 | Spotify Ab | Cadence-based selection, playback, and transition between song versions |
US10055413B2 (en) | 2015-05-19 | 2018-08-21 | Spotify Ab | Identifying media content |
US10372757B2 (en) | 2015-05-19 | 2019-08-06 | Spotify Ab | Search media content based upon tempo |
US9536560B2 (en) * | 2015-05-19 | 2017-01-03 | Spotify Ab | Cadence determination and media content selection |
US9563700B2 (en) * | 2015-05-19 | 2017-02-07 | Spotify Ab | Cadence-based playlists management system |
WO2016196217A1 (en) * | 2015-05-29 | 2016-12-08 | Nike Innovate C.V. | Enhancing exercise through augmented reality |
EP3101612A1 (en) | 2015-06-03 | 2016-12-07 | Skullcandy, Inc. | Audio devices and related methods for acquiring audio device use information |
US9529500B1 (en) * | 2015-06-05 | 2016-12-27 | Apple Inc. | Application recommendation based on detected triggering events |
KR102335011B1 (en) * | 2015-06-26 | 2021-12-06 | 삼성전자주식회사 | Method and Apparatus for Providing Workout Guide Information |
US9737759B2 (en) * | 2015-07-17 | 2017-08-22 | Genesant Technologies, Inc. | Automatic application-based exercise tracking system and method |
WO2017027420A1 (en) * | 2015-08-07 | 2017-02-16 | Crume Ryan K | Functional garments and methods thereof |
CN105892628A (en) * | 2015-08-11 | 2016-08-24 | 乐视体育文化产业发展(北京)有限公司 | Music recommendation method and device as well as bicycle |
EP4268713A3 (en) * | 2015-08-18 | 2024-01-10 | University of Miami | Method and system for adjusting audio signals based on motion deviation |
US9966056B2 (en) * | 2015-08-24 | 2018-05-08 | Plantronics, Inc. | Biometrics-based dynamic sound masking |
US9605970B1 (en) * | 2015-09-03 | 2017-03-28 | Harman International Industries, Incorporated | Methods and systems for driver assistance |
KR20170027999A (en) * | 2015-09-03 | 2017-03-13 | 삼성전자주식회사 | User terminal apparatus, system and the controlling method thereof |
CN105182729A (en) * | 2015-09-22 | 2015-12-23 | 电子科技大学中山学院 | Wearable night running safety metronome |
CN105160941A (en) * | 2015-09-30 | 2015-12-16 | 宇龙计算机通信科技(深圳)有限公司 | Information prompting method, information prompting device and mobile terminal |
US9828060B2 (en) * | 2015-10-13 | 2017-11-28 | GM Global Technology Operations LLC | Automated e-assist adjustment to prevent user perspiration |
US9659484B1 (en) | 2015-11-02 | 2017-05-23 | Rapidsos, Inc. | Method and system for situational awareness for emergency response |
US9936385B2 (en) * | 2015-12-04 | 2018-04-03 | Lenovo (Singapore) Pte. Ltd. | Initial access to network that is permitted from within a threshold distance |
CN105549740B (en) * | 2015-12-10 | 2019-05-07 | 广州酷狗计算机科技有限公司 | A kind of method and apparatus of playing audio-fequency data |
US10200380B2 (en) | 2015-12-16 | 2019-02-05 | At&T Intellectual Property I, L.P. | System for providing layered security |
WO2017106775A1 (en) | 2015-12-17 | 2017-06-22 | Rapidsos, Inc. | Devices and methods for efficient emergency calling |
US10171971B2 (en) * | 2015-12-21 | 2019-01-01 | Skullcandy, Inc. | Electrical systems and related methods for providing smart mobile electronic device features to a user of a wearable device |
US9998507B2 (en) | 2015-12-22 | 2018-06-12 | Rapidsos, Inc. | Systems and methods for robust and persistent emergency communications |
US20180358021A1 (en) * | 2015-12-23 | 2018-12-13 | Intel Corporation | Biometric information for dialog system |
KR20170076281A (en) | 2015-12-24 | 2017-07-04 | 삼성전자주식회사 | Electronic device and method for providing of personalized workout guide therof |
TWI588745B (en) * | 2015-12-31 | 2017-06-21 | Fuelstation Inc | E-commerce system that can automatically record and update the information in the embedded electronic device by the cloud |
US20170200085A1 (en) * | 2016-01-11 | 2017-07-13 | Anuthep Benja-Athon | Creation of Abiotic-Biotic Civilization |
KR102511518B1 (en) * | 2016-01-12 | 2023-03-20 | 삼성전자주식회사 | Display apparatus and control method of the same |
CN105657127B (en) * | 2016-01-18 | 2019-06-11 | 宇龙计算机通信科技(深圳)有限公司 | A kind of method, earphone and terminal that sound is shared |
WO2017136151A1 (en) * | 2016-02-02 | 2017-08-10 | Gaming Grids Wearables, Llc | Esports fitness and training system |
KR102564468B1 (en) | 2016-02-11 | 2023-08-08 | 삼성전자주식회사 | Electronic device and method for providing route information |
US20170235460A1 (en) * | 2016-02-11 | 2017-08-17 | Symbol Technologies, Llc | Methods and systems for implementing an always-on-top data-acquisition button |
US9986404B2 (en) | 2016-02-26 | 2018-05-29 | Rapidsos, Inc. | Systems and methods for emergency communications amongst groups of devices based on shared data |
CN105701249B (en) * | 2016-03-04 | 2020-12-25 | 上海救要救信息科技有限公司 | Method and device for determining position information of emergency supplies |
US20170262589A1 (en) * | 2016-03-14 | 2017-09-14 | Mesa Digital, Llc | Systems and methods for physically supporting users during exercise while enhanced oxygen treatment |
CN105744420A (en) * | 2016-03-23 | 2016-07-06 | 惠州Tcl移动通信有限公司 | Smart sports headphones and smart sports system |
JP6728863B2 (en) * | 2016-03-25 | 2020-07-22 | 富士ゼロックス株式会社 | Information processing system |
US10114607B1 (en) * | 2016-03-31 | 2018-10-30 | Rock My World, Inc. | Physiological state-driven playback tempo modification |
CN105744421A (en) * | 2016-03-31 | 2016-07-06 | 惠州Tcl移动通信有限公司 | System and method for intelligent recommendation of music through Bluetooth headset |
CN105868382A (en) * | 2016-04-08 | 2016-08-17 | 惠州Tcl移动通信有限公司 | Music recommendation method and system |
WO2017189610A2 (en) | 2016-04-26 | 2017-11-02 | Rapidsos, Inc. | Systems and methods for emergency communications |
US20170325056A1 (en) | 2016-05-09 | 2017-11-09 | Rapidsos, Inc. | Systems and methods for emergency communications |
AU2017263835B2 (en) * | 2016-05-13 | 2021-06-10 | WellDoc, Inc. | Database management and graphical user interfaces for managing blood glucose levels |
US9668290B1 (en) | 2016-05-24 | 2017-05-30 | Ronald Snagg | Wireless communication headset system |
CN107450940A (en) * | 2016-06-01 | 2017-12-08 | 北京小米移动软件有限公司 | Intelligent terminal opens the method and device of application program |
CN107450880A (en) * | 2016-06-01 | 2017-12-08 | 北京小米移动软件有限公司 | Update the method and device of display content |
CN106066780B (en) * | 2016-06-06 | 2020-01-21 | 杭州网易云音乐科技有限公司 | Running data processing method and device |
CN106067308A (en) * | 2016-06-07 | 2016-11-02 | 四川长虹网络科技有限责任公司 | The regulation equipment of music speed, system and method is changed by human heart rate |
US10984035B2 (en) | 2016-06-09 | 2021-04-20 | Spotify Ab | Identifying media content |
US11113346B2 (en) | 2016-06-09 | 2021-09-07 | Spotify Ab | Search media content based upon tempo |
JP2018015187A (en) * | 2016-07-27 | 2018-02-01 | セイコーエプソン株式会社 | Swimming information processing system, information processing apparatus, swimming information processing method, and program |
WO2018039142A1 (en) | 2016-08-22 | 2018-03-01 | Rapidsos, Inc. | Predictive analytics for emergency detection and response management |
US10628663B2 (en) | 2016-08-26 | 2020-04-21 | International Business Machines Corporation | Adapting physical activities and exercises based on physiological parameter analysis |
US11003762B2 (en) * | 2016-09-15 | 2021-05-11 | William John McIntosh | Password hidden characters |
US10736543B2 (en) | 2016-09-22 | 2020-08-11 | Apple Inc. | Workout monitor interface |
US10492519B2 (en) | 2016-09-28 | 2019-12-03 | Icon Health & Fitness, Inc. | Customizing nutritional supplement shake recommendations |
US9794752B1 (en) | 2016-09-29 | 2017-10-17 | International Business Machines Corporation | Dynamically creating fitness groups |
US20180101777A1 (en) * | 2016-10-12 | 2018-04-12 | Anuthep Benja-Athon | EM Oracle |
EP3533385B1 (en) * | 2016-10-26 | 2024-03-27 | JSR Corporation | Exercise assistance device, exercise assistance system, exercise assistance method, and non-transitive substantive recording medium |
US10878947B2 (en) | 2016-11-18 | 2020-12-29 | International Business Machines Corporation | Triggered sensor data capture in a mobile device environment |
US10559297B2 (en) | 2016-11-28 | 2020-02-11 | Microsoft Technology Licensing, Llc | Audio landmarking for aural user interface |
TWI644650B (en) * | 2016-12-01 | 2018-12-21 | 國立臺灣大學 | Methods and devices for detecting irregular heartbeat |
JP2018093979A (en) * | 2016-12-09 | 2018-06-21 | セイコーエプソン株式会社 | Exercise diagnostic device, exercise diagnosis system, program, recording medium and exercise diagnosis method |
US11211157B2 (en) * | 2016-12-30 | 2021-12-28 | Intel Corporation | Using emotional maps and dialog display technology to improve workout experiences |
EP3573422B1 (en) | 2017-01-17 | 2024-02-21 | Sony Group Corporation | Pairing of wireless nodes |
JP6864824B2 (en) * | 2017-01-31 | 2021-04-28 | 株式会社Jvcケンウッド | Music playback program, music playback device, music playback method |
US10402417B2 (en) | 2017-02-09 | 2019-09-03 | Microsoft Technology Licensing, Llc | Synthesizing data sources |
US20180260850A1 (en) * | 2017-03-13 | 2018-09-13 | Brunswick Corporation | Systems and Methods for Improving Advertising in Fitness Centers Based on Traffic |
WO2018171196A1 (en) * | 2017-03-21 | 2018-09-27 | 华为技术有限公司 | Control method, terminal and system |
US20180293980A1 (en) * | 2017-04-05 | 2018-10-11 | Kumar Narasimhan Dwarakanath | Visually impaired augmented reality |
US11013641B2 (en) | 2017-04-05 | 2021-05-25 | Kimberly-Clark Worldwide, Inc. | Garment for detecting absorbent article leakage and methods of detecting absorbent article leakage utilizing the same |
US10648821B2 (en) | 2017-04-17 | 2020-05-12 | Toyota Motor Engineering & Manufacturing North America, Inc. | Systems and methodologies for route planning |
US10942520B1 (en) | 2017-04-20 | 2021-03-09 | Wells Fargo Bank, N.A. | Creating trip routes for autonomous vehicles |
WO2018200418A1 (en) | 2017-04-24 | 2018-11-01 | Rapidsos, Inc. | Modular emergency communication flow management system |
JP2018191159A (en) * | 2017-05-08 | 2018-11-29 | 山▲崎▼ 薫 | Moving image distribution method |
US10928980B2 (en) | 2017-05-12 | 2021-02-23 | Apple Inc. | User interfaces for playing and managing audio items |
US10845955B2 (en) | 2017-05-15 | 2020-11-24 | Apple Inc. | Displaying a scrollable list of affordances associated with physical activities |
US20220279063A1 (en) | 2017-05-16 | 2022-09-01 | Apple Inc. | Methods and interfaces for home media control |
CN111343060B (en) | 2017-05-16 | 2022-02-11 | 苹果公司 | Method and interface for home media control |
US10591972B2 (en) * | 2017-05-22 | 2020-03-17 | Gao Shu An Fu (Hangzhou) Technology | Adaptively controlling a tradeoff between computational accuracy and power consumption of a mobile device that operates to select a condition of a subject or device |
US10650918B2 (en) * | 2017-06-01 | 2020-05-12 | International Business Machines Corporation | Crowdsourcing health improvements routes |
CN107277233A (en) * | 2017-06-01 | 2017-10-20 | 深圳天珑无线科技有限公司 | Method, equipment and the readable storage medium storing program for executing of intelligent Matching ring of alarm clock |
US10987006B2 (en) | 2017-06-02 | 2021-04-27 | Apple Inc. | Wearable computer with fitness machine connectivity for improved activity monitoring using caloric expenditure models |
US10814167B2 (en) * | 2017-06-02 | 2020-10-27 | Apple Inc. | Wearable computer with fitness machine connectivity for improved activity monitoring |
US10874313B2 (en) * | 2017-06-04 | 2020-12-29 | Apple Inc. | Heartrate tracking techniques |
US10746556B2 (en) * | 2017-06-16 | 2020-08-18 | Walkspan, Inc. | Recommendation system and method to evaluate the quality of sidewalks and other pedestrian flow zones as a means to operationalize walkability |
CN107270887B (en) * | 2017-07-13 | 2020-10-27 | 青岛海通胜行智能科技有限公司 | Positioning method based on combination of wireless and magnetic field induction technologies |
US11561109B2 (en) | 2017-07-17 | 2023-01-24 | International Business Machines Corporation | Route accessibility for users of mobility assistive technology |
CN107249083A (en) * | 2017-07-27 | 2017-10-13 | 广东小天才科技有限公司 | Method, device, equipment and storage medium for dynamically switching ring |
US10515637B1 (en) * | 2017-09-19 | 2019-12-24 | Amazon Technologies, Inc. | Dynamic speech processing |
FI20175862A1 (en) * | 2017-09-28 | 2019-03-29 | Kipuwex Oy | System for determining sound source |
CN107564550A (en) * | 2017-10-27 | 2018-01-09 | 安徽硕威智能科技有限公司 | A kind of card machine people that music is played according to detection children's heartbeat |
CN109728831A (en) * | 2017-10-27 | 2019-05-07 | 北京金锐德路科技有限公司 | Face-to-face listening device for neck-worn voice interactive headset |
US20190138095A1 (en) * | 2017-11-03 | 2019-05-09 | Qualcomm Incorporated | Descriptive text-based input based on non-audible sensor data |
CN109817334A (en) * | 2017-11-21 | 2019-05-28 | 中国平安人寿保险股份有限公司 | Detection method, device, equipment and the readable storage medium storing program for executing in workout data source |
WO2019113129A1 (en) | 2017-12-05 | 2019-06-13 | Rapidsos, Inc. | Social media content for emergency management |
US10348878B2 (en) | 2017-12-12 | 2019-07-09 | Ronald Snagg | Wireless communication headset system |
US10691789B2 (en) * | 2017-12-19 | 2020-06-23 | International Business Machines Corporation | Authentication/security using user activity mining based live question-answering |
CN108199936B (en) * | 2018-01-25 | 2021-02-19 | 平果县科力屋智能科技有限责任公司 | Smart home |
US20190237168A1 (en) * | 2018-01-29 | 2019-08-01 | Anuthep Benja-Athon | Abiotic Intelligence-Rendered Pay |
US10820181B2 (en) | 2018-02-09 | 2020-10-27 | Rapidsos, Inc. | Emergency location analysis system |
DK180241B1 (en) | 2018-03-12 | 2020-09-08 | Apple Inc | User interfaces for health monitoring |
US20190320310A1 (en) | 2018-04-16 | 2019-10-17 | Rapidsos, Inc. | Emergency data management and access system |
CN110411460B (en) * | 2018-04-27 | 2021-08-27 | 高树安弗(杭州)科技有限公司 | Method and system for adaptively controlling tracking device |
US10958466B2 (en) | 2018-05-03 | 2021-03-23 | Plantronics, Inc. | Environmental control systems utilizing user monitoring |
US11317833B2 (en) | 2018-05-07 | 2022-05-03 | Apple Inc. | Displaying user interfaces associated with physical activities |
DK179992B1 (en) | 2018-05-07 | 2020-01-14 | Apple Inc. | DISPLAY OF USER INTERFACES ASSOCIATED WITH PHYSICAL ACTIVITIES |
US10688867B2 (en) * | 2018-05-22 | 2020-06-23 | International Business Machines Corporation | Vehicular medical assistant |
US11204251B2 (en) * | 2018-05-25 | 2021-12-21 | The University Of Chicago | Routing in navigation applications based on restorative score |
AU2019277220B2 (en) | 2018-05-29 | 2021-05-27 | Curiouser Products Inc. | A reflective video display apparatus for interactive training and demonstration and methods of using same |
EP3803774A4 (en) | 2018-06-11 | 2022-03-09 | Rapidsos, Inc. | Systems and user interfaces for emergency data integration |
US11740630B2 (en) * | 2018-06-12 | 2023-08-29 | Skydio, Inc. | Fitness and sports applications for an autonomous unmanned aerial vehicle |
JP7044646B2 (en) * | 2018-06-22 | 2022-03-30 | 東芝テック株式会社 | Information processing equipment and programs |
US11567632B2 (en) | 2018-07-03 | 2023-01-31 | Apple Inc. | Systems and methods for exploring a geographic region |
IT201800007296A1 (en) * | 2018-07-19 | 2020-01-19 | GARMENT WITH GRAPHENE CIRCUITS AND SENSORS EQUIPPED WITH AN ELECTRONIC PERSONAL SAFETY AND ENVIRONMENTAL MONITORING SYSTEM | |
US11917514B2 (en) | 2018-08-14 | 2024-02-27 | Rapidsos, Inc. | Systems and methods for intelligently managing multimedia for emergency response |
US10956115B2 (en) * | 2018-08-22 | 2021-03-23 | International Business Machines Corporation | Intelligent exercise music synchronization |
US11034360B2 (en) * | 2018-10-11 | 2021-06-15 | GM Global Technology Operations LLC | Method and apparatus that address motion sickness |
DK3867896T3 (en) * | 2018-10-17 | 2023-08-21 | Sphery Ag | Træningsmodul |
US10977927B2 (en) | 2018-10-24 | 2021-04-13 | Rapidsos, Inc. | Emergency communication flow management and notification system |
CN109599007A (en) * | 2018-10-26 | 2019-04-09 | 深圳点猫科技有限公司 | A kind of implementation method and intellectual education paintbrush of intellectual education paintbrush |
US11619747B2 (en) | 2018-11-04 | 2023-04-04 | Chenyu Wang | Location monitoring apparatuses configured for low-power operation |
JP7351078B2 (en) * | 2018-11-14 | 2023-09-27 | オムロン株式会社 | Habit improvement devices, methods and programs |
US20220277254A1 (en) * | 2018-12-27 | 2022-09-01 | Aptima, Inc. | Contextualized sensor systems |
CN109718530B (en) * | 2018-12-29 | 2021-08-24 | 咪咕互动娱乐有限公司 | Method and device for acquiring motion route, and storage medium |
US11382510B2 (en) * | 2019-02-13 | 2022-07-12 | Sports Data Labs, Inc. | Biological data tracking system and method |
US20210225505A1 (en) * | 2019-02-13 | 2021-07-22 | Sports Data Labs, Inc. | Biological data tracking system and method |
WO2020172612A1 (en) | 2019-02-22 | 2020-08-27 | Rapidsos, Inc. | Systems & methods for automated emergency response |
US11146680B2 (en) | 2019-03-29 | 2021-10-12 | Rapidsos, Inc. | Systems and methods for emergency data integration |
WO2020205033A1 (en) | 2019-03-29 | 2020-10-08 | Rapidsos, Inc. | Systems and methods for emergency data integration |
US11842729B1 (en) * | 2019-05-08 | 2023-12-12 | Apple Inc. | Method and device for presenting a CGR environment based on audio data and lyric data |
KR102371399B1 (en) * | 2019-05-13 | 2022-03-08 | 주식회사 바디프랜드 | A water purification device with a function of outputting a brain sound for user's mental health and brain stimulation and the method thereof |
US11344786B2 (en) * | 2019-05-15 | 2022-05-31 | Peloton Interactive, Inc. | User interface with interactive mapping and segmented timeline |
WO2020243691A1 (en) | 2019-05-31 | 2020-12-03 | Apple Inc. | User interfaces for audio media control |
DK201970533A1 (en) | 2019-05-31 | 2021-02-15 | Apple Inc | Methods and user interfaces for sharing audio |
WO2020242589A1 (en) * | 2019-05-31 | 2020-12-03 | Apple Inc. | Methods and user interfaces for sharing audio |
US10996917B2 (en) | 2019-05-31 | 2021-05-04 | Apple Inc. | User interfaces for audio media control |
DK201970534A1 (en) | 2019-06-01 | 2021-02-16 | Apple Inc | User interfaces for monitoring noise exposure levels |
US11209957B2 (en) | 2019-06-01 | 2021-12-28 | Apple Inc. | User interfaces for cycle tracking |
US11234077B2 (en) | 2019-06-01 | 2022-01-25 | Apple Inc. | User interfaces for managing audio exposure |
US11228835B2 (en) | 2019-06-01 | 2022-01-18 | Apple Inc. | User interfaces for managing audio exposure |
US11152100B2 (en) | 2019-06-01 | 2021-10-19 | Apple Inc. | Health application user interfaces |
US11228891B2 (en) | 2019-07-03 | 2022-01-18 | Rapidsos, Inc. | Systems and methods for emergency medical communications |
US12002588B2 (en) | 2019-07-17 | 2024-06-04 | Apple Inc. | Health event logging and coaching user interfaces |
CN114706505B (en) | 2019-09-09 | 2025-01-28 | 苹果公司 | Research User Interface |
US20220344041A1 (en) * | 2019-09-13 | 2022-10-27 | Sony Group Corporation | Information processing device, information processing method, and program |
CN110660411B (en) * | 2019-09-17 | 2021-11-02 | 北京声智科技有限公司 | Body-building safety prompting method, device, equipment and medium based on voice recognition |
US11013050B2 (en) | 2019-10-01 | 2021-05-18 | Ronald Snagg | Wireless communication headset system |
US11195152B2 (en) * | 2019-10-21 | 2021-12-07 | International Business Machines Corporation | Calendar aware activity planner |
CN112972746A (en) * | 2019-12-18 | 2021-06-18 | 深圳富泰宏精密工业有限公司 | Wearable device and fragrance release control method thereof |
TW202143063A (en) * | 2019-12-31 | 2021-11-16 | 芬蘭商亞瑪芬體育數字服務公司 | Apparatus and method for presenting thematic maps |
CN111121814A (en) * | 2020-01-08 | 2020-05-08 | 百度在线网络技术(北京)有限公司 | Navigation method, navigation device, electronic equipment and computer readable storage medium |
CN111202509B (en) * | 2020-01-17 | 2023-07-04 | 山东中医药大学 | Target heart rate monitoring method and device based on auditory performance strategy |
RU2763923C2 (en) * | 2020-03-23 | 2022-01-11 | Наталия Яковлевна Карасик | Training class of radio communication and telegraph alphabet |
TWI766259B (en) * | 2020-03-27 | 2022-06-01 | 莊龍飛 | Scoring method and system for exercise course and computer program product |
JP6811349B1 (en) * | 2020-03-31 | 2021-01-13 | 株式会社三菱ケミカルホールディングス | Information processing equipment, methods, programs |
CA3176608A1 (en) | 2020-04-30 | 2021-11-04 | Curiouser Products Inc. | Reflective video display apparatus for interactive training and demonstration and methods of using same |
US11079913B1 (en) | 2020-05-11 | 2021-08-03 | Apple Inc. | User interface for status indicators |
US11796334B2 (en) * | 2020-05-15 | 2023-10-24 | Apple Inc. | User interfaces for providing navigation directions |
DK181037B1 (en) | 2020-06-02 | 2022-10-10 | Apple Inc | User interfaces for health applications |
JP7460801B2 (en) | 2020-06-02 | 2024-04-02 | アップル インコーポレイテッド | User interface for tracking physical activity events |
US11846515B2 (en) | 2020-06-11 | 2023-12-19 | Apple Inc. | User interfaces for customized navigation routes |
KR20220003197A (en) * | 2020-07-01 | 2022-01-10 | 삼성전자주식회사 | Electronic apparatus and method for controlling thereof |
US11698710B2 (en) | 2020-08-31 | 2023-07-11 | Apple Inc. | User interfaces for logging user activities |
US11167172B1 (en) | 2020-09-04 | 2021-11-09 | Curiouser Products Inc. | Video rebroadcasting with multiplexed communications and display via smart mirrors |
US11392291B2 (en) | 2020-09-25 | 2022-07-19 | Apple Inc. | Methods and interfaces for media control with dynamic feedback |
CA3195169A1 (en) * | 2020-10-07 | 2022-04-14 | Pave Routes, Llc | Computerized route building system and method |
US11330664B1 (en) | 2020-12-31 | 2022-05-10 | Rapidsos, Inc. | Apparatus and method for obtaining emergency data and providing a map view |
WO2022158943A1 (en) | 2021-01-25 | 2022-07-28 | 삼성전자 주식회사 | Apparatus and method for processing multichannel audio signal |
US11847378B2 (en) | 2021-06-06 | 2023-12-19 | Apple Inc. | User interfaces for audio routing |
US20220390248A1 (en) | 2021-06-07 | 2022-12-08 | Apple Inc. | User interfaces for maps and navigation |
CN119223321A (en) | 2021-06-07 | 2024-12-31 | 苹果公司 | User interface for maps and navigation |
WO2023001710A1 (en) * | 2021-07-19 | 2023-01-26 | Intelligent Training Group ApS | Music based exercise program |
WO2023112081A1 (en) * | 2021-12-13 | 2023-06-22 | 日本電信電話株式会社 | Learning device, recommendation device, methods therefor, and program |
CN114288632A (en) * | 2021-12-31 | 2022-04-08 | 深圳市大数据研究院 | Interactive running path covering display method |
US20230392949A1 (en) * | 2022-06-03 | 2023-12-07 | Apple Inc. | Route identification and clustering for real-time mapping |
US12194366B2 (en) | 2022-06-05 | 2025-01-14 | Apple Inc. | User interfaces for physical activity information |
US11977729B2 (en) * | 2022-06-05 | 2024-05-07 | Apple Inc. | Physical activity information user interfaces |
KR102486726B1 (en) * | 2022-06-27 | 2023-01-11 | 아주대학교산학협력단 | Method for providing recommended exercise route information, server and system using the same |
JP7368578B1 (en) * | 2022-10-14 | 2023-10-24 | 医療法人社団M-Forest | Respiratory and circulatory system measurement data management system and respiratory and circulatory system measurement data management program |
KR102632475B1 (en) * | 2022-11-15 | 2024-01-31 | 광운대학교 산학협력단 | Application and methods of Sound play according to exercise speed |
US20240408447A1 (en) * | 2023-06-07 | 2024-12-12 | Technogym S.P.A. | System and method for improving the training experience of a user |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20060111621A1 (en) * | 2004-11-03 | 2006-05-25 | Andreas Coppi | Musical personal trainer |
US20070219059A1 (en) * | 2006-03-17 | 2007-09-20 | Schwartz Mark H | Method and system for continuous monitoring and training of exercise |
Family Cites Families (325)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4278095A (en) | 1977-09-12 | 1981-07-14 | Lapeyre Pierre A | Exercise monitor system and method |
US4566461A (en) | 1983-02-15 | 1986-01-28 | Michael Lubell | Health fitness monitor |
US4625962A (en) | 1984-10-22 | 1986-12-02 | The Cleveland Clinic Foundation | Upper body exercise apparatus |
US5111818A (en) | 1985-10-08 | 1992-05-12 | Capintec, Inc. | Ambulatory physiological evaluation system including cardiac monitoring |
US4920969A (en) | 1985-10-08 | 1990-05-01 | Capintec, Inc. | Ambulatory physiological evaluation system including cardiac monitoring |
GB2184361B (en) | 1985-12-20 | 1989-10-11 | Ind Tech Res Inst | Automatic treadmill |
US4728100A (en) | 1986-03-13 | 1988-03-01 | Smith Robert S | Exercise pacer |
US5277197A (en) | 1986-12-08 | 1994-01-11 | Physical Health Device, Inc. | Microprocessor controlled system for unsupervised EMG feedback and exercise training |
US5474083A (en) | 1986-12-08 | 1995-12-12 | Empi, Inc. | Lifting monitoring and exercise training system |
US4869497A (en) | 1987-01-20 | 1989-09-26 | Universal Gym Equipment, Inc. | Computer controlled exercise machine |
US5072458A (en) | 1987-05-07 | 1991-12-17 | Capintec, Inc. | Vest for use in an ambulatory physiological evaluation system including cardiac monitoring |
US4916628A (en) | 1988-07-08 | 1990-04-10 | Commonwealth Edison Company | Microprocessor-based control/status monitoring arrangement |
EP0404932A4 (en) | 1989-01-13 | 1993-01-27 | The Scott Fetzer Company | Apparatus and method for controlling and monitoring the exercise session for remotely located patients |
US5410472A (en) | 1989-03-06 | 1995-04-25 | Ergometrx Corporation | Method for conditioning or rehabilitating using a prescribed exercise program |
US5052375A (en) | 1990-02-21 | 1991-10-01 | John G. Stark | Instrumented orthopedic restraining device and method of use |
US5207621A (en) | 1991-02-07 | 1993-05-04 | Integral Products | Stair climbing exercise machine |
IL97526A0 (en) | 1991-03-12 | 1992-06-21 | Tius Elcon Ltd | Exercise monitor |
JPH05220120A (en) | 1992-02-18 | 1993-08-31 | Casio Comput Co Ltd | Kinetic intensity display device |
DE69329680D1 (en) | 1992-07-21 | 2000-12-21 | Hayle Brainpower Pty Ltd | INTERACTIVE EXERCISE MONITORING SYSTEM AND METHOD |
US20010011224A1 (en) | 1995-06-07 | 2001-08-02 | Stephen James Brown | Modular microprocessor-based health monitoring system |
US5400794A (en) * | 1993-03-19 | 1995-03-28 | Gorman; Peter G. | Biomedical response monitor and technique using error correction |
US5454770A (en) | 1993-11-15 | 1995-10-03 | Stevens; Clive G. | Stepper with sensor system |
US5433683A (en) | 1993-11-15 | 1995-07-18 | Stevens; Clive G. | Ski exerciser with sensor system |
US5516334A (en) | 1994-01-28 | 1996-05-14 | Easton; Gregory D. | Interactive exercise monitor |
US5524637A (en) | 1994-06-29 | 1996-06-11 | Erickson; Jon W. | Interactive system for measuring physiological exertion |
US6515593B1 (en) | 1995-02-15 | 2003-02-04 | Izex Technologies, Inc. | Communication system for an instrumented orthopedic restraining device and method therefor |
AUPN127195A0 (en) | 1995-02-21 | 1995-03-16 | Hayle Brainpower Pty Ltd | Adaptive interactive exercise system |
JPH08241496A (en) | 1995-03-06 | 1996-09-17 | Toyota Motor Corp | Schedule setting processing system for vehicle |
US6171218B1 (en) | 1995-06-22 | 2001-01-09 | Michael J. Shea | Exercise apparatus |
US7678023B1 (en) | 1995-06-22 | 2010-03-16 | Shea Michael J | Method for providing mental activity for an exerciser |
US6231527B1 (en) * | 1995-09-29 | 2001-05-15 | Nicholas Sol | Method and apparatus for biomechanical correction of gait and posture |
US6749537B1 (en) | 1995-12-14 | 2004-06-15 | Hickman Paul L | Method and apparatus for remote interactive exercise and health equipment |
US6220865B1 (en) | 1996-01-22 | 2001-04-24 | Vincent J. Macri | Instruction for groups of users interactively controlling groups of images to make idiosyncratic, simulated, physical movements |
US5706822A (en) | 1996-03-29 | 1998-01-13 | Kozz Incorporated | Method and computer program for creating individualized exercise protocols |
US6106297A (en) | 1996-11-12 | 2000-08-22 | Lockheed Martin Corporation | Distributed interactive simulation exercise manager system and method |
US5704067A (en) | 1997-01-31 | 1998-01-06 | Brady; Philip | Exercise organizer sweatband |
US6540707B1 (en) | 1997-03-24 | 2003-04-01 | Izex Technologies, Inc. | Orthoses |
JPH10281790A (en) | 1997-04-08 | 1998-10-23 | Aisin Aw Co Ltd | Route search device, navigation apparatus and medium on which computer program for navigation processing is stored |
US6050924A (en) | 1997-04-28 | 2000-04-18 | Shea; Michael J. | Exercise system |
US7056265B1 (en) | 1997-04-28 | 2006-06-06 | Shea Michael J | Exercise system |
US5857939A (en) | 1997-06-05 | 1999-01-12 | Talking Counter, Inc. | Exercise device with audible electronic monitor |
US6251048B1 (en) | 1997-06-05 | 2001-06-26 | Epm Develoment Systems Corporation | Electronic exercise monitor |
US6582342B2 (en) | 1999-01-12 | 2003-06-24 | Epm Development Systems Corporation | Audible electronic exercise monitor |
US20030171189A1 (en) | 1997-06-05 | 2003-09-11 | Kaufman Arthur H. | Audible electronic exercise monitor |
US7438670B2 (en) | 1997-10-17 | 2008-10-21 | True Fitness Technology, Inc. | Exercise device for side-to-side stepping motion |
IL122597A0 (en) | 1997-12-14 | 1998-06-15 | Pylon Inc | System and method for monitoring activity |
US6013007A (en) * | 1998-03-26 | 2000-01-11 | Liquid Spark, Llc | Athlete's GPS-based performance monitor |
US6032108A (en) | 1998-07-08 | 2000-02-29 | Seiple; Ronald | Sports performance computer system and method |
US6280363B1 (en) | 1999-08-11 | 2001-08-28 | Osborn Medical Corporation | Reciprocating therapeutic exerciser |
US6198431B1 (en) | 1998-08-27 | 2001-03-06 | Maptrek Llc | Compact GPS tracker and customized mapping system |
NZ512532A (en) | 1998-12-23 | 2003-12-19 | G | Combinations for treating cardiovascular diseases like hypercholesterolemia and atherosclerosis |
US6244988B1 (en) | 1999-06-28 | 2001-06-12 | David H. Delman | Interactive exercise system and attachment module for same |
US6447424B1 (en) * | 2000-02-02 | 2002-09-10 | Icon Health & Fitness Inc | System and method for selective adjustment of exercise apparatus |
US6811516B1 (en) | 1999-10-29 | 2004-11-02 | Brian M. Dugan | Methods and apparatus for monitoring and encouraging health and fitness |
US6602191B2 (en) | 1999-12-17 | 2003-08-05 | Q-Tec Systems Llp | Method and apparatus for health and disease management combining patient data monitoring with wireless internet connectivity |
US6601016B1 (en) | 2000-04-28 | 2003-07-29 | International Business Machines Corporation | Monitoring fitness activity across diverse exercise machines utilizing a universally accessible server system |
US6702719B1 (en) | 2000-04-28 | 2004-03-09 | International Business Machines Corporation | Exercise machine |
JP3489817B2 (en) * | 2000-04-28 | 2004-01-26 | 牟田 文夫 | Headphone transceiver |
US6746371B1 (en) | 2000-04-28 | 2004-06-08 | International Business Machines Corporation | Managing fitness activity across diverse exercise machines utilizing a portable computer system |
WO2001084507A1 (en) * | 2000-05-04 | 2001-11-08 | Marco Iori | User recognition system for automatically controlling accesses, apparatuses and the like equipment |
EP1283689A4 (en) * | 2000-05-25 | 2005-03-09 | Healthetech Inc | Weight control method using physical activity based parameters |
US7024369B1 (en) | 2000-05-31 | 2006-04-04 | International Business Machines Corporation | Balancing the comprehensive health of a user |
US6447425B1 (en) | 2000-06-14 | 2002-09-10 | Paracomp, Inc. | Range of motion device |
US6659946B1 (en) | 2000-06-30 | 2003-12-09 | Intel Corporation | Training system |
US20020072932A1 (en) | 2000-12-11 | 2002-06-13 | Bala Swamy | Health personal digital assistant |
US7223215B2 (en) | 2000-12-14 | 2007-05-29 | Bastyr Charles A | Exercise device with true pivot point |
US6561951B2 (en) * | 2000-12-21 | 2003-05-13 | Agere Systems, Inc. | Networked biometrically secured fitness device scheduler |
US8462994B2 (en) * | 2001-01-10 | 2013-06-11 | Random Biometrics, Llc | Methods and systems for providing enhanced security over, while also facilitating access through, secured points of entry |
AU2002255568B8 (en) * | 2001-02-20 | 2014-01-09 | Adidas Ag | Modular personal network systems and methods |
US20020156392A1 (en) | 2001-03-06 | 2002-10-24 | Mitsubishi Chemical Corporation | Method and apparatus for inspecting biological rhythms |
US6542814B2 (en) * | 2001-03-07 | 2003-04-01 | Horizon Navigation, Inc. | Methods and apparatus for dynamic point of interest display |
JP2002263213A (en) * | 2001-03-08 | 2002-09-17 | Combi Corp | Training device operation system and method |
US6672991B2 (en) * | 2001-03-28 | 2004-01-06 | O'malley Sean M. | Guided instructional cardiovascular exercise with accompaniment |
US7699754B2 (en) | 2001-05-24 | 2010-04-20 | Kenneth George Schneider | Complete body fitness machine |
US6605044B2 (en) | 2001-06-28 | 2003-08-12 | Polar Electro Oy | Caloric exercise monitor |
US6882883B2 (en) | 2001-08-31 | 2005-04-19 | Medtronic, Inc. | Implantable medical device (IMD) system configurable to subject a patient to a stress test and to detect myocardial ischemia within the patient |
JP2003102868A (en) | 2001-09-28 | 2003-04-08 | Konami Co Ltd | Exercising support method and apparatus therefor |
US6921351B1 (en) | 2001-10-19 | 2005-07-26 | Cybergym, Inc. | Method and apparatus for remote interactive exercise and health equipment |
JP2003131785A (en) | 2001-10-22 | 2003-05-09 | Toshiba Corp | Interface device, operation control method and program product |
US20030088196A1 (en) * | 2001-11-02 | 2003-05-08 | Epm Development Systems Corporation | Customized physiological monitor |
JP2003163959A (en) | 2001-11-28 | 2003-06-06 | Nec Corp | Mobile radio communication equipment and mobile radio communication system |
US6901330B1 (en) * | 2001-12-21 | 2005-05-31 | Garmin Ltd. | Navigation system, method and device with voice guidance |
US6997882B1 (en) | 2001-12-21 | 2006-02-14 | Barron Associates, Inc. | 6-DOF subject-monitoring device and method |
US6793607B2 (en) | 2002-01-22 | 2004-09-21 | Kinetic Sports Interactive | Workout assistant |
US6873905B2 (en) * | 2002-03-19 | 2005-03-29 | Opnext Japan, Inc. | Communications type navigation device |
US20030211916A1 (en) * | 2002-04-23 | 2003-11-13 | Capuano Patrick J. | Exercise parameters monitoring, recording and reporting system for free weight, weight stack, and sport-simulation exercise machines |
US20040006425A1 (en) * | 2002-07-03 | 2004-01-08 | Terragraphix, Inc. | System for communicating and associating information with a geographic location |
FI20025038A0 (en) | 2002-08-16 | 2002-08-16 | Joni Kettunen | Method for analyzing a physiological signal |
US7480512B2 (en) * | 2004-01-16 | 2009-01-20 | Bones In Motion, Inc. | Wireless device, program products and methods of using a wireless device to deliver services |
US7521623B2 (en) * | 2004-11-24 | 2009-04-21 | Apple Inc. | Music synchronization arrangement |
US7507183B2 (en) | 2003-04-07 | 2009-03-24 | Brent Anderson | Health club exercise records system |
US7699752B1 (en) | 2003-04-07 | 2010-04-20 | Brent Anderson | Exercise activity recording system |
US20130090565A1 (en) | 2003-04-18 | 2013-04-11 | Q-Tec Systems Llc | Method and apparatus for monitoring exercise with wireless internet connectivity |
FI118745B (en) | 2003-07-09 | 2008-02-29 | Newtest Oy | Automatic exercise detection method and exercise detector |
US8128532B2 (en) * | 2003-07-10 | 2012-03-06 | International Business Machines Corporation | Workout processing system |
JP2007507256A (en) | 2003-09-29 | 2007-03-29 | アクレス,ジョーン,エフ. | System and exercise network to coordinate exercise |
JP4503262B2 (en) * | 2003-10-10 | 2010-07-14 | 株式会社デンソー | Physical condition management device |
US20060252602A1 (en) * | 2003-10-14 | 2006-11-09 | Brown Michael W | Program and system for managing fitness activity across diverse exercise machines utilizing a portable computer system |
US7505756B2 (en) * | 2003-10-15 | 2009-03-17 | Microsoft Corporation | Dynamic online subscription for wireless wide-area networks |
US20060111944A1 (en) | 2003-10-31 | 2006-05-25 | Sirmans James R Jr | System and method for encouraging performance of health-promoting measures |
WO2005044090A2 (en) | 2003-11-04 | 2005-05-19 | General Hospital Corporation | Respiration motion detection and health state assessment system |
US7664292B2 (en) | 2003-12-03 | 2010-02-16 | Safehouse International, Inc. | Monitoring an output from a camera |
US8712510B2 (en) | 2004-02-06 | 2014-04-29 | Q-Tec Systems Llc | Method and apparatus for exercise monitoring combining exercise monitoring and visual data with wireless internet connectivity |
FI119718B (en) | 2003-12-22 | 2009-02-27 | Suunto Oy | A method of measuring exercise performance |
JP4126500B2 (en) * | 2004-10-08 | 2008-07-30 | カシオ計算機株式会社 | Ear-mounted electronic devices |
JP2005224318A (en) | 2004-02-10 | 2005-08-25 | Rikogaku Shinkokai | Pacemaker |
US7914381B2 (en) | 2004-03-16 | 2011-03-29 | Xfire, Inc. | System and method for facilitating multiplayer online gaming |
EP1737543B1 (en) | 2004-04-09 | 2009-12-09 | O'Brien, Conor | Exercise monitor |
US7057551B1 (en) | 2004-04-27 | 2006-06-06 | Garmin Ltd. | Electronic exercise monitor and method using a location determining component and a pedometer |
US20050272561A1 (en) | 2004-06-07 | 2005-12-08 | Cammerata Gregory T | Electronic data gathering and processing for fitness machines |
US20060020216A1 (en) | 2004-07-20 | 2006-01-26 | Sharp Kabushiki Kaisha | Medical information detection apparatus and health management system using the medical information detection apparatus |
US8109858B2 (en) | 2004-07-28 | 2012-02-07 | William G Redmann | Device and method for exercise prescription, detection of successful performance, and provision of reward therefore |
US20060058156A1 (en) | 2004-09-15 | 2006-03-16 | International Business Machines Corporation | Systems, methods, and computer readable media for determining a circuit training path in a smart gym |
US7227468B1 (en) | 2004-09-30 | 2007-06-05 | Erik David Florio | Object information retrieval system |
CN101057291B (en) * | 2004-11-12 | 2012-05-09 | 皇家飞利浦电子股份有限公司 | Apparatus and method for sharing content via headphone device |
US20060113381A1 (en) * | 2004-11-29 | 2006-06-01 | John Hochstein | Batteryless contact fingerprint-enabled smartcard that enables contactless capability |
US7370763B1 (en) | 2004-12-08 | 2008-05-13 | Pascucci Cheryl L | Health management kit |
US7254516B2 (en) * | 2004-12-17 | 2007-08-07 | Nike, Inc. | Multi-sensor monitoring of athletic performance |
WO2006072961A2 (en) * | 2005-01-10 | 2006-07-13 | Eyepoint Ltd. | Musical pacemaker for physical workout |
US20070266065A1 (en) | 2006-05-12 | 2007-11-15 | Outland Research, Llc | System, Method and Computer Program Product for Intelligent Groupwise Media Selection |
US7638252B2 (en) | 2005-01-28 | 2009-12-29 | Hewlett-Packard Development Company, L.P. | Electrophotographic printing of electronic devices |
US9165280B2 (en) * | 2005-02-22 | 2015-10-20 | International Business Machines Corporation | Predictive user modeling in user interface design |
US7627423B2 (en) * | 2005-03-10 | 2009-12-01 | Wright Ventures, Llc | Route based on distance |
JP2006297069A (en) * | 2005-03-24 | 2006-11-02 | Jun Kawahara | Walking guidance device, walking guidance method, program and recording medium |
SE530842C2 (en) | 2005-04-05 | 2008-09-23 | Yoyo Technology Ab | Procedure for muscle training and implements for this |
US20060240959A1 (en) | 2005-04-22 | 2006-10-26 | Hsien-Ting Huang | Dumbbell that can respond to exercise status and play music |
US20070042868A1 (en) | 2005-05-11 | 2007-02-22 | John Fisher | Cardio-fitness station with virtual- reality capability |
US8244179B2 (en) * | 2005-05-12 | 2012-08-14 | Robin Dua | Wireless inter-device data processing configured through inter-device transmitted data |
TWI262782B (en) | 2005-06-07 | 2006-10-01 | Nat Applied Res Lab Nat Ce | Method for exercise tolerance measurement |
US20060288846A1 (en) | 2005-06-27 | 2006-12-28 | Logan Beth T | Music-based exercise motivation aid |
US8740751B2 (en) * | 2005-07-25 | 2014-06-03 | Nike, Inc. | Interfaces and systems for displaying athletic performance information on electronic devices |
WO2007017739A2 (en) | 2005-08-08 | 2007-02-15 | Dayton Technologies Limited | Performance monitoring apparatus |
US20070032345A1 (en) | 2005-08-08 | 2007-02-08 | Ramanath Padmanabhan | Methods and apparatus for monitoring quality of service for an exercise machine communication network |
JP2007075172A (en) * | 2005-09-12 | 2007-03-29 | Sony Corp | Sound output control device, method and program |
US7633076B2 (en) | 2005-09-30 | 2009-12-15 | Apple Inc. | Automated response to and sensing of user activity in portable devices |
US20130228063A1 (en) * | 2005-10-06 | 2013-09-05 | William D. Turner | System and method for pacing repetitive motion activities |
US20070083095A1 (en) | 2005-10-07 | 2007-04-12 | Rippo Anthony J | External exercise monitor |
US20070083092A1 (en) | 2005-10-07 | 2007-04-12 | Rippo Anthony J | External exercise monitor |
US8360935B2 (en) | 2005-10-12 | 2013-01-29 | Sensyact Ab | Method, a computer program, and device for controlling a movable resistance element in a training device |
US7351187B2 (en) | 2005-10-22 | 2008-04-01 | Joseph Seliber | Resistance and power monitoring device and system for exercise equipment |
US7728214B2 (en) | 2005-11-23 | 2010-06-01 | Microsoft Corporation | Using music to influence a person's exercise performance |
US7683252B2 (en) | 2005-11-23 | 2010-03-23 | Microsoft Corporation | Algorithm for providing music to influence a user's exercise performance |
US8390456B2 (en) * | 2008-12-03 | 2013-03-05 | Tego Inc. | RFID tag facility with access to external devices |
US8333874B2 (en) | 2005-12-09 | 2012-12-18 | Flexible Medical Systems, Llc | Flexible apparatus and method for monitoring and delivery |
US20070146116A1 (en) * | 2005-12-22 | 2007-06-28 | Sony Ericsson Mobile Communications Ab | Wireless communications device with integrated user activity module |
GB2434461A (en) | 2006-01-24 | 2007-07-25 | Hawkgrove Ltd | System for monitoring the performance of the components of a software system by detecting the messages between the components and decoding them |
US8055469B2 (en) * | 2006-03-03 | 2011-11-08 | Garmin Switzerland Gmbh | Method and apparatus for determining the attachment position of a motion sensing apparatus |
KR20080105160A (en) * | 2006-03-15 | 2008-12-03 | 퀄컴 인코포레이티드 | Method and apparatus for determining relevant point of interest information based on a user's root |
JP2007250053A (en) * | 2006-03-15 | 2007-09-27 | Sony Corp | Contents reproducing device and contents reproducing method |
JP2007267818A (en) * | 2006-03-30 | 2007-10-18 | Duskin Healthcare:Kk | Aerobics exercise maintenance apparatus |
KR100807736B1 (en) | 2006-04-21 | 2008-02-28 | 삼성전자주식회사 | Exercise assist device for instructing exercise pace in association with music and method |
US20070249468A1 (en) | 2006-04-24 | 2007-10-25 | Min-Chang Chen | System for monitoring exercise performance |
US8684922B2 (en) | 2006-05-12 | 2014-04-01 | Bao Tran | Health monitoring system |
JP4231876B2 (en) | 2006-05-18 | 2009-03-04 | 株式会社コナミスポーツ&ライフ | Training system, operation terminal, and computer-readable recording medium recording training support program |
US7643895B2 (en) * | 2006-05-22 | 2010-01-05 | Apple Inc. | Portable media device with workout support |
JP2007322172A (en) * | 2006-05-30 | 2007-12-13 | Nissan Motor Co Ltd | Bypass proposal system and method |
US20070300185A1 (en) | 2006-06-27 | 2007-12-27 | Microsoft Corporation | Activity-centric adaptive user interface |
WO2008003830A1 (en) * | 2006-07-04 | 2008-01-10 | Firstbeat Technologies Oy | Method and system for guiding a person in physical exercise |
EP2037798B1 (en) | 2006-07-10 | 2012-10-31 | Accenture Global Services Limited | Mobile personal services platform for providing feedback |
JP4749273B2 (en) * | 2006-08-10 | 2011-08-17 | 三洋電機株式会社 | Electric bicycle |
JP4305671B2 (en) | 2006-08-22 | 2009-07-29 | ソニー株式会社 | HEALTH EXERCISE SUPPORT SYSTEM, PORTABLE MUSIC REPRODUCTION DEVICE, SERVICE INFORMATION PROVIDING DEVICE, INFORMATION PROCESSING DEVICE, HEALTH EXERCISE SUPPORT METHOD |
CN1912862A (en) * | 2006-08-25 | 2007-02-14 | 中山大学 | Device and method for dynamic playing music and video according to body physiological state |
US8360785B2 (en) | 2006-09-29 | 2013-01-29 | Electronics And Telecommunications Research Institute | System for managing physical training and method thereof |
TW200820225A (en) | 2006-10-25 | 2008-05-01 | Taiwan Chest Disease Ass | Home-based exercise tranining method and system guided by automatically assessment and selecting music |
US20080103022A1 (en) | 2006-10-31 | 2008-05-01 | Motorola, Inc. | Method and system for dynamic music tempo tracking based on exercise equipment pace |
US20080147502A1 (en) | 2006-11-06 | 2008-06-19 | Baker Steve G | Exercise incenting methods and devices |
US20080110115A1 (en) * | 2006-11-13 | 2008-05-15 | French Barry J | Exercise facility and method |
US7586418B2 (en) | 2006-11-17 | 2009-09-08 | General Electric Company | Multifunctional personal emergency response system |
WO2008069966A2 (en) * | 2006-12-01 | 2008-06-12 | Fitistics, Llc | System and method for processing information |
US20080176713A1 (en) | 2006-12-05 | 2008-07-24 | Pablo Olivera Brizzio | Method and apparatus for selecting a condition of a fitness machine in relation to a user |
US8652040B2 (en) | 2006-12-19 | 2014-02-18 | Valencell, Inc. | Telemetric apparatus for health and environmental monitoring |
US8157730B2 (en) | 2006-12-19 | 2012-04-17 | Valencell, Inc. | Physiological and environmental monitoring systems and methods |
US20080162186A1 (en) | 2006-12-28 | 2008-07-03 | Michael Jones | System and method for diet and exercise |
US7840031B2 (en) | 2007-01-12 | 2010-11-23 | International Business Machines Corporation | Tracking a range of body movement based on 3D captured image streams of a user |
US7841966B2 (en) | 2007-01-29 | 2010-11-30 | At&T Intellectual Property I, L.P. | Methods, systems, and products for monitoring athletic performance |
CN101244310B (en) * | 2007-02-15 | 2010-12-01 | 李隆 | Music interference electric therapeutic equipment |
US20080204225A1 (en) | 2007-02-22 | 2008-08-28 | David Kitchen | System for measuring and analyzing human movement |
EP2135109A1 (en) * | 2007-03-01 | 2009-12-23 | Telefonaktiebolaget LM Ericsson (publ) | Mobile service for keeping track of competitors during a race |
US7931563B2 (en) | 2007-03-08 | 2011-04-26 | Health Hero Network, Inc. | Virtual trainer system and method |
JP4941037B2 (en) * | 2007-03-22 | 2012-05-30 | ヤマハ株式会社 | Training support apparatus, training support method, and program for training support apparatus |
JP4697165B2 (en) | 2007-03-27 | 2011-06-08 | ヤマハ株式会社 | Music playback control device |
JP4311467B2 (en) * | 2007-03-28 | 2009-08-12 | ヤマハ株式会社 | Performance apparatus and program for realizing the control method |
US7987046B1 (en) * | 2007-04-04 | 2011-07-26 | Garmin Switzerland Gmbh | Navigation device with improved user interface and mounting features |
US20080262918A1 (en) | 2007-04-19 | 2008-10-23 | Jay Wiener | Exercise recommendation engine and internet business model |
US8709709B2 (en) | 2007-05-18 | 2014-04-29 | Luoxis Diagnostics, Inc. | Measurement and uses of oxidative status |
US7970532B2 (en) * | 2007-05-24 | 2011-06-28 | Honeywell International Inc. | Flight path planning to reduce detection of an unmanned aerial vehicle |
US8199014B1 (en) | 2007-06-29 | 2012-06-12 | Sony Ericsson Mobile Communications Ab | System, device and method for keeping track of portable items by means of a mobile electronic device |
JP5221074B2 (en) * | 2007-08-07 | 2013-06-26 | 株式会社ゼンリンデータコム | GUIDANCE INFORMATION GENERATION DEVICE, GUIDE INFORMATION GENERATION METHOD, AND COMPUTER PROGRAM |
EP2175941B1 (en) | 2007-08-08 | 2012-05-30 | Koninklijke Philips Electronics N.V. | Process and system for monitoring exercise motions of a person |
US20090044687A1 (en) * | 2007-08-13 | 2009-02-19 | Kevin Sorber | System for integrating music with an exercise regimen |
US8221290B2 (en) * | 2007-08-17 | 2012-07-17 | Adidas International Marketing B.V. | Sports electronic training system with electronic gaming features, and applications thereof |
JP2009050368A (en) * | 2007-08-24 | 2009-03-12 | Masashi Hosoya | Swimming distance measuring apparatus |
US7996080B1 (en) | 2007-10-16 | 2011-08-09 | Customkynetics, Inc. | Recumbent stepping exercise device with stimulation and related methods |
ITBO20070701A1 (en) | 2007-10-19 | 2009-04-20 | Technogym Spa | DEVICE FOR ANALYSIS AND MONITORING OF THE PHYSICAL ACTIVITY OF A USER. |
KR101512814B1 (en) * | 2007-11-09 | 2015-04-16 | 구글 인코포레이티드 | Activating applications based on accelerometer data |
US7979136B2 (en) | 2007-12-07 | 2011-07-12 | Roche Diagnostics Operation, Inc | Method and system for multi-device communication |
US8103241B2 (en) | 2007-12-07 | 2012-01-24 | Roche Diagnostics Operations, Inc. | Method and system for wireless device communication |
JP2009142333A (en) | 2007-12-11 | 2009-07-02 | Sharp Corp | Exercise supporting device, exercise supporting method, exercise supporting system, exercise supporting control program and recording medium |
WO2009075493A2 (en) * | 2007-12-12 | 2009-06-18 | Kyong Am An | Upper/lower body exercise machine usable in state of being held by user's hands or feet |
US8123660B2 (en) * | 2007-12-28 | 2012-02-28 | Immersion Corporation | Method and apparatus for providing communications with haptic cues |
US8125314B2 (en) | 2008-02-05 | 2012-02-28 | International Business Machines Corporation | Distinguishing between user physical exertion biometric feedback and user emotional interest in a media stream |
JP4769827B2 (en) | 2008-02-15 | 2011-09-07 | 富士通株式会社 | RFID tag |
JP2009206600A (en) * | 2008-02-26 | 2009-09-10 | Sharp Corp | Portable equipment, processing determination method and program |
CN104208860B (en) | 2008-02-27 | 2017-09-05 | 耐克创新有限合伙公司 | Movable information system and method |
US8047966B2 (en) | 2008-02-29 | 2011-11-01 | Apple Inc. | Interfacing portable media devices and sports equipment |
US7951046B1 (en) | 2008-03-17 | 2011-05-31 | Barber Jr Ulysses | Device, method and computer program product for tracking and monitoring an exercise regimen |
US8182424B2 (en) | 2008-03-19 | 2012-05-22 | Microsoft Corporation | Diary-free calorimeter |
WO2009120604A2 (en) * | 2008-03-26 | 2009-10-01 | Frumer John D | System and method for configuring fitness equipment |
US20090247368A1 (en) | 2008-03-31 | 2009-10-01 | Boson Technology Co., Ltd. | Sports health care apparatus with physiological monitoring function |
FI20085398A0 (en) | 2008-04-30 | 2008-04-30 | Polar Electro Oy | Method and apparatus in conjunction with training |
US8254829B1 (en) * | 2008-05-09 | 2012-08-28 | Sprint Communications Company L.P. | Network media service with track delivery adapted to a user cadence |
US20090287103A1 (en) | 2008-05-14 | 2009-11-19 | Pacesetter, Inc. | Systems and methods for monitoring patient activity and/or exercise and displaying information about the same |
US20090292178A1 (en) | 2008-05-21 | 2009-11-26 | Qualcomm Incorporated | System and method of monitoring users during an interactive activity |
US8401468B2 (en) | 2008-05-28 | 2013-03-19 | Sharp Laboratories Of America, Inc. | Method and system for facilitating scheduling using a mobile device |
JP5332313B2 (en) * | 2008-05-29 | 2013-11-06 | 富士通株式会社 | Mobile terminal and stride calculation method |
WO2009147279A1 (en) | 2008-06-02 | 2009-12-10 | Polar Electro Oy | Method and apparatus in connection with exercise |
US7617615B1 (en) | 2008-06-03 | 2009-11-17 | Jonathan Martorell | Belt or band-like exercise result measurement article with selectable display aspect |
CN101428163B (en) * | 2008-06-17 | 2010-10-20 | 李隆 | Adjusting apparatus for states of examination, athletics and sports |
US8996332B2 (en) * | 2008-06-24 | 2015-03-31 | Dp Technologies, Inc. | Program setting adjustments based on activity identification |
KR101054287B1 (en) * | 2008-07-03 | 2011-08-08 | 삼성전자주식회사 | Method for providing location information based service in mobile terminal and mobile terminal implementing same |
US8021270B2 (en) | 2008-07-03 | 2011-09-20 | D Eredita Michael | Online sporting system |
US20100035726A1 (en) | 2008-08-07 | 2010-02-11 | John Fisher | Cardio-fitness station with virtual-reality capability |
US20100190607A1 (en) | 2008-08-22 | 2010-07-29 | Thinkfit, Llc | Exercise device integrally incorporating digital capabilities for music, light, video and still imagery, heart rate measurement and caloric consumption |
US20110195819A1 (en) | 2008-08-22 | 2011-08-11 | James Shaw | Adaptive exercise equipment apparatus and method of use thereof |
US20110165996A1 (en) | 2008-08-22 | 2011-07-07 | David Paulus | Computer controlled exercise equipment apparatus and method of use thereof |
US9409052B2 (en) * | 2008-10-03 | 2016-08-09 | Adidas Ag | Program products, methods, and systems for providing location-aware fitness monitoring services |
JP2010088886A (en) | 2008-10-03 | 2010-04-22 | Adidas Ag | Program products, methods, and systems for providing location-aware fitness monitoring services |
US20100167876A1 (en) | 2008-12-29 | 2010-07-01 | Tzu Chi University | Radio frequency identification based exercise behavior management system |
EP2526503B1 (en) | 2009-01-20 | 2019-05-08 | GemCar Inc. | Personal portable secured network access system |
US8467860B2 (en) | 2009-01-20 | 2013-06-18 | Alexandria Salazar | Portable system and method for monitoring of a heart and other body functions |
US20100191454A1 (en) * | 2009-01-29 | 2010-07-29 | Sony Corporation | Location based personal organizer |
JP2010192012A (en) * | 2009-02-16 | 2010-09-02 | Fujifilm Corp | Portable music reproducing device |
US8062182B2 (en) | 2009-02-24 | 2011-11-22 | Tuffstuff Fitness Equipment, Inc. | Exercise monitoring system |
WO2010098912A2 (en) | 2009-02-25 | 2010-09-02 | Valencell, Inc. | Light-guiding devices and monitoring devices incorporating same |
GB0903601D0 (en) | 2009-03-03 | 2009-04-08 | Bigger Than The Wheel Ltd | Automated weightlifting spotting machine |
JP5310194B2 (en) * | 2009-03-31 | 2013-10-09 | サクサ株式会社 | Terminal location system and method |
FI20095386A0 (en) | 2009-04-08 | 2009-04-08 | Polar Electro Oy | Portable device |
CN102449560B (en) * | 2009-04-26 | 2016-12-21 | 耐克创新有限合伙公司 | Sports watch |
CA2760285C (en) * | 2009-04-27 | 2017-08-22 | Nike International Ltd. | Training program and music playlist generation for athletic training |
US8033959B2 (en) * | 2009-05-18 | 2011-10-11 | Adidas Ag | Portable fitness monitoring systems, and applications thereof |
US8438256B2 (en) * | 2009-06-26 | 2013-05-07 | Vmware, Inc. | Migrating functionality in virtualized mobile devices |
DE102009027365A1 (en) * | 2009-07-01 | 2011-01-05 | Robert Bosch Gmbh | Motion sensor and system for detecting a movement profile |
CN101954171A (en) * | 2009-07-16 | 2011-01-26 | 英业达股份有限公司 | Real-time adjustment body-building program system and method thereof |
US8622873B2 (en) | 2009-07-27 | 2014-01-07 | Rhoderick Euan MCGOWN | Exercise equipment usage monitoring method and apparatus |
CA2773206C (en) | 2009-09-04 | 2014-07-22 | Nike International Ltd. | Monitoring and tracking athletic activity |
US20110066042A1 (en) | 2009-09-15 | 2011-03-17 | Texas Instruments Incorporated | Estimation of blood flow and hemodynamic parameters from a single chest-worn sensor, and other circuits, devices and processes |
US11232671B1 (en) | 2009-09-30 | 2022-01-25 | Zynga Inc. | Socially-based dynamic rewards in multiuser online games |
WO2011041678A1 (en) | 2009-10-02 | 2011-04-07 | Hinds Robert S | Exercise devices with force sensors |
US8902050B2 (en) * | 2009-10-29 | 2014-12-02 | Immersion Corporation | Systems and methods for haptic augmentation of voice-to-text conversion |
FI20096168A0 (en) * | 2009-11-10 | 2009-11-10 | Polar Electro Oy | Many user profiles on a portable device |
FI122770B (en) | 2009-11-11 | 2012-06-29 | Adfore Technologies Oy | A mobile device controlled by context awareness |
EP2504786A2 (en) | 2009-11-25 | 2012-10-03 | The Board of Governors for Higher Education, State of Rhode Island and Providence Plantations | Systems and methods for providing an activity monitor and analyzer with voice direction for exercise |
US8406085B2 (en) | 2009-12-21 | 2013-03-26 | Masami Sakita | Swim device |
US20110152696A1 (en) * | 2009-12-22 | 2011-06-23 | Hall Ryan Laboratories, Inc. | Audible biofeedback heart rate monitor with virtual coach |
US20110165998A1 (en) | 2010-01-07 | 2011-07-07 | Perception Digital Limited | Method For Monitoring Exercise, And Apparatus And System Thereof |
US20110179068A1 (en) | 2010-01-21 | 2011-07-21 | O'brien John Patrick | Computer implemented process for creating an overall health wellness database for a plurality of patients |
CA2731025C (en) | 2010-02-05 | 2014-10-07 | Fletcher Lu | Mobile social fitness networked game |
US20110275042A1 (en) | 2010-02-22 | 2011-11-10 | Warman David J | Human-motion-training system |
US8670709B2 (en) * | 2010-02-26 | 2014-03-11 | Blackberry Limited | Near-field communication (NFC) system providing mobile wireless communications device operations based upon timing and sequence of NFC sensor communication and related methods |
US8521316B2 (en) * | 2010-03-31 | 2013-08-27 | Apple Inc. | Coordinated group musical experience |
US8893022B2 (en) | 2010-04-01 | 2014-11-18 | Microsoft Corporation | Interactive and shared viewing experience |
US20110261079A1 (en) | 2010-04-21 | 2011-10-27 | Apple Inc. | Automatic adjustment of a user interface composition |
US20110288381A1 (en) | 2010-05-24 | 2011-11-24 | Jesse Bartholomew | System And Apparatus For Correlating Heart Rate To Exercise Parameters |
JP2012019811A (en) * | 2010-07-12 | 2012-02-02 | Rohm Co Ltd | Biological data measuring device |
FI20105796A0 (en) | 2010-07-12 | 2010-07-12 | Polar Electro Oy | Analysis of a physiological condition for a cardio exercise |
US9392941B2 (en) * | 2010-07-14 | 2016-07-19 | Adidas Ag | Fitness monitoring methods, systems, and program products, and applications thereof |
US9532734B2 (en) | 2010-08-09 | 2017-01-03 | Nike, Inc. | Monitoring fitness using a mobile device |
US8983785B2 (en) * | 2010-08-18 | 2015-03-17 | Snap-On Incorporated | System and method for simultaneous display of waveforms generated from input signals received at a data acquisition device |
CN101934111A (en) * | 2010-09-10 | 2011-01-05 | 李隆 | Music chromatic light physical factor physical and mental health system based on computer |
US9167991B2 (en) * | 2010-09-30 | 2015-10-27 | Fitbit, Inc. | Portable monitoring devices and methods of operating same |
US9089733B2 (en) | 2010-10-21 | 2015-07-28 | Benaaron, Llc | Systems and methods for exercise in an interactive virtual environment |
US9223936B2 (en) * | 2010-11-24 | 2015-12-29 | Nike, Inc. | Fatigue indices and uses thereof |
JP2012108801A (en) * | 2010-11-18 | 2012-06-07 | Toshiba Tec Corp | Portable information terminal device and control program |
US9541411B2 (en) * | 2010-11-20 | 2017-01-10 | Telenav, Inc. | Navigation system with destination travel category extraction measurement capture mechanism and method of operation thereof |
WO2012070019A2 (en) | 2010-11-23 | 2012-05-31 | Michael Franklin M | System and method for authentication, usage, monitoring and management within a health facility |
US20120142429A1 (en) | 2010-12-03 | 2012-06-07 | Muller Marcus S | Collaborative electronic game play employing player classification and aggregation |
WO2012083177A1 (en) * | 2010-12-16 | 2012-06-21 | Nike International Ltd. | Methods and systems for encouraging athletic activity |
US8655345B2 (en) | 2011-01-08 | 2014-02-18 | Steven K. Gold | Proximity-enabled remote control |
US20120184871A1 (en) | 2011-01-14 | 2012-07-19 | Seungjin Jang | Exercise monitor and method for monitoring exercise |
US20120190502A1 (en) | 2011-01-21 | 2012-07-26 | David Paulus | Adaptive exercise profile apparatus and method of use thereof |
JP5786361B2 (en) * | 2011-02-22 | 2015-09-30 | ヤマハ株式会社 | Notification signal control device |
CN102654911A (en) * | 2011-03-04 | 2012-09-05 | 北京网秦天下科技有限公司 | Method and system for schedule management |
JP2012189415A (en) | 2011-03-10 | 2012-10-04 | Clarion Co Ltd | Portable navigation device |
US20130090213A1 (en) | 2011-03-25 | 2013-04-11 | Regents Of The University Of California | Exercise-Based Entertainment And Game Controller To Improve Health And Manage Obesity |
JP5724602B2 (en) * | 2011-05-10 | 2015-05-27 | オンキヨー株式会社 | Receiver |
CN102198301B (en) * | 2011-05-20 | 2012-12-12 | 哈尔滨工业大学 | Music playing system based on body feature monitoring |
US8730930B2 (en) * | 2011-05-31 | 2014-05-20 | Broadcom Corporation | Polling using B-ACK for occasional back-channel traffic in VoWIFI applications |
US20120308192A1 (en) | 2011-05-31 | 2012-12-06 | United Video Properties, Inc. | Systems and methods for selecting videos for display to a player based on a duration of using exercise equipment |
US20140089672A1 (en) * | 2012-09-25 | 2014-03-27 | Aliphcom | Wearable device and method to generate biometric identifier for authentication using near-field communications |
AU2012274931A1 (en) | 2011-06-20 | 2014-01-30 | Healthwatch Ltd. | Independent non-interfering wearable health monitoring and alert system |
US8821351B2 (en) * | 2011-08-02 | 2014-09-02 | International Business Machines Corporation | Routine-based management of exercise equipment access |
US9367860B2 (en) | 2011-08-05 | 2016-06-14 | Sean McKirdy | Barcode generation and implementation method and system for processing information |
US8816814B2 (en) | 2011-08-16 | 2014-08-26 | Elwha Llc | Systematic distillation of status data responsive to whether or not a wireless signal has been received and relating to regimen compliance |
US9819710B2 (en) * | 2011-08-25 | 2017-11-14 | Logitech Europe S.A. | Easy sharing of wireless audio signals |
JP5892305B2 (en) * | 2011-08-26 | 2016-03-23 | セイコーエプソン株式会社 | Activity amount measuring device, activity amount measuring system, program and recording medium |
CN202342650U (en) * | 2011-09-15 | 2012-07-25 | 王彤 | Synchronous-synthesis device of heart-rate audio and music |
US20130080120A1 (en) * | 2011-09-23 | 2013-03-28 | Honeywell International Inc. | Method for Optimal and Efficient Guard Tour Configuration Utilizing Building Information Model and Adjacency Information |
CN103918344A (en) | 2011-10-13 | 2014-07-09 | 英特尔公司 | Detection of user activities by portable device |
JP5418567B2 (en) * | 2011-10-14 | 2014-02-19 | オンキヨー株式会社 | Receiver |
EP2608090B1 (en) | 2011-11-01 | 2019-03-13 | Polar Electro Oy | Performance intensity zones |
US20130155251A1 (en) | 2011-12-16 | 2013-06-20 | Oren Moravchik | Monitoring system accomodating multiple imagers |
US20130178960A1 (en) | 2012-01-10 | 2013-07-11 | University Of Washington Through Its Center For Commercialization | Systems and methods for remote monitoring of exercise performance metrics |
US20130196821A1 (en) * | 2012-01-31 | 2013-08-01 | Icon Health & Fitness, Inc. | Systems and Methods to Generate a Customized Workout Routine |
US20130218309A1 (en) | 2012-02-03 | 2013-08-22 | Frank Napolitano | Apparatus, system and method for improving user fitness by tracking activity time |
JP2013169611A (en) * | 2012-02-20 | 2013-09-02 | Vstone Kk | Robot system, and robot |
US8947239B1 (en) * | 2012-03-05 | 2015-02-03 | Fitbit, Inc. | Near field communication system, and method of operating same |
US9305141B2 (en) * | 2012-03-13 | 2016-04-05 | Technogym S.P.A. | Method, system and program product for identifying a user on an exercise equipment |
US20130268101A1 (en) * | 2012-04-09 | 2013-10-10 | Icon Health & Fitness, Inc. | Exercise Device Audio Cue System |
WO2013163090A1 (en) * | 2012-04-23 | 2013-10-31 | Sackett Solutions & Innovations, LLC | Cognitive biometric systems to monitor emotions and stress |
US8655591B2 (en) * | 2012-05-09 | 2014-02-18 | Mitac International Corp. | Method of creating varied exercise routes for a user |
JP2013238709A (en) | 2012-05-15 | 2013-11-28 | Sony Corp | Optical laminated body, optical element, and projection device |
US9183822B2 (en) * | 2012-05-23 | 2015-11-10 | Google Inc. | Music selection and adaptation for exercising |
US9222787B2 (en) * | 2012-06-05 | 2015-12-29 | Apple Inc. | System and method for acquiring map portions based on expected signal strength of route segments |
US9005129B2 (en) * | 2012-06-22 | 2015-04-14 | Fitbit, Inc. | Wearable heart rate monitor |
US8854207B2 (en) | 2012-07-02 | 2014-10-07 | Donald S. Williams | Mobile lock with retractable cable |
US9909875B2 (en) * | 2012-09-11 | 2018-03-06 | Nokia Technologies Oy | Method and apparatus for providing alternate route recommendations |
US8744605B2 (en) * | 2012-09-14 | 2014-06-03 | Cycling & Health Tech Industry R & D Center | Handheld device workout coach system |
WO2014055939A1 (en) * | 2012-10-04 | 2014-04-10 | Huawei Technologies Co., Ltd. | User behavior modeling for intelligent mobile companions |
US10866100B2 (en) * | 2012-10-15 | 2020-12-15 | Kamino Labs, Inc. | Method of providing urban hiking trails |
CN202802459U (en) * | 2012-10-25 | 2013-03-20 | 黑龙江工程学院 | Musical device used for psychological regulation |
US9381399B2 (en) | 2013-03-04 | 2016-07-05 | Cellco Partnership | Exercise recordation method and system |
US9510193B2 (en) * | 2013-03-15 | 2016-11-29 | Qualcomm Incorporated | Wireless networking-enabled personal identification system |
US20140316701A1 (en) * | 2013-04-18 | 2014-10-23 | International Business Machines Corporation | Control system for indicating if people can reach locations that satisfy a predetermined set of conditions and requirements |
US9454913B2 (en) * | 2013-06-17 | 2016-09-27 | Polar Electro Oy | Simulator tool for physical exercise device |
US20150079562A1 (en) | 2013-09-17 | 2015-03-19 | Sony Corporation | Presenting audio based on biometrics parameters |
-
2013
- 2013-09-25 US US14/037,271 patent/US20150079562A1/en not_active Abandoned
- 2013-09-25 US US14/037,278 patent/US20150079563A1/en not_active Abandoned
- 2013-09-25 US US14/037,263 patent/US20150082408A1/en not_active Abandoned
- 2013-09-25 US US14/037,286 patent/US20150081210A1/en not_active Abandoned
- 2013-09-25 US US14/037,228 patent/US20150082167A1/en not_active Abandoned
- 2013-09-25 US US14/037,224 patent/US8795138B1/en active Active
- 2013-09-25 US US14/037,267 patent/US20150081067A1/en not_active Abandoned
- 2013-09-25 US US14/037,276 patent/US9142141B2/en active Active
- 2013-09-25 US US14/037,252 patent/US20150081066A1/en not_active Abandoned
-
2014
- 2014-04-17 US US14/255,663 patent/US9224311B2/en active Active
- 2014-09-03 KR KR20140116916A patent/KR20150032170A/en not_active Application Discontinuation
- 2014-09-03 KR KR20140116913A patent/KR20150032169A/en not_active Application Discontinuation
- 2014-09-11 CN CN201810232144.8A patent/CN108428473B/en not_active Expired - Fee Related
- 2014-09-11 CN CN201410460678.8A patent/CN104460980B/en not_active Expired - Fee Related
- 2014-09-12 KR KR1020140120862A patent/KR101640667B1/en active IP Right Grant
- 2014-09-12 KR KR20140120858A patent/KR20150032183A/en not_active Application Discontinuation
- 2014-09-12 KR KR20140120857A patent/KR20150032182A/en active Application Filing
- 2014-09-15 WO PCT/US2014/055579 patent/WO2015041970A1/en active Application Filing
- 2014-09-15 CA CA2917927A patent/CA2917927A1/en not_active Abandoned
- 2014-09-15 KR KR1020167001728A patent/KR101788485B1/en active IP Right Grant
- 2014-09-15 CN CN201480039881.6A patent/CN105393637A/en active Pending
- 2014-09-15 JP JP2016530107A patent/JP2016533237A/en active Pending
- 2014-09-15 CN CN201410468325.2A patent/CN104460982A/en active Pending
- 2014-09-15 WO PCT/US2014/055585 patent/WO2015041971A1/en active Application Filing
- 2014-09-15 EP EP14845088.5A patent/EP3020253A4/en not_active Withdrawn
- 2014-09-15 CN CN201410466936.3A patent/CN104460981A/en active Pending
- 2014-09-16 CN CN201410471221.7A patent/CN104469585A/en active Pending
- 2014-09-16 CN CN201410470194.1A patent/CN104436615B/en not_active Expired - Fee Related
- 2014-09-17 JP JP2014188914A patent/JP2015058364A/en active Pending
- 2014-09-17 JP JP2014188915A patent/JP2015061318A/en active Pending
- 2014-09-17 JP JP2014188913A patent/JP2015058363A/en active Pending
- 2014-09-17 JP JP2014188912A patent/JP2015058362A/en active Pending
- 2014-09-17 JP JP2014188911A patent/JP5896344B2/en active Active
-
2016
- 2016-08-26 KR KR1020160109114A patent/KR20160105373A/en active Search and Examination
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20060111621A1 (en) * | 2004-11-03 | 2006-05-25 | Andreas Coppi | Musical personal trainer |
US20070219059A1 (en) * | 2006-03-17 | 2007-09-20 | Schwartz Mark H | Method and system for continuous monitoring and training of exercise |
Cited By (23)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10220259B2 (en) | 2012-01-05 | 2019-03-05 | Icon Health & Fitness, Inc. | System and method for controlling an exercise device |
US10279212B2 (en) | 2013-03-14 | 2019-05-07 | Icon Health & Fitness, Inc. | Strength training apparatus with flywheel and related methods |
US10188890B2 (en) | 2013-12-26 | 2019-01-29 | Icon Health & Fitness, Inc. | Magnetic resistance mechanism in a cable machine |
US10433612B2 (en) | 2014-03-10 | 2019-10-08 | Icon Health & Fitness, Inc. | Pressure sensor to quantify work |
US10426989B2 (en) | 2014-06-09 | 2019-10-01 | Icon Health & Fitness, Inc. | Cable system incorporated into a treadmill |
US10226396B2 (en) | 2014-06-20 | 2019-03-12 | Icon Health & Fitness, Inc. | Post workout massage device |
US10391361B2 (en) | 2015-02-27 | 2019-08-27 | Icon Health & Fitness, Inc. | Simulating real-world terrain on an exercise device |
US11256471B2 (en) | 2015-05-19 | 2022-02-22 | Spotify Ab | Media content selection based on physiological attributes |
US11211098B2 (en) * | 2015-05-19 | 2021-12-28 | Spotify Ab | Repetitive-motion activity enhancement based upon media content selection |
US10493349B2 (en) | 2016-03-18 | 2019-12-03 | Icon Health & Fitness, Inc. | Display on exercise device |
US10272317B2 (en) | 2016-03-18 | 2019-04-30 | Icon Health & Fitness, Inc. | Lighted pace feature in a treadmill |
US10625137B2 (en) | 2016-03-18 | 2020-04-21 | Icon Health & Fitness, Inc. | Coordinated displays in an exercise device |
TWI601155B (en) * | 2016-06-08 | 2017-10-01 | 群聯電子股份有限公司 | Memory interface, memory control circuit unit, memory storage device and clock generation method |
US10671705B2 (en) | 2016-09-28 | 2020-06-02 | Icon Health & Fitness, Inc. | Customizing recipe recommendations |
CN106598537A (en) * | 2016-11-16 | 2017-04-26 | 上海斐讯数据通信技术有限公司 | Mobile terminal music play control method and system and mobile terminal |
US11185254B2 (en) | 2017-08-21 | 2021-11-30 | Muvik Labs, Llc | Entrainment sonification techniques |
US11205408B2 (en) | 2017-08-21 | 2021-12-21 | Muvik Labs, Llc | Method and system for musical communication |
US11690530B2 (en) | 2017-08-21 | 2023-07-04 | Muvik Labs, Llc | Entrainment sonification techniques |
US10587941B2 (en) | 2017-08-29 | 2020-03-10 | Kabushiki Kaisha Toshiba | Microphone cooperation device |
US11145313B2 (en) * | 2018-07-06 | 2021-10-12 | Michael Bond | System and method for assisting communication through predictive speech |
US20220028388A1 (en) * | 2018-07-06 | 2022-01-27 | Michael Bond | System and method for assisting communication through predictive speech |
US11551698B2 (en) * | 2018-07-06 | 2023-01-10 | Spoken Inc. | System and method for assisting communication through predictive speech |
US20240054159A1 (en) * | 2022-08-12 | 2024-02-15 | Lmdp Co. | Method and apparatus for a user-adaptive audiovisual experience |
Also Published As
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20150079562A1 (en) | Presenting audio based on biometrics parameters | |
KR101662234B1 (en) | Athletic monitoring system having automatic pausing of media content | |
US11256471B2 (en) | Media content selection based on physiological attributes | |
JP6212025B2 (en) | Training program for exercise training and music playlist generation method | |
JP6144710B2 (en) | Fitness monitoring method, apparatus, computer readable medium, and system using mobile devices | |
US10311462B2 (en) | Music streaming for athletic activities | |
KR20210007906A (en) | Exercise machine controls |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: SONY CORPORATION, JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:YEH, SABRINA TAI-CHEN;YOUNG, DAVID ANDREW;HIRONAKA, TAKASHI;AND OTHERS;SIGNING DATES FROM 20130925 TO 20130926;REEL/FRAME:031528/0767 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |