Milner et al., 2013 - Google Patents
Music retrieval by singing and humming using information fusionMilner et al., 2013
- Document ID
- 5712430740503779797
- Author
- Milner J
- Hsu D
- Publication year
- Publication venue
- 2013 IEEE 12th International Conference on Cognitive Informatics and Cognitive Computing
External Links
Snippet
We present that combinatorial fusion analysis (CFA) can improve results in a music information retrieval (MIR) task, specifically querying a database of recorded music by singing, humming, or whistling. Our experiment considers 10 scoring systems, 55 queries …
- 230000004927 fusion 0 title abstract description 7
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F17/00—Digital computing or data processing equipment or methods, specially adapted for specific functions
- G06F17/30—Information retrieval; Database structures therefor; File system structures therefor
- G06F17/3074—Audio data retrieval
- G06F17/30755—Query formulation specially adapted for audio data retrieval
- G06F17/30758—Query by example, e.g. query by humming
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F17/00—Digital computing or data processing equipment or methods, specially adapted for specific functions
- G06F17/30—Information retrieval; Database structures therefor; File system structures therefor
- G06F17/3074—Audio data retrieval
- G06F17/30749—Audio data retrieval using information manually generated or using information not derived from the audio data, e.g. title and artist information, time and location information, usage information, user ratings
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F17/00—Digital computing or data processing equipment or methods, specially adapted for specific functions
- G06F17/30—Information retrieval; Database structures therefor; File system structures therefor
- G06F17/3074—Audio data retrieval
- G06F17/30769—Presentation of query results
- G06F17/30772—Presentation of query results making use of playlists
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F17/00—Digital computing or data processing equipment or methods, specially adapted for specific functions
- G06F17/30—Information retrieval; Database structures therefor; File system structures therefor
- G06F17/3074—Audio data retrieval
- G06F17/30743—Audio data retrieval using features automatically derived from the audio content, e.g. descriptors, fingerprints, signatures, MEP-cepstral coefficients, musical score, tempo
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10H—ELECTROPHONIC MUSICAL INSTRUMENTS
- G10H2210/00—Aspects or methods of musical processing having intrinsic musical character, i.e. involving musical theory or musical parameters or relying on musical knowledge, as applied in electrophonic musical tools or instruments
- G10H2210/031—Musical analysis, i.e. isolation, extraction or identification of musical elements or musical parameters from a raw acoustic signal or from an encoded audio signal
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F17/00—Digital computing or data processing equipment or methods, specially adapted for specific functions
- G06F17/30—Information retrieval; Database structures therefor; File system structures therefor
- G06F17/30017—Multimedia data retrieval; Retrieval of more than one type of audiovisual media
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10H—ELECTROPHONIC MUSICAL INSTRUMENTS
- G10H2240/00—Data organisation or data communication aspects, specifically adapted for electrophonic musical tools or instruments
- G10H2240/121—Musical libraries, i.e. musical databases indexed by musical parameters, wavetables, indexing schemes using musical parameters, musical rule bases or knowledge bases, e.g. for automatic composing methods
- G10H2240/131—Library retrieval, i.e. searching a database or selecting a specific musical piece, segment, pattern, rule or parameter set
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10H—ELECTROPHONIC MUSICAL INSTRUMENTS
- G10H1/00—Details of electrophonic musical instruments
- G10H1/0008—Associated control or indicating means
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10H—ELECTROPHONIC MUSICAL INSTRUMENTS
- G10H2240/00—Data organisation or data communication aspects, specifically adapted for electrophonic musical tools or instruments
- G10H2240/075—Musical metadata derived from musical analysis or for use in electrophonic musical instruments
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10H—ELECTROPHONIC MUSICAL INSTRUMENTS
- G10H1/00—Details of electrophonic musical instruments
- G10H1/0033—Recording/reproducing or transmission of music for electrophonic musical instruments
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10L—SPEECH ANALYSIS OR SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING; SPEECH OR AUDIO CODING OR DECODING
- G10L17/00—Speaker identification or verification
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Casey et al. | Content-based music information retrieval: Current directions and future challenges | |
Panda et al. | Multi-modal music emotion recognition: A new dataset, methodology and comparative analysis | |
EP2659485B1 (en) | Semantic audio track mixer | |
De Man et al. | A knowledge-engineered autonomous mixing system | |
Hargreaves et al. | Structural segmentation of multitrack audio | |
GB0320578D0 (en) | Processing an audio signal | |
Knees et al. | Introduction to music similarity and retrieval | |
Humphrey et al. | A brief review of creative MIR | |
Grekow | Audio features dedicated to the detection and tracking of arousal and valence in musical compositions | |
Singhi et al. | On Cultural, Textual and Experiential Aspects of Music Mood. | |
Lefford et al. | Context aware intelligent mixing systems | |
Bogdanov et al. | From low-level to high-level: Comparative study of music similarity measures | |
Plewa et al. | A study on correlation between tempo and mood of music | |
Tzanetakis et al. | Stereo panning information for music information retrieval tasks | |
Van De Laar | Emotion detection in music, a survey | |
Dressler | Automatic transcription of the melody from polyphonic music | |
Milner et al. | Music retrieval by singing and humming using information fusion | |
Wilmering et al. | Grateful live: Mixing multiple recordings of a dead performance into an immersive experience | |
Zhu et al. | Perceptual visualization of a music collection | |
Tsai et al. | Automatic Singing Performance Evaluation Using Accompanied Vocals as Reference Bases. | |
Musil et al. | Perceptual dimensions of short audio clips and corresponding timbre features | |
Moffat et al. | Semantic music production: A meta-study | |
Müller et al. | Data-driven sound track generation | |
Werthen-Brabants | Ground truth extraction & transition analysis of DJ mixes | |
Sofianos et al. | H-Semantics: A hybrid approach to singing voice separation |