Nothing Special   »   [go: up one dir, main page]

Skip to main content

Feature-Based Synchronization of Video and Background Music

  • Conference paper
Advances in Machine Vision, Image Processing, and Pattern Analysis (IWICPAS 2006)

Part of the book series: Lecture Notes in Computer Science ((LNIP,volume 4153))

Abstract

We synchronize background music with a video by changing the timing of music, an approach that minimizes the damage to music data. Starting from a MIDI file and video data, feature points are extracted from both sources, paired, and then synchronized using dynamic programming to time-scale the music. We also introduce the music graph, a directed graph that encapsulates connections between many short music sequences. By traversing a music graph, we can generate large amounts of new background music, in which we expect to find a sequence which matches the video features better than the original music.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Similar content being viewed by others

References

  1. Burt, G.: The Art of Film Music. Northeastern University Press (1996)

    Google Scholar 

  2. Lee, H.C., Lee, I.K.: Automatic synchronization of background music and motion in computer animation. In: Proceedings of the EUROGRAPHICS 2005, pp. 353–362 (2005)

    Google Scholar 

  3. Kovar, L., Gleicher, M., Pighin, F.: Motion graphs. In: Proceedings of ACM SIGGRAPH, pp. 473–482 (2002)

    Google Scholar 

  4. Arikan, O., Forsyth, D.: Interactive motion generation from examples. In: Proceedings of ACM SIGGRAPH, pp. 483–490 (2002)

    Google Scholar 

  5. Lee, J., Chai, J., Reitsma, P., Hodgins, J., Pollard, N.: Interactive control of avatars animated with human motion data. In: Proceedings of ACM SIGGRAPH, pp. 491–500 (2002)

    Google Scholar 

  6. Foote, J., Cooper, M., Girgensohn, A.: Creating music videos using automatic media analysis. In: Proceedings of ACM Multimedia 2002, pp. 553–560 (2002)

    Google Scholar 

  7. Hua, X.S., Lu, L., Zhang, H.J.: Ave - automated home video editing. In: Proceedings of ACM Multimedia 2003, pp. 490–497 (2003)

    Google Scholar 

  8. Mulhem, P., Kankanhalli, M.S., Hassan, H., Yi, J.: Pivot vector space approach for audio-video mixing. In: Proceedings of IEEE Multimedia 2003, pp. 28–40 (2003)

    Google Scholar 

  9. Jehan, T., Lew, M., Vaucelle, C.: Cati dance: self-edited, self-synchronized music video. In: Proceedings of SIGGRAPH Conference Abstracts and Applications, pp. 27–31 (2003)

    Google Scholar 

  10. Yoo, M.-J., Lee, I.-K., Choi, J.-J.: Background music generation using music texture synthesis. In: Rauterberg, M. (ed.) ICEC 2004. LNCS, vol. 3166, pp. 565–570. Springer, Heidelberg (2004)

    Chapter  Google Scholar 

  11. Ma, Y.F., Lu, L., Zhang, H.J., Li, M.J.: A user attention model for video summarization. In: Proceedings of ACM Multimedia 2002, pp. 533–542 (2002)

    Google Scholar 

  12. Lan, D.J., Ma, Y.F., Zhang, H.J.: A novel motion-based representation for video mining. In: Proceedings of IEEE International Conference on Multimedia and Expo., pp. 469–472 (2003)

    Google Scholar 

  13. Bradski, G.R.: Computer vision face tracking as a component of a perceptual user interface. In: Proceedings of Workshop on Applications of Computer Vision, pp. 214–219 (1998)

    Google Scholar 

  14. Rowe, R.: Machine Musicianship. MIT Press, Cambridge (2004)

    Google Scholar 

  15. Hoscheck, J., Lasser, D.: Fundametals of Computer Aided Geometric Design. AK Peters (1993)

    Google Scholar 

  16. Trivedi, K.: Probability & Statistics with Reliability, Queuing, and Computer Science Applications. Prentice-Hall, Englewood Cliffs (1982)

    MATH  Google Scholar 

  17. Cambouropoulos, E.: Markov chains as an aid to computer assisted composition. Musical Praxis 1, 41–52 (1994)

    Google Scholar 

  18. Trivino-Rodriguez, J.L., Morales-Bueno, R.: Using multiattribute prediction surffix graphs to predict and generate music. Computer Music Journal 25, 62–79 (2001)

    Article  Google Scholar 

  19. Bresin, R., Friberg, A.: Emotional coloring of computer-controlled music performances. Computer Music Journal 24, 44–63 (2000)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2006 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Yoon, JC., Lee, IK., Lee, HC. (2006). Feature-Based Synchronization of Video and Background Music. In: Zheng, N., Jiang, X., Lan, X. (eds) Advances in Machine Vision, Image Processing, and Pattern Analysis. IWICPAS 2006. Lecture Notes in Computer Science, vol 4153. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11821045_22

Download citation

  • DOI: https://doi.org/10.1007/11821045_22

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-37597-5

  • Online ISBN: 978-3-540-37598-2

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics