Abstract
Multimedia technology has been developing quickly since the computer was invented. Nowadays, more and more people use media players to watch movies and TV shows on their computer. But it seems that we are too busy sometimes to spare nearly two hours watching a movie from beginning to end. We’d like to find and watch the most exciting episodes that we are interested in as soon as possible. The media players we use often only provide a slider bar for locating, which is rather low in efficiency. Respecting this, an affective annotation method for video clips is proposed in this paper and the method is implemented in an affectively annotated media player. In addition, two experiments were set up to evaluate the proposed method. Experiment results indicate that affective annotation facilitates the subjects in quick locating target events and helps them understand the scenarios better.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Qian, Y., Feijs, L.M.G.: Exploring the potentials of combining photo annotating tasks with instant messaging fun. In: Proceedings of MUM 2004, pp. 11–17 (2004)
Abowd, G.D., Gauger, M., Lachenmann, A.: The family video archive: an annotation and browsing environment for home movies. In: Proceedings of MIR 2003, pp. 1–8 (2003)
Ramos, G., Balakrishnan, R.: Fluid interaction techniques for the control and annotation of digital Video. In: Proceedings of UIST 2003, pp. 105–114 (2003)
Costa, M., Correia, N., Guimarães, N.: Annotations as multiple perspectives of video content. In: Proceedings of Multimedia 2002, pp. 283–286 (2002)
Liu, H., Selker, T., Lieberman, H.: Visualizing the affective structure of a text Document. In: Proceedings of CHI 2003, pp. 740–741 (2003)
Valdez, P., Mehrabian, A.: A Effects of color on emotions. Journal of Experimental Psychology. General 123(4), 394–409 (1994)
Cowie, R., Douglas-Cowie, E., Tsapatsoulis, N., Votsis, G., Kollias, S., Fellenz, W., Taylor, J.G.: Emotion recognition in Human-Computer Interaction. IEEE Signal Processing Magazine 18(1), 33–80 (2001)
Murray, I.R., Arnott, J.L.: Toward the simulation of emotion in synthetic speech: A review of the literature on human vocal emotion. J Acoust Soc Am 93(2), 1097–1108 (1993)
Craggs, R., Wood, M.M.: A categorical annotation scheme for emotion in the linguistic content of dialogue. In: André, E., Dybkjær, L., Minker, W., Heisterkamp, P. (eds.) ADS 2004. LNCS (LNAI), vol. 3068, pp. 89–100. Springer, Heidelberg (2004)
Tokitomo, A., Kazuhiro, N., Hiroshi, T.: Effect of facial colors on humanoids in emotion recognition using speech. In: Proceedings of IEEE International Workshop on Robot and Human Interactive Communication, pp. 59–64 (2004)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2006 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Xu, C., Chen, L., Chen, G. (2006). A Color Bar Based Affective Annotation Method for Media Player. In: Zhou, X., Li, J., Shen, H.T., Kitsuregawa, M., Zhang, Y. (eds) Frontiers of WWW Research and Development - APWeb 2006. APWeb 2006. Lecture Notes in Computer Science, vol 3841. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11610113_70
Download citation
DOI: https://doi.org/10.1007/11610113_70
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-31142-3
Online ISBN: 978-3-540-32437-9
eBook Packages: Computer ScienceComputer Science (R0)