Abstract
Visual stimuli are the most sensitive stimulus to affect human sentiments. Many researches have attempted to find the relationship between visual elements in images and sentimental elements using statistical approaches. In many cases, the range of sentiment that affects humans varies with image categories, such as landscapes, portraits, sports, and still life. Therefore, to enhance the performance of sentiment prediction, an individual prediction model must be established for each image category. However, collecting much ground truth sentiment data is one of the obstacles encountered by studies on this field. In this paper, we propose an approach that acquires a training data set for category classification and predicting sentiments from images. Using this approach, we collect a training data set and establish a predictor for sentiments from images. First, we estimate the image category from a given image, and then we predict the sentiment as coordinates on the arousal–valence space using the predictor of an estimated category. We show that the performance of our approach approximates performance using ground truth data. Based on our experiments, we argue that our approach, which utilizes big data on the web as the training set for predicting content sentiment, is useful for practical purposes.



Similar content being viewed by others
References
Aurenhammer F (1991) Voronoi diagrams: a survey of a fundamental geometric data structure. ACM Comput Surv (CSUR) 23(3):345–405
Cantone D, Ferro A, Pulvirenti A, Recupero D, Shasha D (2005) Antipole tree indexing to support range search and k-nearest neighbor search in metric spaces. Knowl Data Eng IEEE Trans 17(4):535–550. doi:10.1109/TKDE.2005.53
Cohen-Or D, Sorkine O, Gal R, Leyvand T, Xu YQ (2006) Color harmonization. In: ACM transactions on graphics (TOG), ACM, vol 25, pp 624–630
Colombo C, Del Bimbo A, Pala P (1999) Semantics in visual information retrieval. IEEE Multimed 6(3):38–53
Dan-Glauser E, Scherer K (2011) The geneva affective picture database (gaped): a new 730-picture database focusing on valence and normative significance. Behav Res Methods 43(2):468–477. doi:10.3758/s13428-011-0064-1
Datta R, Li J, Wang JZ (2008) Algorithmic inferencing of aesthetics and emotion in natural images: an exposition. In: Image processing, 2008. ICIP 2008. 15th IEEE international conference on, IEEE, pp 105–108
Fasel B, Luettin J (2003) Automatic facial expression analysis: a survey. Pattern Recognit 36(1):259–275
Hall M, Frank E, Holmes G, Pfahringer B, Reutemann P, Witten IH (2009) The weka data mining software: an update. SIGKDD Explor Newsl 11(1):10–18. doi:10.1145/1656274.1656278
Haralick RM (1979) Statistical and structural approaches to texture. Proc IEEE 67(5):786–804
Haralick RM, Shanmugam K, Dinstein IH (1973) Textural features for image classification. Syst Man Cybern IEEE Trans 6:610–621
Ipeirotis PG (2010) Analyzing the amazon mechanical turk marketplace. XRDS 17(2):16–21. doi:10.1145/1869086.1869094
Itti L, Koch C, Niebur E (1998) A model of saliency-based visual attention for rapid scene analysis. IEEE Trans Pattern Anal Mach Intell 20(11):1254–1259
Kobayashi S (1981) The aim and method of the color image scale. Color Res Appl 6(2):93–107
Lang PJ, Bradley MM, Cuthbert BN (1999) International affective picture system (iaps): technical manual and affective ratings
Machajdik J, Hanbury A (2010) Affective image classification using features inspired by psychology and art theory. In: Proceedings of the international conference on multimedia, ACM, New York, NY, USA, MM ’10, pp 83–92. doi:10.1145/1873951.1873965
Mayer JD, DiPaolo M, Salovey P (1990) Perceiving affective content in ambiguous visual stimuli: a component of emotional intelligence. J Personal Assess 54(3–4):772–781
Moon P, Spencer DE (1944) Geometric formulation of classical color harmony. JOSA 34(1):46–50
Nicolaou MA, Gunes H, Pantic M (2012) Output-associative rvm regression for dimensional and continuous emotion prediction. Image Vis Comput 30(3):186–196
Park BJ, Jang EH, Chung MA, Kim SH (2013) Design of prototype-based emotion recognizer using physiological signals. ETRI J 35(5):869–879
Picard RW (1997) Affective computing. MIT Press, Cambridge
Picard RW (2003) Affective computing: challenges. Int J Hum Comput Stud 59(1):55–64
Picard RW, Vyzas E, Healey J (2001) Toward machine emotional intelligence: analysis of affective physiological state. Pattern Anal Mach Intell IEEE Trans 23(10):1175–1191
Reiman EM, Lane RD, Ahern GL, Schwartz GE, Davidson RJ, Friston KJ, Yun LS, Chen K (1997) Neuroanatomical correlates of externally and internally generated human emotion. Am J Psychiatr 154(7):918–925
Russell JA (1989) Measures of emotion. Academic Press, London
Schmidt S, Stock WG (2009) Collective indexing of emotions in images. A study in emotional information retrieval. J Am Soc Inf Sci Technol 60(5):863–876
Schubert E (1999) Measuring emotion continuously: validity and reliability of the two-dimensional emotion-space. Aust J Psychol 51(3):154–165
Suykens JA, Vandewalle J (1999) Least squares support vector machine classifiers. Neural Process Lett 9(3):293–300
Tokumaru M, Muranaka N, Imanishi S (2002) Color design support system considering color harmony. In: Fuzzy Systems, 2002. FUZZ-IEEE’02. Proceedings of the 2002 IEEE international conference on, IEEE, vol 1, pp 378–383
Warriner AB, Kuperman V, Brysbaert M (2013) Norms of valence, arousal, and dominance for 13,915 english lemmas. Behav Res Methods 45(4):1191–1207
Yang YH, Lin YC, Su YF, Chen HH (2008) A regression approach to music emotion recognition. Audio Speech Lang Process IEEE Trans 16(2):448–457
Yun Wh, Kim D, Park C, Kim J (2013) Hybrid facial representations for emotion recognition. ETRI J 35(6):1021–1028
Acknowledgments
This work was supported by Ministry of Culture, Sports and Tourism (MCST) and Korea Creative Content Agency (KOCCA) in the Culture Technology (CT) and Research Development Program 2015.
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Seo, S., Kang, D. Study on predicting sentiment from images using categorical and sentimental keyword-based image retrieval. J Supercomput 72, 3478–3488 (2016). https://doi.org/10.1007/s11227-015-1510-0
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11227-015-1510-0