Shasha Lv


As the popularity of photo taking devices leads to more and more pictures being stored by everyone, various ways of describing pictures, especially in terms of aesthetic quality evaluation, have attracted great attention in the past decade or so. The purpose of the research on image aesthetic quality is to enable the computer to simulate human thinking and aesthetics to judge the aesthetic value of a picture, so as to output score or text description. From the level of art, image aesthetics is a subject that shows natural aesthetics through the integration of various photographic skills such as color collocation, depth of field processing and composition design. The research involves computer vision, psychology, text description, cognitive science and other aspects, there is a phenomenon of interdisciplinary research, has important research significance. Therefore, we will systematically analyze and summarize the research status of image aesthetic quality, and give Suggestions on the development of image aesthetic quality research.


aesthetic quality evaluation; Research on image aesthetic quality; Computer vision

Full Text:



Datta R , Joshi D , Li J , et al. Studying Aesthetics in Photographic Images Using a Computational Approach[C]// Computer Vision - ECCV 2006, 9th European Conference on Computer Vision, Graz, Austria, May 7-13, 2006, Proceedings, Part III. Springer-Verlag, 2006.

王伟凝, 刘剑聪, 徐向民, 姜怡孜, 王励, "基于构图规则的图像美学优化," 华南理工大学学报(自然科学版), pp. 51-58, 2015-05-15 2015.

Lu X , Lin Z , Jin H , et al. Rating Pictorial Aesthetics Using Deep Learning[J]. IEEE Transactions on Multimedia, 2014, 17(11):457-466.

Krizhevsky A , Sutskever I , Hinton G . ImageNet Classification with Deep Convolutional Neural Networks[C]// NIPS. Curran Associates Inc. 2012.

Kang L , Ye P , Li Y , et al. Convolutional Neural Networks for No-Reference Image Quality Assessment[C]// 2014 IEEE Conference on Computer Vision and Pattern Recognition (CVPR). IEEE Computer Society, 2014.

Vinyals O, Toshev A, Bengio S, et al. Show and tell: A neural image caption generator[J]. Computer Science, 2015:3156-3164.

Fang H, Gupta S, Iandola F, et al. From captions to visual concepts and back[C]// IEEE Conference on Computer Vision and Pattern Recognition. IEEE, 2015:1473-1482.

Xu K, Ba J, Kiros R, et al. Show, Attend and Tell: Neural Image Caption Generation with Visual Attention[J]. Computer Science, 2016:2048-2057.

J. Johnson, A. Karpathy, and L. Fei-Fei. Densecap: Fully convolutional localization networks for dense captioning. In CVPR, 2016. 1, 2

A. Karpathy and L. Fei-Fei. Deep visual-semantic alignments for generating image descriptions. In CVPR, 2015.1, 2, 3

J. Mao, W. Xu, Y. Yang, J. Wang, Z. Huang, and A. Yuille.Deep captioning with multimodal recurrent neural networks(m-rnn). In ICLR, 2015. 1, 2

L. Anne Hendricks, S. Venugopalan, M. Rohrbach,R. Mooney, K. Saenko, and T. Darrell. Deep compositional captioning: Describing novel object categories without paired training data. In CVPR, 2016. 1, 2

J. Mao, X. Wei, Y. Yang, J. Wang, Z. Huang, and A. L.Yuille. Learning like a child: Fast novel visual concept learning from sentence descriptions of images. In ICCV, 2015. 1,2

J. Donahue, L. Anne Hendricks, S. Guadarrama,M. Rohrbach, S. Venugopalan, K. Saenko, and T. Darrell. Long-term recurrent convolutional networks for visual recognition and description. In CVPR, 2015. 1, 2

H. Fang, S. Gupta, F. Iandola, R. K. Srivastava, L. Deng,P. Dollar, J. Gao, X. He, M. Mitchell, J. C. Platt, et al. From captions to visual concepts and back. In CVPR, 2015. 1, 2

Chang K Y, Lu K H, Chen C S. Aesthetic critiques generation for photos[C]//Proceedings of 2017 IEEE International Conference on Computer Vision (ICCV). Piscataway NJ: IEEE, 2017: 3534-3543

Wang W S, Yang S, Zhang W S, et al. Neural aesthetic image reviewer[J]., 2018, arXiv:1802.10240


  • There are currently no refbacks.

Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 International License.

Copyright © 2019 International Educational Applied Scientific Research Journal