认知、神经方面的论文就不要看了,比较耗精力。已经有很多用注意力解决绑定问题的ANN例子:
Look and Think Twice: Capturing Top-Down Visual Attention With Feedback Convolutional Neural Networks
http://www.ics.uci.edu/~yyang8/research/feedback/feedback-iccv2015.pdfABC-CNN: An Attention Based Convolutional Neural Network for Visual Question Answering
https://arxiv.org/pdf/1511.05960v2.pdf用在生成句子我认为注意力是相当必要的,不如根本无法理解各个物体的空间关系
MULTIPLE OBJECT RECOGNITION WITH VISUAL ATTENTION
https://arxiv.org/pdf/1412.7755v2.pdfAttention to Scale: Scale-aware Semantic Image Segmentation
http://www.ics.uci.edu/~yyang8/research/attention-scale/attention-scale-cvpr2016.pdf