There\u2019s been a lot of interest lately in the attention mechanism in neural nets\u2014it\u2019s got a colloquial name (who\u2019s not familiar with the idea of \u201cattention\u201d?) but it\u2019s more like a technical trick that\u2019s been pivotal to some recent advances in computer vision and especially word embeddings. It\u2019s an interesting example of trying out human-cognitive-ish ideas (like focusing consideration more on some inputs than others) in neural nets, and one of the more high-profile recent successes in playing around with neural net architectures for fun and profit.