The successes of deep learning for text analytics, also introduced in a recent post about sentiment analysis and published here are undeniable. Many other tasks in NLP have also benefitted from the superiority of deep learning methods over more traditional approaches. Such extraordinary results have also been possible due to the neural network approach to learn meaningful character and word embeddings, that is the representation space in which semantically similar objects are mapped to nearby vectors. All this is strictly related to a field one might initially find disconnected or off-topic: biology.\n\xa0\nDon't forget to subscribe to our Newsletter at amethix.com\xa0and get the latest updates in AI and machine learning. We do not spam. Promise!\n\xa0\nReferences\n[1] Rives A., et al., \u201cBiological structure and function emerge from scaling unsupervised learning to 250 million protein sequences\u201d, biorxiv, doi: https://doi.org/10.1101/622803\n[2] Vaswani A., et al., \u201cAttention is all you need\u201d, Advances in neural information processing systems, pp. 5998\u20136008, 2017.\n[3] Bahdanau D., et al., \u201cNeural machine translation by jointly learning to align and translate\u201d, arXiv, http://arxiv.org/abs/1409.0473.