19/04/2023.
The paper ‘Attention Is All You Need’ introduces transformers and the sequence-to-sequence architecture. This is a neural net that transforms a sequence of elements (for example, the sequence of words of a sentence) into another sequence.
Keep reading at Attention is all you need.