To deal with the complexities of human language, new text-based generative AI technologies use a relatively new type of neural network called a transformer. Transformers have a unique feature known as attention, which allows them to prioritize certain words over others, and more carefully analyze the relation between important words, punctuation, and sentence structures. This attention feature means that newer text-based generative AI technologies can better understand the context of the input and the overall conversation and respond adequately.
'Artificial intelligence utilizing an attention mechanism to transform information’, Microsoft Designer