A 20B-parameter seq2seq language model by Amazon's Alexa AI team, demonstrating strong few-shot learning performance:contentReference[oaicite:31]{index=31}.
Technical Specifications
Parameters
20
Context Length
1.0K
Architecture
Multilingual sequence-to-sequence (seq2seq) transformer model with 20 billion parameters