Successor to GPT-1, with 1.5 billion parameters trained on 40 GB of text:contentReference[oaicite:1]{index=1}.