Talkie: a 13B LLM trained only on pre-1931 text used Claude Sonnet to help test the model and judge its output
·
0 reactions
·
0 comments
·
6 views
Researchers Alec Radford (GPT, CLIP, Whisper), Nick Levine, and David Duvenaud just released talkie : a 13 billion parameter language model trained exclusively on text published before 1931. No internet. No Wikipedia. No World War II. Its worldview is frozen at December 31, 1930. Why does this matter? Every major LLM today (GPT, Claude, Gemini, Llama) ultimately shares a common ancestor: the modern web. That makes it nearly impossible to tell what these models genuinely reason versus what they s
Original article
ClaudeAI
Anonymous · no account needed