Tired of generic AI wrappers? Meet Talkie 13B, an LLM fine-tuned exclusively on pre-1930s data. Here is why Hacker News is obsessed with this useless masterpiece.

If you're anything like me, your feed is probably choked to death with "revolutionary" AI wrappers promising to 10x your productivity. It’s exhausting. But just as I was about to rage-quit tech news for the day, I stumbled across a post blowing up on Hacker News with over 700 upvotes. It wasn't a ChatGPT killer. It was an AI... stuck in the Great Depression.
Yeah, you heard that right.
Some absolute madlad decided to build a 13-billion parameter model (likely taking LLaMA and heavily lobotomizing its modern memory). Instead of feeding it the entirety of Reddit and modern Wikipedia, they exclusively fine-tuned it on datasets from 1930 and earlier.
The result? Talkie. An LLM that speaks like a monocle-wearing aristocrat who doesn't know what the hell the internet is.
Try asking this thing about cryptocurrency or getting some Free $300 to test VPS on Vultr. It will look at you like you're speaking Martian. It has zero concept of WWII, the moon landing, or JavaScript frameworks (lucky bastard). It exists in a perpetual state of 1930s bliss and ignorance.
You drop a quirky, completely non-monetizable project on HN, and the community will eat it up. The comments were a goldmine of pure dev chaos:
Looking at Talkie 13B, it hit me: we’ve lost the plot. The tech industry has become so obsessed with grinding, hustling, and optimizing everything that we forgot how to just play with technology.
The survival lesson here? Sometimes, building a completely useless, batshit crazy project is exactly what you need to cure your burnout. It might not pay the bills, but it will definitely remind you why you started coding in the first place.
Go build something stupid today, my friends.
Sauce: Talkie: a 13B vintage language model from 1930 (Hacker News)