Should use AI to clone yourself or just write a book instead?
Had a strange conversation the other day.
(You’d be surprised how often it happens in LinkedIn DMs).
It went:
“I’m not gonna write a book, as much as I think I could write something entertaining, but lacking focus.”
“It kinda seems to me that with the way the world is working, I should just train "my AI" off of my thoughts instead. Then people can just ask me questions instead of needing to read a whole book.”
I was curious how he’d actually make this happen.
So I said:
“Do you have experience training AI to write like you?”
The response:
“I’m not saying I’d train GPT to write like me. I’m saying I’d train my own LLM to think like me. Tell stories. Be my thoughts.”
Quick aside:
An “LLM” is a “Large Language Model.” Basically, it’s the how behind ChadGPT & Friends answering your questions. They have a set of “knowledge,” and then use the LLM to give you the answer you’re looking for.
At least, that’s what they do when they aren’t hallucinating.
(To the nerds: I’m sure that’s not exactly what an LLM is.
But close enough)
Anyway, back to the conversation. I was curious if dude could ACTUALLY make this happen. Like…
Say he chooses not to write a book. Could he use that same time & energy building a AI-lookalike? A version of himself that could produce infinite books that sounded JUST like him? And knew EXACTLY what he knows?
So I pushed a little further:
“Is there currently LLM tech that works like this, by the way? Can you currently train one to think like you?”
Ok.
Before you see his response.
This next part is THICC with jargon.
You’ve been warned.
“Absolutely. That’s how LLMs work.
You take a set of data, which in this case would be writings or transcripts of videos, and then you push that into a vector database, and then train an LLM on that dataset.
Then in the future when you ask it questions, it would be able to predict what response you want to give based on your personal ethos.
It's too expensive right now to do on a lark, but in the future it as LLMs become more powerful and efficient, then it would be possible to have AdamAI.”
And now I’m getting suspicious.
Because when tech people use phrases like “in the future,” they often mislead.
Is “in the future” tomorrow?
Or 50 years from now?
Just for fun, I prodded him again:
“Any guesses when it would be possible for the average person to do this?”
And this was his answer:
“Average person? Eh.. 3-5 years, but I doubt that many people are going to want to document their own thoughts enough to actually train a model to speak like them.”
.
.
.
So. To clear up.
Adam would rather create an Adam-bot than write a book.
But in order to create an Adam-bot…
He needs a big old pile of information.
Something that would show how he thinks.
Like maybe…
A large document, somewhere between 30,000 and 50,000 words, presenting Adam’s big beliefs about the world.
Something that argues his ethos in a clear way.
Wait.
What am I thinking of that does that?
Oh yeah.
IT’S A BOOK.
Honestly, it sounds like writing a book is a PREREQUISITE for building your own AI clone.
You don’t have to choose.
You NEED to document your thoughts to have a chance at mimicking yourself via AI.
And wait.
What’s this?
Oh! It’s an easy transition!
Let me read this off…
Want to get a kick start on building your own You-bot?
Or, in the event that You-bots don’t become possible until some vague time “in the future,” write something that attracts readers, customers, and/or clients in the meantime?
WRITE A BOOK.
End of rant.
For more wild stories, AI predictions, and writing advice, sign up for my email list here