Most of the data we produce, if you were to read it, you could lock me in jail for thirty days and ask me to solve one of these physics problems. I can't even barely read the type of data that we create because it's really experts in PhDs.
“Most of the data we produce, if you were to read it, like, you know, you could lock me in jail for, like, thirty days and ask me to solve one of these physics problems.”
Most of the data we produce, if you were to read it, like, you know, you could lock me in jail for, like, thirty days and ask me to solve one of these physics problems. I It's not asking. You can't you can't even, like, I can't even barely read the the type of data that we create because, you know, it's really experts in PhDs.
Why this clip
Vivid metaphor about the complexity of AI training data that makes technical concepts accessible. The 'lock me in jail for 30 days' imagery is memorable and the self-deprecating admission creates relatability while highlighting expertise gaps.
What they said next
If the AI crawlers go crawl the web and get all the content, when you go to your Perplexity or ChatGPT, do you go back and look at the original source? No. You just read it right there. The original source isn't getting compensated.
30:05 - 35s · Consequences
More from this episode
From the blog
Want clips like this for your podcast?
We find your top 5-8 clips, write the hooks, and deliver ready-to-post content. First 2 episodes are free.
Get 2 Episodes Clipped Free