In partnership with

Reading Time: 9 minutes

Hey Prompt Lover,

I need to talk about the thing everyone is arguing about right now.

Because my inbox has been full of it. Comments on the last few issues. Messages from people who read the Claude series and came back with the same question underneath everything.

"This is all great but... am I teaching AI to replace me?"

It's a fair question.

And I'm not going to give you the comfortable answer. I'm going to give you the honest one.

Here's what's actually happening right now.

LLM traffic converts 3× better than Google search

58% of buyers now start their research in ChatGPT or Gemini, not Google. Most startups aren't showing up there yet.

The ones that are get cited by the AI tools their buyers, investors, and future hires already use. And they convert at 3×.

Download the free AEO Playbook for Startups from HubSpot and get the exact steps to start showing up. Five minutes to read.

A new survey from Epoch AI and Ipsos found that half of American adults used AI in the past week. And 20 percent of full-time workers said AI has already replaced parts of their job.

One in five.

That's not a prediction. That's not a think piece.

That's people, right now, saying parts of what they used to do every day are being done by a machine.

And before you say "well not my job" — Microsoft's AI chief Mustafa Suleyman said in a recent interview that most tasks in white-collar jobs will be fully automated by AI within the next 12 to 18 months.

"White-collar work, where you're sitting down at a computer, either being a lawyer or an accountant or a project manager or a marketing person — most of those tasks will be fully automated."

That's not some random person on Twitter.

That's the head of AI at the company that makes the tools millions of people use for work every day.

So yeah. This is real.

But here's where it gets complicated.

Because the same survey that found AI replaced tasks for 20 percent of workers also found something else nobody is talking about.

15 percent of full-time workers said they had started doing new tasks at work that they wouldn't have done without AI services.

So AI is taking tasks. And creating tasks. At the same time. From the same group of people.

And Goldman Sachs found that AI substitution reduced monthly payroll growth by about 25,000 jobs — but AI augmentation, meaning humans using AI tools, added 9,000 jobs to monthly payroll growth.

Net negative? Yes. But nowhere near the apocalypse the loudest voices are selling.

Here's the thing I keep coming back to.

Companies are laying off workers because of AI's potential — not its performance.

Read that again.

Companies are making decisions based on what they think AI will be able to do, not what it actually does well right now. They're betting on a future that hasn't fully arrived yet.

And some of them are going to look very silly in eighteen months when they realize they gutted their teams over a tool that still gets things wrong constantly and needs a human checking everything it produces.

Current AI is "jagged" — good at some things but not others. And tasks aren't jobs. Even if AI can do some part of a person's job, it doesn't mean it can do all of that person's job.

That's Fortune magazine. Not a tech optimist. Not someone trying to sell you an AI course.

So who's actually at risk right now?

Goldman Sachs found the negative effects fall largely on less experienced workers.

Entry level. Junior roles.

The people who used to spend their days doing the repetitive foundational tasks — drafting first versions, doing basic research, formatting documents, writing first-pass emails.

Those tasks are going. That's true.

And if your entire job description is a list of those tasks, that's a real problem that's worth taking seriously right now, not later.

But here's what's also true.

62 percent of executives said AI can't create the new products and services their customers will want in the future. 53 percent said their customers prefer to work with humans. And 49 percent said they're worried about security and privacy when using AI.

The people running companies are not as all-in as the headlines suggest. They want AI. They also want humans. And right now, the humans who are winning are the ones who figured out how to use AI better than the people sitting next to them.

That's the actual gap opening up in 2026.

Not humans versus AI.

Humans who use AI well versus humans who don't.

There's a phrase going around right now: "AI won't take your job, but somebody who knows AI will take your job." Today's employers want to hire people who have developed the skills and ability to work with AI — people who are comfortable with the tools and understand how to use them for certain tasks, but not others.

That's it. That's the whole game right now.

The person who can produce in two hours what used to take a full day is not getting replaced. They're getting more work, more responsibility, and in a lot of cases, more money. Because they're the one the manager calls when something needs to get done fast and done right.

The person who refuses to touch any of this, or who is waiting to see how it plays out, is the one who wakes up in eighteen months and wonders how they fell behind.

I've been writing about Claude for months because I use it every single day and I've watched what it actually does to how work gets done.

It doesn't replace thinking. It doesn't replace judgment. It doesn't replace the relationship you have with a client or a colleague or a reader.

What it does is get rid of the two hours of mechanical work that used to sit between the thinking and the finished thing. The formatting. The first draft that you were just going to rewrite anyway. The research that took all morning. The email you kept putting off because you didn't know how to start it.

That's what's going away. And honestly? Most people I know didn't want to spend those two hours doing that stuff anyway.

The question is whether you're going to be the person who reclaims those two hours and does something valuable with them — or the person who watches someone else do it.

My honest take?

The fear is real but the timeline people are selling is wrong. This is not happening overnight.

The jobs disappearing right now are specific, narrow, task-based roles that were already fragile.

The jobs that last are the ones where a human being brings something AI genuinely can't — relationships, judgment, creativity that comes from actually living a life, accountability when something goes wrong.

But "my job is safe because AI can't do all of it" is not a strategy. It's just a delay.

The strategy is learning to use these tools well enough that you become the person in the room who knows how to make AI do the heavy lifting while you do the parts that actually matter.

That's what this newsletter has been about from the start.

Reply and tell me where you sit on this.

Are you worried? Are you already ahead of it? Are you somewhere in the middle still figuring it out?

I read every reply and this is one I genuinely want to hear your take on.

— Prompt Guy