Why this shift happened?

Let’s be honest — doing research through classic search engines became boring. You type a question, get 10 million results, click on three, close two, scroll endlessly on the third. Every topic has 15 conflicting opinions. With enough time, you can find any answer that matches what you already wanted to believe.

Now people want something faster: immediate, filtered, sorted.

They want the heavy lifting done for them. The AI reads everything, summarizes it, gives a conclusion — no ads, no SEO spam, no mental load.

And it works.

Mostly.

Google introduced Gemini in their search results more than a year ago, a first introduction to a new way of getting results for your question.

What we lose when we outsource judgment

There’s a catch though.

By giving that responsibility to AI, we also hand over our own critical filter — our ability to decide what’s relevant, nuanced, or worth trusting.

It’s like asking someone to do research for you. Maybe they’re efficient, but what if they missed the key detail you would’ve noticed? What if they quietly skipped over the uncomfortable bits? What if they’re subtly shaping what you think is “true”?

Not saying there’s some conspiracy here — but there’s a structural risk.

AI doesn’t “see” truth; it curates it. And that’s a lot of power in the hands of the systems deciding what’s relevant.

The next SEO problem

That brings us to a new question: how does AI pick what it uses?

When you ask ChatGPT something, the answer doesn’t come out of thin air — it’s built on existing content.

So which content makes it in? And how do we, as creators, stay visible when “search results” start disappearing?

For years, the game was clear:

  • Write clearly.

  • Optimize metadata.

  • Add keywords.

  • Get backlinks.

Classic SEO. You wrote for Google’s crawlers.

Now, you’ll write for AI selectors.

The same rules apply but the audience is different.

AI cares about structure, reliability, and internal logic more than backlinks or keyword stuffing. It needs clean, contextual, and verified information to pull from.

If you’re writing a blog today, imagine your reader isn’t a person scrolling but an AI skimming.

Your content should be easy to extract, summarize, and reference. That’s how you stay relevant in a post-search era.

What happens to your Google Analytics stats?

If fewer people actually visit your site, does “traffic” still mean anything?

Will we start seeing analytics like:

“Used 3,294 times as a reference by ChatGPT in the last 28 days.”

Your analytic tool

That’s probably where we’re heading.

Your content’s value might soon be measured by how often AI systems cite it — not how many humans click it.

Website design will split in two directions

I see two paths forming.

  1. The efficient web:

    Sites built purely to be read and understood by machines. Clean code, solid semantics, accessible data. Design will be minimal, maybe even generic. It’s not for humans — it’s for bots.

  2. The experiential web:

    The opposite extreme.

    Where design still matters — maybe more than ever. Luxury brands, tech giants, art projects, events… anyone selling emotion or immersion will keep investing heavily in custom design. Because a chat interface can’t replace that feeling.

We’ll see more cinematic sites, 3D environments, interactive storytelling. Not to rank, but to resonate.

My prediction

The web is splitting between information and experience.

One will be optimized for AI, the other one for humans.

Everything in between will slowly fade into noise.

About me

I was really working btw

Hey, I’m François Savard from END Agency

I design clear, functional products that cut friction and remove unnecessary decisions. END Space is my newsletter where I share ideas, trends, and what I’m working on.

🌉 Background: Creative Director in a digital marketing agency in Norway for 5 years, moved back to France to create END Agency.

🏄🏼‍♂️ Current focus: Keep my brain sane from AI

Keep Reading

No posts found