BACK TO BLOG
AI All the Way Down: The Meta-Reality of Product Design

AI All the Way Down: The Meta-Reality of Product Design

Sometimes, when you are deep in the weeds of component libraries and stakeholder meetings, you forget how fast the ground is shifting beneath us.

I had a moment of profound realization this week. I was sitting down with one of my Product Managers to brainstorm the user experience for a suite of new AI features we are bringing to our product.

As we talked, the irony of the situation washed over me. I was preparing to design an AI-driven user experience, and to do it, I was going to rely almost entirely on my own AI-driven workflow. I am, quite literally, using AI to design AI. It is AI all the way down.

Sometimes, when you are deep in the weeds of component libraries and stakeholder meetings, you forget how fast the ground is shifting beneath us. But moments like this make you realize that we are standing on the brink of a massive paradigm shift in how digital products are built.

Here is a transparent look at what my actual, day-to-day UX loop looks like in this new era:

AI All the Way Down

1. The Human Baseline (Figma & Transcripts)

Everything still starts with human intent. My PM and I discuss the core user need and the business logic. I record the transcript of our meeting. Then, I retreat to Figma. Despite having AI that can build front-end UI in code, Figma remains my sanctuary for raw visual thinking. I sketch out the initial flows, establishing the baseline architecture of the feature.

2. The AI Co-Pilot (Ideation & Expansion)

This is where the workflow diverges from the past decade of UX. I take our meeting transcript and my initial Figma sketches and feed them to my AI assistant. I don't ask it to design the feature for me; I ask it to spar with me. It analyzes the context and suggests edge cases I might have missed, or alternative layout structures based on the data requirements. I create a few distinct variations based on this session.

3. The Synthetic Peer Review

Before I ever put these designs in front of a stakeholder, I run a synthetic peer review. I prompt my AI to adopt the persona of a highly critical, Staff-level UX Designer. I ask it to tear my concepts apart—looking for accessibility flaws, cognitive overload, or inconsistent interaction patterns. I absorb that critique, refine the designs, and prepare the final presentation.

4. The Human Alignment

Technology hasn't replaced the need for human alignment. I take the refined prototype back to my PM. We review it, make our final strategic adjustments, and present the functioning prototype to the wider team for our human critique and approval.

5. The Executable Handoff

Once the feature is green-lit, the AI steps back in for the heavy lifting. Instead of handing my developers a static Figma file full of redlines, I use my AI coding tools to generate a functioning UI sandbox. I wire the front-end components to mock data, ensuring the developers receive a living, breathing starting point rather than a static picture.

The Shift from Creator to Orchestrator

I am still wrapping my head around the implications of this workflow, but the most profound realization is how it has shifted my professional identity.

For the first decade of my career, my value was tied to my ability to manually generate artifacts—drawing the boxes, picking the hex codes, writing the redlines. Now, the friction of generation is gone.

I have essentially become an orchestrator. I am managing a continuous, meta-level loop. I take human strategy from my PM, direct an AI to brainstorm, direct an AI to critique, blend that with vital human critique from my team, and finally direct an AI to output the executable code for my human developers.

The challenge of UX design is no longer just arranging the UI on the screen. The new challenge is curating the logic, balancing synthetic peer reviews with essential human feedback, and acting as the definitive human bridge between a product's intent and its coded reality. We are using machine intelligence to simulate human critique, so we can build better machine intelligence interfaces for humans to use.

It is a dizzying, incredible loop to be caught in. And while I don't know exactly what the next five years hold for the UX industry, I know I wouldn't trade this new seat at the orchestrator's table for anything.

#UXDesign #DesignEngineer #ArtificialIntelligence #ProductManagement #Frontend #DesignProcess #ProductDesign