A Visual Essay

The Compression Problem

Why software delivery breaks at every handoff, and the architecture that fixes it.

March 2026 10 min read

AI coding tools made writing code dramatically faster. But faster coding on the wrong thing is just faster waste.

The bottleneck was never typing speed. It was understanding. And understanding gets destroyed at every stage of the software delivery lifecycle — silently, systematically, by compression.

Step 01
See the Compression

Your delivery pipeline is a lossy codec

A business user has a problem. It's rich, emotional, full of context — she knows her customers, her constraints, the politics of her organization, the reason this matters now.

That intent passes through four stages. At each stage, it gets compressed. The developer at the end is reading three lines in a ticket. The richness is gone.

Business Intent
100% fidelity
PRD
~60% fidelity
Design
~35% fidelity
Code
~15% fidelity
"The developer implementing the feature doesn't know — and can't ask — the 'why' behind the requirement. It's all mandates all over the place."

This is not a theoretical problem. Every escalation, every misbuilt feature, every "that's not what I asked for" traces back to the same root cause: the person building didn't have the context of the person requesting.

01

Discovery takes a month

Medium-complexity apps: 4+ weeks of calls, UX iterations, and requirement docs before a line of code. AI made coding faster, but didn't touch this.

02

The handover destroys intent

Sales knows the business context. Delivery gets a "do" mandate. The finance head wanted cost savings — the team built automation and forgot to measure savings.

03

Eight levels of approval

One account required eight levels of sign-off on requirements. Nobody mapped this upfront. Discovered in week three. Months of delay.

04

Teams don't know the domain

Delivery teams don't research the customer's business, industry, or stakeholder personas before the first call. They build what's asked, not what's needed.

05

Expectations drift

On projects longer than two months, stakeholders change their minds. AI awareness shifts expectations. They forget what they originally said.

06

Outcomes go unmeasured

CXOs care about business ROI. Users care about task efficiency. Delivery teams conflate or ignore this distinction — and never measure either.

Every one of these pain points traces to the same mechanism: compression. Richness in, bullet points out. Repeat at every handoff until the original intent is unrecognizable.

Step 02
Preserve the Context

What if nothing was lost?

Compression is inevitable — that's the law of context engineering. We have to summarize. We have to create artifacts. The question is whether the rich, uncompressed source is preserved and queryable.

"We have to compress — that's the law of context engineering. But if we start saving the uncompressed data and have loops that always go back to it, only AI can do that at scale."

The concept is zoomable context. Every compressed artifact links back to its source. A line in the PRD is traceable to the exact moment in a stakeholder conversation. A design decision traces to the business constraint that drove it.

Click through the layers below to see how one requirement expands:

Interactive: Zoomable Context Depth 0 — Surface
PRD
"System shall support 3-way invoice matching with configurable tolerance."
This is what the developer sees. A compressed requirement. Three lines. No context on why, for whom, or what success looks like.
Click to zoom in
0
PRD Business Context
The finance head is paying $10,200/month for a manual AP team doing data entry in Faridabad.
She asked for automation as the natural next step from BPO. The requirement exists because she wants to eliminate this cost. The tolerance parameter (±5%) comes from PO variance policies in their procurement handbook.
Click to zoom deeper
1
PRD Business Conversation
"I'm paying $10K for the manual team. Show me savings with automation — that's the whole point."
Said by the VP of Finance during a discovery call. She was emphatic. The word "savings" came up four times in the same conversation. This is the emotional core of the requirement — and it never made it into the PRD.
Click to zoom to outcome
2
PRD Business Conversation Success Metric
Success = ROI dashboard showing $10.2K manual cost vs. automated cost, updated monthly.
The feature isn't "3-way matching." The feature is a visible cost comparison that proves the automation is saving money. The matching engine is a means to this end. Without measuring the outcome, the project has technically succeeded but actually failed.
Click to return to surface
3

The developer who only reads "3-way invoice matching" builds the matching engine and ships. The developer who can zoom to the original conversation builds the matching engine and the ROI dashboard — because they understand the intent.

"At every stage there's compression happening. You lose so much fidelity because the person actually implementing doesn't have access to the richness of this information. All he's reading is three lines."

The context pipeline

Zoomable context needs architecture. It's not a feature — it's a substrate. Every artifact, every decision, every line of code traces back to the uncompressed source through a persistent context pipeline.

AI agents don't consume compressed artifacts. They query the full, uncompressed context before making decisions. "What was the business intent behind this feature?" is a question any agent can answer, at any stage.

Step 03
Close the Loop

From intent to measured outcome

Solving compression isn't enough if the loop doesn't close. A business user requests software. AI and humans build it. But who checks whether the business outcome was achieved?

The architecture is three phases — Plan, Build, Test — running as a continuous loop. The context pipeline connects all three.

"The differentiation will be in this end-to-end — not just coding. The business user being able to explain, and then the whole observability and outcome. That loop — nobody's yet thinking about it."
The Old Way

In the loop

The finance head asks for AP automation to cut costs. The requirement becomes "3-way invoice matching with configurable tolerance." A team builds the matching engine and ships it. Nobody measures whether she actually saved the $10,200/month. The project is marked complete. The outcome is unknown.

The New Way

On the loop

She describes what she needs. AI captures her exact words, the cost she wants to cut, the success metric. Agents build the matching engine and the ROI dashboard. Every month, the system reports: manual AP cost vs. automated cost. She sees the savings. The loop closes.

Three capabilities make this loop possible

Discovery agents

AI that captures context from conversations, structures requirements, detects gaps, and fills domain knowledge. Calls stakeholders, asks pointed questions, returns structured answers.

Cloud sandboxes

Isolated development environments that spin up on demand. The agent gets a full VM with codebase, test suite, databases. Finished code exits as a PR — not a local diff.

Outcome measurement

Validation against the original business intent, not just code correctness. The system asks: "Did the finance head save $10K/month?" — not just "Did the tests pass?"

The compressed thought
The gap between what's asked and what's built isn't a people problem. It's a missing architecture.