---
title: "Context engineering: where it goes next"
url: https://mdfy.app/AXAIfrBG
updated: 2026-05-14T18:15:49.480Z
source: "mdfy.app"
---
# Context engineering: where it goes next

> Trend note, written 2026-05-12. I'll mark this as out of date if it doesn't hold up.

## The bet

The phrase "prompt engineering" is collapsing into "context engineering" over the next 12 months. Both will exist; one will become the load-bearing skill.

## What I'm seeing

Every AI tool that mattered in 2024 was a chat box. Every AI tool that matters in 2026 has at least one persistent context layer:

- Cursor — project context, sidebar history, AGENTS.md
- Claude Projects — project files, project instructions, scoped memory
- ChatGPT — GPTs, memory, project workspaces
- Codex CLI — AGENTS.md, conversation transcripts
- Claude Code — CLAUDE.md, hooks, MCP servers

They all converged on the same primitive: shaped, persistent context that lives outside the live conversation. The thing that varies between them is *how* you shape it, *where* it lives, and *which AI can read it.*

## Why this matters for mdfy

The convergence is the wedge. If shaped context is the new primitive, then a **portable** shape — one URL that every tool reads — is structurally valuable. AI companies can't build it because their incentive is the wall. The third party can.

## The risk

The big AI companies could decide to interoperate. Anthropic and OpenAI could agree on a common context format. If that happens, mdfy's "we make context portable" pitch is weaker.

## Why I don't think that's the world we're in

- **The wall is the business model.** OpenAI's revenue comes from people staying inside ChatGPT. Anthropic's comes from people staying inside Claude. Neither has a structural reason to make context portable across providers.
- **MCP is the closest they've come.** And MCP is a tool-call protocol, not a memory protocol. It doesn't make context portable; it makes individual function calls callable from multiple hosts.

## What I'd do differently if I'm wrong

If the AI companies do interoperate, mdfy still has a wedge: **authored** context, where the user writes what they want preserved. The portable layer is the easier win, but the authoring layer doesn't depend on portability surviving.
