If you’ve ever jumped between a TypeScript codebase and a Python one, you know the feeling. TypeScript gives you this almost magic-like type system where you can slice, dice, and reshape types at compile time. Python, on the other hand, has a type system that’s great for the basics but starts to fall apart the moment you try to do something clever — like model what happens when a decorator adds a keyword argument, or when a framework derives a bunch of model variants from a single class definition.

Vercel, best known as a deployment platform, apparently felt this frustration deeply enough to spend a year doing something about it. On March 2, 2026, Yury Selivanov (Director of Engineering at Vercel) and software engineer Michael J. Sullivan published PEP 827: Type Manipulation — a proposal targeting Python 3.15 that aims to give Python’s type system a programmable core inspired by TypeScript’s conditional and mapped types.

This is a big deal. Let’s dig into what it actually means.


The Problem: Python’s Type System Can’t Keep Up With Python’s Runtime

Python is a genuinely weird language, in the best possible way. You can generate entire classes at runtime, decorate functions to completely change their behavior, define APIs that produce different output types based on the values of their arguments, and do all of this with a handful of elegant lines. Metaprogramming isn’t some niche power-user trick — it’s baked into how the language works.

The type system, however, hasn’t kept up.

Every time a library wants the type checker to understand its runtime magic, the options are pretty grim: write a custom mypy plugin (which may or may not work with other type checkers), reach for a special-case decorator like @dataclass_transform (which only covers one narrow pattern), or accept that your users will get no type checking help at all.

According to Meta’s 2025 Typed Python Survey, the single most-requested feature from the Python typing community was features inspired by TypeScript: mapped types, conditional types, utility types like Pick and Omit, and better structural typing. The community has been asking for this for years. Vercel decided to actually build it.


What PEP 827 Proposes

At its core, PEP 827 introduces type-level introspection and construction facilities — essentially a small programming language that operates on types rather than values. The proposal adds:

  • Conditional types — types that resolve to one thing or another depending on a subtype check
  • Unpacked comprehension types — the ability to iterate over type members at the type level, like a list comprehension but for types
  • Type member access — dot notation to access properties of type descriptors (e.g., .name, .type)
  • A new Member and Param system — structured representations of class attributes and function parameters at the type level
  • A suite of type operators — things like Members[T], Attrs[T], GetArg[T, Base, Idx], NewProtocol[...], IsAssignable[T, S], and more

The whole thing is deliberately designed to work with Python’s runtime model, not just satisfy static type checkers. That last point matters because frameworks like FastAPI and Pydantic don’t just use types at check time — they actually evaluate them at runtime to drive validation, serialization, and code generation.


The Concrete Examples Are Where It Gets Real

Let me walk through the examples from the PEP, because they illustrate exactly what problem this is solving.

TypeScript-Style Utility Types, Finally

TypeScript developers have Pick, Omit, Partial, and a dozen other utility types that let you reshape an existing type without rewriting it. Python has… nothing equivalent. Right now if you want to derive a version of a class with some fields removed, you write it by hand.

With PEP 827, Pick would look like this:

# Pick<T, Keys> — constructs a type by picking only the specified properties from T
type Pick[T, Keys] = typing.NewProtocol[
    *[
        p
        for p in typing.Iter[typing.Members[T]]
        if typing.IsAssignable[p.name, Keys]
    ]
]

# Omit<T, Keys> — like Pick, but removes the specified properties instead
type Omit[T, Keys] = typing.NewProtocol[
    *[
        p
        for p in typing.Iter[typing.Members[T]]
        if not typing.IsAssignable[p.name, Keys]
    ]
]

# Partial<T> — makes every property optional (T | None)
type Partial[T] = typing.NewProtocol[
    *[
        typing.Member[p.name, p.type | None, p.quals]
        for p in typing.Iter[typing.Attrs[T]]
    ]
]

That’s Python syntax. Familiar comprehension style, just operating at the type level. The TypeScript side uses a completely different syntax (mapped types with keyof), but here you’re using the same mental model you already have for list comprehensions.

FastAPI’s CRUD Model Boilerplate — Gone

One of the most painful parts of building a FastAPI backend with SQLModel or Pydantic is that you end up manually writing four nearly-identical model classes for every entity: the database model, the public model, the create model, and the update model.

The FastAPI docs literally walk you through writing all of this by hand:

class HeroBase(SQLModel):
    name: str = Field(index=True)
    age: int | None = Field(default=None, index=True)

class Hero(HeroBase, table=True):
    id: int | None = Field(default=None, primary_key=True)
    secret_name: str

class HeroPublic(HeroBase):
    id: int

class HeroCreate(HeroBase):
    secret_name: str

class HeroUpdate(HeroBase):
    name: str | None = None
    age: int | None = None
    secret_name: str | None = None

That’s a lot of repetition. Every time Hero changes, you have to update four classes. With PEP 827, a FastAPI framework could define Public, Create, and Update as computed types, and you’d just write:

class Hero(NewSQLModel, table=True):
    id: int | None = Field(default=None, primary_key=True)
    name: str = Field(index=True)
    age: int | None = Field(default=None, index=True)
    secret_name: str = Field(hidden=True)

type HeroPublic = Public[Hero]
type HeroCreate = Create[Hero]
type HeroUpdate = Update[Hero]

The types would be fully evaluated both statically (by the type checker) and at runtime (by Pydantic for validation). Here’s what a Create type operator could look like under the hood:

type Create[T] = typing.NewProtocol[
    *[
        typing.Member[
            p.name,
            p.type,
            p.quals,
            GetDefault[p.init],
        ]
        for p in typing.Iter[typing.Attrs[T]]
        if not typing.IsAssignable[
            Literal[True],
            GetFieldItem[p.init, Literal["primary_key"]],
        ]
    ]
]

That iterates over every attribute of T, skips the primary key, and preserves defaults. The GetDefault helper is itself a conditional type alias that checks whether the attribute’s initializer is a Field and pulls out its default value if so.

Dataclasses, But Your Own Version

Another motivating use case is generating __init__ methods. The standard @dataclass decorator does this, and @dataclass_transform was added to PEP 681 as a special-case hack to let type checkers understand that pattern. But any library that wants similar behavior either relies on that one narrow special case or writes a mypy plugin.

With PEP 827, you could define a reusable InitFnType type alias and a @dataclass_ish decorator:

type InitFnType[T] = typing.Member[
    Literal["__init__"],
    Callable[
        [
            typing.Param[Literal["self"], Self],
            *[
                typing.Param[
                    p.name,
                    p.type,
                    Literal["keyword"]
                    if typing.IsAssignable[GetDefault[p.init], Never]
                    else Literal["keyword", "default"],
                ]
                for p in typing.Iter[typing.Attrs[T]]
            ],
        ],
        None,
    ],
    Literal["ClassVar"],
]

def dataclass_ish[T](cls: type[T]) -> typing.UpdateClass[InitFnType[T]]:
    pass

Or the base-class version (Pydantic-style), where subclasses automatically get the computed __init__:

class Model:
    def __init_subclass__[T](
        cls: type[T],
    ) -> typing.UpdateClass[InitFnType[T]]:
        super().__init_subclass__()

No mypy plugin required. No special-case PEP. Just composable type-level programming.

NumPy Broadcasting as a Type

For the math nerds: the PEP also includes a full implementation of NumPy-style broadcasting rules at the type level, which lets the type checker verify that array shapes are compatible before your code even runs:

class Array[DType, *Shape]:
    def __add__[*Shape2](
        self,
        other: Array[DType, *Shape2]
    ) -> Array[DType, *Broadcast[tuple[*Shape], tuple[*Shape2]]]:
        raise BaseException

The Broadcast type alias is recursive, walking down the shapes from the right and applying a MergeOne check that handles the broadcasting rules (like Literal[1] broadcasting to any size). Type errors get raised via RaiseError[Literal["Broadcast mismatch"], T, S] — which is a proper static type error, not a runtime exception.


How It Differs From TypeScript

The blog post is careful to point out that this isn’t trying to make Python look like TypeScript. TypeScript’s mapped types use a custom syntax ({ [K in keyof T]: ... }) that’s pretty different from the rest of the language. Python’s version stays in Python: you’re writing comprehensions, conditionals, and attribute access using familiar constructs, just applied at the type level.

The side-by-side comparison from the blog post is illuminating. TypeScript’s Pick requires a special keyof operator and mapped type syntax. Python’s version uses a comprehension with an if filter. TypeScript’s Omit then has to compose on top of Pick in a non-obvious way; Python’s Omit is just Pick with the condition inverted — same structure, different filter.

TypeScript’s system is powerful but syntactically alien to normal TypeScript code. Python’s proposed system feels like Python.


Runtime Evaluation: The Hard Part

One thing that makes this proposal genuinely difficult is the requirement to support runtime evaluation. This isn’t just about satisfying mypy — FastAPI, Pydantic, and countless other frameworks need to evaluate type annotations at runtime to drive actual behavior.

That means the new conditional types (tt if tb else tf) and comprehension types (*[... for t in Iter[T]]) can’t just be opaque static annotations. They have to actually compute the right thing when called.

The PEP handles this through a special_form_evaluator context variable that allows a runtime evaluator library to hook into boolean and iteration evaluation. The proposal authors are planning to publish a third-party evaluator library (there’s already a demo at github.com/vercel/python-typemap), and there’s an in-progress proof-of-concept implementation in mypy that can already handle the ORM, FastAPI model derivation, and NumPy broadcasting examples.


Why Is Vercel Doing This?

Fair question. Vercel is a deployment platform. What are they doing writing Python PEPs?

The answer they give is honest and makes sense: they build across TypeScript and Python, and they want both ecosystems to be first-class. Their AI SDK is TypeScript, their infrastructure tooling intersects Python everywhere, and they clearly have engineers who care deeply about developer experience.

More specifically, Yury Selivanov is one of the people who brought asyncio to Python (he was at Facebook/Meta and Edgedb before Vercel), so this isn’t a company pretending to care about Python — these are people who’ve been contributing to Python’s core for years.

The closing line of the blog post is worth quoting directly: “One might ask: in an age where agents are writing an increasing share of source code, should we even care about programming language syntax, tooling, or type system capabilities? We argue the answer is, more than ever, ‘yes’. We want type checkers to be more thorough and frameworks to be more expressive… The less boilerplate we have to maintain, the better.”

That last part is the real argument. Better types mean better autocomplete, safer AI-generated code, less debugging, fewer runtime surprises.


Where It Stands

PEP 827 is currently a Draft, submitted February 27, 2026, targeting Python 3.15. PEPs get debated, revised, and sometimes rejected — the process takes time, and there are open questions in this one around syntax alternatives, how strictly the type operators should be validated, and how UpdateClass handles evaluation order.

The proof-of-concept implementation in mypy covers the core use cases (ORM-style queries, FastAPI model derivation, NumPy broadcasting) but is still missing callable support and UpdateClass. The runtime evaluator demo is available to experiment with today.

If you want to track progress or contribute feedback, the discussion is live on Python Discourse and the reference implementation is at github.com/vercel/python-typemap.

This is one of the more ambitious Python typing proposals in years. Whether it lands in 3.15 or gets revised along the way, the fact that the community is finally seriously tackling TypeScript-style type manipulation for Python feels like a genuinely important moment for the language.


Source: https://vercel.com/blog/advancing-python-typing
PEP 827: https://peps.python.org/pep-0827/