Jacar mascot — reading along A laptop whose eyes follow your cursor while you read.
Desarrollo de Software Tecnología

WASI 0.2 GA: Truly Composable WebAssembly

WASI 0.2 GA: Truly Composable WebAssembly

Actualizado: 2026-05-03

WASI 0.2 Preview 2 reached GA in January 2024, and with it the WebAssembly Component Model arrived in production. This is a decisive milestone: server-side WebAssembly moves from experiment to a platform capable of running composable polyglot code. This article covers what practically changes, when server-side Wasm makes sense, and what cases are viable today.

Key takeaways

  • The Component Model resolves the cross-language portability that was blocking Wasm adoption outside the browser.
  • For edge functions with critical cold start, secure plugins, and untrusted-code sandboxing, Wasm is already viable today.
  • For traditional apps, Wasm does not replace containers — it complements them.
  • Rust is the lowest-friction language for Wasm components; others are closing the gap.
  • Sub-1 ms cold start vs. 100–1000 ms for a container is the differential advantage on the edge.

What the Component Model is

Before WASI 0.2, server-side WebAssembly was Preview 1: Unix-like syscalls and isolated modules. Every integration was manual and a module’s code could not compose with another’s without ad-hoc glue code.

Preview 2 Component Model adds:

  • WIT (WebAssembly Interface Types): cross-language interface definitions with static types.
  • Components: reusable composition unit, like Rust crates or npm packages but multi-language.
  • Dependency injection: components consume interfaces, pluggable at load time.
  • Type safety across language boundaries.

The result: write one component in Rust, another in Go, another in JavaScript, and compose them without manual glue code. Each compiles to .wasm; the host assembles them.

WIT interface example

package example:greeter@1.0.0;

interface greeter {
    greet: func(name: string) -> string;
}

world host {
    export greeter;
}

Any language compiling to a Wasm component can implement or consume this interface. The toolchain validates types at compile time; integration errors are caught before deployment.

Viable real cases today

Edge serverless

Fastly Compute[1], Cloudflare Workers[2] with Wasm, and Fermyon Spin[3] use server-side Wasm for:

  • Sub-1 ms cold start versus 100–1000 ms for a cold container.
  • Strict sandboxing without OS privileges.
  • Multi-runtime portability without recompiling.
  • Massive scaling without persistent state.

Secure plugins for host applications

A host application can load Wasm components as plugins without privileged access:

Plugins are portable between runtimes without recompiling and without host system access.

IoT and embedded

Wasm runs on devices with kilobytes of memory:

  • wasm-micro-runtime[7] (WAMR): for highly resource-constrained IoT.
  • Customisable firmware without reflashing.
  • Sandbox for third-party code on industrial devices.

Main runtimes

Runtime Focus Notes
Wasmtime[8] Bytecode Alliance reference Safest default
Wasmer[9] Commercial + OSS Broad ecosystem
WasmEdge[10] Cloud-native CNCF, K8s focus
Fermyon Spin[3] Serverless Wasm Complete framework

Language support

Maturity varies significantly:

  • Rust: first-class citizen. cargo-component generates components with minimal friction.
  • C/C++: via wasi-sdk, mature.
  • Go: TinyGo supports Wasm with some limitations vs. full Go.
  • JavaScript: via Javy (for edge) and Componentize-JS.
  • Python: experimental via componentize-py.
  • .NET: emerging support.

Rust is the lowest-friction path today. For greenfield, start there.

Wasm versus containers

Not a replacement — a complement for specific cases:

Aspect Wasm component Container
Cold start <1 ms 100 ms–1 s
Size KB–MB MB–GB
Isolation Very strong Good
Ecosystem Emerging Massive
OS access Limited (safer) Full

For edge functions, plugins and sandboxing, Wasm wins on cold start and isolation. For traditional apps with OS dependencies, containers remain the correct default.

Kubernetes and Wasm

K8s integration matures through:

  • runwasi[11]: containerd runtime to run Wasm components directly.
  • kwasm[12]: operator to enable Wasm on K8s nodes without changing the base runtime.

Running Wasm on K8s without containers is possible today, especially for edge-adjacent workloads like Fastly Compute.

Current limitations

  • Ecosystem still emerging: many third-party libraries lack Wasm bindings.
  • Debugging: harder than native or container.
  • Async: improving with WASI 0.3 on the roadmap, but not complete.
  • GC languages: Java, C#, Python are less efficient when compiling to Wasm.

When to use Wasm today versus when to wait

Use today:

  • Edge functions where cold start is critical for UX.
  • Secure plugins for platforms where code is third-party.
  • Untrusted-code sandboxing (AI, user scripting).
  • Cross-platform portability (same binary on Mac, Linux, Windows, ARM, x86).

Wait:

  • Traditional web apps with no clear advantage over containers.
  • Disk- or network-intensive stateful workloads.
  • Heavy computation where native or GPU remain better.

Conclusion

WASI 0.2 GA is the inflection point where server-side WebAssembly stops being a promise. The Component Model resolves the cross-language portability that was blocking adoption. For teams building edge functions, platforms with plugins, or systems sandboxing third-party code, Wasm is already a technically solid choice. For the rest, it is a technology to follow closely. Convergence with Kubernetes through runwasi and adoption by Cloudflare Workers are clear signals the ecosystem is consolidating.

Was this useful?
[Total: 15 · Average: 4.3]
  1. Fastly Compute
  2. Cloudflare Workers
  3. Fermyon Spin
  4. Envoy Wasm filters
  5. Proxy-Wasm
  6. Istio WebAssembly plugins
  7. wasm-micro-runtime
  8. Wasmtime
  9. Wasmer
  10. WasmEdge
  11. runwasi
  12. kwasm

Written by

CEO - Jacar Systems

Passionate about technology, cloud infrastructure and artificial intelligence. Writes about DevOps, AI, platforms and software from Madrid.