Jacar mascot — reading along A laptop whose eyes follow your cursor while you read.
Tecnología

Fastly Compute: High-Performance Edge with WebAssembly

Fastly Compute: High-Performance Edge with WebAssembly

Actualizado: 2026-05-03

Fastly Compute[1] (formerly Compute@Edge) is Fastly’s bet on serious edge computing. While Cloudflare Workers uses V8 isolates and a mature JavaScript ecosystem, Fastly chose pure WebAssembly on its own Lucet/Wasmtime runtime. Two different philosophies with implications for performance, portability, and ergonomics. This article honestly compares both and explains when Fastly is the right choice.

Fastly’s Technical Bet

Fastly didn’t build its own JavaScript runtime or lean on Node. It chose WebAssembly as the universal target:

  • Your code (Rust, Go, AssemblyScript, JS via Javy) compiles to .wasm.
  • Fastly deploys the binary to its ~70 global PoPs.
  • Each request invokes an ephemeral Wasm module instance.
  • Cold start: 35 microseconds (μs, not ms) — Fastly’s central marketing argument.

The number is real but deserves context: Workers also has sub-5ms cold start. The 1ms→35μs difference isn’t visible to end users; but it does allow running thousands of functions per request without accumulated overhead.

Supported Languages

Four officially supported:

  • Rust: first-class citizen. Most mature tooling (fastly crate), more complete documentation.
  • JavaScript: supports JS + TypeScript subset, run via Javy[2] — JS → Wasm ahead-of-time.
  • Go: supported via TinyGo. Limitations vs standard Go.
  • AssemblyScript: TypeScript subset that compiles directly to Wasm.

For new projects, Rust is the recommendation. For pure-JS teams, support is good but doesn’t match Workers’ ergonomy where the npm ecosystem is much more integrated.

Execution Model

Each request lifecycle in Fastly Compute:

  1. Arrives at the nearest PoP to the user.
  2. Wasm module is instantiated (35μs cold start).
  3. Handler executes with access to request/response/stores.
  4. Instance is destroyed — stateless execution by default.
  5. Response is sent to the client.

No shared state between requests within the same process. Fastly prioritises determinism and strong isolation over state convenience.

Rust Hello World

rust
use fastly::http::{Method, StatusCode};
use fastly::{Error, Request, Response};

#[fastly::main]
fn main(req: Request) -> Result<Response, Error> {
    match (req.get_method(), req.get_path()) {
        (&Method::GET, "/") => Ok(Response::from_status(StatusCode::OK)
            .with_body("Hello from Fastly Compute")),
        _ => Ok(Response::from_status(StatusCode::NOT_FOUND)),
    }
}

Deploy with one command: fastly compute publish.

Fastly Compute vs Cloudflare Workers

Aspect Fastly Compute Cloudflare Workers
Runtime WebAssembly (Wasmtime) V8 isolates
Cold start 35μs ~1-5ms
Main language Rust, JS JS, TypeScript
PoPs ~70 ~300+
KV/store KV Store (simple) Workers KV (sophisticated), D1, R2
Entry price $50/mo $5/mo
Portability Standard Wasm CF-locked

For compute-intensive code in Rust/Go, Fastly performs better. For JS/TS apps with npm ecosystem, Workers is more productive. Cloudflare has a clear advantage in PoP count and entry pricing.

Where Fastly Clearly Wins

Scenarios where Fastly is the natural choice: portable Rust code (if you already have critical logic in Rust, porting is natural); enterprise compliance and control; integration with existing Fastly CDN; predictable performance for high-concurrency workloads; serious custom edge logic (complex A/B tests, HTML rewriting, image transforms).

Where Workers Wins

Cloudflare Workers dominates in: JS ecosystem (npm packages work directly); Durable Objects + KV + D1 primitives; entry pricing; geographic coverage (~300+ PoPs); community resources and examples.

Common Real-World Patterns

  • API gateway: validate auth, transform requests, route by feature flags.
  • Image optimisation: resize and optimise on-the-fly at the edge.
  • HTML rewriting: inject consent banners or personalised content.
  • A/B testing: split traffic with deterministic logic.
  • Geo-based routing: serve different content by country or region.

Realistic Costs

Fastly Compute: $50/month Essential plan includes 1M requests, $0.50/1M additional. Cloudflare Workers: free tier at 100k requests/day, $5/month paid plan with 10M requests, $0.30/1M additional.

Current Limitations

  • Limited binary size (~100 MB).
  • Max time per request: ~60s.
  • No persistent native WebSockets.
  • Limited Wasm debugging vs JS devtools.
  • Less community context and examples than Workers.

Conclusion

Fastly Compute is a serious technical bet on WebAssembly at the edge. For compute-intensive workloads, complex Rust logic, or teams already on Fastly CDN, it’s the more performant option. For the median case of “edge JS logic with npm ecosystem and low entry price”, Cloudflare Workers remains more practical. The choice isn’t ideological — it’s contextual: what language, what team, what scale, what compliance.

Was this useful?
[Total: 0 · Average: 0]
  1. Fastly Compute
  2. Javy

Written by

CEO - Jacar Systems

Passionate about technology, cloud infrastructure and artificial intelligence. Writes about DevOps, AI, platforms and software from Madrid.