Upgrade to Pro — share decks privately, control downloads, hide ads and more …

Deno + Elixir + Erlang/OTP WHY & HOW

Deno + Elixir + Erlang/OTP WHY & HOW

Intelligent AI agents can write code that calls existing APIs in secure sandboxes, allowing higher token efficiency, better performance, and lower per-turn latency. Deno’s primitives in Rust can be integrated into such an agent harness, utilising the Elixir programming language and the Erlang/OTP runtime, which allows you to construct agentic systems that are massively concurrent and fault-tolerant.

Avatar for Evadne Wu

Evadne Wu

April 03, 2026
Tweet

More Decks by Evadne Wu

Other Decks in Technology

Transcript

  1. Precepts We want to build agents. They must interact with

    the outside world: either by calling APIs directly, or calling MCP tools. We want to allow agents to do heavy, analytical work, such as munging data into CSVs, without doing this within the context window.
  2. Why JavaScript? Let the agents write code. We will execute

    the code in a secure sandbox. → Subsequent consideration: What language? We must choose a language that agents can write: “Worse is better“… JavaScript.
  3. (High-Level) Requirements We want high performance one-off JavaScript evals, with

    as little ceremony as possible; We want to not leak data; We want to retain as much code as we can, on our existing Elixir stack, due to its concurrent & fault- tolerant attributes.
  4. Design Principles import() must work, so agents may import stuff

    …so no need to re-invent the wheel every time fetch() must work, so agents can actually call APIs …but we don’t want to just allow connections anywhere Dangerous/unnecessary APIs removed …but useful APIs need to be retained
  5. Crateology: 4 Embed Options deno (cli/) ← The binary. Everything

    user-facing. └─ deno_lib (cli/lib/) ← Shared between CLI and denort; “highly unstable” └─ deno_runtime ← The embeddable runtime crate ★ We use this (runtime/) └─ ext/* ← extension crates (operations exposed to JS) └─ deno_core (libs/core/) ← V8 bindings, JsRuntime, op system └─ rusty_v8 (v8 crate) ← Rust bindings to V8 C++ API
  6. Option 1: Raw V8 via rusty_v8 This was what Rivet’s

    Secure Exec does… Use Raw V8 Isolates. BUT V8 has no Web APIs!!!!!! You can not even run import() You must reimplement fetch, crypto, TextEncoder, etc.
  7. Option 1: Raw V8 via rusty_v8 Rivet provides a “Kernel”

    (handle VFS, File Descriptors, Processes, Pipes, …) When code in V8 calls fs.readFile('/foo') It goes through a bridge (host_call.rs → UDS → host) so the “Kernel” serves it
  8. Option 1: Raw V8 via rusty_v8 Honest conclusion: This option

    required too much IQ, which I did not have.
  9. Option 2: Use deno_core What you get: JsRuntime struct (V8

    isolate wrapper) execute_script() — synchronous eval Module loading (load_module() + mod_evaluate()) Op registration (Rust functions callable from JS)
  10. Option 2: Use deno_core What you don’t get: No File

    I/O, no network, no permissions, no Web APIs None of the Web APIs: fetch, crypto, TextEncoder, setTimeout, … not even import()
  11. Option 3: Use Deno CLI Zero embedding effort. We get

    everything (a little bit too much)… Module resolution, npm, TypeScript, all permissions. BUT: some globalThis leakage (may be our problem), and all the I/O is still done on Deno side unless we hack
  12. Extension Provides deno_web TextEncoder, TextDecoder, atob, btoa, URL, Blob, timers

    deno_fetch fetch, Request, Response, Headers deno_crypto crypto.subtle, crypto.getRandomValues deno_net TCP/UDP/Unix sockets
  13. Workspace Session Llama Service Llama Carrier Lua Service Lua Carrier

    Deno Service Deno Carrier Task Matter OS Process Call Monitor OS Process UDS UDS UDS 1 2
  14. Recap: Our Solution Embed deno_runtime in a Rust carrier process.

    Connect to the carrier using Unix Domain Sockets. All policy decisions delegated to the host, including where to fetch()… We just inject our own fetch and masked the native one
  15. Context Reuse Creating a fresh MainWorker per eval = 14ms

    Make the fresh v8 context. Share ContextState, ModuleMap from Main Context. We used unsafe, but: 14ms → 0.29ms, and we can crash this carrier with impunity without any pollution
  16. Pre-Baked Snapshots deno_runtime uses 30+ extensions Each extension includes JavaScript/TypeScript

    that must be compiled and evaluated to set up the global environment (Web APIs, crypto, fetch, etc.). Without optimisation: ~230ms per new MainWorker
  17. Metric Value Snapshot blob size ~8.1 MB First build time

    ~25 min (hundreds of crates) Incremental rebuild (Rust only) ~2–5 min Binary size (release, thin LTO) ~159 MB
  18. Metric Without Snapshot With Snapshot MainWorker creation ~230 ms ~14

    ms Fresh V8 context (context-reuse) ~0.29 ms ~0.29 ms Carrier → READY ~300 ms ~5 ms
  19. Host-Side Fetch Use Shun (Elixir library) for protection. Every fetch()

    hits the Networking service, which checks the domains, and then resolves any accepted domain → IP and checks ip (which avoids XIP.io-style attacks); Fetches are serialised to avoid resource exhaustion
  20. Scenario p50 p99 max noop (`0`) 380μs 1.47ms 2.21ms arithmetic

    (`2 + 2`) 378μs 1.51ms 2.55ms string concat 388μs 1.59ms 4.92ms JSON object construction 389μs 1.52ms 7.06ms
  21. Scenario p50 p99 max SHA-256 (SubtleCrypto) 753μs 2.37ms 7.26ms JSON

    parse+transform (100 items) 843μs 2.31ms 9.12ms globalThis isolation check 383μs 1.48ms 9.59ms new MainWorker, n=100 29.69ms 52.23ms 52.23ms 10 parallel refs 701μs 4.67ms 8.36ms
  22. Seven layers of abstraction, each one blissfully unaware of the

    horrors above and below it. That's not a stack; that's a turducken. Claude Opus 4.6 February 2026