Terminals for the web and machines.
Low-latency terminal streaming to any browser. Share with a link. Drive from AI agents. Tunnel into SSH hosts. Single binary, no required dependencies.
docker run --rm grab/blit-demo curl -sf https://install.blit.sh | sh irm https://install.blit.sh/install.ps1 | iex $ blit shareTerminals at https://blit.sh/s#rf6e…Read-only: https://blit.sh/s#EMPs… $ blit terminal start -t build make -j81$ blit terminal wait 1 --pattern 'BUILD OK'[ 12%] Building CXX object src/CMakeFiles/app.dir/main.cpp.o[ 38%] Building CXX object src/CMakeFiles/app.dir/io.cpp.o[ 71%] Building CXX object src/CMakeFiles/app.dir/render.cpp.o[ 92%] Linking CXX executable bin/app[100%] Built target appBUILD OK $ blit terminal send 1 "./bin/app --serve"$ Three layers, one socket.
The server parses PTY output into structured state, computes per-client binary diffs, and ships only the delta. The browser applies diffs in WASM and paints with WebGPU — falling back to WebGL2, then canvas2d. Keystrokes travel the reverse path with no queuing; latency is bounded by link RTT.
Above the Unix socket, everything is stateless: gateways and proxies can restart without touching your terminals.
- Parser
- alacritty_terminal
- Render
- WebGPU · WebGL2 · canvas2d
- Compression
- LZ4 · binary diff
- Encode
- H.264 · AV1 · Vulkan
Applies LZ4-compressed binary diffs; reports render metrics back for pacing.
Stateless. Restartable. PTYs survive gateway and proxy restarts.
Owns PTYs, scrollback, parsed state, per-client frame pacing, GPU encode.
Four commands, four superpowers.
Hand a teammate a URL.
WebRTC punches through NATs — no SSH keys, no port forwarding. For permanent access, name your SSH remotes once. blit auto-installs itself on the remote on the first connection.
$ blit share
Terminals at https://blit.sh/s#rf6e…
Read-only: https://blit.sh/s#EMPs…
$ blit remote add prod ssh:alice@prod.co
$ blit --on prod terminal list
2 build (running)
3 shell (idle) Drive sessions from a script — or an LLM.
Non-interactive subcommands work over the parsed terminal grid: start a process, send keystrokes, wait for a regex, read the screen as text. We ship a computer-agent skill any model can pick up.
$ blit terminal start -t build make -j8
1
$ blit terminal wait 1 --pattern 'BUILD OK'
$ blit terminal show 1 | tail -2
[100%] Built target app
BUILD OK
$ blit terminal send 1 "q" Stream graphical apps without a display.
Every blit terminal includes an experimental headless Wayland compositor. Launch foot, mpv, or a browser; capture the surface, click into it, type. Encoded as H.264 / AV1 with NVENC, VA-API, or software fallback.
$ blit terminal start foot
1
$ blit surface list
1 foot — 1024x768 — av1-vaapi
$ blit surface capture 1 screenshot.png
$ blit surface click 1 100 50
$ blit surface type 1 "hello{Return}" Move bytes through the compositor clipboard.
Read and write the compositor clipboard with any MIME type — text, images, custom formats. Same API for local terminals, share links, and SSH remotes.
$ blit clipboard list
text/plain;charset=utf-8
text/plain
$ blit clipboard get
hello world
$ blit clipboard set "new value" Built for the things other terminal-over-browser tools don't do.
We respect the existing tools — most served their generation well. Here is where blit diverges, in plain terms.
| Feature | blit | ttyd | gotty | Eternal Terminal | Mosh | xterm.js + node-pty |
|---|---|---|---|---|---|---|
| Architecture | Single binary | Single binary | Single binary | Client + daemon | Client + server | Library |
| Multiple PTYs | ✓ | — | — | — | — | ~ |
| Browser access | ✓ | ✓ | ✓ | — | — | ✓ |
| Delta updates | ✓ | — | — | — | ✓ | — |
| LZ4 compression | ✓ | — | — | — | — | — |
| Per-client backpressure | ✓ | — | — | ~ | — | — |
| GPU rendering (WebGPU/WebGL2) | ✓ | — | — | — | — | ~ |
| Embeddable (React/Solid) | ✓ | — | — | — | — | ✓ |
| Wayland compositor | ✓ | — | — | — | — | — |
| GUI app streaming | ✓ | — | — | — | — | — |
| Agent / CLI subcommands | ✓ | — | — | — | — | — |
| SSH tunneling built-in | ✓ | — | — | ✓ | ✓ | — |
A skill any LLM can pick up.
A ready-made instruction set that teaches a model to drive blit's CLI — terminals, keystrokes, pattern waits, surface capture, clicks. One URL, no extra glue.
// In your agent system prompt or tools manifest:
read("https://install.blit.sh/SKILL.md") import { BlitWorkspaceProvider, BlitTerminal,
useBlitFocusedSession } from '@blit-sh/react';
import { BlitWorkspace } from '@blit-sh/core';
// Create a workspace with a WebSocket connection
const workspace = new BlitWorkspace({
wasm,
connections: [{
id: "ws",
transport: { type: "websocket", url, passphrase },
}],
});
// Render — that's it
<BlitWorkspaceProvider workspace={workspace}>
<BlitTerminal sessionId={useBlitFocusedSession()?.id ?? null} />
</BlitWorkspaceProvider> Drop a real terminal into your app.
@blit-sh/react
and
@blit-sh/solid
give you a workspace context, hooks for sessions and focus, and
a <BlitTerminal>
component. WebGPU when available, WebGL2 and canvas2d
fallbacks. Zero DOM manipulation.
-
@blit-sh/coreframework-agnostic core, transports, renderers -
@blit-sh/reactthin React wrapper: provider, hooks, components -
@blit-sh/solidthin Solid wrapper: primitives + components -
@blit-sh/browserWASM diff engine + vertex builder
Nothing to install. Optional accelerators.
Software H.264 and AV1 encoders are statically linked into blit;
the CPU compositor works everywhere. GPU and audio kick in
automatically when the right libraries are present — every shared
object is loaded at runtime via
dlopen, missing ones silently skipped.
Full features, including the experimental Wayland compositor.
PTY multiplexing, sharing, agent API. No GUI compositor.
PTY multiplexing, sharing, agent API. No GUI compositor.
SSH remotes are auto-installed on first connection — requires
curl or
wget
and CA certificates on the remote.
| Library | Used for |
|---|---|
libvulkan.so.1 libvulkan1, mesa-vulkan-drivers | GPU compositing, Vulkan Video encode |
libva.so.2 + libva-drm.so.2 libva2, libva-drm2, va-driver-all | VA-API hardware encode (Intel/AMD) |
libgbm.so.1 libgbm1 | DMA-BUF zero-copy for VA-API |
libcuda.so.1 + libnvidia-encode.so.1 NVIDIA proprietary driver | NVENC hardware encode |
| Binary | Used for |
|---|---|
pipewire pipewire | Audio daemon (private instance) |
pipewire-pulse pipewire-pulse | PulseAudio compatibility |
libpipewire-0.3.so.0 pipewire | Monitor capture (in-process) |
dbus-daemon dbus | Private D-Bus session |
wireplumber wireplumber | Session manager (optional) |
Without GPU libraries, the compositor falls back to CPU rendering and software
encoding. Audio is disabled when PipeWire is not installed, or
explicitly with
BLIT_AUDIO=0.
Things to know.
We'd rather flag what's rough than oversell. blit's PTY core is stable; the experimental bits are clearly marked and shipping regularly.
Headless compositor and surface streaming are experimental. Some apps will draw fine; others won't. Linux only.
Both platforms multiplex PTYs and stream them to browsers. The GUI compositor is Linux-only and not on the roadmap for either.
NVENC requires the proprietary NVIDIA driver. VA-API needs Intel/AMD with libva. Falls back to software at any time.
Try it without installing.
The demo image runs unprivileged and starts a shareable terminal, with foot, mpv, neovim, and a few Wayland toys preloaded. Spin it up to feel the latency.
docker run --rm grab/blit-demo