Jump to related tools in the same category or review the original source on GitHub.

DevOps & Cloud @macstenk Updated 2/26/2026

šŸ“¦ Skill Exporter OpenClaw Skill - ClawHub

Do you want your AI agent to automate Skill Exporter workflows? This free skill from ClawHub helps with devops & cloud tasks without building custom tools from scratch.

What this skill does

Export Clawdbot skills as standalone, deployable microservices. Use when you want to dockerize a skill, deploy it to Railway or Fly.io, or create an independent API service. Generates Dockerfile, FastAPI wrapper, requirements.txt, deployment configs, and optional LLM client integration.

Install

npx clawhub@latest install skill-exporter

Full SKILL.md

Open original
namedescriptionlicense
skill-exporterExport Clawdbot skills as standalone, deployable microservices. Use when you want to dockerize a skill, deploy it to Railway or Fly.io, or create an independent API service. Generates Dockerfile, FastAPI wrapper, requirements.txt, deployment configs, and optional LLM client integration.MIT

Skill Exporter

Transform Clawdbot skills into standalone, deployable microservices.

Workflow

Clawdbot Skill (tested & working)
         ↓
    skill-exporter
         ↓
Standalone Microservice
         ↓
Railway / Fly.io / Docker

Usage

Export a skill

python3 {baseDir}/scripts/export.py \
  --skill ~/.clawdbot/skills/instagram \
  --target railway \
  --llm anthropic \
  --output ~/projects/instagram-service

Options

Flag Description Default
--skill Path to skill directory required
--target Deployment target: railway, fly, docker docker
--llm LLM provider: anthropic, openai, none none
--output Output directory ./<skill-name>-service
--port API port 8000

Targets

railway — Generates railway.json, optimized Dockerfile, health checks fly — Generates fly.toml, multi-region ready docker — Generic Dockerfile, docker-compose.yml

LLM Integration

When --llm is set, generates llm_client.py with:

  • Caption/prompt generation
  • Decision making helpers
  • Rate limiting and error handling

What Gets Generated

<skill>-service/
ā”œā”€ā”€ Dockerfile
ā”œā”€ā”€ docker-compose.yml
ā”œā”€ā”€ api.py              # FastAPI wrapper
ā”œā”€ā”€ llm_client.py       # If --llm specified
ā”œā”€ā”€ requirements.txt
ā”œā”€ā”€ .env.example
ā”œā”€ā”€ railway.json        # If --target railway
ā”œā”€ā”€ fly.toml            # If --target fly
└── scripts/            # Copied from original skill
    └── *.py

Requirements

The source skill must have:

  • SKILL.md with valid frontmatter
  • At least one script in scripts/
  • Scripts should be callable (functions, not just inline code)

Post-Export

  1. Copy .env.example to .env and fill in secrets
  2. Test locally: docker-compose up
  3. Deploy: railway up or fly deploy
Original URL: https://github.com/openclaw/skills/blob/main/skills/macstenk/skill-exporter

Related skills

If this matches your use case, these are close alternatives in the same category.