š¦ Skill Exporter OpenClaw Skill - ClawHub
Do you want your AI agent to automate Skill Exporter workflows? This free skill from ClawHub helps with devops & cloud tasks without building custom tools from scratch.
What this skill does
Export Clawdbot skills as standalone, deployable microservices. Use when you want to dockerize a skill, deploy it to Railway or Fly.io, or create an independent API service. Generates Dockerfile, FastAPI wrapper, requirements.txt, deployment configs, and optional LLM client integration.
Install
npx clawhub@latest install skill-exporterFull SKILL.md
Open original| name | description | license |
|---|---|---|
| skill-exporter | Export Clawdbot skills as standalone, deployable microservices. Use when you want to dockerize a skill, deploy it to Railway or Fly.io, or create an independent API service. Generates Dockerfile, FastAPI wrapper, requirements.txt, deployment configs, and optional LLM client integration. | MIT |
Skill Exporter
Transform Clawdbot skills into standalone, deployable microservices.
Workflow
Clawdbot Skill (tested & working)
ā
skill-exporter
ā
Standalone Microservice
ā
Railway / Fly.io / Docker
Usage
Export a skill
python3 {baseDir}/scripts/export.py \
--skill ~/.clawdbot/skills/instagram \
--target railway \
--llm anthropic \
--output ~/projects/instagram-service
Options
| Flag | Description | Default |
|---|---|---|
--skill |
Path to skill directory | required |
--target |
Deployment target: railway, fly, docker |
docker |
--llm |
LLM provider: anthropic, openai, none |
none |
--output |
Output directory | ./<skill-name>-service |
--port |
API port | 8000 |
Targets
railway ā Generates railway.json, optimized Dockerfile, health checks
fly ā Generates fly.toml, multi-region ready
docker ā Generic Dockerfile, docker-compose.yml
LLM Integration
When --llm is set, generates llm_client.py with:
- Caption/prompt generation
- Decision making helpers
- Rate limiting and error handling
What Gets Generated
<skill>-service/
āāā Dockerfile
āāā docker-compose.yml
āāā api.py # FastAPI wrapper
āāā llm_client.py # If --llm specified
āāā requirements.txt
āāā .env.example
āāā railway.json # If --target railway
āāā fly.toml # If --target fly
āāā scripts/ # Copied from original skill
āāā *.py
Requirements
The source skill must have:
SKILL.mdwith valid frontmatter- At least one script in
scripts/ - Scripts should be callable (functions, not just inline code)
Post-Export
- Copy
.env.exampleto.envand fill in secrets - Test locally:
docker-compose up - Deploy:
railway uporfly deploy