Jump to related tools in the same category or review the original source on GitHub.

CLI Utilities @gitgoodordietrying Updated 2/26/2026

🐚 Shell Scripting OpenClaw Skill - ClawHub

Do you want your AI agent to automate Shell Scripting workflows? This free skill from ClawHub helps with cli utilities tasks without building custom tools from scratch.

What this skill does

Write robust, portable shell scripts. Use when parsing arguments, handling errors properly, writing POSIX-compatible scripts, managing temp files, running commands in parallel, managing background processes, or adding --help to scripts.

Install

npx clawhub@latest install shell-scripting

Full SKILL.md

Open original
namedescription
shell-scriptingWrite robust, portable shell scripts. Use when parsing arguments, handling errors properly, writing POSIX-compatible scripts, managing temp files, running commands in parallel, managing background processes, or adding --help to scripts.

Shell Scripting

Write reliable, maintainable bash scripts. Covers argument parsing, error handling, portability, temp files, parallel execution, process management, and self-documenting scripts.

When to Use

  • Writing scripts that others (or future you) will run
  • Automating multi-step workflows
  • Parsing command-line arguments with flags and options
  • Handling errors and cleanup properly
  • Running tasks in parallel
  • Making scripts portable across Linux and macOS
  • Wrapping complex commands with a simpler interface

Script Template

#!/usr/bin/env bash
set -euo pipefail

# Description: What this script does (one line)
# Usage: script.sh [options] <required-arg>

readonly SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
readonly SCRIPT_NAME="$(basename "$0")"

# Defaults
VERBOSE=false
OUTPUT_DIR="./output"

usage() {
    cat <<EOF
Usage: $SCRIPT_NAME [options] <input-file>

Description:
  Process the input file and generate output.

Options:
  -o, --output DIR    Output directory (default: $OUTPUT_DIR)
  -v, --verbose       Enable verbose output
  -h, --help          Show this help message

Examples:
  $SCRIPT_NAME data.csv
  $SCRIPT_NAME -v -o /tmp/results data.csv
EOF
}

log() { echo "[$(date '+%H:%M:%S')] $*" >&2; }
debug() { $VERBOSE && log "DEBUG: $*" || true; }
die() { log "ERROR: $*"; exit 1; }

# Parse arguments
while [[ $# -gt 0 ]]; do
    case "$1" in
        -o|--output) OUTPUT_DIR="$2"; shift 2 ;;
        -v|--verbose) VERBOSE=true; shift ;;
        -h|--help) usage; exit 0 ;;
        --) shift; break ;;
        -*) die "Unknown option: $1" ;;
        *) break ;;
    esac
done

INPUT_FILE="${1:?$(usage >&2; echo "Error: input file required")}"
[[ -f "$INPUT_FILE" ]] || die "File not found: $INPUT_FILE"

# Main logic
main() {
    debug "Input: $INPUT_FILE"
    debug "Output: $OUTPUT_DIR"
    mkdir -p "$OUTPUT_DIR"

    log "Processing $INPUT_FILE..."
    # ... do work ...
    log "Done. Output in $OUTPUT_DIR"
}

main "$@"

Error Handling

set flags

set -e          # Exit on any command failure
set -u          # Error on undefined variables
set -o pipefail # Pipe fails if any command in the pipe fails
set -x          # Debug: print each command before executing (noisy)

# Combined (use this in every script)
set -euo pipefail

# Temporarily disable for commands that are allowed to fail
set +e
some_command_that_might_fail
exit_code=$?
set -e

Trap for cleanup

# Cleanup on exit (any exit: success, failure, or signal)
TMPDIR=""
cleanup() {
    [[ -n "$TMPDIR" ]] && rm -rf "$TMPDIR"
}
trap cleanup EXIT

TMPDIR=$(mktemp -d)
# Use $TMPDIR freely — it's cleaned up automatically

# Trap specific signals
trap 'echo "Interrupted"; exit 130' INT    # Ctrl+C
trap 'echo "Terminated"; exit 143' TERM    # kill

Error handling patterns

# Check command exists before using it
command -v jq >/dev/null 2>&1 || die "jq is required but not installed"

# Provide default values
NAME="${NAME:-default_value}"

# Required variable (fail if unset)
: "${API_KEY:?Error: API_KEY environment variable is required}"

# Retry a command
retry() {
    local max_attempts=$1
    shift
    local attempt=1
    while [[ $attempt -le $max_attempts ]]; do
        "$@" && return 0
        log "Attempt $attempt/$max_attempts failed. Retrying..."
        ((attempt++))
        sleep $((attempt * 2))
    done
    die "Command failed after $max_attempts attempts: $*"
}

retry 3 curl -sf https://api.example.com/health

Argument Parsing

Simple: positional + flags

# Manual parsing (no dependencies)
FORCE=false
DRY_RUN=false

while [[ $# -gt 0 ]]; do
    case "$1" in
        -f|--force) FORCE=true; shift ;;
        -n|--dry-run) DRY_RUN=true; shift ;;
        -o|--output)
            [[ -n "${2:-}" ]] || die "--output requires a value"
            OUTPUT="$2"; shift 2 ;;
        --output=*)
            OUTPUT="${1#*=}"; shift ;;
        -h|--help) usage; exit 0 ;;
        --) shift; break ;;  # End of options
        -*) die "Unknown option: $1" ;;
        *) break ;;  # Start of positional args
    esac
done

# Remaining args are positional
FILES=("$@")
[[ ${#FILES[@]} -gt 0 ]] || die "At least one file is required"

getopts (POSIX, short options only)

while getopts ":o:vhf" opt; do
    case "$opt" in
        o) OUTPUT="$OPTARG" ;;
        v) VERBOSE=true ;;
        f) FORCE=true ;;
        h) usage; exit 0 ;;
        :) die "Option -$OPTARG requires an argument" ;;
        ?) die "Unknown option: -$OPTARG" ;;
    esac
done
shift $((OPTIND - 1))

Temp Files and Directories

# Create temp file (automatically unique)
TMPFILE=$(mktemp)
echo "data" > "$TMPFILE"

# Create temp directory
TMPDIR=$(mktemp -d)

# Create temp with custom prefix/suffix
TMPFILE=$(mktemp /tmp/myapp.XXXXXX)
TMPFILE=$(mktemp --suffix=.json)  # GNU only

# Always clean up with trap
trap 'rm -f "$TMPFILE"' EXIT

# Portable pattern (works on macOS and Linux)
TMPDIR=$(mktemp -d 2>/dev/null || mktemp -d -t 'myapp')
trap 'rm -rf "$TMPDIR"' EXIT

Parallel Execution

xargs -P

# Run 4 commands in parallel
cat urls.txt | xargs -P 4 -I {} curl -sO {}

# Process files in parallel (4 at a time)
find . -name "*.csv" | xargs -P 4 -I {} ./process.sh {}

# Parallel with progress indicator
find . -name "*.jpg" | xargs -P 8 -I {} sh -c 'convert {} -resize 800x600 resized/{} && echo "Done: {}"'

Background jobs + wait

# Run tasks in background, wait for all
pids=()
for file in data/*.csv; do
    process_file "$file" &
    pids+=($!)
done

# Wait for all and check results
failed=0
for pid in "${pids[@]}"; do
    wait "$pid" || ((failed++))
done
[[ $failed -eq 0 ]] || die "$failed jobs failed"

GNU Parallel (if available)

# Process files with 8 parallel jobs
parallel -j 8 ./process.sh {} ::: data/*.csv

# With progress bar
parallel --bar -j 4 convert {} -resize 800x600 resized/{/} ::: *.jpg

# Pipe input lines
cat urls.txt | parallel -j 10 curl -sO {}

Process Management

Background processes

# Start in background
long_running_command &
BG_PID=$!

# Check if still running
kill -0 $BG_PID 2>/dev/null && echo "Running" || echo "Stopped"

# Wait for it
wait $BG_PID
echo "Exit code: $?"

# Kill on script exit
trap 'kill $BG_PID 2>/dev/null' EXIT

Process supervision

# Run a command, restart if it dies
run_with_restart() {
    local cmd=("$@")
    while true; do
        "${cmd[@]}" &
        local pid=$!
        log "Started PID $pid"
        wait $pid
        local exit_code=$?
        log "Process exited with code $exit_code. Restarting in 5s..."
        sleep 5
    done
}

run_with_restart ./my-server --port 8080

Timeout

# Kill command after 30 seconds
timeout 30 long_running_command

# With custom signal (SIGKILL after SIGTERM fails)
timeout --signal=TERM --kill-after=10 30 long_running_command

# Portable (no timeout command)
( sleep 30; kill $$ 2>/dev/null ) &
TIMER_PID=$!
long_running_command
kill $TIMER_PID 2>/dev/null

Portability (Linux vs macOS)

Common differences

# sed: macOS requires -i '' (empty backup extension)
# Linux:
sed -i 's/old/new/g' file.txt
# macOS:
sed -i '' 's/old/new/g' file.txt
# Portable:
sed -i.bak 's/old/new/g' file.txt && rm file.txt.bak

# date: different flags
# GNU (Linux):
date -d '2026-02-03' '+%s'
# BSD (macOS):
date -j -f '%Y-%m-%d' '2026-02-03' '+%s'

# readlink -f: doesn't exist on macOS
# Portable alternative:
real_path() { cd "$(dirname "$1")" && echo "$(pwd)/$(basename "$1")"; }

# stat: different syntax
# GNU: stat -c '%s' file
# BSD: stat -f '%z' file

# grep -P: not available on macOS by default
# Use grep -E instead, or install GNU grep

POSIX-safe patterns

# Use printf instead of echo -e (echo behavior varies)
printf "Line 1\nLine 2\n"

# Use $() instead of backticks
result=$(command)   # Good
result=`command`    # Bad (deprecated, nesting issues)

# Use [[ ]] for tests (bash), [ ] for POSIX sh
[[ -f "$file" ]]   # Bash (safer, no word splitting)
[ -f "$file" ]     # POSIX sh

# Array check (bash only, not POSIX)
if [[ ${#array[@]} -gt 0 ]]; then
    echo "Array has elements"
fi

Config File Parsing

Source a config file

# Simple: source a key=value file
# config.env:
# DB_HOST=localhost
# DB_PORT=5432

# Validate before sourcing (security: check for commands)
if grep -qP '^[A-Z_]+=.*[;\`\$\(]' config.env; then
    die "Config file contains unsafe characters"
fi
source config.env

Parse INI-style config

# config.ini:
# [database]
# host = localhost
# port = 5432
# [app]
# debug = true

parse_ini() {
    local file="$1" section=""
    while IFS='= ' read -r key value; do
        [[ -z "$key" || "$key" =~ ^[#\;] ]] && continue
        if [[ "$key" =~ ^\[(.+)\]$ ]]; then
            section="${BASH_REMATCH[1]}"
            continue
        fi
        value="${value%%#*}"     # Strip inline comments
        value="${value%"${value##*[![:space:]]}"}"  # Trim trailing whitespace
        printf -v "${section}_${key}" '%s' "$value"
    done < "$file"
}

parse_ini config.ini
echo "$database_host"  # localhost
echo "$app_debug"      # true

Useful Patterns

Confirm before destructive action

confirm() {
    local prompt="${1:-Are you sure?}"
    read -rp "$prompt [y/N] " response
    [[ "$response" =~ ^[Yy]$ ]]
}

confirm "Delete all files in /tmp/data?" || die "Aborted"
rm -rf /tmp/data/*

Progress indicator

# Simple counter
total=$(wc -l < file_list.txt)
count=0
while IFS= read -r file; do
    ((count++))
    printf "\rProcessing %d/%d..." "$count" "$total" >&2
    process "$file"
done < file_list.txt
echo "" >&2

Lock file (prevent concurrent runs)

LOCKFILE="/tmp/${SCRIPT_NAME}.lock"

acquire_lock() {
    if ! mkdir "$LOCKFILE" 2>/dev/null; then
        die "Another instance is running (lock: $LOCKFILE)"
    fi
    trap 'rm -rf "$LOCKFILE"' EXIT
}

acquire_lock
# ... safe to proceed, only one instance runs ...

Stdin or file argument

# Read from file argument or stdin
input="${1:--}"   # Default to "-" (stdin)
if [[ "$input" == "-" ]]; then
    cat
else
    cat "$input"
fi | while IFS= read -r line; do
    process "$line"
done

Tips

  • Always start with set -euo pipefail. It catches 80% of silent bugs.
  • Always use trap cleanup EXIT for temp files. Never rely on reaching the cleanup code at the end.
  • Quote all variable expansions: "$var" not $var. Unquoted variables break on spaces and globs.
  • Use [[ ]] instead of [ ] in bash. It handles empty strings, spaces, and pattern matching better.
  • shellcheck is the best linter for shell scripts. Run it: shellcheck myscript.sh. Install it if available.
  • readonly for constants prevents accidental overwrite: readonly DB_HOST="localhost".
  • Write a usage() function and call it on -h/--help and on missing required arguments. Future users (including you) will thank you.
  • Prefer printf over echo for anything that might contain special characters or needs formatting.
  • Test scripts with bash -n script.sh (syntax check) before running.
Original URL: https://github.com/openclaw/skills/blob/main/skills/gitgoodordietrying/shell-scripting

Related skills

If this matches your use case, these are close alternatives in the same category.