Developer Setup¶
One place for all the extra tools this repo uses—what they do, where they’re configured, and how to run them locally or in CI.
Table of Contents¶
- At a Glance
- Prereqs & Versions
- Formatting & Linting
- Typing
- Tests & Coverage
- Mutation & Property Tests
- Security & Supply Chain
- Docs Toolchain
- API Tooling
- Packaging & Releases
- Git Hooks & Commit Hygiene
- CI Orchestration
- Config Files (Map)
At a Glance¶
- Make is the front door; tox mirrors CI envs.
- Tool configs live in
config/
and root dotfiles. - Most tasks have a 1:1
make
target (make help
shows all).
Prereqs & Versions¶
- Python 3.11–3.13 (recommend
pyenv
) - Node.js (API validation / docs assets)
- Java 17 (OpenAPI Generator CLI)
- GNU Make, git
Formatting & Linting¶
Run both locally and in CI:
make lint # ruff format+lint, mypy, pyright, codespell, radon, pydocstyle (+ pytype ≤ 3.12)
make quality # vulture, deptry, reuse, interrogate, codespell, radon
How make lint
works (per current makefiles/lint.mk
):
-
Runs per file across
src/bijux_cli
andtests
: -
ruff format
(code style) ruff check --fix
(lint autofix, config atconfig/ruff.toml
)mypy --strict
(config atconfig/mypy.ini
)codespell -I config/bijux.dic
pyright --project config/pyrightconfig.json
radon cc -s -a
(complexity)pydocstyle --convention=google
(docstring style)- Then runs Pytype once per directory only on Python ≤ 3.12; it is skipped on 3.13+.
Make: Lint (makefiles/lint.mk
)
# Lint Configuration (no root cache pollution)
RUFF := $(ACT)/ruff
MYPY := $(ACT)/mypy
PYTYPE := $(ACT)/pytype
CODESPELL := $(ACT)/codespell
PYRIGHT := $(ACT)/pyright
PYDOCSTYLE := $(ACT)/pydocstyle
RADON := $(ACT)/radon
# Targets & dirs
LINT_DIRS ?= src/bijux_cli tests
LINT_ARTIFACTS_DIR ?= artifacts/lint
# Tool caches inside artifacts_pages/lint
RUFF_CACHE_DIR ?= $(LINT_ARTIFACTS_DIR)/.ruff_cache
MYPY_CACHE_DIR ?= $(LINT_ARTIFACTS_DIR)/.mypy_cache
PYTYPE_OUT_DIR ?= $(LINT_ARTIFACTS_DIR)/.pytype
# In case these are not defined elsewhere
VENV_PYTHON ?= python3
.PHONY: lint lint-artifacts lint-file lint-dir lint-clean
lint: lint-artifacts
@echo "✔ Linting completed (logs in '$(LINT_ARTIFACTS_DIR)')"
lint-artifacts: | $(VENV)
@mkdir -p "$(LINT_ARTIFACTS_DIR)" "$(RUFF_CACHE_DIR)" "$(MYPY_CACHE_DIR)" "$(PYTYPE_OUT_DIR)"
@set -euo pipefail; { \
echo "→ Ruff format (check)"; \
$(RUFF) format --check --cache-dir "$(RUFF_CACHE_DIR)" $(LINT_DIRS); \
} 2>&1 | tee "$(LINT_ARTIFACTS_DIR)/ruff-format.log"
@set -euo pipefail; $(RUFF) check --fix --config config/ruff.toml --cache-dir "$(RUFF_CACHE_DIR)" $(LINT_DIRS) 2>&1 | tee "$(LINT_ARTIFACTS_DIR)/ruff.log"
@set -euo pipefail; $(MYPY) --config-file config/mypy.ini --strict --cache-dir "$(MYPY_CACHE_DIR)" $(LINT_DIRS) 2>&1 | tee "$(LINT_ARTIFACTS_DIR)/mypy.log"
@set -euo pipefail; $(PYRIGHT) --project config/pyrightconfig.json 2>&1 | tee "$(LINT_ARTIFACTS_DIR)/pyright.log"
@set -euo pipefail; $(CODESPELL) -I config/bijux.dic $(LINT_DIRS) 2>&1 | tee "$(LINT_ARTIFACTS_DIR)/codespell.log"
@set -euo pipefail; $(RADON) cc -s -a $(LINT_DIRS) 2>&1 | tee "$(LINT_ARTIFACTS_DIR)/radon.log"
@set -euo pipefail; $(PYDOCSTYLE) --convention=google $(LINT_DIRS) 2>&1 | tee "$(LINT_ARTIFACTS_DIR)/pydocstyle.log"
@if $(VENV_PYTHON) -c 'import sys; sys.exit(0 if sys.version_info < (3,13) else 1)'; then \
set -euo pipefail; $(PYTYPE) -o "$(PYTYPE_OUT_DIR)" --keep-going --disable import-error $(LINT_DIRS) 2>&1 | tee "$(LINT_ARTIFACTS_DIR)/pytype.log"; \
else \
echo "Pytype skipped on Python ≥3.13" | tee "$(LINT_ARTIFACTS_DIR)/pytype.log"; \
fi
@[ -d .pytype ] && echo "→ removing stray .pytype" && rm -rf .pytype || true
@[ -d .mypy_cache ] && echo "→ removing stray .mypy_cache" && rm -rf .mypy_cache || true
@[ -d .ruff_cache ] && echo "→ removing stray .ruff_cache" && rm -rf .ruff_cache || true
@printf "OK\n" > "$(LINT_ARTIFACTS_DIR)/_passed"
lint-file:
ifndef file
$(error Usage: make lint-file file=path/to/file.py)
endif
@$(call run_tool,RuffFormat,$(RUFF) format --cache-dir "$(RUFF_CACHE_DIR)")
@$(call run_tool,Ruff,$(RUFF) check --fix --config config/ruff.toml --cache-dir "$(RUFF_CACHE_DIR)")
@$(call run_tool,Mypy,$(MYPY) --config-file config/mypy.ini --strict --cache-dir "$(MYPY_CACHE_DIR)")
@$(call run_tool,Codespell,$(CODESPELL) -I config/bijux.dic)
@$(call run_tool,Pyright,$(PYRIGHT) --project config/pyrightconfig.json)
@$(call run_tool,Radon,$(RADON) cc -s -a)
@$(call run_tool,Pydocstyle,$(PYDOCSTYLE) --convention=google)
@if $(VENV_PYTHON) -c 'import sys; sys.exit(0 if sys.version_info < (3,13) else 1)'; then \
$(call run_tool,Pytype,$(PYTYPE) -o "$(PYTYPE_OUT_DIR)" --keep-going --disable import-error); \
else \
echo "→ Skipping Pytype (unsupported on Python ≥ 3.13)"; \
fi
lint-dir:
ifndef dir
$(error Usage: make lint-dir dir=<directory_path>)
endif
@$(MAKE) LINT_DIRS="$(dir)" lint-artifacts
lint-clean:
@echo "→ Cleaning lint artifacts"
@rm -rf "$(LINT_ARTIFACTS_DIR)" .pytype .mypy_cache .ruff_cache || true
@echo "✔ done"
##@ Lint
lint: ## Run all lint checks; save logs to artifacts_pages/lint/ (ruff/mypy/pytype caches under artifacts_pages/lint)
lint-artifacts: ## Same as 'lint' (explicit), generates logs
lint-file: ## Lint a single file (requires file=<path>)
lint-dir: ## Lint a directory (requires dir=<path>)
lint-clean: ## Remove lint artifacts_pages, including caches
Make: Quality (makefiles/quality.mk
)
# Quality Configuration (evidence → artifacts_pages/quality)
INTERROGATE_PATHS ?= src/bijux_cli
QUALITY_PATHS ?= src/bijux_cli
VULTURE := $(ACT)/vulture
DEPTRY := $(ACT)/deptry
REUSE := $(ACT)/reuse
INTERROGATE := $(ACT)/interrogate
PYTHON := $(shell command -v python3 || command -v python)
QUALITY_ARTIFACTS_DIR ?= artifacts/quality
QUALITY_OK_MARKER := $(QUALITY_ARTIFACTS_DIR)/_passed
ifeq ($(shell uname -s),Darwin)
BREW_PREFIX := $(shell command -v brew >/dev/null 2>&1 && brew --prefix)
CAIRO_PREFIX := $(shell test -n "$(BREW_PREFIX)" && brew --prefix cairo)
QUALITY_ENV := DYLD_FALLBACK_LIBRARY_PATH="$(BREW_PREFIX)/lib:$(CAIRO_PREFIX)/lib:$$DYLD_FALLBACK_LIBRARY_PATH"
else
QUALITY_ENV :=
endif
.PHONY: quality interrogate-report quality-clean
quality:
@echo "→ Running quality checks..."
@mkdir -p "$(QUALITY_ARTIFACTS_DIR)"
@echo " - Dead code analysis (Vulture)"
@set -euo pipefail; \
{ $(VULTURE) --version 2>/dev/null || echo vulture; } >"$(QUALITY_ARTIFACTS_DIR)/vulture.log"; \
OUT="$$( $(VULTURE) $(QUALITY_PATHS) --min-confidence 80 2>&1 || true )"; \
printf '%s\n' "$$OUT" >>"$(QUALITY_ARTIFACTS_DIR)/vulture.log"; \
if [ -z "$$OUT" ]; then echo "✔ Vulture: no dead code found." >>"$(QUALITY_ARTIFACTS_DIR)/vulture.log"; fi
@echo " - Dependency hygiene (Deptry)"
@set -euo pipefail; \
{ $(DEPTRY) --version 2>/dev/null || true; } >"$(QUALITY_ARTIFACTS_DIR)/deptry.log"; \
$(DEPTRY) $(QUALITY_PATHS) 2>&1 | tee -a "$(QUALITY_ARTIFACTS_DIR)/deptry.log"
@echo " - License & SPDX compliance (REUSE)"
@set -euo pipefail; \
{ $(REUSE) --version 2>/dev/null || true; } >"$(QUALITY_ARTIFACTS_DIR)/reuse.log"; \
$(REUSE) lint 2>&1 | tee -a "$(QUALITY_ARTIFACTS_DIR)/reuse.log"
@echo " - Documentation coverage (Interrogate)"
@$(MAKE) interrogate-report
@echo "✔ Quality checks passed"
@printf "OK\n" >"$(QUALITY_OK_MARKER)"
interrogate-report:
@echo "→ Generating docstring coverage report (<100%)"
@mkdir -p "$(QUALITY_ARTIFACTS_DIR)"
@set +e; \
OUT="$$( $(QUALITY_ENV) $(INTERROGATE) --verbose $(INTERROGATE_PATHS) )"; \
rc=$$?; \
printf '%s\n' "$$OUT" >"$(QUALITY_ARTIFACTS_DIR)/interrogate.full.txt"; \
OFF="$$(printf '%s\n' "$$OUT" | awk -F'|' 'NR>3 && $$0 ~ /^\|/ { \
name=$$2; cov=$$6; gsub(/^[ \t]+|[ \t]+$$/, "", name); gsub(/^[ \t]+|[ \t]+$$/, "", cov); \
if (name !~ /^-+$$/ && cov != "100%") printf(" - %s (%s)\n", name, cov); \
}')"; \
printf '%s\n' "$$OFF" >"$(QUALITY_ARTIFACTS_DIR)/interrogate.offenders.txt"; \
if [ -n "$$OFF" ]; then printf '%s\n' "$$OFF"; else echo "✔ All files 100% documented"; fi; \
exit $$rc
quality-clean:
@echo "→ Cleaning quality artifacts"
@rm -rf "$(QUALITY_ARTIFACTS_DIR)"
##@ Quality
quality: ## Run Vulture, Deptry, REUSE, Interrogate; save logs to artifacts_pages/quality/
interrogate-report: ## Save full Interrogate table + offenders list
quality-clean: ## Remove artifacts_pages/quality
Ruff config (config/ruff.toml
)
target-version = "py311"
line-length = 88
respect-gitignore = true
src = ["src", "tests"]
exclude = [
".git", ".hg", ".mypy_cache", ".pytest_cache", ".ruff_cache", ".tox", ".venv",
"build", "dist", "docs", "htmlcov", "__pycache__", "migrations", "*.egg-info"
]
[lint]
select = [
"E","F","I","B","UP","SIM","PT","N","A","C4","S","TID","PERF",
# "RUF","ARG","TRY","T20","BLE","ERA"
]
ignore = ["E501", "E203"]
[lint.per-file-ignores]
"tests/**" = ["S101"]
"__init__.py" = ["F401"]
[lint.isort]
force-sort-within-sections = true
known-first-party = ["bijux_cli"]
required-imports = ["from __future__ import annotations"]
[lint.flake8-tidy-imports]
ban-relative-imports = "parents"
[lint.mccabe]
max-complexity = 10
Interrogate (pyproject.toml
)
Artifacts: Lint Artifacts · Quality Artifacts
Typing¶
Strict static typing with mypy and pyright; pytype runs when supported (≤3.12), skipped on 3.13+.
Mypy config (config/mypy.ini
)
[mypy]
python_version = 3.11
strict = true
show_error_codes = true
pretty = true
warn_unreachable = true
warn_unused_configs = true
warn_unused_ignores = true
follow_imports = silent
mypy_path = src
files = src, tests
exclude = ^(\.venv|build|dist|docs|htmlcov|\.mypy_cache|\.pytest_cache|\.ruff_cache|\.tox|__pycache__|migrations|\.egg-info|node_modules)/
plugins = pydantic.mypy
[mypy-cookiecutter.*]
ignore_missing_imports = true
[mypy-bijux_cli.core.di]
disable_error_code = type-abstract
[pydantic-mypy]
init_typed = true
warn_required_dynamic_aliases = true
warn_untyped_fields = true
Pyright config (config/pyrightconfig.json
)
{
"include": [
"../src/bijux_cli",
"../tests"
],
"extraPaths": [
"../src",
".."
],
"exclude": [
".venv",
"build",
"dist",
"htmlcov",
".pytest_cache",
".mypy_cache",
".pytype",
".ruff_cache",
".tox",
"**/__pycache__",
"node_modules"
],
"pythonVersion": "3.11",
"typeCheckingMode": "strict",
"useLibraryCodeForTypes": true,
"reportMissingImports": "warning",
"reportMissingTypeStubs": "none",
"reportUnusedImport": "error",
"reportPrivateUsage": "warning",
"reportUnnecessaryTypeIgnoreComment": "warning",
"reportUnnecessaryCast": "warning",
"reportUnnecessaryIsInstance": "warning",
"reportOptionalSubscript": "error",
"reportOptionalMemberAccess": "error",
"reportOptionalCall": "error",
"reportOptionalIterable": "error",
"reportOptionalContextManager": "error",
"reportOptionalOperand": "error",
"reportGeneralTypeIssues": "error",
"reportUntypedClassDecorator": "error",
"reportIncompatibleMethodOverride": "error",
"reportUnknownVariableType": "none",
"reportUnknownParameterType": "none",
"reportUntypedFunctionDecorator": "none",
"reportUnknownMemberType": "none",
"reportUnknownArgumentType": "none",
"reportUnknownLambdaType": "none",
"reportUnusedVariable": "information",
"reportMethodAssign": "none",
"executionEnvironments": [
{
"root": ".",
"extraPaths": [
"src"
]
}
]
}
Artifacts: Logs live under Lint Artifacts — direct sections: mypy.log
· pyright.log
· pytype.log
Tests & Coverage¶
- pytest + pytest-cov, overall coverage gate ≥ 98% (see config).
- Per-layer markers for fast selection (unit/integration/functional/e2e).
make test
pytest --cov=bijux_cli --cov-report=term-missing
pytest --cov=bijux_cli --cov-report=html && open htmlcov/index.html
Coverage config (config/coveragerc.ini
)
Pytest defaults (pytest.ini
)
[pytest]
minversion = 8.0
testpaths = tests
pythonpath = src
norecursedirs =
.venv
.tox
.pytest_cache
.ruff_cache
.mypy_cache
.hypothesis
.benchmarks
build
dist
htmlcov
docs
artifacts
cache_dir = artifacts/test/.pytest_cache
asyncio_mode = auto
timeout = 300
timeout_method = thread
timeout_func_only = true
addopts =
-ra
--strict-markers
--tb=short
--cov=bijux_cli
--cov-branch
--cov-config=config/coveragerc.ini
--cov-report=term-missing:skip-covered
--cov-report=html
--cov-fail-under=98
# --hypothesis-show-statistics
xfail_strict = true
markers =
slow: mark test as slow (deselect with '-m "not slow"')
windows: mark tests for Windows-only
filterwarnings =
ignore:Not saving anything, no benchmarks have been run!
ignore:jsonschema\.exceptions\.RefResolutionError is deprecated:DeprecationWarning
ignore:.*forkpty.*:DeprecationWarning
Artifacts: Test Artifacts · HTML coverage report · JUnit report
Mutation & Property Tests¶
- mutmut and Cosmic Ray validate assertion strength.
- Hypothesis for property-based tests.
Cosmic Ray config (config/cosmic-ray.toml
)
[cosmic-ray]
python = "python3.11"
test-command = "pytest -q -x -k 'not slow' --tb=short --disable-warnings"
module-path = ["src/bijux_cli"]
timeout = 300.0
[cosmic-ray.distributor]
name = "local"
worker-processes = 4
[cosmic-ray.mutators]
enabled = [
"arithmetic",
"boolean",
"comparison",
"constant",
"return-value",
]
[cosmic-ray.logging]
level = "INFO"
file = "artifacts/cosmic-ray.log"
format = "plain"
Security & Supply Chain¶
- bandit (SAST), pip-audit (CVE scan)
- CycloneDX SBOM generation (
artifacts/sbom.json
) - SPDX compliance via REUSE
Make: Security (makefiles/security.mk
)
# Security Configuration (no SBOM here; SBOM is handled in sbom.mk)
SECURITY_PATHS ?= src/bijux_cli
BANDIT ?= $(if $(ACT),$(ACT)/bandit,bandit)
PIP_AUDIT ?= $(if $(ACT),$(ACT)/pip-audit,pip-audit)
VENV_PYTHON ?= $(if $(VIRTUAL_ENV),$(VIRTUAL_ENV)/bin/python,python)
SECURITY_REPORT_DIR ?= artifacts/security
BANDIT_JSON := $(SECURITY_REPORT_DIR)/bandit.json
BANDIT_TXT := $(SECURITY_REPORT_DIR)/bandit.txt
PIPA_JSON := $(SECURITY_REPORT_DIR)/pip-audit.json
PIPA_TXT := $(SECURITY_REPORT_DIR)/pip-audit.txt
SECURITY_IGNORE_IDS ?= PYSEC-2022-42969
SECURITY_IGNORE_FLAGS = $(foreach V,$(SECURITY_IGNORE_IDS),--ignore-vuln $(V))
PIP_AUDIT_CONSOLE_FLAGS ?= --skip-editable --progress-spinner off
PIP_AUDIT_INPUTS ?=
SECURITY_STRICT ?= 1
BANDIT_EXCLUDES ?= .venv,venv,build,dist,.tox,.mypy_cache,.pytest_cache
BANDIT_THREADS ?= 0
.PHONY: security security-bandit security-audit security-clean
security: security-bandit security-audit
security-bandit:
@mkdir -p "$(SECURITY_REPORT_DIR)"
@echo "→ Bandit (Python static analysis)"
@$(BANDIT) -r "$(SECURITY_PATHS)" -x "$(BANDIT_EXCLUDES)" -f json -o "$(BANDIT_JSON)" -n $(BANDIT_THREADS) || true
@$(BANDIT) -r "$(SECURITY_PATHS)" -x "$(BANDIT_EXCLUDES)" -n $(BANDIT_THREADS) | tee "$(BANDIT_TXT)"
security-audit:
@mkdir -p "$(SECURITY_REPORT_DIR)"
@echo "→ Pip-audit (dependency vulnerability scan)"
@set -e; RC=0; \
$(PIP_AUDIT) $(SECURITY_IGNORE_FLAGS) $(PIP_AUDIT_CONSOLE_FLAGS) $(PIP_AUDIT_INPUTS) \
-f json -o "$(PIPA_JSON)" >/dev/null 2>&1 || RC=$$?; \
if [ $$RC -ne 0 ]; then \
echo "! pip-audit invocation failed (rc=$$RC)"; \
if [ "$(SECURITY_STRICT)" = "1" ]; then exit $$RC; fi; \
fi
@set -o pipefail; \
PIPA_JSON="$(PIPA_JSON)" \
SECURITY_STRICT="$(SECURITY_STRICT)" \
SECURITY_IGNORE_IDS="$(SECURITY_IGNORE_IDS)" \
"$(VENV_PYTHON)" scripts/helper_pip_audit.py | tee "$(PIPA_TXT)"
security-clean:
@rm -rf "$(SECURITY_REPORT_DIR)"
##@ Security
security: ## Run Bandit and pip-audit; save reports to $(SECURITY_REPORT_DIR)
security-bandit: ## Run Bandit (screen + JSON artifact)
security-audit: ## Run pip-audit (JSON once) and gate via scripts/helper_pip_audit.py; prints concise summary
security-clean: ## Remove security reports
Make: SBOM (makefiles/sbom.mk
)
# SBOM Configuration (pip-audit → CycloneDX JSON)
PACKAGE_NAME ?= bijux-cli
GIT_SHA ?= $(shell git rev-parse --short HEAD 2>/dev/null || echo unknown)
GIT_TAG_EXACT := $(shell git describe --tags --exact-match 2>/dev/null | sed -E 's/^v//')
GIT_TAG_LATEST := $(shell git describe --tags --abbrev=0 2>/dev/null | sed -E 's/^v//')
PYPROJECT_VERSION = $(call read_pyproject_version)
PKG_VERSION ?= $(if $(GIT_TAG_EXACT),$(GIT_TAG_EXACT),\
$(if $(PYPROJECT_VERSION),$(PYPROJECT_VERSION),\
$(if $(GIT_TAG_LATEST),$(GIT_TAG_LATEST),0.0.0)))
GIT_DESCRIBE := $(shell git describe --tags --long --dirty --always 2>/dev/null)
PKG_VERSION_FULL := $(if $(GIT_TAG_EXACT),$(PKG_VERSION),\
$(shell echo "$(GIT_DESCRIBE)" \
| sed -E 's/^v//; s/-([0-9]+)-g([0-9a-f]+)(-dirty)?$$/+\\1.g\\2\\3/'))
SBOM_VERSION := $(if $(PKG_VERSION_FULL),$(PKG_VERSION_FULL),$(PKG_VERSION))
SBOM_DIR ?= artifacts/sbom
SBOM_PROD_REQ ?= requirements/prod.txt
SBOM_DEV_REQ ?= requirements/dev.txt
SBOM_FORMAT ?= cyclonedx-json # pip-audit format
SBOM_CLI ?= cyclonedx # cyclonedx-cli for validation
SBOM_IGNORE_IDS ?= PYSEC-2022-42969
SBOM_IGNORE_FLAGS = $(foreach V,$(SBOM_IGNORE_IDS),--ignore-vuln $(V))
PIP_AUDIT := $(if $(ACT),$(ACT)/pip-audit,pip-audit)
PIP_AUDIT_FLAGS = --progress-spinner off --format $(SBOM_FORMAT)
SBOM_PROD_FILE := $(SBOM_DIR)/$(PACKAGE_NAME)-$(SBOM_VERSION)-$(GIT_SHA).prod.cdx.json
SBOM_DEV_FILE := $(SBOM_DIR)/$(PACKAGE_NAME)-$(SBOM_VERSION)-$(GIT_SHA).dev.cdx.json
.PHONY: sbom sbom-prod sbom-dev sbom-validate sbom-summary sbom-clean
sbom: sbom-clean sbom-prod sbom-dev sbom-summary
@echo "✔ SBOMs generated in $(SBOM_DIR)"
sbom-prod:
@mkdir -p "$(SBOM_DIR)"
@if [ -s "$(SBOM_PROD_REQ)" ]; then \
echo "→ SBOM (prod via $(SBOM_PROD_REQ))"; \
$(PIP_AUDIT) $(PIP_AUDIT_FLAGS) $(SBOM_IGNORE_FLAGS) \
-r "$(SBOM_PROD_REQ)" --output "$(SBOM_PROD_FILE)" || true; \
else \
echo "→ SBOM (prod fallback: current venv)"; \
$(PIP_AUDIT) $(PIP_AUDIT_FLAGS) $(SBOM_IGNORE_FLAGS) \
--output "$(SBOM_PROD_FILE)" || true; \
fi
sbom-dev:
@mkdir -p "$(SBOM_DIR)"
@if [ -s "$(SBOM_DEV_REQ)" ]; then \
echo "→ SBOM (dev via $(SBOM_DEV_REQ))"; \
$(PIP_AUDIT) $(PIP_AUDIT_FLAGS) $(SBOM_IGNORE_FLAGS) \
-r "$(SBOM_DEV_REQ)" --output "$(SBOM_DEV_FILE)" || true; \
else \
echo "→ SBOM (dev fallback: current venv)"; \
$(PIP_AUDIT) $(PIP_AUDIT_FLAGS) $(SBOM_IGNORE_FLAGS) \
--output "$(SBOM_DEV_FILE)" || true; \
fi
sbom-validate:
@if [ -z "$(SBOM_CLI)" ]; then echo "✘ SBOM_CLI not set"; exit 1; fi
@command -v $(SBOM_CLI) >/dev/null 2>&1 || { echo "✘ '$(SBOM_CLI)' not found. Install it or set SBOM_CLI."; exit 1; }
@if ! find "$(SBOM_DIR)" -maxdepth 1 -name '*.cdx.json' -print -quit | grep -q .; then \
echo "✘ No SBOM files in $(SBOM_DIR)"; exit 1; \
fi
@for f in "$(SBOM_DIR)"/*.cdx.json; do \
echo "→ Validating $$f"; \
$(SBOM_CLI) validate --input-format json --input-file "$$f"; \
done
sbom-summary:
@mkdir -p "$(SBOM_DIR)"
@if ! find "$(SBOM_DIR)" -maxdepth 1 -name '*.cdx.json' -print -quit | grep -q .; then \
echo "→ No SBOM files found in $(SBOM_DIR); skipping summary"; \
exit 0; \
fi
@echo "→ Writing SBOM summary"
@summary="$(SBOM_DIR)/summary.txt"; : > "$$summary"; \
if command -v jq >/dev/null 2>&1; then \
for f in "$(SBOM_DIR)"/*.cdx.json; do \
comps=$$(jq -r '(.components|length) // 0' "$$f"); \
echo "$$(basename "$$f") components=$$comps" >> "$$summary"; \
done; \
else \
tmp="$(SBOM_DIR)/_sbom_summary.py"; \
echo "import glob, json, os" > "$$tmp"; \
echo "sbom_dir = r'$(SBOM_DIR)'" >> "$$tmp"; \
echo "for f in glob.glob(os.path.join(sbom_dir, '*.cdx.json')):" >> "$$tmp"; \
echo " try:" >> "$$tmp"; \
echo " with open(f, 'r', encoding='utf-8') as fh:" >> "$$tmp"; \
echo " d = json.load(fh)" >> "$$tmp"; \
echo " comps = len(d.get('components', []) or [])" >> "$$tmp"; \
echo " except Exception:" >> "$$tmp"; \
echo " comps = '?'" >> "$$tmp"; \
echo " print(os.path.basename(f) + ' components=' + str(comps))" >> "$$tmp"; \
python3 "$$tmp" >> "$$summary" || true; \
rm -f "$$tmp"; \
fi; \
sed -n '1,5p' "$$summary" 2>/dev/null || true
sbom-clean:
@echo "→ Cleaning SBOM artifacts"
@mkdir -p "$(SBOM_DIR)"
@rm -f \
"$(SBOM_DIR)/$(PACKAGE_NAME)-0.0.0-"*.cdx.json \
"$(SBOM_DIR)/$(PACKAGE_NAME)--"*.cdx.json || true
##@ SBOM
sbom: ## Generate SBOMs for prod/dev (pip-audit → CycloneDX JSON) and a short summary
sbom-validate: ## Validate all generated SBOMs with CycloneDX CLI
sbom-summary: ## Write a brief components summary to $(SBOM_DIR)/summary.txt (best-effort)
sbom-clean: ## Remove stale SBOM artifacts from $(SBOM_DIR)
REUSE config (REUSE.toml
)
version = 1
# Config/docs/assets: public domain
[[annotations]]
path = [
"**/*.png", "**/*.svg", "**/*.ico", "**/*.gif", "**/*.jpg", "**/*.jpeg",
"**/*.html", "**/*.toml", "**/*.ini", "**/*.cfg", "**/*.conf",
"**/*.env", "**/*.env.*", "**/*.yaml", "**/*.yml", "**/*.json",
"**/*.cff", "**/*.dic", ".coverage*", ".gitattributes", ".gitignore",
"changelog.d/**", "**/.editorconfig", "artifacts/**", "scripts/git-hooks/**",
"docs/assets/styles/**"
]
precedence = "override"
SPDX-License-Identifier = "CC0-1.0"
SPDX-FileCopyrightText = "© 2025 Bijan Mousavi"
# Templates: public domain
[[annotations]]
path = ["plugin_template/**"]
precedence = "override"
SPDX-License-Identifier = "CC0-1.0"
SPDX-FileCopyrightText = "© 2025 Bijan Mousavi"
# Code: MIT
[[annotations]]
path = ["**/*.py", "**/*.pyi", "**/*.sh", "**/*.mk", "Makefile", "Dockerfile", "Dockerfile.*"]
precedence = "closest"
SPDX-License-Identifier = "MIT"
SPDX-FileCopyrightText = "© 2025 Bijan Mousavi"
# Markdown docs: MIT
[[annotations]]
path = ["**/*.md"]
precedence = "closest"
SPDX-License-Identifier = "MIT"
SPDX-FileCopyrightText = "© 2025 Bijan Mousavi"
Artifacts: Security Artifacts · SBOM Artifacts
Docs Toolchain¶
- MkDocs (Material) + mkdocstrings + literate-nav.
scripts/docs_builder/mkdocs_manager.py
copies top-level Markdown intodocs/
, generatesreference/**
API pages,nav.md
, and ensures{#top}
anchors.
MkDocs config (mkdocs.yml
)
site_name: Bijux CLI
site_description: A modern, predictable Python CLI framework with plugins, DI, strict flag precedence, and an interactive REPL.
site_url: !ENV [SITE_URL, https://bijux.github.io/bijux-cli/]
site_author: Bijan Mousavi
repo_url: https://github.com/bijux/bijux-cli
repo_name: bijux/bijux-cli
edit_uri: ""
strict: true
use_directory_urls: true
dev_addr: 127.0.0.1:8000
docs_dir: artifacts/docs/docs
site_dir: artifacts/docs/site
theme:
name: material
custom_dir: docs/overrides
logo: assets/bijux_logo_hq.png
favicon: assets/bijux_icon.png
font:
text: Roboto
code: Roboto Mono
palette:
- scheme: default
primary: indigo
accent: indigo
toggle:
icon: material/toggle-switch-off-outline
name: Switch to dark mode
- scheme: slate
primary: teal
accent: lime
toggle:
icon: material/toggle-switch
name: Switch to light mode
features:
- navigation.tabs
- navigation.tabs.sticky
- navigation.sections
- navigation.indexes
- navigation.top
- navigation.path
- navigation.tracking
- navigation.instant
- toc.follow
- search.suggest
- search.highlight
- content.tabs.link
- content.code.annotate
- content.code.copy
- content.code.select
- content.action.edit
- content.action.view
- navigation.footer
icon:
repo: fontawesome/brands/github
plugins:
- gen-files:
scripts:
- scripts/docs_builder/mkdocs_manager.py
- literate-nav:
nav_file: nav.md
- include-markdown:
recursive: true
dedent: true
preserve_includer_indent: false
trailing_newlines: true
comments: false
rewrite_relative_urls: true
- mkdocstrings:
handlers:
python:
paths: [src]
options:
docstring_style: google
docstring_section_style: list
merge_init_into_class: true
show_source: true
show_if_no_docstring: false
show_root_heading: true
heading_level: 2
separate_signature: true
signature_crossrefs: true
filters: ["!^_"]
- search
- social:
enabled: !ENV [ENABLE_SOCIAL_CARDS, false]
cache_dir: artifacts/docs/.cache/social
- autorefs
- redirects: {}
- tags
- minify:
minify_html: true
minify_js: true
minify_css: true
- glightbox:
touchNavigation: true
loop: false
effect: zoom
markdown_extensions:
- admonition
- footnotes
- attr_list
- md_in_html
- toc:
permalink: true
- pymdownx.highlight:
anchor_linenums: true
- pymdownx.inlinehilite
- pymdownx.snippets:
base_path:
- .
- docs
- pymdownx.details
- pymdownx.superfences
- pymdownx.mark
- pymdownx.magiclink
- pymdownx.tasklist
- pymdownx.tabbed:
alternate_style: true
- pymdownx.keys
- pymdownx.arithmatex:
generic: true
- pymdownx.emoji:
emoji_index: !!python/name:material.extensions.emoji.twemoji
emoji_generator: !!python/name:material.extensions.emoji.to_svg
extra:
social:
- icon: fontawesome/brands/github
link: https://github.com/bijux/bijux-cli
- icon: fontawesome/brands/python
link: https://pypi.org/project/bijux-cli/
analytics:
provider: plausible
domain: bijux.github.io
copyright:
text: "© 2025 Bijan Mousavi"
release_tag: !ENV [RELEASE_TAG, "main"]
extra_css:
- assets/styles/extra.css
extra_javascript:
- https://cdn.jsdelivr.net/npm/mathjax@3/es5/tex-mml-chtml.js
watch:
- artifacts
- src
- scripts
- config
- tests
- changelog.d
Docs generator (scripts/docs_builder/mkdocs_manager.py
)
# SPDX-License-Identifier: MIT
# Copyright © 2025 Bijan Mousavi
"""Main build manager for generating the MkDocs documentation site.
This script serves as the entrypoint for the `mkdocs-gen-files` plugin. It
orchestrates the entire documentation generation process, including:
- Materializing top-level project Markdown files (e.g., README, USAGE).
- Finding and processing Architecture Decision Records (ADRs).
- Generating API reference documentation from Python source files using
`mkdocstrings`.
- Creating index pages for all documentation sections.
- Building detailed pages for CI/CD artifacts (linting, testing, etc.).
- Composing a complete `nav.md` file for the `literate-nav` plugin to
construct the site navigation.
"""
from __future__ import annotations
import os
import sys
from pathlib import Path
from typing import Dict
from typing import List
from typing import Optional
from typing import Tuple
from typing import Callable
sys.path.insert(0, str(Path(__file__).resolve().parents[2]))
from scripts.docs_builder.artifacts_pages.api_page import APIArtifactPage
from scripts.docs_builder.artifacts_pages.citation_page import CitationArtifactPage
from scripts.docs_builder.artifacts_pages.lint_page import LintArtifactPage
from scripts.docs_builder.artifacts_pages.quality_page import QualityArtifactPage
from scripts.docs_builder.artifacts_pages.sbom_page import SBOMArtifactPage
from scripts.docs_builder.artifacts_pages.security_page import SecurityArtifactPage
from scripts.docs_builder.artifacts_pages.test_page import TestArtifactPage
from scripts.docs_builder.helpers import INDENT1
from scripts.docs_builder.helpers import INDENT2
from scripts.docs_builder.helpers import INDENT3
from scripts.docs_builder.helpers import NAV_FILE
from scripts.docs_builder.helpers import REPO_ROOT
from scripts.docs_builder.helpers import SRC_DIR
from scripts.docs_builder.helpers import ensure_top_anchor
from scripts.docs_builder.helpers import final_fixups
from scripts.docs_builder.helpers import fs_read_text
from scripts.docs_builder.helpers import nav_add_bullets
from scripts.docs_builder.helpers import nav_header
from scripts.docs_builder.helpers import pretty_title
from scripts.docs_builder.helpers import rewrite_links_general
from scripts.docs_builder.helpers import rewrite_links_tree
from scripts.docs_builder.helpers import write_if_changed
ADR_SRC_PRIMARY = REPO_ROOT / "ADR"
ADR_SRC_FALLBACK = REPO_ROOT / "docs" / "ADR"
ADR_DEST_DIR = Path("ADR")
PAGE_META_NO_EDIT = (
"---\nhide:\n - edit\n---\n\n"
)
def _pick_adr_source() -> Optional[Path]:
"""Selects the source directory for Architecture Decision Records (ADRs).
It prefers the top-level `ADR/` directory. If that does not exist, it
falls back to `docs/ADR/`.
Returns:
The path to the ADR source directory, or None if neither exists.
"""
if ADR_SRC_PRIMARY.is_dir():
return ADR_SRC_PRIMARY
if ADR_SRC_FALLBACK.is_dir():
return ADR_SRC_FALLBACK
return None
def _iter_adr_files(src_root: Path) -> List[Path]:
"""Lists all ADR Markdown files in a directory, sorted by name.
It excludes any `index.md` file from the list.
Args:
src_root: The directory to search for ADR files.
Returns:
A sorted list of paths to the ADR files.
"""
return sorted(
[p for p in src_root.glob("*.md") if p.is_file() and p.name != "index.md"],
key=lambda p: p.name,
)
def _adr_display_name(filename: str) -> str:
"""Formats a user-friendly title from an ADR filename.
For example, "0001-some-decision.md" becomes "ADR 0001: Some Decision".
Args:
filename: The name of the ADR file.
Returns:
A formatted, human-readable title string.
"""
stem = filename[:-3]
parts = stem.split("-", 1)
if len(parts) == 2 and parts[0].isdigit():
adr_num, title_raw = parts
return f"ADR {adr_num.zfill(4)}: {title_raw.replace('-', ' ').title()}"
return stem.replace("-", " ").title()
def _materialize_root_docs() -> None:
"""Copy key project files into the docs site; create fallbacks if absent."""
pairs: List[Tuple[Path, Path, Callable[[str], str]]] = [
(REPO_ROOT / "README.md", Path("index.md"), rewrite_links_general),
(REPO_ROOT / "USAGE.md", Path("usage.md"), rewrite_links_general),
(REPO_ROOT / "TESTS.md", Path("tests.md"), rewrite_links_general),
(REPO_ROOT / "PROJECT_TREE.md", Path("project_tree.md"), rewrite_links_tree),
(REPO_ROOT / "TOOLING.md", Path("tooling.md"), rewrite_links_general),
(REPO_ROOT / "SECURITY.md", Path("security.md"), rewrite_links_general),
(REPO_ROOT / "CODE_OF_CONDUCT.md",Path("code_of_conduct.md"), rewrite_links_general),
(REPO_ROOT / "CONTRIBUTING.md", Path("contributing.md"), rewrite_links_general),
(REPO_ROOT / "CHANGELOG.md", Path("changelog.md"), rewrite_links_general),
(REPO_ROOT / "LICENSES" / "MIT.txt", Path("license.md"), rewrite_links_general),
]
have_index = False
for src, dst, fixer in pairs:
if not src.exists():
continue
raw = fs_read_text(src)
md = ensure_top_anchor(fixer(raw))
md = final_fixups(md)
md = PAGE_META_NO_EDIT + md
write_if_changed(dst, md)
if dst.as_posix() == "index.md":
have_index = True
if not have_index:
fallback = PAGE_META_NO_EDIT + (
"# Bijux CLI {#top}\n\n"
"_Auto-generated skeleton page._\n\n"
"- [API Reference](reference/index.md)\n"
"- [Artifacts](artifacts/index.md)\n"
"- [Architecture Decision Records](ADR/index.md)\n"
)
write_if_changed("index.md", fallback)
def _materialize_adrs() -> None:
"""Copies ADRs from the source directory into the virtual docs filesystem.
This step is skipped if the ADRs are already located in the on-disk
`docs/ADR/` directory, as `mkdocs-gen-files` will pick them up automatically.
"""
src_root = _pick_adr_source()
if not src_root or src_root == ADR_SRC_FALLBACK:
return
for src in _iter_adr_files(src_root):
dst = ADR_DEST_DIR / src.name
raw = fs_read_text(src)
md = ensure_top_anchor(rewrite_links_general(raw))
md = final_fixups(md)
md = PAGE_META_NO_EDIT + md
write_if_changed(dst, md)
def _generate_adr_index() -> None:
"""Generates the `ADR/index.md` file in the virtual docs filesystem.
This ensures a correct and up-to-date index is always available,
regardless of whether an index file exists in the source directory.
"""
src_root = _pick_adr_source()
if not src_root:
return
files = _iter_adr_files(src_root)
if not files:
return
lines = [PAGE_META_NO_EDIT, "# Architecture Decision Records {#top}\n\n"]
for p in files:
lines.append(f"- [{_adr_display_name(p.name)}](./{p.name})\n")
write_if_changed(ADR_DEST_DIR / "index.md", "".join(lines))
def _generate_api_pages() -> Dict[str, List[Tuple[str, str]]]:
"""Walks the source directory and generates API reference pages.
For each Python module found in the `SRC_DIR`, this function creates a
corresponding Markdown file in the `reference/` virtual directory. The
content of each file is a `mkdocstrings` block configured to render the
API documentation for that module.
Returns:
A dictionary mapping each reference subdirectory to a list of pages
it contains. Each page is a tuple of (display_name, path). This
structure is used to build index pages and the site navigation.
"""
ref_dir_to_pages: Dict[str, List[Tuple[str, str]]] = {}
for root, _, files in os.walk(SRC_DIR):
rel_root = os.path.relpath(root, SRC_DIR)
section = None if rel_root == "." else rel_root
for file in files:
if not file.endswith(".py") or file.startswith("__") or file == "py.typed":
continue
module_name = file[:-3]
raw_md_path = os.path.join("reference", rel_root, f"{module_name}.md")
md_path = os.path.normpath(raw_md_path).replace("\\", "/")
is_command = (section or "").split(os.sep, 1)[0] == "commands"
header = f"# {module_name.capitalize()} Command API Reference\n" if is_command else f"# {module_name.capitalize()} Module API Reference\n"
blurb = f"This section documents the internals of the `{module_name}` command in Bijux CLI.\n" if is_command else f"This section documents the internals of the `{module_name}` module in Bijux CLI.\n"
full_module_path = f"bijux_cli.{module_name}" if section is None else f"bijux_cli.{section.replace(os.sep, '.')}.{module_name}"
content = (
PAGE_META_NO_EDIT
+ header
+ blurb
+ f"::: {full_module_path}\n"
+ " handler: python\n"
+ " options:\n"
+ " show_root_heading: true\n"
+ " show_source: true\n"
+ " show_signature_annotations: true\n"
+ " docstring_style: google\n"
)
write_if_changed(Path(md_path), content)
label = "Command" if is_command else "Module"
display_name = f"{pretty_title(Path(md_path).stem)} {label}"
ref_dir = os.path.dirname(md_path) or "reference"
ref_dir_to_pages.setdefault(ref_dir, []).append((display_name, md_path))
return ref_dir_to_pages
def _write_reference_indexes(ref_dir_to_pages: Dict[str, List[Tuple[str, str]]]) -> set[str]:
"""Creates `index.md` files for all API reference directories.
Args:
ref_dir_to_pages: The mapping of directories to pages from `_generate_api_pages`.
Returns:
A set of all directory paths within the API reference section.
"""
all_dirs: set[str] = {"reference"}
for ref_dir in ref_dir_to_pages:
parts = ref_dir.split("/")
for i in range(1, len(parts) + 1):
all_dirs.add("/".join(parts[:i]))
for ref_dir in sorted(all_dirs):
title = ref_dir.replace("reference", "Reference").strip("/").replace("/", " / ") or "Reference"
lines = [PAGE_META_NO_EDIT, f"# {title.title()} Index\n\n"]
for display_name, md_link in sorted(ref_dir_to_pages.get(ref_dir, []), key=lambda x: x[0].lower()):
lines.append(f"- [{display_name}]({Path(md_link).name})\n")
write_if_changed(Path(ref_dir) / "index.md", "".join(lines))
return all_dirs
def _compose_nav(ref_dir_to_pages: Dict[str, List[Tuple[str, str]]], all_dirs: set[str]) -> None:
"""Programmatically composes the entire site navigation in `nav.md`.
This function builds a Markdown list that `mkdocs-literate-nav` uses to
create the site's navigation tree. The structure is highly ordered and
builds several main sections, including top-level pages, a nested API
Reference section, ADRs, and artifact reports.
Args:
ref_dir_to_pages: The mapping of reference directories to pages.
all_dirs: A set of all reference directories that exist.
"""
nav = nav_header()
nav = nav_add_bullets(
nav,
[
"* [Home](index.md)",
"* [Usage](usage.md)",
"* [Project Overview](project_tree.md)",
"* [Tests](tests.md)",
"* [Tooling](tooling.md)",
"* API Reference",
f"{INDENT1}* [Index](reference/index.md)",
],
)
root_pages = ref_dir_to_pages.get("reference", [])
root_by_stem = {Path(p).stem.lower(): (name, p) for name, p in root_pages}
for stem in ("api", "cli", "httpapi"):
if stem in root_by_stem:
name, p = root_by_stem.pop(stem)
nav = nav_add_bullets(nav, [f"{INDENT1}* [{name}]({p})"])
for name, p in sorted(root_by_stem.values(), key=lambda x: x[0].lower()):
nav = nav_add_bullets(nav, [f"{INDENT1}* [{name}]({p})"])
SECTION_ORDER = ("commands", "contracts", "core", "infra", "services")
section_dirs = [f"reference/{s}" for s in SECTION_ORDER if f"reference/{s}" in all_dirs]
for section_dir in section_dirs:
section_name = section_dir.split("/", 1)[1].capitalize()
nav = nav_add_bullets(nav, [f"{INDENT1}* {section_name}", f"{INDENT2}* [Index]({section_dir}/index.md)"])
pages_here = sorted(ref_dir_to_pages.get(section_dir, []), key=lambda x: x[0].lower())
if pages_here:
bucket = "Commands" if section_dir.endswith("/commands") else "Modules"
nav = nav_add_bullets(nav, [f"{INDENT2}* {bucket}"])
for display_name, md_link in pages_here:
nav = nav_add_bullets(nav, [f"{INDENT3}* [{display_name}]({md_link})"])
subdirs = sorted(d for d in all_dirs if d.startswith(section_dir + "/"))
seen = {section_dir}
for sub_dir in subdirs:
if sub_dir in seen:
continue
seen.add(sub_dir)
subgroup_title = pretty_title(Path(sub_dir).name)
nav = nav_add_bullets(nav, [f"{INDENT2}* {subgroup_title}", f"{INDENT3}* [Index]({sub_dir}/index.md)"])
for display_name, md_link in sorted(ref_dir_to_pages.get(sub_dir, []), key=lambda x: x[0].lower()):
nav = nav_add_bullets(nav, [f"{INDENT3}* [{display_name}]({md_link})"])
src_root = _pick_adr_source()
if src_root and (files := _iter_adr_files(src_root)):
nav = nav_add_bullets(nav, ["* Architecture", f"{INDENT1}* [Decision Records](ADR/index.md)"])
for p in files:
nav = nav_add_bullets(nav, [f"{INDENT2}* [{_adr_display_name(p.name)}](ADR/{p.name})"])
nav = nav_add_bullets(nav, ["* [Changelog](changelog.md)"])
community_pages = [
("Code of Conduct", "code_of_conduct.md"),
("Contributing", "contributing.md"),
("Security Policy", "security.md"),
("License", "license.md"),
]
landing = [PAGE_META_NO_EDIT, "# Community {#top}\n\n",
"Project policies and how to get involved.\n\n"]
for title, path in community_pages:
landing.append(f"- [{title}]({path})\n")
write_if_changed(Path("community.md"), "".join(landing))
nav = nav_add_bullets(nav, ["* [Community](community.md)"])
for title, path in community_pages:
nav = nav_add_bullets(nav, [f"{INDENT1}* [{title}]({path})"])
artifacts = [
("Test Artifacts", "artifacts/test.md"),
("Lint Artifacts", "artifacts/lint.md"),
("Quality Artifacts", "artifacts/quality.md"),
("Security Artifacts", "artifacts/security.md"),
("API Artifacts", "artifacts/api.md"),
("SBOM Artifacts", "artifacts/sbom.md"),
("Citation Artifacts", "artifacts/citation.md")
]
landing = [PAGE_META_NO_EDIT, "# Artifacts {#top}\n\n",
"Collected CI/test reports and logs.\n\n"]
for title, path in artifacts:
landing.append(f"- [{title}]({Path(path).name})\n")
write_if_changed(Path("artifacts/index.md"), "".join(landing))
nav = nav_add_bullets(nav, ["* [Artifacts](artifacts/index.md)"])
for title, path in artifacts:
nav = nav_add_bullets(nav, [f"{INDENT1}* [{title}]({path})"])
write_if_changed(NAV_FILE, nav)
def main() -> None:
"""The main entrypoint for the documentation generation script.
Orchestrates the entire build process by calling functions in sequence to:
1. Materialize root documentation files.
2. Materialize and index ADRs.
3. Generate API reference pages and their indexes.
4. Build all artifact-specific documentation pages.
5. Compose the final site navigation file.
"""
_materialize_root_docs()
_materialize_adrs()
ref = _generate_api_pages()
print(f"[docs] generated {sum(len(v) for v in ref.values())} reference pages")
dirs = _write_reference_indexes(ref)
_generate_adr_index()
TestArtifactPage().build()
LintArtifactPage().build()
QualityArtifactPage().build()
SecurityArtifactPage().build()
APIArtifactPage().build()
SBOMArtifactPage().build()
CitationArtifactPage().build()
_compose_nav(ref, dirs)
if __name__ == "__main__":
main()
API Tooling¶
- Validation: Prance, openapi-spec-validator, Redocly
- Codegen compatibility: OpenAPI Generator CLI
- Schemathesis contract tests against a running server
Make: API (makefiles/api.mk
)
# API configuration — organized, zero root pollution
# ── Server / app
SHELL := /bin/bash
APP_DIR ?= src
API_HOST ?= 127.0.0.1
API_PORT ?= 8000
API_BASE_PATH ?= /v1
API_APP ?= app
API_MODULE ?= bijux_cli.httpapi
API_FACTORY ?=
API_WAIT_SECS ?= 30
HEALTH_PATH ?= /health
SCHEMA_URL ?= http://$(API_HOST):$(API_PORT)
# Workaround for older Schemathesis versions that may hang after successful test completion.
SCHEMATHESIS_TIMEOUT ?= 30
# ── Artifacts
API_ARTIFACTS_DIR ?= artifacts/api
API_LOG ?= $(API_ARTIFACTS_DIR)/server.log
API_LINT_DIR ?= $(API_ARTIFACTS_DIR)/lint
API_TEST_DIR ?= $(API_ARTIFACTS_DIR)/test
SCHEMA_BUNDLE_DIR ?= $(API_ARTIFACTS_DIR)/schemas
HYPOTHESIS_DB_API ?= $(API_TEST_DIR)/hypothesis
SCHEMATHESIS_JUNIT ?= $(API_TEST_DIR)/schemathesis.xml
SCHEMATHESIS_JUNIT_ABS := $(abspath $(SCHEMATHESIS_JUNIT))
# ── Node tool sandbox (no root pollution)
API_NODE_DIR ?= $(API_ARTIFACTS_DIR)/node
OPENAPI_GENERATOR_VERSION ?= 7.14.0
# Find schemas
ALL_API_SCHEMAS := $(shell find api -type f \( -name '*.yaml' -o -name '*.yml' \))
ALL_API_SCHEMAS_ABS := $(abspath $(ALL_API_SCHEMAS))
# Python CLIs (prefer ACT if present)
PRANCE := $(ACT)/prance
OPENAPI_SPEC_VALIDATOR := $(ACT)/openapi-spec-validator
SCHEMATHESIS := $(ACT)/schemathesis
SCHEMATHESIS_OPTS ?= \
--checks=all --max-failures=1 \
--report junit --report-junit-path $(SCHEMATHESIS_JUNIT_ABS) \
--request-timeout=5 --max-response-time=3 \
--max-examples=50 --seed=1 --generation-deterministic --exclude-checks=positive_data_acceptance \
--suppress-health-check=filter_too_much
# ── Absolute paths (safe if recipe cd's)
API_ARTIFACTS_DIR_ABS := $(abspath $(API_ARTIFACTS_DIR))
API_LINT_DIR_ABS := $(abspath $(API_LINT_DIR))
API_TEST_DIR_ABS := $(abspath $(API_TEST_DIR))
SCHEMA_BUNDLE_DIR_ABS := $(abspath $(SCHEMA_BUNDLE_DIR))
API_LOG_ABS := $(abspath $(API_LOG))
API_NODE_DIR_ABS := $(abspath $(API_NODE_DIR))
HYPOTHESIS_DB_API_ABS := $(abspath $(HYPOTHESIS_DB_API))
REDOCLY_ABS := $(API_NODE_DIR_ABS)/node_modules/.bin/redocly
OPENAPI_GENERATOR_ABS := $(API_NODE_DIR_ABS)/node_modules/.bin/openapi-generator-cli
# ── Uvicorn runner (force import from src/; tolerate unset PYTHONPATH)
ifneq ($(strip $(API_FACTORY)),)
API_CMD ?= PYTHONPATH="$(APP_DIR)$${PYTHONPATH:+:$$PYTHONPATH}" \
$(VENV_PYTHON) -c 'import sys, importlib, uvicorn; \
sys.path.insert(0,"$(APP_DIR)"); \
m=importlib.import_module("$(API_MODULE)"); \
app=getattr(m,"$(API_FACTORY)")(); \
uvicorn.run(app, host="$(API_HOST)", port=$(API_PORT))'
else
API_CMD ?= PYTHONPATH="$(APP_DIR)$${PYTHONPATH:+:$$PYTHONPATH}" \
$(VENV_PYTHON) -m uvicorn --app-dir "$(APP_DIR)" \
$(API_MODULE):$(API_APP) --host $(API_HOST) --port $(API_PORT)
endif
# ── Macro: validate one schema (use ABS CLI paths, no cd)
define VALIDATE_ONE_SCHEMA
@mkdir -p "$(API_LINT_DIR_ABS)"
@b="$$(basename "$(1)")"; \
in_abs="$(abspath $(1))"; \
log="$(API_LINT_DIR_ABS)/$${b}.log"; \
echo "→ Validating: $(1)"; \
{ \
$(PRANCE) validate "$$in_abs"; \
$(OPENAPI_SPEC_VALIDATOR) "$$in_abs"; \
"$(REDOCLY_ABS)" lint "$$in_abs"; \
NODE_NO_WARNINGS=1 "$(OPENAPI_GENERATOR_ABS)" validate -i "$$in_abs"; \
} 2>&1 | tee "$$log"
endef
.PHONY: api api-install api-lint api-test api-serve api-serve-bg api-stop api-clean node_deps node_bootstrap
## Orchestrator
api: api-install api-lint api-test
# ── Install toolchain (Python + Node sandbox)
api-install: | $(VENV) node_deps
@echo "→ Installing API Python deps..."
@command -v curl >/dev/null || { echo "✘ curl not found"; exit 1; }
@command -v java >/dev/null || { echo "✘ java not found"; exit 1; }
@$(VENV_PYTHON) -m pip install --quiet --upgrade prance openapi-spec-validator uvicorn schemathesis
@echo "✔ API toolchain ready."
api-lint: | node_deps
@if [ -z "$(ALL_API_SCHEMAS)" ]; then echo "✘ No API schemas found under api/*.y*ml"; exit 1; fi
@echo "→ Linting OpenAPI specs..."
$(foreach s,$(ALL_API_SCHEMAS),$(call VALIDATE_ONE_SCHEMA,$(s)))
@[ -f ./openapitools.json ] && echo "→ Removing stray openapitools.json (root)" && rm -f ./openapitools.json || true
@echo "✔ All schemas validated. Logs → $(API_LINT_DIR_ABS)"
# ── Start server, wait for readiness, run Schemathesis (sandboxed Hypothesis DB), stop server
api-test: | $(VENV) node_deps
@if [ -z "$(ALL_API_SCHEMAS)" ]; then echo "✘ No API schemas found under api/*.y*ml"; exit 1; fi
@mkdir -p "$(API_ARTIFACTS_DIR_ABS)" "$(API_TEST_DIR_ABS)"
@echo "→ Starting API server"
@script="$(API_ARTIFACTS_DIR_ABS)/run_api_test.sh"; \
rm -f "$$script"; \
echo '#!/usr/bin/env bash' >> "$$script"; \
echo 'set -euo pipefail' >> "$$script"; \
echo 'echo "→ Starting API server"' >> "$$script"; \
echo '$(API_CMD) >"$(API_LOG_ABS)" 2>&1 & PID=$$!' >> "$$script"; \
echo 'echo $$PID >"$(API_ARTIFACTS_DIR_ABS)/server.pid"' >> "$$script"; \
echo 'cleanup(){ kill $$PID >/dev/null 2>&1 || true; wait $$PID >/dev/null 2>&1 || true; }' >> "$$script"; \
echo 'trap cleanup EXIT INT TERM' >> "$$script"; \
echo 'echo "→ Waiting up to $(API_WAIT_SECS)s for readiness @ $(SCHEMA_URL)$(HEALTH_PATH)"' >> "$$script"; \
echo 'READY=' >> "$$script"; \
echo 'for i in $$(seq 1 $(API_WAIT_SECS)); do' >> "$$script"; \
echo ' if curl -fsS "$(SCHEMA_URL)$(HEALTH_PATH)" >/dev/null 2>&1; then READY=1; break; fi' >> "$$script"; \
echo ' sleep 1' >> "$$script"; \
echo ' if ! kill -0 $$PID >/dev/null 2>&1; then echo "✘ API crashed — see $(API_LOG_ABS)"; exit 1; fi' >> "$$script"; \
echo 'done' >> "$$script"; \
echo 'if [ -z "$$READY" ]; then echo "✘ API did not become ready in $(API_WAIT_SECS)s — see $(API_LOG_ABS)"; exit 1; fi' >> "$$script"; \
echo 'BASE_FLAG=$$($(SCHEMATHESIS) run -h 2>&1 | grep -q " --url " && echo --url || echo --base-url)' >> "$$script"; \
echo 'STATEFUL_ARGS=""' >> "$$script"; \
echo 'if $(SCHEMATHESIS) run -h 2>&1 | grep -q " --stateful"; then STATEFUL_ARGS="--stateful=links"; else echo "↪︎ Schemathesis: --stateful not supported; skipping"; fi' >> "$$script"; \
echo 'LOG="$(API_TEST_DIR_ABS)/schemathesis.log"; : > "$$LOG"' >> "$$script"; \
echo 'BUF=""; command -v stdbuf >/dev/null 2>&1 && BUF="stdbuf -oL -eL"' >> "$$script"; \
echo 'TO=""' >> "$$script"; \
echo 'if [ "$(SCHEMATHESIS_TIMEOUT)" -gt 0 ] 2>/dev/null; then' >> "$$script"; \
echo ' if command -v gtimeout >/dev/null 2>&1; then TO="gtimeout --kill-after=10 $(SCHEMATHESIS_TIMEOUT)";' >> "$$script"; \
echo ' elif command -v timeout >/dev/null 2>&1; then TO="timeout --kill-after=10 $(SCHEMATHESIS_TIMEOUT)";' >> "$$script"; \
echo ' fi' >> "$$script"; \
echo 'fi' >> "$$script"; \
echo 'if [ -n "$$TO" ]; then echo "↪︎ Using timeout wrapper: $$TO"; else echo "↪︎ No timeout wrapper in use"; fi' >> "$$script"; \
echo 'echo "→ Running Schemathesis against: $(SCHEMA_URL)$(API_BASE_PATH)"' >> "$$script"; \
echo 'EXIT_CODE=0' >> "$$script"; \
echo 'SCHEMA_BIN="$(SCHEMATHESIS)"; case "$$SCHEMA_BIN" in /*) ;; *) SCHEMA_BIN="$$(pwd)/$$SCHEMA_BIN";; esac' >> "$$script"; \
echo 'tmpdir=$$(mktemp -d); trap "rm -rf $$tmpdir" EXIT; cd "$$tmpdir"' >> "$$script"; \
echo 'for schema in $(ALL_API_SCHEMAS_ABS); do' >> "$$script"; \
echo ' echo " • $$schema" | tee -a "$$LOG"' >> "$$script"; \
echo ' set +e' >> "$$script"; \
echo ' ( $$TO $$BUF "$$SCHEMA_BIN" run "$$schema" $$BASE_FLAG "$(SCHEMA_URL)$(API_BASE_PATH)" $(SCHEMATHESIS_OPTS) $$STATEFUL_ARGS 2>&1 || [ $$? -eq 124 ] ) | tee -a "$$LOG"' >> "$$script"; \
echo ' rc=$${PIPESTATUS[0]}' >> "$$script"; \
echo ' set -e' >> "$$script"; \
echo ' if [ $$rc -ne 0 ] && [ $$EXIT_CODE -eq 0 ]; then EXIT_CODE=$$rc; fi' >> "$$script"; \
echo 'done' >> "$$script"; \
echo 'echo "→ Stopping API server"' >> "$$script"; \
echo 'cleanup' >> "$$script"; \
echo 'if [ $$EXIT_CODE -ne 0 ]; then echo "✘ Schemathesis reported failures (exit $$EXIT_CODE)"; fi' >> "$$script"; \
echo 'exit $$EXIT_CODE' >> "$$script"; \
chmod +x "$$script"; "$$script"
@[ -f ./openapitools.json ] && echo "→ Removing stray openapitools.json (root)" && rm -f ./openapitools.json || true
@echo "✔ Schemathesis finished. Log → $(API_TEST_DIR_ABS)/schemathesis.log"
@[ -f "$(SCHEMATHESIS_JUNIT)" ] && echo " JUnit → $(SCHEMATHESIS_JUNIT)" || true
@[ -d .hypothesis ] && echo "→ Removing stray .hypothesis (root)" && rm -rf .hypothesis || true
# ── Dev helpers
api-serve: | $(VENV)
@mkdir -p "$(API_ARTIFACTS_DIR_ABS)"
@echo "→ Serving API (foreground) @ $(SCHEMA_URL) — logs → $(API_LOG_ABS)"
@$(API_CMD)
api-serve-bg: | $(VENV)
@mkdir -p "$(API_ARTIFACTS_DIR_ABS)"
@echo "→ Serving API (background) @ $(SCHEMA_URL) — logs → $(API_LOG_ABS)"
@$(API_CMD) >"$(API_LOG_ABS)" 2>&1 & echo $$! >"$(API_ARTIFACTS_DIR_ABS)/server.pid"
@echo "PID $$(cat "$(API_ARTIFACTS_DIR_ABS)/server.pid")"
api-stop:
@if [ -f "$(API_ARTIFACTS_DIR_ABS)/server.pid" ]; then \
PID=$$(cat "$(API_ARTIFACTS_DIR_ABS)/server.pid"); \
echo "→ Stopping PID $$PID"; \
kill $$PID >/dev/null 2>&1 || true; \
wait $$PID >/dev/null 2>&1 || true; \
rm -f "$(API_ARTIFACTS_DIR_ABS)/server.pid"; \
else \
echo "→ No server.pid found (nothing to stop)"; \
fi
# ── Node deps (sandboxed). No root pollution, no repo-level openapitools.json.
node_deps: $(API_NODE_DIR_ABS)/.deps-ok
$(API_NODE_DIR_ABS)/.deps-ok:
@mkdir -p "$(API_NODE_DIR_ABS)" "$(API_NODE_DIR_ABS)/.npm-cache"
@command -v npm >/dev/null || { echo "✘ npm not found"; exit 1; }
@echo "→ Bootstrapping Node toolchain in $(API_NODE_DIR_ABS)"
@cd "$(API_NODE_DIR_ABS)" && { test -f package.json || npm init -y >/dev/null; }
@echo "→ Resolving openapi-generator-cli version (requested: $(OPENAPI_GENERATOR_VERSION))"
@cd "$(API_NODE_DIR_ABS)" && { \
PKG="@openapitools/openapi-generator-cli@$(OPENAPI_GENERATOR_VERSION)"; \
if ! npm view "$$PKG" version >/dev/null 2>&1; then \
echo "↪︎ Requested version not on npm; using @openapitools/openapi-generator-cli@latest"; \
PKG="@openapitools/openapi-generator-cli@latest"; \
fi; \
echo "→ Installing CLI deps in $(API_NODE_DIR_ABS)"; \
NPM_CONFIG_CACHE="$(API_NODE_DIR_ABS)/.npm-cache" \
npm install --no-fund --no-audit --loglevel=info \
--save-dev --save-exact \
@redocly/cli "$$PKG" \
> npm-install.log 2>&1 \
|| { echo "✘ npm install failed — see $(API_NODE_DIR_ABS)/npm-install.log"; tail -n 200 npm-install.log; exit 1; }; \
RESOLVED_GEN_VER="$$(node -p "require('./node_modules/@openapitools/openapi-generator-cli/package.json').version")"; \
RESOLVED_REDOC_VER="$$(node -p "require('./node_modules/@redocly/cli/package.json').version")"; \
printf "openapi-generator-cli=%s\nredocly-cli=%s\n" "$$RESOLVED_GEN_VER" "$$RESOLVED_REDOC_VER" > tool-versions.txt; \
echo "→ Installed: openapi-generator-cli=$$RESOLVED_GEN_VER, redocly-cli=$$RESOLVED_REDOC_VER"; \
}
@test -x "$(REDOCLY_ABS)" || { echo "✘ redocly CLI not found in sandbox"; exit 1; }
@test -x "$(OPENAPI_GENERATOR_ABS)" || { echo "✘ openapi-generator-cli not found in sandbox"; exit 1; }
@touch "$@"
# ── Cleanup
api-clean:
@echo "→ Cleaning API artifacts"
@rm -rf "$(API_ARTIFACTS_DIR_ABS)" || true
@echo "✔ Done"
##@ API
api: ## Run full API workflow (install → lint → test with Schemathesis); artifacts in artifacts/api/**
api-install: ## Install API toolchain (Python deps + sandboxed Node deps)
api-lint: ## Validate all OpenAPI specs; logs to artifacts/api/lint/*.log
api-test: ## Start server, wait for /health, run Schemathesis; logs & JUnit to artifacts/api/**
api-serve: ## Serve API in the foreground (dev)
api-serve-bg: ## Serve API in the background; PID to artifacts/api/server.pid
api-stop: ## Stop background API (if running)
api-clean: ## Remove all API artifacts
OpenAPI schema (api/v1/schema.yaml
)
openapi: 3.0.3
info:
title: Bijux CLI API
version: 1.0.0
description: |
OpenAPI schema for Bijux CLI item management.
Includes robust error handling, pagination, and industry-standard response objects.
Designed for SDK/code generation, documentation, monitoring, and security review.
contact:
name: Bijux CLI Support
url: https://bijux-cli.dev/support
email: mousavi.bijan@gmail.com
license:
name: MIT
url: https://opensource.org/licenses/MIT
tags:
- name: Items
description: Operations related to items
servers:
- url: http://127.0.0.1:8000/v1
description: Local development server
- url: https://api.bijux-cli.dev/v1
description: |
Production API server (planned; not live yet).
This URL is a placeholder. API endpoints are not currently deployed.
security: []
paths:
/items:
get:
tags: [Items]
summary: List items
description: |
Return a paginated list of items.
Use `limit` and `offset` for pagination.
operationId: listItems
parameters:
- name: limit
in: query
description: Maximum number of items to return (default 10, max 100)
required: false
schema:
type: integer
minimum: 1
maximum: 100
default: 10
example: 20
- name: offset
in: query
description: Offset from the start of the list (for pagination)
required: false
schema:
type: integer
minimum: 0
default: 0
example: 0
responses:
'200':
description: A paginated list of items
content:
application/json:
schema:
$ref: '#/components/schemas/ItemList'
examples:
success:
value:
items:
- id: 1
name: "Item One"
description: "Description one"
- id: 2
name: "Item Two"
description: "Description two"
total: 2
'406':
$ref: '#/components/responses/NotAcceptable'
'422':
$ref: '#/components/responses/ValidationError'
'500':
$ref: '#/components/responses/ServerError'
post:
tags: [Items]
summary: Create a new item
description: Create a new item with the provided data. If an item with the same name exists, returns the existing item (idempotent).
operationId: createItem
requestBody:
description: Data for the new item
required: true
content:
application/json:
schema:
$ref: '#/components/schemas/ItemCreate'
examples:
minimalItem:
summary: A request with only the required fields
value:
name: "Minimal Item"
fullItem:
summary: A request with all optional fields included
value:
name: "Full Item"
description: "This item has a detailed description."
responses:
'201':
description: Item created successfully
content:
application/json:
schema:
$ref: '#/components/schemas/Item'
examples:
created:
value:
id: 3
name: "New Item"
description: "A new item description"
links:
GetCreatedItem:
operationId: getItemById
parameters:
item_id: '$response.body#/id'
UpdateCreatedItem:
operationId: updateItemById
parameters:
item_id: '$response.body#/id'
DeleteCreatedItem:
operationId: deleteItemById
parameters:
item_id: '$response.body#/id'
'200':
description: Item already exists (idempotent create); existing resource is returned
content:
application/json:
schema:
$ref: '#/components/schemas/Item'
examples:
existed:
value:
id: 3
name: "Full Item"
description: "This item has a detailed description."
'406':
$ref: '#/components/responses/NotAcceptable'
'409':
$ref: '#/components/responses/Conflict'
'422':
$ref: '#/components/responses/ValidationError'
'500':
$ref: '#/components/responses/ServerError'
/items/{item_id}:
parameters:
- name: item_id
in: path
required: true
description: Unique item identifier
schema:
type: integer
minimum: 1
example: 1
get:
tags: [Items]
summary: Get item by ID
description: Retrieve item details by unique ID.
operationId: getItemById
responses:
'200':
description: Item details
content:
application/json:
schema:
$ref: '#/components/schemas/Item'
examples:
found:
value:
id: 1
name: "Item One"
description: "Description one"
'404':
$ref: '#/components/responses/NotFound'
'406':
$ref: '#/components/responses/NotAcceptable'
'422':
$ref: '#/components/responses/ValidationError'
'500':
$ref: '#/components/responses/ServerError'
put:
tags: [Items]
summary: Update item by ID
description: Update an existing item with the provided data. Raises conflict if the new name is already in use by another item.
operationId: updateItemById
requestBody:
description: Updated data for the item
required: true
content:
application/json:
schema:
$ref: '#/components/schemas/ItemCreate'
examples:
valid1:
value:
name: "Alpha"
description: "ok"
valid2:
value:
name: "Beta"
update:
value:
name: "Updated Item"
description: "Updated description"
responses:
'200':
description: Item updated successfully
content:
application/json:
schema:
$ref: '#/components/schemas/Item'
examples:
updated:
value:
id: 1
name: "Updated Item"
description: "Updated description"
links:
GetUpdatedItem:
operationId: getItemById
parameters:
item_id: '$response.body#/id'
DeleteUpdatedItem:
operationId: deleteItemById
parameters:
item_id: '$response.body#/id'
'404':
$ref: '#/components/responses/NotFound'
'406':
$ref: '#/components/responses/NotAcceptable'
'409':
$ref: '#/components/responses/Conflict'
'422':
$ref: '#/components/responses/ValidationError'
'500':
$ref: '#/components/responses/ServerError'
delete:
tags: [Items]
summary: Delete item by ID
description: Delete an item by its unique ID. Returns 404 if the item does not exist.
operationId: deleteItemById
responses:
'204':
description: Item deleted successfully
'404':
$ref: '#/components/responses/NotFound'
'406':
$ref: '#/components/responses/NotAcceptable'
'422':
$ref: '#/components/responses/ValidationError'
'500':
$ref: '#/components/responses/ServerError'
components:
schemas:
ItemCreate:
type: object
additionalProperties: false
properties:
name:
type: string
description: Item name (cannot be empty or only whitespace; trimmed on input).
minLength: 1
maxLength: 100
pattern: '.*\S.*'
description:
type: string
description: Optional item description
maxLength: 500
nullable: true
required: [name]
Item:
type: object
properties:
id:
type: integer
description: Item ID
example: 1
name:
type: string
description: Item name
example: "Item One"
description:
type: string
description: Optional item description
nullable: true
example: "Description one"
required: [id, name]
ItemList:
type: object
properties:
items:
type: array
items:
$ref: '#/components/schemas/Item'
total:
type: integer
description: Total items available
example: 2
required: [items, total]
Problem:
type: object
description: Error object in RFC7807 format
properties:
type:
type: string
description: Error type URI
example: "https://bijux-cli.dev/docs/errors/invalid-request"
title:
type: string
description: Short, human-readable summary
example: "Invalid request"
status:
type: integer
description: HTTP status code
example: 400
detail:
type: string
description: Detailed explanation
example: "Query parameter 'limit' must be between 1 and 100"
instance:
type: string
description: URI of the request/instance
example: "/items?limit=1000"
responses:
NotFound:
description: Item not found
content:
application/json:
schema:
$ref: '#/components/schemas/Problem'
examples:
notFound:
value:
type: "https://bijux-cli.dev/docs/errors/not-found"
title: "Not found"
status: 404
detail: "Item not found"
instance: "/items/99"
NotAcceptable:
description: Client must accept application/json
content:
application/json:
schema:
$ref: '#/components/schemas/Problem'
examples:
notAcceptable:
value:
type: "https://bijux-cli.dev/docs/errors/not-acceptable"
title: "Not Acceptable"
status: 406
detail: "Set 'Accept: application/json' for this endpoint"
instance: "/items/1"
Conflict:
description: Conflict (e.g., duplicate name)
content:
application/json:
schema:
$ref: '#/components/schemas/Problem'
examples:
conflict:
value:
type: "https://bijux-cli.dev/docs/errors/conflict"
title: "Conflict"
status: 409
detail: "Item with this name already exists"
instance: "/items"
ValidationError:
description: Unprocessable entity (validation failed)
content:
application/json:
schema:
$ref: '#/components/schemas/Problem'
examples:
validation:
value:
type: "https://bijux-cli.dev/docs/errors/validation-error"
title: "Validation error"
status: 422
detail: "Input validation failed"
instance: "/items"
ServerError:
description: Internal server error
content:
application/json:
schema:
$ref: '#/components/schemas/Problem'
examples:
serverError:
value:
type: "https://bijux-cli.dev/docs/errors/internal-server-error"
title: "Internal server error"
status: 500
detail: "An unexpected error occurred"
instance: "/items"
Artifacts: API Artifacts
Packaging & Releases¶
- Build backend: hatch
- Versioning: hatch-vcs (from git tags)
- PyPI long description: hatch-fancy-pypi-readme
- Changelog: Towncrier (fragments in
changelog.d/
) - Conventional Commits: Commitizen
- Publishing: GitHub Actions
publish.yml
→make publish
Hatch/metadata (pyproject.toml
)
[tool.hatch.build]
include = [
"README.md",
"CHANGELOG.md",
"LICENSE",
"LICENSES/**",
"REUSE.toml",
"CITATION.cff",
"src/bijux_cli/py.typed",
]
[tool.hatch.version]
source = "vcs"
[tool.hatch.build.targets.wheel]
packages = ["src/bijux_cli"]
zip-safe = false
[tool.hatch.build.targets.wheel.package-data]
"bijux_cli" = ["py.typed"]
[tool.hatch.metadata.hooks.fancy-pypi-readme]
content-type = "text/markdown"
fragments = [
{ path = "README.md" },
{ path = "CHANGELOG.md" },
]
Git Hooks & Commit Hygiene¶
- pre-commit hooks (see
.pre-commit-config.yaml
) - Conventional Commits enforced (Commitizen)
- Auto-fragment creator:
scripts/git-hooks/prepare-commit-msg
- Guard requiring a fragment:
scripts/check-towncrier-fragment.sh
pre-commit config (.pre-commit-config.yaml
)
ci:
autofix_prs: true
autoupdate_schedule: monthly
fail_fast: true
minimum_pre_commit_version: "3.7.0"
default_language_version:
python: python3.11
default_install_hook_types: [pre-commit, pre-push, commit-msg]
exclude: |
(?x)^(
node_modules/|
\.venv/|
dist/|
build/|
artifacts/|
site/|
plugin_template/|
.*/__pycache__/|
.*\.min\.(js|css)$
)
repos:
# --- Core hygiene
- repo: https://github.com/pre-commit/pre-commit-hooks
rev: v4.6.0
hooks:
- id: check-ast
- id: check-merge-conflict
- id: check-case-conflict
- id: check-symlinks
- id: detect-private-key
- id: detect-aws-credentials
args: ["--allow-missing-credentials"]
- id: forbid-new-submodules
- id: end-of-file-fixer
exclude: |
(?x)^(
tests/(e2e|integration|functional)/test_fixtures/|
tests/.*/snapshots/
)
- id: trailing-whitespace
exclude: |
(?x)^(
tests/(e2e|integration|functional)/test_fixtures/|
tests/.*/snapshots/
)
- id: mixed-line-ending
args: ["--fix=lf"]
exclude: |
(?x)^(
tests/(e2e|integration|functional)/test_fixtures/|
tests/.*/snapshots/
)
- id: check-yaml
args: ["--allow-multiple-documents", "--unsafe"]
- id: check-json
files: \.json$
exclude: |
(?x)^(
tests/(e2e|integration|functional)/test_fixtures/.*\.json$
| tests/.*/snapshots/.*\.json$
)
- id: check-toml
- id: check-added-large-files
args: ["--maxkb=300"]
- id: debug-statements
- id: pretty-format-json
files: \.json$
exclude: |
(?x)^(
tests/(e2e|integration|functional)/test_fixtures/.*\.json$
)
args: ["--autofix", "--no-sort-keys"]
# --- Ruff (src/ and tests/)
- repo: https://github.com/astral-sh/ruff-pre-commit
rev: v0.6.8
hooks:
- id: ruff-format
types: [python]
files: ^(src/|tests/)
args: ["--config", "config/ruff.toml"]
stages: [pre-commit]
- id: ruff
types: [python]
files: ^(src/|tests/)
args: ["--fix", "--config", "config/ruff.toml"]
stages: [pre-commit]
# --- Markdown formatting
- repo: https://github.com/executablebooks/mdformat
rev: 0.7.17
hooks:
- id: mdformat
additional_dependencies: [mdformat-gfm, mdformat-frontmatter]
files: ^(docs/|README\.md|CONTRIBUTING\.md|USAGE\.md|SECURITY\.md|CODE_OF_CONDUCT\.md)$
args: ["--wrap", "88"]
# --- Spelling
- repo: https://github.com/codespell-project/codespell
rev: v2.3.0
hooks:
- id: codespell
args: ["-I", "config/bijux.dic"]
exclude: "^CHANGELOG\\.md$"
# --- Conventional commits
- repo: local
hooks:
- id: no-commit-to-main
name: prevent direct commits to main
entry: bash -c 'branch="$(git rev-parse --abbrev-ref HEAD)"; test "$branch" != "main" || { echo "Do not commit to main"; exit 1; }'
language: system
stages: [pre-commit]
- id: commitizen-check
name: commitizen check message
entry: cz check --commit-msg-file
language: system
stages: [commit-msg]
pass_filenames: true
# --- Heavy checks via Makefile gate
- repo: local
hooks:
- id: gate-pre-push
name: project gate (make pre-push)
entry: make pre-push
language: system
pass_filenames: false
stages: [pre-push]
# --- Towncrier
- repo: local
hooks:
- id: towncrier-fragment-required
name: towncrier - require fragment for user-facing commits
language: script
stages: [ commit-msg ]
entry: scripts/check-towncrier-fragment.sh
prepare-commit-msg hook (scripts/git-hooks/prepare-commit-msg
)
#!/usr/bin/env bash
# Automatically create a changelog fragment based on commit message
MSG_FILE="$1"
TYPE=$(awk '{print $1}' "$MSG_FILE" | cut -d: -f1)
DESC=$(awk -F': ' 'NR==1{print $2}' "$MSG_FILE")
# Map commit type to Towncrier fragment type
case "$TYPE" in
feat) KIND="feature" ;;
fix) KIND="bugfix" ;;
chore|refactor|docs|style) KIND="misc" ;;
*) KIND="misc" ;;
esac
if [ -n "$DESC" ]; then
ID=$(date +%s)
mkdir -p changelog.d
echo "$DESC" > "changelog.d/${ID}.${KIND}.md"
fi
Towncrier fragment guard (scripts/check-towncrier-fragment.sh
)
#!/usr/bin/env bash
set -euo pipefail
if [ -z "${1:-}" ]; then
echo "Error: commit-msg hook did not receive filename argument." >&2
exit 1
fi
msg_file="$1"
subject=$(head -n1 "$msg_file")
if [ -t 2 ]; then
RED=$(tput setaf 1); YELLOW=$(tput setaf 3); BOLD=$(tput bold); RESET=$(tput sgr0)
else
RED=""; YELLOW=""; BOLD=""; RESET=""
fi
if [[ "$subject" =~ ^(Merge|Revert|fixup!|squash!)\ ]]; then exit 0; fi
if [[ "$subject" =~ ^chore(\(.*\))?:\ release ]]; then exit 0; fi
staged_files=$(git diff-index --cached --name-only HEAD)
if [ -z "$staged_files" ]; then exit 0; fi
if [ "$staged_files" == "CHANGELOG.md" ]; then exit 0; fi
if [ "${TOWNCRIER_ALLOW_SKIP:-0}" = "1" ]; then exit 0; fi
type=$(echo "$subject" | sed -nE 's/^(feat|fix|refactor|perf|docs)(\(.*\))?!?:.*/\1/p')
if [ -z "$type" ]; then
exit 0
fi
if echo "$staged_files" | grep -q 'changelog.d/.*\.md$'; then
exit 0
fi
echo "${RED}✘ Commit type '${BOLD}${type}${RESET}${RED}' requires a Towncrier fragment under changelog.d/${RESET}" >&2
echo " ${YELLOW}Please add a fragment, for example: 'changelog.d/123.${type}.md'${RESET}" >&2
echo " (To override this check in rare cases, run: ${BOLD}TOWNCRIER_ALLOW_SKIP=1 git commit ...${RESET})" >&2
exit 1
CI Orchestration¶
- GitHub Actions: main CI, docs deploy, and publish pipelines.
- tox mirrors Make targets for matrix runs.
- Makefile is modularized under
makefiles/*.mk
.
Makefile (entrypoint) (Makefile
)
# SPDX-License-Identifier: MIT
# Copyright © 2025 Bijan Mousavi
# Core Config
.DELETE_ON_ERROR:
.DEFAULT_GOAL := all
.SHELLFLAGS := -eu -o pipefail -c
SHELL := bash
PYTHON := python3
VENV := .venv
VENV_PYTHON := $(VENV)/bin/python
ACT := $(VENV)/bin
RM := rm -rf
.NOTPARALLEL: all clean
# Modular Includes
include makefiles/api.mk
include makefiles/build.mk
include makefiles/changelog.mk
include makefiles/citation.mk
include makefiles/dictionary.mk
include makefiles/docs.mk
include makefiles/lint.mk
include makefiles/mutation.mk
include makefiles/quality.mk
include makefiles/sbom.mk
include makefiles/security.mk
include makefiles/test.mk
include makefiles/publish.mk
include makefiles/hooks.mk
# Environment
$(VENV):
@echo "→ Creating virtualenv with '$$(which $(PYTHON))' ..."
@$(PYTHON) -m venv $(VENV)
install: $(VENV)
@echo "→ Installing dependencies..."
@$(VENV_PYTHON) -m pip install --upgrade pip setuptools wheel
@$(VENV_PYTHON) -m pip install -e ".[dev]"
bootstrap: $(VENV) install-git-hooks
.PHONY: bootstrap
# Cleanup
clean:
@$(MAKE) clean-soft
@echo "→ Cleaning (.venv) ..."
@$(RM) $(VENV)
clean-soft:
@echo "→ Cleaning (no .venv) ..."
@$(RM) \
.pytest_cache htmlcov coverage.xml dist build *.egg-info .tox demo .tmp_home \
.ruff_cache .mypy_cache .pytype .hypothesis .coverage.* .coverage .benchmarks \
spec.json openapitools.json node_modules .mutmut-cache session.sqlite site \
docs/reference artifacts usage_test usage_test_artifacts citation.bib .cache || true
@if [ "$(OS)" != "Windows_NT" ]; then \
find . -type d -name '__pycache__' -exec $(RM) {} +; \
fi
# Pipelines
all: clean install test lint quality security api docs build sbom citation
@echo "✔ All targets completed"
# Run independent checks in parallel
lint quality security api docs: | bootstrap
.NOTPARALLEL:
all-parallel: clean install
@$(MAKE) -j4 quality security api docs
@$(MAKE) build sbom citation
@echo "✔ All targets completed (parallel mode)"
# Pre-push Gate - Pre-Commit!
pre-push:
@$(PYTEST) -q -m "not e2e and not slow"
@$(MAKE) quality
@$(MAKE) security
@$(MAKE) api
@$(MAKE) docs
@$(MAKE) changelog-check
@echo "✔ pre-push gate passed"
.PHONY: pre-push
# Utilities
define run_tool
printf "→ %s %s\n" "$(1)" "$$file"; \
OUT=`$(2) "$$file" 2>&1`; \
if [ $$? -eq 0 ]; then \
printf " ✔ %s OK\n" "$(1)"; \
else \
printf " ✘ %s failed:\n" "$(1)"; \
printf "%s\n" "$$OUT" | head -10; \
fi
endef
define read_pyproject_version
$(strip $(shell \
python3 -c 'import tomllib; \
print(tomllib.load(open("pyproject.toml","rb"))["project"]["version"])' \
2>/dev/null || echo 0.0.0 \
))
endef
help:
@awk 'BEGIN{FS=":.*##"; OFS="";} \
/^##@/ {gsub(/^##@ */,""); print "\n\033[1m" $$0 "\033[0m"; next} \
/^[a-zA-Z0-9_.-]+:.*##/ {printf " \033[36m%-20s\033[0m %s\n", $$1, $$2}' \
$(MAKEFILE_LIST)
.PHONY: help
##@ Core
clean: ## Remove virtualenv, caches, build, and artifacts
clean-soft: ## Remove build artifacts but keep .venv
install: ## Install project in editable mode into .venv
bootstrap: ## Setup environment & install git hooks
all: ## Run full pipeline (clean → citation)
all-parallel: ## Run pipeline with parallelized lint, quality, security, api, and docs
pre-push: ## Run pre-push gate: tests, quality, security, API, docs, changelog-check
help: ## Show this help
tox config (tox.ini
)
[tox]
minversion = 4.11
requires = tox>=4.11, tox-gh-actions>=3.1
isolated_build = true
skip_missing_interpreters = true
envlist = py311, py312, py313, lint, quality, security, api, docs, build, sbom, citation
[gh-actions]
python =
3.11: py311, lint, quality, security, api, docs, build, sbom, citation
3.12: py312
3.13: py313
[testenv]
package = editable
extras = dev
allowlist_externals = make
setenv =
MAKEFLAGS = VENV={envdir} ACT={envdir}/bin
PIP_DISABLE_PIP_VERSION_CHECK = 1
PYTHONDONTWRITEBYTECODE = 1
passenv =
CI
GITHUB_*
GITLAB_*
PIP_INDEX_URL
PIP_EXTRA_INDEX_URL
HTTP_PROXY
HTTPS_PROXY
NO_PROXY
SSH_AUTH_SOCK
LC_ALL
LANG
TC_BASE
SBOM_CLI
OPENAPI_GENERATOR_VERSION
API_HOST
API_PORT
API_BASE_PATH
API_FACTORY
commands =
make test-unit
[testenv:lint]
basepython = python3.11
commands = make lint
[testenv:quality]
basepython = python3.11
commands = make quality
[testenv:security]
basepython = python3.11
commands = make security
[testenv:api]
basepython = python3.11
commands = make api
[testenv:docs]
basepython = python3.11
commands = make docs
[testenv:build]
basepython = python3.11
commands = make build
[testenv:sbom]
basepython = python3.11
commands = make sbom
[testenv:citation]
basepython = python3.11
commands = make citation
CI workflow: main (.github/workflows/ci.yml
)
name: CI
on:
push:
branches: [main]
tags: ['v*']
pull_request:
concurrency:
group: ${{ github.workflow }}-${{ github.ref }}
cancel-in-progress: true
permissions:
contents: read
env:
PIP_DISABLE_PIP_VERSION_CHECK: "1"
PYTHONUNBUFFERED: "1"
jobs:
# 1) Test matrix (3.11, 3.12, 3.13) — parallel
tests:
name: tests (${{ matrix.toxenv }})
runs-on: ubuntu-latest
timeout-minutes: 30
strategy:
fail-fast: false
matrix:
include:
- python-version: "3.11"
toxenv: "py311"
- python-version: "3.12"
toxenv: "py312"
- python-version: "3.13"
toxenv: "py313"
steps:
- uses: actions/checkout@v4
- uses: actions/setup-python@v5
with:
python-version: ${{ matrix['python-version'] }}
cache: pip
cache-dependency-path: |
pyproject.toml
requirements/**/*.txt
- run: python -m pip install -U pip tox
- run: tox -vv -e ${{ matrix.toxenv }}
# Evidence collection (per-Python)
- name: Stage test artifacts
if: ${{ always() }}
shell: bash
run: |
set -euo pipefail
rm -rf artifacts/test_upload && mkdir -p artifacts/test_upload
if [ -d artifacts/test ]; then
find artifacts/test -type f \
\( -name 'junit*.xml' -o -name 'pytest*.xml' -o -name 'report.xml' -o -name 'coverage.xml' \) \
-not -path 'artifacts/test/tmp/*' \
-print -exec cp --parents -t artifacts/test_upload {} +
if [ -d artifacts/test/htmlcov ]; then
mkdir -p artifacts/test_upload/htmlcov
cp -r artifacts/test/htmlcov/. artifacts/test_upload/htmlcov/
fi
fi
if [ -d .tox ]; then
find .tox -type f \
\( -name 'junit*.xml' -o -name 'pytest*.xml' -o -name 'report.xml' -o -name 'coverage.xml' \) \
-print -exec cp --parents -t artifacts/test_upload {} + || true
fi
- name: Upload test artifacts
if: ${{ always() }}
uses: actions/upload-artifact@v4
with:
name: test-${{ matrix.toxenv }}
path: artifacts/test_upload/**
if-no-files-found: ignore
retention-days: 14
- name: Upload coverage (direct)
if: ${{ always() }}
uses: actions/upload-artifact@v4
with:
name: coverage-${{ matrix.toxenv }}
path: artifacts/test/coverage.xml
if-no-files-found: warn
retention-days: 14
# 2) Non-lint checks — parallel; pinned to 3.11
checks:
name: ${{ matrix.env }}
runs-on: ubuntu-latest
timeout-minutes: 30
strategy:
fail-fast: false
matrix:
env: [quality, security, docs, build, api, sbom, citation]
steps:
- uses: actions/checkout@v4
- uses: actions/setup-python@v5
with:
python-version: "3.11"
cache: pip
cache-dependency-path: |
pyproject.toml
requirements/**/*.txt
# Tooling only where needed
- uses: actions/setup-node@v4
if: ${{ matrix.env == 'api' }}
with:
node-version: "20"
cache: "npm"
- uses: actions/setup-java@v4
if: ${{ matrix.env == 'api' }}
with:
distribution: "temurin"
java-version: "17"
- run: python -m pip install -U pip tox
- run: tox -vv -e ${{ matrix.env }}
# --- QUALITY ---
- name: Upload quality artifacts
if: ${{ always() && matrix.env == 'quality' }}
uses: actions/upload-artifact@v4
with:
name: quality
path: artifacts/quality/**
if-no-files-found: warn
retention-days: 14
# --- SECURITY ---
- name: Upload security artifacts
if: ${{ always() && matrix.env == 'security' }}
uses: actions/upload-artifact@v4
with:
name: security
path: artifacts/security/**
if-no-files-found: warn
retention-days: 14
# --- DOCS (source + site produced by make docs) ---
- name: Upload docs artifacts
if: ${{ always() && matrix.env == 'docs' }}
uses: actions/upload-artifact@v4
with:
name: docs
path: artifacts/docs/**
if-no-files-found: warn
retention-days: 14
# --- DIST from the 'build' env (artifacts/build) ---
- name: Validate artifacts/build on tag
if: ${{ startsWith(github.ref, 'refs/tags/v') && matrix.env == 'build' }}
shell: bash
run: |
test -d artifacts/build && compgen -G "artifacts/build/*" >/dev/null \
|| { echo "::error::artifacts/build is missing or empty"; exit 1; }
- name: Upload dist artifacts
if: ${{ matrix.env == 'build' && hashFiles('artifacts/build/**') != '' }}
uses: actions/upload-artifact@v4
with:
name: dist
path: artifacts/build/**
if-no-files-found: error
retention-days: 14
# --- Evidence (optional, for releases) ---
- name: Upload evidence
if: ${{ matrix.env == 'build' && hashFiles('artifacts/evidence/**') != '' }}
uses: actions/upload-artifact@v4
with:
name: evidence
path: artifacts/evidence/**
if-no-files-found: ignore
retention-days: 30
# --- SBOM ---
- name: Upload SBOM
if: ${{ matrix.env == 'sbom' && hashFiles('artifacts/sbom/**') != '' }}
uses: actions/upload-artifact@v4
with:
name: sbom
path: artifacts/sbom/**
if-no-files-found: ignore
retention-days: 30
# --- Citation ---
- name: Upload citation
if: ${{ matrix.env == 'citation' && hashFiles('artifacts/citation/**') != '' }}
uses: actions/upload-artifact@v4
with:
name: citation
path: artifacts/citation/**
if-no-files-found: ignore
retention-days: 14
# --- API (full logs) ---
- name: Upload API artifacts (full)
if: ${{ always() && matrix.env == 'api' }}
uses: actions/upload-artifact@v4
with:
name: api-full
path: artifacts/api/**
if-no-files-found: warn
retention-days: 14
# --- API schema (release layout expected by 'Publish to PyPI' / Releases) ---
- name: Stage API schema (exact release layout)
if: ${{ matrix.env == 'api' && hashFiles('artifacts/api/v1/schema.yaml') != '' }}
shell: bash
run: |
set -euo pipefail
rm -rf _release_api && mkdir -p _release_api/v1
cp -f artifacts/api/v1/schema.yaml _release_api/v1/schema.yaml
- name: Upload API schema (release layout)
if: ${{ matrix.env == 'api' && hashFiles('artifacts/api/v1/schema.yaml') != '' }}
uses: actions/upload-artifact@v4
with:
name: api
path: _release_api/v1
if-no-files-found: error
retention-days: 14
# 3) Lint — ONLY if all tests + checks pass
lint:
name: lint
runs-on: ubuntu-latest
needs: [tests, checks]
timeout-minutes: 45
steps:
- uses: actions/checkout@v4
- uses: actions/setup-python@v5
with:
python-version: "3.11"
cache: pip
cache-dependency-path: |
pyproject.toml
requirements/**/*.txt
- run: python -m pip install -U pip tox
- run: tox -vv -e lint
- name: Upload lint artifacts
if: ${{ always() }}
uses: actions/upload-artifact@v4
with:
name: lint
path: artifacts/lint/**
if-no-files-found: ignore
retention-days: 14
CI workflow: publish (.github/workflows/publish.yml
)
name: Publish to PyPI
on:
push:
tags: ["v*"]
workflow_dispatch:
permissions:
contents: read
actions: read
concurrency:
group: publish-${{ github.sha }}
cancel-in-progress: true
jobs:
publish:
# Only run on tag push or manual dispatch
if: ${{ github.event_name == 'push' || github.event_name == 'workflow_dispatch' }}
runs-on: ubuntu-latest
outputs:
publish: ${{ steps.decide.outputs.publish }}
tag: ${{ steps.decide.outputs.tag }}
version: ${{ steps.decide.outputs.version }}
ci_run_id: ${{ steps.ci.outputs.run_id }}
pypi_ready: ${{ steps.pypi_ready.outputs.ok }}
steps:
- name: Determine target SHA
id: sha
run: echo "value=${{ github.sha }}" >> "$GITHUB_OUTPUT"
- name: Checkout (tags + history)
uses: actions/checkout@v4
with:
fetch-depth: 0
ref: ${{ steps.sha.outputs.value }}
- name: Ensure tags present
run: git fetch --tags --force --prune
- name: Decide publish (SemVer tag on this SHA?)
id: decide
shell: bash
run: |
set -euo pipefail
SHA="${{ steps.sha.outputs.value }}"
TAGS=$(git tag --points-at "$SHA" | grep -E '^v[0-9]+\.[0-9]+\.[0-9]+$' || true)
if [ -z "$TAGS" ] && [ "${{ github.event_name }}" = "push" ]; then
if [[ "${GITHUB_REF_NAME}" =~ ^v[0-9]+\.[0-9]+\.[0-9]+$ ]]; then
TAGS="${GITHUB_REF_NAME}"
fi
fi
if [ -z "$TAGS" ]; then
echo "publish=false" >> "$GITHUB_OUTPUT"
exit 0
fi
TAG=$(echo "$TAGS" | head -n1)
VERSION=${TAG#v}
{
echo "publish=true"
echo "tag=$TAG"
echo "version=$VERSION"
} >> "$GITHUB_OUTPUT"
- name: Wait for CI success & capture run id
if: steps.decide.outputs.publish == 'true'
id: ci
uses: actions/github-script@v7
with:
script: |
const { owner, repo } = context.repo;
const sha = context.sha;
const timeoutMs = 20 * 60 * 1000, pollMs = 15000;
const deadline = Date.now() + timeoutMs;
async function latestCI() {
const res = await github.rest.actions.listWorkflowRunsForRepo({
owner, repo, head_sha: sha, per_page: 100
});
return (res.data.workflow_runs || [])
.filter(r => r.name === 'CI')
.sort((a,b) => new Date(b.run_started_at) - new Date(a.run_started_at))[0];
}
while (true) {
const run = await latestCI();
if (run && run.status === 'completed') {
if (run.conclusion === 'success') {
core.setOutput('run_id', String(run.id));
return;
}
core.setFailed(`CI concluded: ${run.conclusion}`);
return;
}
if (Date.now() > deadline) { core.setFailed('CI wait timeout'); return; }
await new Promise(r => setTimeout(r, pollMs));
}
- name: Check if version already on PyPI
if: steps.decide.outputs.publish == 'true'
id: precheck_pypi
shell: bash
run: |
VERSION="${{ steps.decide.outputs.version }}"
STATUS=$(curl -s -o /dev/null -w "%{http_code}" "https://pypi.org/pypi/bijux-cli/$VERSION/json" || true)
if [ "$STATUS" = "200" ]; then
echo "exists=true" >> "$GITHUB_OUTPUT"
else
echo "exists=false" >> "$GITHUB_OUTPUT"
fi
- name: Preflight — ensure PyPI token is set
if: steps.decide.outputs.publish == 'true'
shell: bash
run: |
if [ -z "${{ secrets.PYPI_API_TOKEN }}" ]; then
echo "::error::Missing PYPI_API_TOKEN secret"
exit 1
fi
- name: Download dist from CI artifacts
if: steps.decide.outputs.publish == 'true' && steps.precheck_pypi.outputs.exists == 'false'
env:
GH_TOKEN: ${{ github.token }}
shell: bash
run: |
set -euo pipefail
mkdir -p artifacts/build
gh run download "${{ steps.ci.outputs.run_id }}" -n dist -D artifacts/build || true
ls -la artifacts/build || true
- name: Set up Python (fallback build)
if: steps.decide.outputs.publish == 'true' && steps.precheck_pypi.outputs.exists == 'false'
uses: actions/setup-python@v5
with:
python-version: "3.11"
- name: Fallback build (only if dist missing)
if: steps.decide.outputs.publish == 'true' && steps.precheck_pypi.outputs.exists == 'false' && hashFiles('artifacts/build/**') == ''
shell: bash
run: |
set -euo pipefail
python -m pip install -U pip build
python -m build --outdir artifacts/build
# Fail if still no dist
if [ ! -d artifacts/build ] || ! compgen -G "artifacts/build/*" >/dev/null; then
echo "::error::No dist artifacts available after fallback build"
exit 1
fi
- name: Publish to PyPI
if: steps.decide.outputs.publish == 'true' && steps.precheck_pypi.outputs.exists == 'false'
uses: pypa/gh-action-pypi-publish@release/v1
with:
password: ${{ secrets.PYPI_API_TOKEN }}
packages-dir: artifacts/build
skip-existing: true
verbose: true
- name: Verify version is visible on PyPI
if: steps.decide.outputs.publish == 'true'
id: pypi_ready
shell: bash
run: |
set -euo pipefail
V="${{ steps.decide.outputs.version }}"
# If it already existed, we're good.
if [ "${{ steps.precheck_pypi.outputs.exists }}" = "true" ]; then
echo "ok=true" >> "$GITHUB_OUTPUT"
exit 0
fi
# Otherwise, poll for a short time after upload.
for i in {1..12}; do
code=$(curl -s -o /dev/null -w "%{http_code}" "https://pypi.org/pypi/bijux-cli/$V/json" || true)
if [ "$code" = "200" ]; then
echo "ok=true" >> "$GITHUB_OUTPUT"
exit 0
fi
sleep 5
done
echo "ok=false" >> "$GITHUB_OUTPUT"
echo "::warning::Version $V not visible on PyPI yet"
create-release:
name: Create GitHub Release
needs: publish
if: needs.publish.outputs.publish == 'true' && needs.publish.outputs.pypi_ready == 'true'
runs-on: ubuntu-latest
permissions:
contents: write
steps:
- name: Download all CI artifacts
env:
GH_TOKEN: ${{ github.token }}
shell: bash
run: |
set -euo pipefail
mkdir -p artifacts
gh run download "${{ needs.publish.outputs.ci_run_id }}" \
--repo "${{ github.repository }}" \
-D artifacts
echo "--- Downloaded artifacts layout ---"
find artifacts -maxdepth 2 -type d -print | sort || true
echo "-----------------------------------"
- name: Package curated bundles (zip)
shell: bash
run: |
set -euo pipefail
shopt -s nullglob
mkdir -p _release
# Tests: create test-<env>.zip for each env (e.g., artifacts/test-py311)
for d in artifacts/test-*; do
[ -d "$d" ] || continue
env="${d##*/test-}"
zip -r -q "_release/test-${env}.zip" "$d"
done
# Single-dir bundles
for name in lint quality security api docs sbom citation; do
if [ -d "artifacts/${name}" ]; then
zip -r -q "_release/${name}.zip" "artifacts/${name}"
fi
done
# Build bundle from dist (keep raw dist files as well)
if [ -d "artifacts/dist" ] && compgen -G "artifacts/dist/*" >/dev/null; then
zip -r -q "_release/build.zip" "artifacts/dist"
fi
# Checksums (avoid path issues by referencing full globs)
chks="_release/checksums-${{ needs.publish.outputs.tag }}.txt"
: > "$chks"
if compgen -G "_release/*.zip" >/dev/null; then
sha256sum _release/*.zip >> "$chks"
fi
if compgen -G "artifacts/dist/*" >/dev/null; then
sha256sum artifacts/dist/* >> "$chks"
fi
echo "Release payload:"
ls -lh _release || true
- name: Compute changelog anchor
id: anchor
shell: bash
run: |
ref="${{ needs.publish.outputs.version }}"
echo "anchor=v${ref//./-}" >> "$GITHUB_OUTPUT"
# Delete the existing release for this tag so assets are replaced cleanly
- name: Delete existing release (if it exists)
env:
GH_TOKEN: ${{ github.token }}
shell: bash
run: |
set -euo pipefail
gh release delete "${{ needs.publish.outputs.tag }}" --yes || true
- name: Create release (curated assets only)
uses: softprops/action-gh-release@v2
with:
tag_name: ${{ needs.publish.outputs.tag }}
name: ${{ needs.publish.outputs.tag }}
body: |
See the full changelog entry at https://bijux.github.io/bijux-cli/changelog/#${{ steps.anchor.outputs.anchor }}
files: |
_release/*.zip
_release/checksums-${{ needs.publish.outputs.tag }}.txt
artifacts/dist/*
fail_on_unmatched_files: false
make_latest: true
Artifacts: Browse CI-produced reports under Artifacts.
Config Files (Map)¶
- Lint:
config/ruff.toml
- Types:
config/mypy.ini
,config/pyrightconfig.json
- Coverage:
config/coveragerc.ini
- Mutation:
config/cosmic-ray.toml
- Dictionary (codespell):
config/bijux.dic
- Config notes:
config/README.md
- CI:
.github/workflows/
,tox.ini
,pytest.ini
- Docs:
mkdocs.yml
,scripts/helper_mkdocs.py
- Security & licensing:
REUSE.toml
- Packaging & release:
pyproject.toml