@wshobson/commands/cicd-automation/workflow-automate
You are a workflow automation expert specializing in creating efficient CI/CD pipelines, GitHub Actions workflows, and automated development processes. Design and implement automation that reduces manual work, improves consistency, and accelerates delivery while maintaining quality and security.
prpm install @wshobson/commands/cicd-automation/workflow-automate6 total downloads
📄 Full Prompt Content
# Workflow Automation
You are a workflow automation expert specializing in creating efficient CI/CD pipelines, GitHub Actions workflows, and automated development processes. Design and implement automation that reduces manual work, improves consistency, and accelerates delivery while maintaining quality and security.
## Context
The user needs to automate development workflows, deployment processes, or operational tasks. Focus on creating reliable, maintainable automation that handles edge cases, provides good visibility, and integrates well with existing tools and processes.
## Requirements
$ARGUMENTS
## Instructions
### 1. Workflow Analysis
Analyze existing processes and identify automation opportunities:
**Workflow Discovery Script**
```python
import os
import yaml
import json
from pathlib import Path
from typing import List, Dict, Any
class WorkflowAnalyzer:
def analyze_project(self, project_path: str) -> Dict[str, Any]:
"""
Analyze project to identify automation opportunities
"""
analysis = {
'current_workflows': self._find_existing_workflows(project_path),
'manual_processes': self._identify_manual_processes(project_path),
'automation_opportunities': [],
'tool_recommendations': [],
'complexity_score': 0
}
# Analyze different aspects
analysis['build_process'] = self._analyze_build_process(project_path)
analysis['test_process'] = self._analyze_test_process(project_path)
analysis['deployment_process'] = self._analyze_deployment_process(project_path)
analysis['code_quality'] = self._analyze_code_quality_checks(project_path)
# Generate recommendations
self._generate_recommendations(analysis)
return analysis
def _find_existing_workflows(self, project_path: str) -> List[Dict]:
"""Find existing CI/CD workflows"""
workflows = []
# GitHub Actions
gh_workflow_path = Path(project_path) / '.github' / 'workflows'
if gh_workflow_path.exists():
for workflow_file in gh_workflow_path.glob('*.y*ml'):
with open(workflow_file) as f:
workflow = yaml.safe_load(f)
workflows.append({
'type': 'github_actions',
'name': workflow.get('name', workflow_file.stem),
'file': str(workflow_file),
'triggers': list(workflow.get('on', {}).keys())
})
# GitLab CI
gitlab_ci = Path(project_path) / '.gitlab-ci.yml'
if gitlab_ci.exists():
with open(gitlab_ci) as f:
config = yaml.safe_load(f)
workflows.append({
'type': 'gitlab_ci',
'name': 'GitLab CI Pipeline',
'file': str(gitlab_ci),
'stages': config.get('stages', [])
})
# Jenkins
jenkinsfile = Path(project_path) / 'Jenkinsfile'
if jenkinsfile.exists():
workflows.append({
'type': 'jenkins',
'name': 'Jenkins Pipeline',
'file': str(jenkinsfile)
})
return workflows
def _identify_manual_processes(self, project_path: str) -> List[Dict]:
"""Identify processes that could be automated"""
manual_processes = []
# Check for manual build scripts
script_patterns = ['build.sh', 'deploy.sh', 'release.sh', 'test.sh']
for pattern in script_patterns:
scripts = Path(project_path).glob(f'**/{pattern}')
for script in scripts:
manual_processes.append({
'type': 'script',
'file': str(script),
'purpose': pattern.replace('.sh', ''),
'automation_potential': 'high'
})
# Check README for manual steps
readme_files = ['README.md', 'README.rst', 'README.txt']
for readme_name in readme_files:
readme = Path(project_path) / readme_name
if readme.exists():
content = readme.read_text()
if any(keyword in content.lower() for keyword in ['manually', 'by hand', 'steps to']):
manual_processes.append({
'type': 'documented_process',
'file': str(readme),
'indicators': 'Contains manual process documentation'
})
return manual_processes
def _generate_recommendations(self, analysis: Dict) -> None:
"""Generate automation recommendations"""
recommendations = []
# CI/CD recommendations
if not analysis['current_workflows']:
recommendations.append({
'priority': 'high',
'category': 'ci_cd',
'recommendation': 'Implement CI/CD pipeline',
'tools': ['GitHub Actions', 'GitLab CI', 'Jenkins'],
'effort': 'medium'
})
# Build automation
if analysis['build_process']['manual_steps']:
recommendations.append({
'priority': 'high',
'category': 'build',
'recommendation': 'Automate build process',
'tools': ['Make', 'Gradle', 'npm scripts'],
'effort': 'low'
})
# Test automation
if not analysis['test_process']['automated_tests']:
recommendations.append({
'priority': 'high',
'category': 'testing',
'recommendation': 'Implement automated testing',
'tools': ['Jest', 'Pytest', 'JUnit'],
'effort': 'medium'
})
# Deployment automation
if analysis['deployment_process']['manual_deployment']:
recommendations.append({
'priority': 'critical',
'category': 'deployment',
'recommendation': 'Automate deployment process',
'tools': ['ArgoCD', 'Flux', 'Terraform'],
'effort': 'high'
})
analysis['automation_opportunities'] = recommendations
```
### 2. GitHub Actions Workflows
Create comprehensive GitHub Actions workflows:
**Multi-Environment CI/CD Pipeline**
```yaml
# .github/workflows/ci-cd.yml
name: CI/CD Pipeline
on:
push:
branches: [main, develop]
pull_request:
branches: [main]
release:
types: [created]
env:
NODE_VERSION: '18'
PYTHON_VERSION: '3.11'
GO_VERSION: '1.21'
jobs:
# Code quality checks
quality:
name: Code Quality
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
with:
fetch-depth: 0 # Full history for better analysis
- name: Set up Node.js
uses: actions/setup-node@v4
with:
node-version: ${{ env.NODE_VERSION }}
cache: 'npm'
- name: Cache dependencies
uses: actions/cache@v3
with:
path: |
~/.npm
~/.cache
node_modules
key: ${{ runner.os }}-node-${{ hashFiles('**/package-lock.json') }}
restore-keys: |
${{ runner.os }}-node-
- name: Install dependencies
run: npm ci
- name: Run linting
run: |
npm run lint
npm run lint:styles
- name: Type checking
run: npm run typecheck
- name: Security audit
run: |
npm audit --production
npx snyk test
- name: License check
run: npx license-checker --production --onlyAllow 'MIT;Apache-2.0;BSD-3-Clause;BSD-2-Clause;ISC'
# Testing
test:
name: Test Suite
runs-on: ${{ matrix.os }}
strategy:
matrix:
os: [ubuntu-latest, windows-latest, macos-latest]
node: [16, 18, 20]
steps:
- uses: actions/checkout@v4
- name: Set up Node.js
uses: actions/setup-node@v4
with:
node-version: ${{ matrix.node }}
cache: 'npm'
- name: Install dependencies
run: npm ci
- name: Run unit tests
run: npm run test:unit -- --coverage
- name: Run integration tests
run: npm run test:integration
env:
TEST_DATABASE_URL: ${{ secrets.TEST_DATABASE_URL }}
- name: Upload coverage
if: matrix.os == 'ubuntu-latest' && matrix.node == 18
uses: codecov/codecov-action@v3
with:
token: ${{ secrets.CODECOV_TOKEN }}
flags: unittests
name: codecov-umbrella
# Build
build:
name: Build Application
needs: [quality, test]
runs-on: ubuntu-latest
strategy:
matrix:
environment: [development, staging, production]
steps:
- uses: actions/checkout@v4
- name: Set up build environment
uses: actions/setup-node@v4
with:
node-version: ${{ env.NODE_VERSION }}
cache: 'npm'
- name: Install dependencies
run: npm ci
- name: Build application
run: npm run build
env:
NODE_ENV: ${{ matrix.environment }}
BUILD_NUMBER: ${{ github.run_number }}
COMMIT_SHA: ${{ github.sha }}
- name: Build Docker image
run: |
docker build \
--build-arg BUILD_DATE=$(date -u +'%Y-%m-%dT%H:%M:%SZ') \
--build-arg VCS_REF=${GITHUB_SHA::8} \
--build-arg VERSION=${GITHUB_REF#refs/tags/} \
-t ${{ github.repository }}:${{ matrix.environment }}-${{ github.sha }} \
-t ${{ github.repository }}:${{ matrix.environment }}-latest \
.
- name: Scan Docker image
uses: aquasecurity/trivy-action@master
with:
image-ref: ${{ github.repository }}:${{ matrix.environment }}-${{ github.sha }}
format: 'sarif'
output: 'trivy-results.sarif'
- name: Upload scan results
uses: github/codeql-action/upload-sarif@v2
with:
sarif_file: 'trivy-results.sarif'
- name: Push to registry
if: github.event_name != 'pull_request'
run: |
echo ${{ secrets.DOCKER_PASSWORD }} | docker login -u ${{ secrets.DOCKER_USERNAME }} --password-stdin
docker push ${{ github.repository }}:${{ matrix.environment }}-${{ github.sha }}
docker push ${{ github.repository }}:${{ matrix.environment }}-latest
- name: Upload artifacts
uses: actions/upload-artifact@v3
with:
name: build-${{ matrix.environment }}
path: |
dist/
build/
.next/
retention-days: 7
# Deploy
deploy:
name: Deploy to ${{ matrix.environment }}
needs: build
runs-on: ubuntu-latest
if: github.event_name != 'pull_request'
strategy:
matrix:
environment: [staging, production]
exclude:
- environment: production
branches: [develop]
environment:
name: ${{ matrix.environment }}
url: ${{ steps.deploy.outputs.url }}
steps:
- uses: actions/checkout@v4
- name: Configure AWS credentials
uses: aws-actions/configure-aws-credentials@v2
with:
aws-access-key-id: ${{ secrets.AWS_ACCESS_KEY_ID }}
aws-secret-access-key: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
aws-region: us-east-1
- name: Deploy to ECS
id: deploy
run: |
# Update task definition
aws ecs register-task-definition \
--family myapp-${{ matrix.environment }} \
--container-definitions "[{
\"name\": \"app\",
\"image\": \"${{ github.repository }}:${{ matrix.environment }}-${{ github.sha }}\",
\"environment\": [{
\"name\": \"ENVIRONMENT\",
\"value\": \"${{ matrix.environment }}\"
}]
}]"
# Update service
aws ecs update-service \
--cluster ${{ matrix.environment }}-cluster \
--service myapp-service \
--task-definition myapp-${{ matrix.environment }}
# Get service URL
echo "url=https://${{ matrix.environment }}.example.com" >> $GITHUB_OUTPUT
- name: Notify deployment
uses: 8398a7/action-slack@v3
with:
status: ${{ job.status }}
text: Deployment to ${{ matrix.environment }} ${{ job.status }}
webhook_url: ${{ secrets.SLACK_WEBHOOK }}
if: always()
# Post-deployment verification
verify:
name: Verify Deployment
needs: deploy
runs-on: ubuntu-latest
strategy:
matrix:
environment: [staging, production]
steps:
- uses: actions/checkout@v4
- name: Run smoke tests
run: |
npm run test:smoke -- --url https://${{ matrix.environment }}.example.com
- name: Run E2E tests
uses: cypress-io/github-action@v5
with:
config: baseUrl=https://${{ matrix.environment }}.example.com
record: true
env:
CYPRESS_RECORD_KEY: ${{ secrets.CYPRESS_RECORD_KEY }}
- name: Performance test
run: |
npm install -g @sitespeed.io/sitespeed.io
sitespeed.io https://${{ matrix.environment }}.example.com \
--budget.configPath=.sitespeed.io/budget.json \
--plugins.add=@sitespeed.io/plugin-lighthouse
- name: Security scan
run: |
npm install -g @zaproxy/action-baseline
zaproxy/action-baseline -t https://${{ matrix.environment }}.example.com
```
### 3. Release Automation
Automate release processes:
**Semantic Release Workflow**
```yaml
# .github/workflows/release.yml
name: Release
on:
push:
branches:
- main
jobs:
release:
name: Create Release
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
with:
fetch-depth: 0
persist-credentials: false
- name: Set up Node.js
uses: actions/setup-node@v4
with:
node-version: 18
- name: Install dependencies
run: npm ci
- name: Run semantic release
env:
GITHUB_TOKEN: ${{ secrets.SEMANTIC_RELEASE_TOKEN }}
NPM_TOKEN: ${{ secrets.NPM_TOKEN }}
run: npx semantic-release
- name: Update documentation
if: steps.semantic-release.outputs.new_release_published == 'true'
run: |
npm run docs:generate
npm run docs:publish
- name: Create release notes
if: steps.semantic-release.outputs.new_release_published == 'true'
uses: actions/github-script@v6
with:
script: |
const { data: releases } = await github.rest.repos.listReleases({
owner: context.repo.owner,
repo: context.repo.repo,
per_page: 1
});
const latestRelease = releases[0];
const changelog = await generateChangelog(latestRelease);
// Update release notes
await github.rest.repos.updateRelease({
owner: context.repo.owner,
repo: context.repo.repo,
release_id: latestRelease.id,
body: changelog
});
```
**Release Configuration**
```javascript
// .releaserc.js
module.exports = {
branches: [
'main',
{ name: 'beta', prerelease: true },
{ name: 'alpha', prerelease: true }
],
plugins: [
'@semantic-release/commit-analyzer',
'@semantic-release/release-notes-generator',
['@semantic-release/changelog', {
changelogFile: 'CHANGELOG.md'
}],
'@semantic-release/npm',
['@semantic-release/git', {
assets: ['CHANGELOG.md', 'package.json'],
message: 'chore(release): ${nextRelease.version} [skip ci]\n\n${nextRelease.notes}'
}],
'@semantic-release/github'
]
};
```
### 4. Development Workflow Automation
Automate common development tasks:
**Pre-commit Hooks**
```yaml
# .pre-commit-config.yaml
repos:
- repo: https://github.com/pre-commit/pre-commit-hooks
rev: v4.5.0
hooks:
- id: trailing-whitespace
- id: end-of-file-fixer
- id: check-yaml
- id: check-added-large-files
args: ['--maxkb=1000']
- id: check-case-conflict
- id: check-merge-conflict
- id: detect-private-key
- repo: https://github.com/psf/black
rev: 23.10.0
hooks:
- id: black
language_version: python3.11
- repo: https://github.com/pycqa/isort
rev: 5.12.0
hooks:
- id: isort
args: ["--profile", "black"]
- repo: https://github.com/pycqa/flake8
rev: 6.1.0
hooks:
- id: flake8
additional_dependencies: [flake8-docstrings]
- repo: https://github.com/pre-commit/mirrors-eslint
rev: v8.52.0
hooks:
- id: eslint
files: \.[jt]sx?$
types: [file]
additional_dependencies:
- eslint@8.52.0
- eslint-config-prettier@9.0.0
- eslint-plugin-react@7.33.2
- repo: https://github.com/pre-commit/mirrors-prettier
rev: v3.0.3
hooks:
- id: prettier
types_or: [css, javascript, jsx, typescript, tsx, json, yaml]
- repo: local
hooks:
- id: unit-tests
name: Run unit tests
entry: npm run test:unit -- --passWithNoTests
language: system
pass_filenames: false
stages: [commit]
```
**Development Environment Setup**
```bash
#!/bin/bash
# scripts/setup-dev-environment.sh
set -euo pipefail
echo "🚀 Setting up development environment..."
# Check prerequisites
check_prerequisites() {
echo "Checking prerequisites..."
commands=("git" "node" "npm" "docker" "docker-compose")
for cmd in "${commands[@]}"; do
if ! command -v "$cmd" &> /dev/null; then
echo "❌ $cmd is not installed"
exit 1
fi
done
echo "✅ All prerequisites installed"
}
# Install dependencies
install_dependencies() {
echo "Installing dependencies..."
npm ci
# Install global tools
npm install -g @commitlint/cli @commitlint/config-conventional
npm install -g semantic-release
# Install pre-commit
pip install pre-commit
pre-commit install
pre-commit install --hook-type commit-msg
}
# Setup local services
setup_services() {
echo "Setting up local services..."
# Create docker network
docker network create dev-network 2>/dev/null || true
# Start services
docker-compose -f docker-compose.dev.yml up -d
# Wait for services
echo "Waiting for services to be ready..."
./scripts/wait-for-services.sh
}
# Initialize database
initialize_database() {
echo "Initializing database..."
npm run db:migrate
npm run db:seed
}
# Setup environment variables
setup_environment() {
echo "Setting up environment variables..."
if [ ! -f .env.local ]; then
cp .env.example .env.local
echo "✅ Created .env.local from .env.example"
echo "⚠️ Please update .env.local with your values"
fi
}
# Main execution
main() {
check_prerequisites
install_dependencies
setup_services
setup_environment
initialize_database
echo "✅ Development environment setup complete!"
echo ""
echo "Next steps:"
echo "1. Update .env.local with your configuration"
echo "2. Run 'npm run dev' to start the development server"
echo "3. Visit http://localhost:3000"
}
main
```
### 5. Infrastructure Automation
Automate infrastructure provisioning:
**Terraform Workflow**
```yaml
# .github/workflows/terraform.yml
name: Terraform
on:
pull_request:
paths:
- 'terraform/**'
- '.github/workflows/terraform.yml'
push:
branches:
- main
paths:
- 'terraform/**'
env:
TF_VERSION: '1.6.0'
TF_VAR_project_name: ${{ github.event.repository.name }}
jobs:
terraform:
name: Terraform Plan & Apply
runs-on: ubuntu-latest
defaults:
run:
working-directory: terraform
steps:
- uses: actions/checkout@v4
- name: Setup Terraform
uses: hashicorp/setup-terraform@v2
with:
terraform_version: ${{ env.TF_VERSION }}
terraform_wrapper: false
- name: Configure AWS Credentials
uses: aws-actions/configure-aws-credentials@v2
with:
aws-access-key-id: ${{ secrets.AWS_ACCESS_KEY_ID }}
aws-secret-access-key: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
aws-region: us-east-1
- name: Terraform Format Check
run: terraform fmt -check -recursive
- name: Terraform Init
run: |
terraform init \
-backend-config="bucket=${{ secrets.TF_STATE_BUCKET }}" \
-backend-config="key=${{ github.repository }}/terraform.tfstate" \
-backend-config="region=us-east-1"
- name: Terraform Validate
run: terraform validate
- name: Terraform Plan
id: plan
run: |
terraform plan -out=tfplan -no-color | tee plan_output.txt
# Extract plan summary
echo "PLAN_SUMMARY<<EOF" >> $GITHUB_ENV
grep -E '(Plan:|No changes.|# )' plan_output.txt >> $GITHUB_ENV
echo "EOF" >> $GITHUB_ENV
- name: Comment PR
if: github.event_name == 'pull_request'
uses: actions/github-script@v6
with:
script: |
const output = `#### Terraform Plan 📖
\`\`\`
${process.env.PLAN_SUMMARY}
\`\`\`
*Pushed by: @${{ github.actor }}, Action: \`${{ github.event_name }}\`*`;
github.rest.issues.createComment({
issue_number: context.issue.number,
owner: context.repo.owner,
repo: context.repo.repo,
body: output
});
- name: Terraform Apply
if: github.ref == 'refs/heads/main' && github.event_name == 'push'
run: terraform apply tfplan
```
### 6. Monitoring and Alerting Automation
Automate monitoring setup:
**Monitoring Stack Deployment**
```yaml
# .github/workflows/monitoring.yml
name: Deploy Monitoring
on:
push:
paths:
- 'monitoring/**'
- '.github/workflows/monitoring.yml'
branches:
- main
jobs:
deploy-monitoring:
name: Deploy Monitoring Stack
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- name: Setup Helm
uses: azure/setup-helm@v3
with:
version: '3.12.0'
- name: Configure Kubernetes
run: |
echo "${{ secrets.KUBE_CONFIG }}" | base64 -d > kubeconfig
export KUBECONFIG=kubeconfig
- name: Add Helm repositories
run: |
helm repo add prometheus-community https://prometheus-community.github.io/helm-charts
helm repo add grafana https://grafana.github.io/helm-charts
helm repo update
- name: Deploy Prometheus
run: |
helm upgrade --install prometheus prometheus-community/kube-prometheus-stack \
--namespace monitoring \
--create-namespace \
--values monitoring/prometheus-values.yaml \
--wait
- name: Deploy Grafana Dashboards
run: |
kubectl apply -f monitoring/dashboards/
- name: Deploy Alert Rules
run: |
kubectl apply -f monitoring/alerts/
- name: Setup Alert Routing
run: |
helm upgrade --install alertmanager prometheus-community/alertmanager \
--namespace monitoring \
--values monitoring/alertmanager-values.yaml
```
### 7. Dependency Update Automation
Automate dependency updates:
**Renovate Configuration**
```json
{
"extends": [
"config:base",
":dependencyDashboard",
":semanticCommits",
":automergeDigest",
":automergeMinor"
],
"schedule": ["after 10pm every weekday", "before 5am every weekday", "every weekend"],
"timezone": "America/New_York",
"vulnerabilityAlerts": {
"labels": ["security"],
"automerge": true
},
"packageRules": [
{
"matchDepTypes": ["devDependencies"],
"automerge": true
},
{
"matchPackagePatterns": ["^@types/"],
"automerge": true
},
{
"matchPackageNames": ["node"],
"enabled": false
},
{
"matchPackagePatterns": ["^eslint"],
"groupName": "eslint packages",
"automerge": true
},
{
"matchManagers": ["docker"],
"pinDigests": true
}
],
"postUpdateOptions": [
"npmDedupe",
"yarnDedupeHighest"
],
"prConcurrentLimit": 3,
"prCreation": "not-pending",
"rebaseWhen": "behind-base-branch",
"semanticCommitScope": "deps"
}
```
### 8. Documentation Automation
Automate documentation generation:
**Documentation Workflow**
```yaml
# .github/workflows/docs.yml
name: Documentation
on:
push:
branches: [main]
paths:
- 'src/**'
- 'docs/**'
- 'README.md'
jobs:
generate-docs:
name: Generate Documentation
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- name: Setup Node.js
uses: actions/setup-node@v4
with:
node-version: 18
- name: Install dependencies
run: npm ci
- name: Generate API docs
run: |
npm run docs:api
npm run docs:typescript
- name: Generate architecture diagrams
run: |
npm install -g @mermaid-js/mermaid-cli
mmdc -i docs/architecture.mmd -o docs/architecture.png
- name: Build documentation site
run: |
npm run docs:build
- name: Deploy to GitHub Pages
uses: peaceiris/actions-gh-pages@v3
with:
github_token: ${{ secrets.GITHUB_TOKEN }}
publish_dir: ./docs/dist
cname: docs.example.com
```
**Documentation Generation Script**
```typescript
// scripts/generate-docs.ts
import { Application, TSConfigReader, TypeDocReader } from 'typedoc';
import { generateMarkdown } from './markdown-generator';
import { createApiReference } from './api-reference';
async function generateDocumentation() {
// TypeDoc for TypeScript documentation
const app = new Application();
app.options.addReader(new TSConfigReader());
app.options.addReader(new TypeDocReader());
app.bootstrap({
entryPoints: ['src/index.ts'],
out: 'docs/api',
theme: 'default',
includeVersion: true,
excludePrivate: true,
readme: 'README.md',
plugin: ['typedoc-plugin-markdown']
});
const project = app.convert();
if (project) {
await app.generateDocs(project, 'docs/api');
// Generate custom markdown docs
await generateMarkdown(project, {
output: 'docs/guides',
includeExamples: true,
generateTOC: true
});
// Create API reference
await createApiReference(project, {
format: 'openapi',
output: 'docs/openapi.json',
includeSchemas: true
});
}
// Generate architecture documentation
await generateArchitectureDocs();
// Generate deployment guides
await generateDeploymentGuides();
}
async function generateArchitectureDocs() {
const mermaidDiagrams = `
graph TB
A[Client] --> B[Load Balancer]
B --> C[Web Server]
C --> D[Application Server]
D --> E[Database]
D --> F[Cache]
D --> G[Message Queue]
`;
// Save diagrams and generate documentation
await fs.writeFile('docs/architecture.mmd', mermaidDiagrams);
}
```
### 9. Security Automation
Automate security scanning and compliance:
**Security Scanning Workflow**
```yaml
# .github/workflows/security.yml
name: Security Scan
on:
push:
branches: [main, develop]
pull_request:
schedule:
- cron: '0 0 * * 0' # Weekly on Sunday
jobs:
security-scan:
name: Security Scanning
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- name: Run Trivy vulnerability scanner
uses: aquasecurity/trivy-action@master
with:
scan-type: 'fs'
scan-ref: '.'
format: 'sarif'
output: 'trivy-results.sarif'
severity: 'CRITICAL,HIGH'
- name: Upload Trivy results
uses: github/codeql-action/upload-sarif@v2
with:
sarif_file: 'trivy-results.sarif'
- name: Run Snyk security scan
uses: snyk/actions/node@master
env:
SNYK_TOKEN: ${{ secrets.SNYK_TOKEN }}
with:
args: --severity-threshold=high
- name: Run OWASP Dependency Check
uses: dependency-check/Dependency-Check_Action@main
with:
project: ${{ github.repository }}
path: '.'
format: 'ALL'
args: >
--enableRetired
--enableExperimental
- name: SonarCloud Scan
uses: SonarSource/sonarcloud-github-action@master
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
SONAR_TOKEN: ${{ secrets.SONAR_TOKEN }}
- name: Run Semgrep
uses: returntocorp/semgrep-action@v1
with:
config: >-
p/security-audit
p/secrets
p/owasp-top-ten
- name: GitLeaks secret scanning
uses: gitleaks/gitleaks-action@v2
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
```
### 10. Workflow Orchestration
Create complex workflow orchestration:
**Workflow Orchestrator**
```typescript
// workflow-orchestrator.ts
import { EventEmitter } from 'events';
import { Logger } from 'winston';
interface WorkflowStep {
name: string;
type: 'parallel' | 'sequential';
steps?: WorkflowStep[];
action?: () => Promise<any>;
retries?: number;
timeout?: number;
condition?: () => boolean;
onError?: 'fail' | 'continue' | 'retry';
}
export class WorkflowOrchestrator extends EventEmitter {
constructor(
private logger: Logger,
private config: WorkflowConfig
) {
super();
}
async execute(workflow: WorkflowStep): Promise<WorkflowResult> {
const startTime = Date.now();
const result: WorkflowResult = {
success: true,
steps: [],
duration: 0
};
try {
await this.executeStep(workflow, result);
} catch (error) {
result.success = false;
result.error = error;
this.emit('workflow:failed', result);
}
result.duration = Date.now() - startTime;
this.emit('workflow:completed', result);
return result;
}
private async executeStep(
step: WorkflowStep,
result: WorkflowResult,
parentPath: string = ''
): Promise<void> {
const stepPath = parentPath ? `${parentPath}.${step.name}` : step.name;
this.emit('step:start', { step: stepPath });
// Check condition
if (step.condition && !step.condition()) {
this.logger.info(`Skipping step ${stepPath} due to condition`);
this.emit('step:skipped', { step: stepPath });
return;
}
const stepResult: StepResult = {
name: step.name,
path: stepPath,
startTime: Date.now(),
success: true
};
try {
if (step.action) {
// Execute single action
await this.executeAction(step, stepResult);
} else if (step.steps) {
// Execute sub-steps
if (step.type === 'parallel') {
await this.executeParallel(step.steps, result, stepPath);
} else {
await this.executeSequential(step.steps, result, stepPath);
}
}
stepResult.endTime = Date.now();
stepResult.duration = stepResult.endTime - stepResult.startTime;
result.steps.push(stepResult);
this.emit('step:complete', { step: stepPath, result: stepResult });
} catch (error) {
stepResult.success = false;
stepResult.error = error;
result.steps.push(stepResult);
this.emit('step:failed', { step: stepPath, error });
if (step.onError === 'fail') {
throw error;
}
}
}
private async executeAction(
step: WorkflowStep,
stepResult: StepResult
): Promise<void> {
const timeout = step.timeout || this.config.defaultTimeout;
const retries = step.retries || 0;
let lastError: Error;
for (let attempt = 0; attempt <= retries; attempt++) {
try {
const result = await Promise.race([
step.action!(),
this.createTimeout(timeout)
]);
stepResult.output = result;
return;
} catch (error) {
lastError = error as Error;
if (attempt < retries) {
this.logger.warn(`Step ${step.name} failed, retry ${attempt + 1}/${retries}`);
await this.delay(this.calculateBackoff(attempt));
}
}
}
throw lastError!;
}
private async executeParallel(
steps: WorkflowStep[],
result: WorkflowResult,
parentPath: string
): Promise<void> {
await Promise.all(
steps.map(step => this.executeStep(step, result, parentPath))
);
}
private async executeSequential(
steps: WorkflowStep[],
result: WorkflowResult,
parentPath: string
): Promise<void> {
for (const step of steps) {
await this.executeStep(step, result, parentPath);
}
}
private createTimeout(ms: number): Promise<never> {
return new Promise((_, reject) => {
setTimeout(() => reject(new Error(`Timeout after ${ms}ms`)), ms);
});
}
private calculateBackoff(attempt: number): number {
return Math.min(1000 * Math.pow(2, attempt), 30000);
}
private delay(ms: number): Promise<void> {
return new Promise(resolve => setTimeout(resolve, ms));
}
}
// Example workflow definition
export const deploymentWorkflow: WorkflowStep = {
name: 'deployment',
type: 'sequential',
steps: [
{
name: 'pre-deployment',
type: 'parallel',
steps: [
{
name: 'backup-database',
action: async () => {
// Backup database
},
timeout: 300000 // 5 minutes
},
{
name: 'health-check',
action: async () => {
// Check system health
},
retries: 3
}
]
},
{
name: 'deployment',
type: 'sequential',
steps: [
{
name: 'blue-green-switch',
action: async () => {
// Switch traffic to new version
},
onError: 'retry',
retries: 2
},
{
name: 'smoke-tests',
action: async () => {
// Run smoke tests
},
onError: 'fail'
}
]
},
{
name: 'post-deployment',
type: 'parallel',
steps: [
{
name: 'notify-teams',
action: async () => {
// Send notifications
},
onError: 'continue'
},
{
name: 'update-monitoring',
action: async () => {
// Update monitoring dashboards
}
}
]
}
]
};
```
## Output Format
1. **Workflow Analysis**: Current processes and automation opportunities
2. **CI/CD Pipeline**: Complete GitHub Actions/GitLab CI configuration
3. **Release Automation**: Semantic versioning and release workflows
4. **Development Automation**: Pre-commit hooks and setup scripts
5. **Infrastructure Automation**: Terraform and Kubernetes workflows
6. **Security Automation**: Scanning and compliance workflows
7. **Documentation Generation**: Automated docs and diagrams
8. **Workflow Orchestration**: Complex workflow management
9. **Monitoring Integration**: Automated alerts and dashboards
10. **Implementation Guide**: Step-by-step setup instructions
Focus on creating reliable, maintainable automation that reduces manual work while maintaining quality and security standards.💡 Suggested Test Inputs
Loading suggested inputs...
🎯 Community Test Results
Loading results...
📦 Package Info
- Format
- claude
- Type
- slash-command
- Category
- security
- License
- MIT