Patterns and techniques for evaluating and improving AI agent outputs. Use this skill when: - Implementing self-critique and reflection loops - Building evaluator-optimizer pipelines for quality-critical generation - Creating test-driven code refinement workflows - Designing rubric-based or LLM-as-judge evaluation systems - Adding iterative improvement to agent outputs (code, reports, analysis) - Measuring and improving agent response quality
# Agentic Evaluation Patterns
Patterns for self-improvement through iterative evaluation and refinement.
## Overview
Evaluation patterns enable agents to assess and improve their own outputs, moving beyond single-shot generation to iterative refinement loops.
```
Generate → Evaluate → Critique → Refine → Output
↑ │
└──────────────────────────────┘
```
## When to Use
- **Quality-critical generation**: Code, reports, analysis requiring high accuracy
- **Tasks with clear evaluation criteria**: Defined success metrics exist
- **Content requiring specific standards**: Style guides, compliance, formatting
---
## Pattern 1: Basic Reflection
Agent evaluates and improves its own output through self-critique.
```python
def reflect_and_refine(task: str, criteria: list[str], max_iterations: int = 3) -> str:
"""Generate with reflection loop."""
output = llm(f"Complete this task:\n{task}")
for i in range(max_iterations):
# Self-critique
critique = llm(f"""
Evaluate this output against criteria: {criteria}
Output: {output}
Rate each: PASS/FAIL with feedback as JSON.
""")
critique_data = json.loads(critique)
all_pass = all(c["status"] == "PASS" for c in critique_data.values())
if all_pass:
return output
# Refine based on critique
failed = {k: v["feedback"] for k, v in critique_data.items() if v["status"] == "FAIL"}
output = llm(f"Improve to address: {failed}\nOriginal: {output}")
return output
```
**Key insight**: Use structured JSON output for reliable parsing of critique results.
---
## Pattern 2: Evaluator-Optimizer
Separate generation and evaluation into distinct components for clearer responsibilities.
```python
class EvaluatorOptimizer:
def __init__(self, score_threshold: float = 0.8):
self.score_threshold = score_threshold
def generate(self, task: str) -> str:
return llm(f"Complete: {task}")
def evaluate(self, output: str, task: str) -> dict:
return json.loads(llm(f"""
Evaluate output for task: {task}
Output: {output}
Return JSON: {{"overall_score": 0-1, "dimensions": {{"accuracy": ..., "clarity": ...}}}}
"""))
def optimize(self, output: str, feedback: dict) -> str:
return llm(f"Improve based on feedback: {feedback}\nOutput: {output}")
def run(self, task: str, max_iterations: int = 3) -> str:
output = self.generate(task)
for _ in range(max_iterations):
evaluation = self.evaluate(output, task)
if evaluation["overall_score"] >= self.score_threshold:
break
output = self.optimize(output, evaluation)
return output
```
---
## Pattern 3: Code-Specific Reflection
Test-driven refinement loop for code generation.
```python
class CodeReflector:
def reflect_and_fix(self, spec: str, max_iterations: int = 3) -> str:
code = llm(f"Write Python code for: {spec}")
tests = llm(f"Generate pytest tests for: {spec}\nCode: {code}")
for _ in range(max_iterations):
result = run_tests(code, tests)
if result["success"]:
return code
code = llm(f"Fix error: {result['error']}\nCode: {code}")
return code
```
---
## Evaluation Strategies
### Outcome-Based
Evaluate whether output achieves the expected result.
```python
def evaluate_outcome(task: str, output: str, expected: str) -> str:
return llm(f"Does output achieve expected outcome? Task: {task}, Expected: {expected}, Output: {output}")
```
### LLM-as-Judge
Use LLM to compare and rank outputs.
```python
def llm_judge(output_a: str, output_b: str, criteria: str) -> str:
return llm(f"Compare outputs A and B for {criteria}. Which is better and why?")
```
### Rubric-Based
Score outputs against weighted dimensions.
```python
RUBRIC = {
"accuracy": {"weight": 0.4},
"clarity": {"weight": 0.3},
"completeness": {"weight": 0.3}
}
def evaluate_with_rubric(output: str, rubric: dict) -> float:
scores = json.loads(llm(f"Rate 1-5 for each dimension: {list(rubric.keys())}\nOutput: {output}"))
return sum(scores[d] * rubric[d]["weight"] for d in rubric) / 5
```
---
## Best Practices
| Practice | Rationale |
|----------|-----------|
| **Clear criteria** | Define specific, measurable evaluation criteria upfront |
| **Iteration limits** | Set max iterations (3-5) to prevent infinite loops |
| **Convergence check** | Stop if output score isn't improving between iterations |
| **Log history** | Keep full trajectory for debugging and analysis |
| **Structured output** | Use JSON for reliable parsing of evaluation results |
---
## Quick Start Checklist
```markdown
## Evaluation Implementation Checklist
### Setup
- [ ] Define evaluation criteria/rubric
- [ ] Set score threshold for "good enough"
- [ ] Configure max iterations (default: 3)
### Implementation
- [ ] Implement generate() function
- [ ] Implement evaluate() function with structured output
- [ ] Implement optimize() function
- [ ] Wire up the refinement loop
### Safety
- [ ] Add convergence detection
- [ ] Log all iterations for debugging
- [ ] Handle evaluation parse failures gracefully
```Instrument a webapp to send useful telemetry data to Azure App Insights
# AppInsights instrumentation
This skill enables sending telemetry data of a webapp to Azure App Insights for better observability of the app's health.
## When to use this skill
Use this skill when the user wants to enable telemetry for their webapp.
## Prerequisites
The app in the workspace must be one of these kinds
- An ASP.NET Core app hosted in Azure
- A Node.js app hosted in Azure
## Guidelines
### Collect context information
Find out the (programming language, application framework, hosting) tuple of the application the user is trying to add telemetry support in. This determines how the application can be instrumented. Read the source code to make an educated guess. Confirm with the user on anything you don't know. You must always ask the user where the application is hosted (e.g. on a personal computer, in an Azure App Service as code, in an Azure App Service as container, in an Azure Container App, etc.).
### Prefer auto-instrument if possible
If the app is a C# ASP.NET Core app hosted in Azure App Service, use [AUTO guide](references/AUTO.md) to help user auto-instrument the app.
### Manually instrument
Manually instrument the app by creating the AppInsights resource and update the app's code.
#### Create AppInsights resource
Use one of the following options that fits the environment.
- Add AppInsights to existing Bicep template. See [examples/appinsights.bicep](examples/appinsights.bicep) for what to add. This is the best option if there are existing Bicep template files in the workspace.
- Use Azure CLI. See [scripts/appinsights.ps1](scripts/appinsights.ps1) for what Azure CLI command to execute to create the App Insights resource.
No matter which option you choose, recommend the user to create the App Insights resource in a meaningful resource group that makes managing resources easier. A good candidate will be the same resource group that contains the resources for the hosted app in Azure.
#### Modify application code
- If the app is an ASP.NET Core app, see [ASPNETCORE guide](references/ASPNETCORE.md) for how to modify the C# code.
- If the app is a Node.js app, see [NODEJS guide](references/NODEJS.md) for how to modify the JavaScript/TypeScript code.
- If the app is a Python app, see [PYTHON guide](references/PYTHON.md) for how to modify the Python code.Performs comprehensive preflight validation of Bicep deployments to Azure, including template syntax validation, what-if analysis, and permission checks. Use this skill before any deployment to Azure to preview changes, identify potential issues, and ensure the deployment will succeed. Activate when users mention deploying to Azure, validating Bicep files, checking deployment permissions, previewing infrastructure changes, running what-if, or preparing for azd provision.
# Azure Deployment Preflight Validation
This skill validates Bicep deployments before execution, supporting both Azure CLI (`az`) and Azure Developer CLI (`azd`) workflows.
## When to Use This Skill
- Before deploying infrastructure to Azure
- When preparing or reviewing Bicep files
- To preview what changes a deployment will make
- To verify permissions are sufficient for deployment
- Before running `azd up`, `azd provision`, or `az deployment` commands
## Validation Process
Follow these steps in order. Continue to the next step even if a previous step fails—capture all issues in the final report.
### Step 1: Detect Project Type
Determine the deployment workflow by checking for project indicators:
1. **Check for azd project**: Look for `azure.yaml` in the project root
- If found → Use **azd workflow**
- If not found → Use **az CLI workflow**
2. **Locate Bicep files**: Find all `.bicep` files to validate
- For azd projects: Check `infra/` directory first, then project root
- For standalone: Use the file specified by the user or search common locations (`infra/`, `deploy/`, project root)
3. **Auto-detect parameter files**: For each Bicep file, look for matching parameter files:
- `<filename>.bicepparam` (Bicep parameters - preferred)
- `<filename>.parameters.json` (JSON parameters)
- `parameters.json` or `parameters/<env>.json` in same directory
### Step 2: Validate Bicep Syntax
Run Bicep CLI to check template syntax before attempting deployment validation:
```bash
bicep build <bicep-file> --stdout
```
**What to capture:**
- Syntax errors with line/column numbers
- Warning messages
- Build success/failure status
**If Bicep CLI is not installed:**
- Note the issue in the report
- Continue to Step 3 (Azure will validate syntax during what-if)
### Step 3: Run Preflight Validation
Choose the appropriate validation based on project type detected in Step 1.
#### For azd Projects (azure.yaml exists)
Use `azd provision --preview` to validate the deployment:
```bash
azd provision --preview
```
If an environment is specified or multiple environments exist:
```bash
azd provision --preview --environment <env-name>
```
#### For Standalone Bicep (no azure.yaml)
Determine the deployment scope from the Bicep file's `targetScope` declaration:
| Target Scope | Command |
|--------------|---------|
| `resourceGroup` (default) | `az deployment group what-if` |
| `subscription` | `az deployment sub what-if` |
| `managementGroup` | `az deployment mg what-if` |
| `tenant` | `az deployment tenant what-if` |
**Run with Provider validation level first:**
```bash
# Resource Group scope (most common)
az deployment group what-if \
--resource-group <rg-name> \
--template-file <bicep-file> \
--parameters <param-file> \
--validation-level Provider
# Subscription scope
az deployment sub what-if \
--location <location> \
--template-file <bicep-file> \
--parameters <param-file> \
--validation-level Provider
# Management Group scope
az deployment mg what-if \
--location <location> \
--management-group-id <mg-id> \
--template-file <bicep-file> \
--parameters <param-file> \
--validation-level Provider
# Tenant scope
az deployment tenant what-if \
--location <location> \
--template-file <bicep-file> \
--parameters <param-file> \
--validation-level Provider
```
**Fallback Strategy:**
If `--validation-level Provider` fails with permission errors (RBAC), retry with `ProviderNoRbac`:
```bash
az deployment group what-if \
--resource-group <rg-name> \
--template-file <bicep-file> \
--validation-level ProviderNoRbac
```
Note the fallback in the report—the user may lack full deployment permissions.
### Step 4: Capture What-If Results
Parse the what-if output to categorize resource changes:
| Change Type | Symbol | Meaning |
|-------------|--------|---------|
| Create | `+` | New resource will be created |
| Delete | `-` | Resource will be deleted |
| Modify | `~` | Resource properties will change |
| NoChange | `=` | Resource unchanged |
| Ignore | `*` | Resource not analyzed (limits reached) |
| Deploy | `!` | Resource will be deployed (changes unknown) |
For modified resources, capture the specific property changes.
### Step 5: Generate Report
Create a Markdown report file in the **project root** named:
- `preflight-report.md`
Use the template structure from [references/REPORT-TEMPLATE.md](references/REPORT-TEMPLATE.md).
**Report sections:**
1. **Summary** - Overall status, timestamp, files validated, target scope
2. **Tools Executed** - Commands run, versions, validation levels used
3. **Issues** - All errors and warnings with severity and remediation
4. **What-If Results** - Resources to create/modify/delete/unchanged
5. **Recommendations** - Actionable next steps
## Required Information
Before running validation, gather:
| Information | Required For | How to Obtain |
|-------------|--------------|---------------|
| Resource Group | `az deployment group` | Ask user or check existing `.azure/` config |
| Subscription | All deployments | `az account show` or ask user |
| Location | Sub/MG/Tenant scope | Ask user or use default from config |
| Environment | azd projects | `azd env list` or ask user |
If required information is missing, prompt the user before proceeding.
## Error Handling
See [references/ERROR-HANDLING.md](references/ERROR-HANDLING.md) for detailed error handling guidance.
**Key principle:** Continue validation even when errors occur. Capture all issues in the final report.
| Error Type | Action |
|------------|--------|
| Not logged in | Note in report, suggest `az login` or `azd auth login` |
| Permission denied | Fall back to `ProviderNoRbac`, note in report |
| Bicep syntax error | Include all errors, continue to other files |
| Tool not installed | Note in report, skip that validation step |
| Resource group not found | Note in report, suggest creating it |
## Tool Requirements
This skill uses the following tools:
- **Azure CLI** (`az`) - Version 2.76.0+ recommended for `--validation-level`
- **Azure Developer CLI** (`azd`) - For projects with `azure.yaml`
- **Bicep CLI** (`bicep`) - For syntax validation
- **Azure MCP Tools** - For documentation lookups and best practices
Check tool availability before starting:
```bash
az --version
azd version
bicep --version
```
## Example Workflow
1. User: "Validate my Bicep deployment before I run it"
2. Agent detects `azure.yaml` → azd project
3. Agent finds `infra/main.bicep` and `infra/main.bicepparam`
4. Agent runs `bicep build infra/main.bicep --stdout`
5. Agent runs `azd provision --preview`
6. Agent generates `preflight-report.md` in project root
7. Agent summarizes findings to user
## Reference Documentation
- [Validation Commands Reference](references/VALIDATION-COMMANDS.md)
- [Report Template](references/REPORT-TEMPLATE.md)
- [Error Handling Guide](references/ERROR-HANDLING.md)Manage Azure DevOps resources via CLI including projects, repos, pipelines, builds, pull requests, work items, artifacts, and service endpoints. Use when working with Azure DevOps, az commands, devops automation, CI/CD, or when user mentions Azure DevOps CLI.
# Azure DevOps CLI
This Skill helps manage Azure DevOps resources using the Azure CLI with Azure DevOps extension.
**CLI Version:** 2.81.0 (current as of 2025)
## Prerequisites
Install Azure CLI and Azure DevOps extension:
```bash
# Install Azure CLI
brew install azure-cli # macOS
curl -sL https://aka.ms/InstallAzureCLIDeb | sudo bash # Linux
pip install azure-cli # via pip
# Verify installation
az --version
# Install Azure DevOps extension
az extension add --name azure-devops
az extension show --name azure-devops
```
## CLI Structure
```
az devops # Main DevOps commands
├── admin # Administration (banner)
├── extension # Extension management
├── project # Team projects
├── security # Security operations
│ ├── group # Security groups
│ └── permission # Security permissions
├── service-endpoint # Service connections
├── team # Teams
├── user # Users
├── wiki # Wikis
├── configure # Set defaults
├── invoke # Invoke REST API
├── login # Authenticate
└── logout # Clear credentials
az pipelines # Azure Pipelines
├── agent # Agents
├── build # Builds
├── folder # Pipeline folders
├── pool # Agent pools
├── queue # Agent queues
├── release # Releases
├── runs # Pipeline runs
├── variable # Pipeline variables
└── variable-group # Variable groups
az boards # Azure Boards
├── area # Area paths
├── iteration # Iterations
└── work-item # Work items
az repos # Azure Repos
├── import # Git imports
├── policy # Branch policies
├── pr # Pull requests
└── ref # Git references
az artifacts # Azure Artifacts
└── universal # Universal Packages
├── download # Download packages
└── publish # Publish packages
```
## Authentication
### Login to Azure DevOps
```bash
# Interactive login (prompts for PAT)
az devops login --organization https://dev.azure.com/{org}
# Login with PAT token
az devops login --organization https://dev.azure.com/{org} --token YOUR_PAT_TOKEN
# Logout
az devops logout --organization https://dev.azure.com/{org}
```
### Configure Defaults
```bash
# Set default organization and project
az devops configure --defaults organization=https://dev.azure.com/{org} project={project}
# List current configuration
az devops configure --list
# Enable Git aliases
az devops configure --use-git-aliases true
```
## Extension Management
### List Extensions
```bash
# List available extensions
az extension list-available --output table
# List installed extensions
az extension list --output table
```
### Manage Azure DevOps Extension
```bash
# Install Azure DevOps extension
az extension add --name azure-devops
# Update Azure DevOps extension
az extension update --name azure-devops
# Remove extension
az extension remove --name azure-devops
# Install from local path
az extension add --source ~/extensions/azure-devops.whl
```
## Projects
### List Projects
```bash
az devops project list --organization https://dev.azure.com/{org}
az devops project list --top 10 --output table
```
### Create Project
```bash
az devops project create \
--name myNewProject \
--organization https://dev.azure.com/{org} \
--description "My new DevOps project" \
--source-control git \
--visibility private
```
### Show Project Details
```bash
az devops project show --project {project-name} --org https://dev.azure.com/{org}
```
### Delete Project
```bash
az devops project delete --id {project-id} --org https://dev.azure.com/{org} --yes
```
## Repositories
### List Repositories
```bash
az repos list --org https://dev.azure.com/{org} --project {project}
az repos list --output table
```
### Show Repository Details
```bash
az repos show --repository {repo-name} --project {project}
```
### Create Repository
```bash
az repos create --name {repo-name} --project {project}
```
### Delete Repository
```bash
az repos delete --id {repo-id} --project {project} --yes
```
### Update Repository
```bash
az repos update --id {repo-id} --name {new-name} --project {project}
```
## Repository Import
### Import Git Repository
```bash
# Import from public Git repository
az repos import create \
--git-source-url https://github.com/user/repo \
--repository {repo-name}
# Import with authentication
az repos import create \
--git-source-url https://github.com/user/private-repo \
--repository {repo-name} \
--user {username} \
--password {password-or-pat}
```
## Pull Requests
### Create Pull Request
```bash
# Basic PR creation
az repos pr create \
--repository {repo} \
--source-branch {source-branch} \
--target-branch {target-branch} \
--title "PR Title" \
--description "PR description" \
--open
# PR with work items
az repos pr create \
--repository {repo} \
--source-branch {source-branch} \
--work-items 63 64
# Draft PR with reviewers
az repos pr create \
--repository {repo} \
--source-branch feature/new-feature \
--target-branch main \
--title "Feature: New functionality" \
--draft true \
--reviewers [email protected] [email protected] \
--required-reviewers [email protected] \
--labels "enhancement" "backlog"
```
### List Pull Requests
```bash
# All PRs
az repos pr list --repository {repo}
# Filter by status
az repos pr list --repository {repo} --status active
# Filter by creator
az repos pr list --repository {repo} --creator {email}
# Output as table
az repos pr list --repository {repo} --output table
```
### Show PR Details
```bash
az repos pr show --id {pr-id}
az repos pr show --id {pr-id} --open # Open in browser
```
### Update PR (Complete/Abandon/Draft)
```bash
# Complete PR
az repos pr update --id {pr-id} --status completed
# Abandon PR
az repos pr update --id {pr-id} --status abandoned
# Set to draft
az repos pr update --id {pr-id} --draft true
# Publish draft PR
az repos pr update --id {pr-id} --draft false
# Auto-complete when policies pass
az repos pr update --id {pr-id} --auto-complete true
# Set title and description
az repos pr update --id {pr-id} --title "New title" --description "New description"
```
### Checkout PR Locally
```bash
# Checkout PR branch
az repos pr checkout --id {pr-id}
# Checkout with specific remote
az repos pr checkout --id {pr-id} --remote-name upstream
```
### Vote on PR
```bash
az repos pr set-vote --id {pr-id} --vote approve
az repos pr set-vote --id {pr-id} --vote approve-with-suggestions
az repos pr set-vote --id {pr-id} --vote reject
az repos pr set-vote --id {pr-id} --vote wait-for-author
az repos pr set-vote --id {pr-id} --vote reset
```
### PR Reviewers
```bash
# Add reviewers
az repos pr reviewer add --id {pr-id} --reviewers [email protected] [email protected]
# List reviewers
az repos pr reviewer list --id {pr-id}
# Remove reviewers
az repos pr reviewer remove --id {pr-id} --reviewers [email protected]
```
### PR Work Items
```bash
# Add work items to PR
az repos pr work-item add --id {pr-id} --work-items {id1} {id2}
# List PR work items
az repos pr work-item list --id {pr-id}
# Remove work items from PR
az repos pr work-item remove --id {pr-id} --work-items {id1}
```
### PR Policies
```bash
# List policies for a PR
az repos pr policy list --id {pr-id}
# Queue policy evaluation for a PR
az repos pr policy queue --id {pr-id} --evaluation-id {evaluation-id}
```
## Pipelines
### List Pipelines
```bash
az pipelines list --output table
az pipelines list --query "[?name=='myPipeline']"
az pipelines list --folder-path 'folder/subfolder'
```
### Create Pipeline
```bash
# From local repository context (auto-detects settings)
az pipelines create --name 'ContosoBuild' --description 'Pipeline for contoso project'
# With specific branch and YAML path
az pipelines create \
--name {pipeline-name} \
--repository {repo} \
--branch main \
--yaml-path azure-pipelines.yml \
--description "My CI/CD pipeline"
# For GitHub repository
az pipelines create \
--name 'GitHubPipeline' \
--repository https://github.com/Org/Repo \
--branch main \
--repository-type github
# Skip first run
az pipelines create --name 'MyPipeline' --skip-run true
```
### Show Pipeline
```bash
az pipelines show --id {pipeline-id}
az pipelines show --name {pipeline-name}
```
### Update Pipeline
```bash
az pipelines update --id {pipeline-id} --name "New name" --description "Updated description"
```
### Delete Pipeline
```bash
az pipelines delete --id {pipeline-id} --yes
```
### Run Pipeline
```bash
# Run by name
az pipelines run --name {pipeline-name} --branch main
# Run by ID
az pipelines run --id {pipeline-id} --branch refs/heads/main
# With parameters
az pipelines run --name {pipeline-name} --parameters version=1.0.0 environment=prod
# With variables
az pipelines run --name {pipeline-name} --variables buildId=123 configuration=release
# Open results in browser
az pipelines run --name {pipeline-name} --open
```
## Pipeline Runs
### List Runs
```bash
az pipelines runs list --pipeline {pipeline-id}
az pipelines runs list --name {pipeline-name} --top 10
az pipelines runs list --branch main --status completed
```
### Show Run Details
```bash
az pipelines runs show --run-id {run-id}
az pipelines runs show --run-id {run-id} --open
```
### Pipeline Artifacts
```bash
# List artifacts for a run
az pipelines runs artifact list --run-id {run-id}
# Download artifact
az pipelines runs artifact download \
--artifact-name '{artifact-name}' \
--path {local-path} \
--run-id {run-id}
# Upload artifact
az pipelines runs artifact upload \
--artifact-name '{artifact-name}' \
--path {local-path} \
--run-id {run-id}
```
### Pipeline Run Tags
```bash
# Add tag to run
az pipelines runs tag add --run-id {run-id} --tags production v1.0
# List run tags
az pipelines runs tag list --run-id {run-id} --output table
```
## Builds
### List Builds
```bash
az pipelines build list
az pipelines build list --definition {build-definition-id}
az pipelines build list --status completed --result succeeded
```
### Queue Build
```bash
az pipelines build queue --definition {build-definition-id} --branch main
az pipelines build queue --definition {build-definition-id} --parameters version=1.0.0
```
### Show Build Details
```bash
az pipelines build show --id {build-id}
```
### Cancel Build
```bash
az pipelines build cancel --id {build-id}
```
### Build Tags
```bash
# Add tag to build
az pipelines build tag add --build-id {build-id} --tags prod release
# Delete tag from build
az pipelines build tag delete --build-id {build-id} --tag prod
```
## Build Definitions
### List Build Definitions
```bash
az pipelines build definition list
az pipelines build definition list --name {definition-name}
```
### Show Build Definition
```bash
az pipelines build definition show --id {definition-id}
```
## Releases
### List Releases
```bash
az pipelines release list
az pipelines release list --definition {release-definition-id}
```
### Create Release
```bash
az pipelines release create --definition {release-definition-id}
az pipelines release create --definition {release-definition-id} --description "Release v1.0"
```
### Show Release
```bash
az pipelines release show --id {release-id}
```
## Release Definitions
### List Release Definitions
```bash
az pipelines release definition list
```
### Show Release Definition
```bash
az pipelines release definition show --id {definition-id}
```
## Pipeline Variables
### List Variables
```bash
az pipelines variable list --pipeline-id {pipeline-id}
```
### Create Variable
```bash
# Non-secret variable
az pipelines variable create \
--name {var-name} \
--value {var-value} \
--pipeline-id {pipeline-id}
# Secret variable
az pipelines variable create \
--name {var-name} \
--secret true \
--pipeline-id {pipeline-id}
# Secret with prompt
az pipelines variable create \
--name {var-name} \
--secret true \
--prompt true \
--pipeline-id {pipeline-id}
```
### Update Variable
```bash
az pipelines variable update \
--name {var-name} \
--value {new-value} \
--pipeline-id {pipeline-id}
# Update secret variable
az pipelines variable update \
--name {var-name} \
--secret true \
--value "{new-secret-value}" \
--pipeline-id {pipeline-id}
```
### Delete Variable
```bash
az pipelines variable delete --name {var-name} --pipeline-id {pipeline-id} --yes
```
## Variable Groups
### List Variable Groups
```bash
az pipelines variable-group list
az pipelines variable-group list --output table
```
### Show Variable Group
```bash
az pipelines variable-group show --id {group-id}
```
### Create Variable Group
```bash
az pipelines variable-group create \
--name {group-name} \
--variables key1=value1 key2=value2 \
--authorize true
```
### Update Variable Group
```bash
az pipelines variable-group update \
--id {group-id} \
--name {new-name} \
--description "Updated description"
```
### Delete Variable Group
```bash
az pipelines variable-group delete --id {group-id} --yes
```
### Variable Group Variables
#### List Variables
```bash
az pipelines variable-group variable list --group-id {group-id}
```
#### Create Variable
```bash
# Non-secret variable
az pipelines variable-group variable create \
--group-id {group-id} \
--name {var-name} \
--value {var-value}
# Secret variable (will prompt for value if not provided)
az pipelines variable-group variable create \
--group-id {group-id} \
--name {var-name} \
--secret true
# Secret with environment variable
export AZURE_DEVOPS_EXT_PIPELINE_VAR_MySecret=secretvalue
az pipelines variable-group variable create \
--group-id {group-id} \
--name MySecret \
--secret true
```
#### Update Variable
```bash
az pipelines variable-group variable update \
--group-id {group-id} \
--name {var-name} \
--value {new-value} \
--secret false
```
#### Delete Variable
```bash
az pipelines variable-group variable delete \
--group-id {group-id} \
--name {var-name}
```
## Pipeline Folders
### List Folders
```bash
az pipelines folder list
```
### Create Folder
```bash
az pipelines folder create --path 'folder/subfolder' --description "My folder"
```
### Delete Folder
```bash
az pipelines folder delete --path 'folder/subfolder'
```
### Update Folder
```bash
az pipelines folder update --path 'old-folder' --new-path 'new-folder'
```
## Agent Pools
### List Agent Pools
```bash
az pipelines pool list
az pipelines pool list --pool-type automation
az pipelines pool list --pool-type deployment
```
### Show Agent Pool
```bash
az pipelines pool show --pool-id {pool-id}
```
## Agent Queues
### List Agent Queues
```bash
az pipelines queue list
az pipelines queue list --pool-name {pool-name}
```
### Show Agent Queue
```bash
az pipelines queue show --id {queue-id}
```
## Work Items (Boards)
### Query Work Items
```bash
# WIQL query
az boards query \
--wiql "SELECT [System.Id], [System.Title], [System.State] FROM WorkItems WHERE [System.AssignedTo] = @Me AND [System.State] = 'Active'"
# Query with output format
az boards query --wiql "SELECT * FROM WorkItems" --output table
```
### Show Work Item
```bash
az boards work-item show --id {work-item-id}
az boards work-item show --id {work-item-id} --open
```
### Create Work Item
```bash
# Basic work item
az boards work-item create \
--title "Fix login bug" \
--type Bug \
--assigned-to [email protected] \
--description "Users cannot login with SSO"
# With area and iteration
az boards work-item create \
--title "New feature" \
--type "User Story" \
--area "Project\\Area1" \
--iteration "Project\\Sprint 1"
# With custom fields
az boards work-item create \
--title "Task" \
--type Task \
--fields "Priority=1" "Severity=2"
# With discussion comment
az boards work-item create \
--title "Issue" \
--type Bug \
--discussion "Initial investigation completed"
# Open in browser after creation
az boards work-item create --title "Bug" --type Bug --open
```
### Update Work Item
```bash
# Update state, title, and assignee
az boards work-item update \
--id {work-item-id} \
--state "Active" \
--title "Updated title" \
--assigned-to [email protected]
# Move to different area
az boards work-item update \
--id {work-item-id} \
--area "{ProjectName}\\{Team}\\{Area}"
# Change iteration
az boards work-item update \
--id {work-item-id} \
--iteration "{ProjectName}\\Sprint 5"
# Add comment/discussion
az boards work-item update \
--id {work-item-id} \
--discussion "Work in progress"
# Update with custom fields
az boards work-item update \
--id {work-item-id} \
--fields "Priority=1" "StoryPoints=5"
```
### Delete Work Item
```bash
# Soft delete (can be restored)
az boards work-item delete --id {work-item-id} --yes
# Permanent delete
az boards work-item delete --id {work-item-id} --destroy --yes
```
### Work Item Relations
```bash
# List relations
az boards work-item relation list --id {work-item-id}
# List supported relation types
az boards work-item relation list-type
# Add relation
az boards work-item relation add --id {work-item-id} --relation-type parent --target-id {parent-id}
# Remove relation
az boards work-item relation remove --id {work-item-id} --relation-id {relation-id}
```
## Area Paths
### List Areas for Project
```bash
az boards area project list --project {project}
az boards area project show --path "Project\\Area1" --project {project}
```
### Create Area
```bash
az boards area project create --path "Project\\NewArea" --project {project}
```
### Update Area
```bash
az boards area project update \
--path "Project\\OldArea" \
--new-path "Project\\UpdatedArea" \
--project {project}
```
### Delete Area
```bash
az boards area project delete --path "Project\\AreaToDelete" --project {project} --yes
```
### Area Team Management
```bash
# List areas for team
az boards area team list --team {team-name} --project {project}
# Add area to team
az boards area team add \
--team {team-name} \
--path "Project\\NewArea" \
--project {project}
# Remove area from team
az boards area team remove \
--team {team-name} \
--path "Project\\AreaToRemove" \
--project {project}
# Update team area
az boards area team update \
--team {team-name} \
--path "Project\\Area" \
--project {project} \
--include-sub-areas true
```
## Iterations
### List Iterations for Project
```bash
az boards iteration project list --project {project}
az boards iteration project show --path "Project\\Sprint 1" --project {project}
```
### Create Iteration
```bash
az boards iteration project create --path "Project\\Sprint 1" --project {project}
```
### Update Iteration
```bash
az boards iteration project update \
--path "Project\\OldSprint" \
--new-path "Project\\NewSprint" \
--project {project}
```
### Delete Iteration
```bash
az boards iteration project delete --path "Project\\OldSprint" --project {project} --yes
```
### List Iterations for Team
```bash
az boards iteration team list --team {team-name} --project {project}
```
### Add Iteration to Team
```bash
az boards iteration team add \
--team {team-name} \
--path "Project\\Sprint 1" \
--project {project}
```
### Remove Iteration from Team
```bash
az boards iteration team remove \
--team {team-name} \
--path "Project\\Sprint 1" \
--project {project}
```
### List Work Items in Iteration
```bash
az boards iteration team list-work-items \
--team {team-name} \
--path "Project\\Sprint 1" \
--project {project}
```
### Set Default Iteration for Team
```bash
az boards iteration team set-default-iteration \
--team {team-name} \
--path "Project\\Sprint 1" \
--project {project}
```
### Show Default Iteration
```bash
az boards iteration team show-default-iteration \
--team {team-name} \
--project {project}
```
### Set Backlog Iteration for Team
```bash
az boards iteration team set-backlog-iteration \
--team {team-name} \
--path "Project\\Sprint 1" \
--project {project}
```
### Show Backlog Iteration
```bash
az boards iteration team show-backlog-iteration \
--team {team-name} \
--project {project}
```
### Show Current Iteration
```bash
az boards iteration team show --team {team-name} --project {project} --timeframe current
```
## Git References
### List References (Branches)
```bash
az repos ref list --repository {repo}
az repos ref list --repository {repo} --query "[?name=='refs/heads/main']"
```
### Create Reference (Branch)
```bash
az repos ref create --name refs/heads/new-branch --object-type commit --object {commit-sha}
```
### Delete Reference (Branch)
```bash
az repos ref delete --name refs/heads/old-branch --repository {repo} --project {project}
```
### Lock Branch
```bash
az repos ref lock --name refs/heads/main --repository {repo} --project {project}
```
### Unlock Branch
```bash
az repos ref unlock --name refs/heads/main --repository {repo} --project {project}
```
## Repository Policies
### List All Policies
```bash
az repos policy list --repository {repo-id} --branch main
```
### Create Policy Using Configuration File
```bash
az repos policy create --config policy.json
```
### Update/Delete Policy
```bash
# Update
az repos policy update --id {policy-id} --config updated-policy.json
# Delete
az repos policy delete --id {policy-id} --yes
```
### Policy Types
#### Approver Count Policy
```bash
az repos policy approver-count create \
--blocking true \
--enabled true \
--branch main \
--repository-id {repo-id} \
--minimum-approver-count 2 \
--creator-vote-counts true
```
#### Build Policy
```bash
az repos policy build create \
--blocking true \
--enabled true \
--branch main \
--repository-id {repo-id} \
--build-definition-id {definition-id} \
--queue-on-source-update-only true \
--valid-duration 720
```
#### Work Item Linking Policy
```bash
az repos policy work-item-linking create \
--blocking true \
--branch main \
--enabled true \
--repository-id {repo-id}
```
#### Required Reviewer Policy
```bash
az repos policy required-reviewer create \
--blocking true \
--enabled true \
--branch main \
--repository-id {repo-id} \
--required-reviewers [email protected]
```
#### Merge Strategy Policy
```bash
az repos policy merge-strategy create \
--blocking true \
--enabled true \
--branch main \
--repository-id {repo-id} \
--allow-squash true \
--allow-rebase true \
--allow-no-fast-forward true
```
#### Case Enforcement Policy
```bash
az repos policy case-enforcement create \
--blocking true \
--enabled true \
--branch main \
--repository-id {repo-id}
```
#### Comment Required Policy
```bash
az repos policy comment-required create \
--blocking true \
--enabled true \
--branch main \
--repository-id {repo-id}
```
#### File Size Policy
```bash
az repos policy file-size create \
--blocking true \
--enabled true \
--branch main \
--repository-id {repo-id} \
--maximum-file-size 10485760 # 10MB in bytes
```
## Service Endpoints
### List Service Endpoints
```bash
az devops service-endpoint list --project {project}
az devops service-endpoint list --project {project} --output table
```
### Show Service Endpoint
```bash
az devops service-endpoint show --id {endpoint-id} --project {project}
```
### Create Service Endpoint
```bash
# Using configuration file
az devops service-endpoint create --service-endpoint-configuration endpoint.json --project {project}
```
### Delete Service Endpoint
```bash
az devops service-endpoint delete --id {endpoint-id} --project {project} --yes
```
## Teams
### List Teams
```bash
az devops team list --project {project}
```
### Show Team
```bash
az devops team show --team {team-name} --project {project}
```
### Create Team
```bash
az devops team create \
--name {team-name} \
--description "Team description" \
--project {project}
```
### Update Team
```bash
az devops team update \
--team {team-name} \
--project {project} \
--name "{new-team-name}" \
--description "Updated description"
```
### Delete Team
```bash
az devops team delete --team {team-name} --project {project} --yes
```
### Show Team Members
```bash
az devops team list-member --team {team-name} --project {project}
```
## Users
### List Users
```bash
az devops user list --org https://dev.azure.com/{org}
az devops user list --top 10 --output table
```
### Show User
```bash
az devops user show --user {user-id-or-email} --org https://dev.azure.com/{org}
```
### Add User
```bash
az devops user add \
--email [email protected] \
--license-type express \
--org https://dev.azure.com/{org}
```
### Update User
```bash
az devops user update \
--user {user-id-or-email} \
--license-type advanced \
--org https://dev.azure.com/{org}
```
### Remove User
```bash
az devops user remove --user {user-id-or-email} --org https://dev.azure.com/{org} --yes
```
## Security Groups
### List Groups
```bash
# List all groups in project
az devops security group list --project {project}
# List all groups in organization
az devops security group list --scope organization
# List with filtering
az devops security group list --project {project} --subject-types vstsgroup
```
### Show Group Details
```bash
az devops security group show --group-id {group-id}
```
### Create Group
```bash
az devops security group create \
--name {group-name} \
--description "Group description" \
--project {project}
```
### Update Group
```bash
az devops security group update \
--group-id {group-id} \
--name "{new-group-name}" \
--description "Updated description"
```
### Delete Group
```bash
az devops security group delete --group-id {group-id} --yes
```
### Group Memberships
```bash
# List memberships
az devops security group membership list --id {group-id}
# Add member
az devops security group membership add \
--group-id {group-id} \
--member-id {member-id}
# Remove member
az devops security group membership remove \
--group-id {group-id} \
--member-id {member-id} --yes
```
## Security Permissions
### List Namespaces
```bash
az devops security permission namespace list
```
### Show Namespace Details
```bash
# Show permissions available in a namespace
az devops security permission namespace show --namespace "GitRepositories"
```
### List Permissions
```bash
# List permissions for user/group and namespace
az devops security permission list \
--id {user-or-group-id} \
--namespace "GitRepositories" \
--project {project}
# List for specific token (repository)
az devops security permission list \
--id {user-or-group-id} \
--namespace "GitRepositories" \
--project {project} \
--token "repoV2/{project}/{repository-id}"
```
### Show Permissions
```bash
az devops security permission show \
--id {user-or-group-id} \
--namespace "GitRepositories" \
--project {project} \
--token "repoV2/{project}/{repository-id}"
```
### Update Permissions
```bash
# Grant permission
az devops security permission update \
--id {user-or-group-id} \
--namespace "GitRepositories" \
--project {project} \
--token "repoV2/{project}/{repository-id}" \
--permission-mask "Pull,Contribute"
# Deny permission
az devops security permission update \
--id {user-or-group-id} \
--namespace "GitRepositories" \
--project {project} \
--token "repoV2/{project}/{repository-id}" \
--permission-mask 0
```
### Reset Permissions
```bash
# Reset specific permission bits
az devops security permission reset \
--id {user-or-group-id} \
--namespace "GitRepositories" \
--project {project} \
--token "repoV2/{project}/{repository-id}" \
--permission-mask "Pull,Contribute"
# Reset all permissions
az devops security permission reset-all \
--id {user-or-group-id} \
--namespace "GitRepositories" \
--project {project} \
--token "repoV2/{project}/{repository-id}" --yes
```
## Wikis
### List Wikis
```bash
# List all wikis in project
az devops wiki list --project {project}
# List all wikis in organization
az devops wiki list
```
### Show Wiki
```bash
az devops wiki show --wiki {wiki-name} --project {project}
az devops wiki show --wiki {wiki-name} --project {project} --open
```
### Create Wiki
```bash
# Create project wiki
az devops wiki create \
--name {wiki-name} \
--project {project} \
--type projectWiki
# Create code wiki from repository
az devops wiki create \
--name {wiki-name} \
--project {project} \
--type codeWiki \
--repository {repo-name} \
--mapped-path /wiki
```
### Delete Wiki
```bash
az devops wiki delete --wiki {wiki-id} --project {project} --yes
```
### Wiki Pages
```bash
# List pages
az devops wiki page list --wiki {wiki-name} --project {project}
# Show page
az devops wiki page show \
--wiki {wiki-name} \
--path "/page-name" \
--project {project}
# Create page
az devops wiki page create \
--wiki {wiki-name} \
--path "/new-page" \
--content "# New Page\n\nPage content here..." \
--project {project}
# Update page
az devops wiki page update \
--wiki {wiki-name} \
--path "/existing-page" \
--content "# Updated Page\n\nNew content..." \
--project {project}
# Delete page
az devops wiki page delete \
--wiki {wiki-name} \
--path "/old-page" \
--project {project} --yes
```
## Administration
### Banner Management
```bash
# List banners
az devops admin banner list
# Show banner details
az devops admin banner show --id {banner-id}
# Add new banner
az devops admin banner add \
--message "System maintenance scheduled" \
--level info # info, warning, error
# Update banner
az devops admin banner update \
--id {banner-id} \
--message "Updated message" \
--level warning \
--expiration-date "2025-12-31T23:59:59Z"
# Remove banner
az devops admin banner remove --id {banner-id}
```
## DevOps Extensions
Manage extensions installed in an Azure DevOps organization (different from CLI extensions).
```bash
# List installed extensions
az devops extension list --org https://dev.azure.com/{org}
# Search marketplace extensions
az devops extension search --search-query "docker"
# Show extension details
az devops extension show --ext-id {extension-id} --org https://dev.azure.com/{org}
# Install extension
az devops extension install \
--ext-id {extension-id} \
--org https://dev.azure.com/{org} \
--publisher {publisher-id}
# Enable extension
az devops extension enable \
--ext-id {extension-id} \
--org https://dev.azure.com/{org}
# Disable extension
az devops extension disable \
--ext-id {extension-id} \
--org https://dev.azure.com/{org}
# Uninstall extension
az devops extension uninstall \
--ext-id {extension-id} \
--org https://dev.azure.com/{org} --yes
```
## Universal Packages
### Publish Package
```bash
az artifacts universal publish \
--feed {feed-name} \
--name {package-name} \
--version {version} \
--path {package-path} \
--project {project}
```
### Download Package
```bash
az artifacts universal download \
--feed {feed-name} \
--name {package-name} \
--version {version} \
--path {download-path} \
--project {project}
```
## Agents
### List Agents in Pool
```bash
az pipelines agent list --pool-id {pool-id}
```
### Show Agent Details
```bash
az pipelines agent show --agent-id {agent-id} --pool-id {pool-id}
```
## Git Aliases
After enabling git aliases:
```bash
# Enable Git aliases
az devops configure --use-git-aliases true
# Use Git commands for DevOps operations
git pr create --target-branch main
git pr list
git pr checkout 123
```
## Output Formats
All commands support multiple output formats:
```bash
# Table format (human-readable)
az pipelines list --output table
# JSON format (default, machine-readable)
az pipelines list --output json
# JSONC (colored JSON)
az pipelines list --output jsonc
# YAML format
az pipelines list --output yaml
# YAMLC (colored YAML)
az pipelines list --output yamlc
# TSV format (tab-separated values)
az pipelines list --output tsv
# None (no output)
az pipelines list --output none
```
## JMESPath Queries
Filter and transform output:
```bash
# Filter by name
az pipelines list --query "[?name=='myPipeline']"
# Get specific fields
az pipelines list --query "[].{Name:name, ID:id}"
# Chain queries
az pipelines list --query "[?name.contains('CI')].{Name:name, ID:id}" --output table
# Get first result
az pipelines list --query "[0]"
# Get top N
az pipelines list --query "[0:5]"
```
## Global Arguments
Available on all commands:
- `--help` / `-h`: Show help
- `--output` / `-o`: Output format (json, jsonc, none, table, tsv, yaml, yamlc)
- `--query`: JMESPath query string
- `--verbose`: Increase logging verbosity
- `--debug`: Show all debug logs
- `--only-show-errors`: Only show errors, suppress warnings
- `--subscription`: Name or ID of subscription
## Common Parameters
| Parameter | Description |
| -------------------------- | ------------------------------------------------------------------- |
| `--org` / `--organization` | Azure DevOps organization URL (e.g., `https://dev.azure.com/{org}`) |
| `--project` / `-p` | Project name or ID |
| `--detect` | Auto-detect organization from git config |
| `--yes` / `-y` | Skip confirmation prompts |
| `--open` | Open in web browser |
## Common Workflows
### Create PR from current branch
```bash
CURRENT_BRANCH=$(git branch --show-current)
az repos pr create \
--source-branch $CURRENT_BRANCH \
--target-branch main \
--title "Feature: $(git log -1 --pretty=%B)" \
--open
```
### Create work item on pipeline failure
```bash
az boards work-item create \
--title "Build $BUILD_BUILDNUMBER failed" \
--type bug \
--org $SYSTEM_TEAMFOUNDATIONCOLLECTIONURI \
--project $SYSTEM_TEAMPROJECT
```
### Download latest pipeline artifact
```bash
RUN_ID=$(az pipelines runs list --pipeline {pipeline-id} --top 1 --query "[0].id" -o tsv)
az pipelines runs artifact download \
--artifact-name 'webapp' \
--path ./output \
--run-id $RUN_ID
```
### Approve and complete PR
```bash
# Vote approve
az repos pr set-vote --id {pr-id} --vote approve
# Complete PR
az repos pr update --id {pr-id} --status completed
```
### Create pipeline from local repo
```bash
# From local git repository (auto-detects repo, branch, etc.)
az pipelines create --name 'CI-Pipeline' --description 'Continuous Integration'
```
### Bulk update work items
```bash
# Query items and update in loop
for id in $(az boards query --wiql "SELECT ID FROM WorkItems WHERE State='New'" -o tsv); do
az boards work-item update --id $id --state "Active"
done
```
## Best Practices
### Authentication and Security
```bash
# Use PAT from environment variable (most secure)
export AZURE_DEVOPS_EXT_PAT=$MY_PAT
az devops login --organization $ORG_URL
# Pipe PAT securely (avoids shell history)
echo $MY_PAT | az devops login --organization $ORG_URL
# Set defaults to avoid repetition
az devops configure --defaults organization=$ORG_URL project=$PROJECT
# Clear credentials after use
az devops logout --organization $ORG_URL
```
### Idempotent Operations
```bash
# Always use --detect for auto-detection
az devops configure --defaults organization=$ORG_URL project=$PROJECT
# Check existence before creation
if ! az pipelines show --id $PIPELINE_ID 2>/dev/null; then
az pipelines create --name "$PIPELINE_NAME" --yaml-path azure-pipelines.yml
fi
# Use --output tsv for shell parsing
PIPELINE_ID=$(az pipelines list --query "[?name=='MyPipeline'].id" --output tsv)
# Use --output json for programmatic access
BUILD_STATUS=$(az pipelines build show --id $BUILD_ID --query "status" --output json)
```
### Script-Safe Output
```bash
# Suppress warnings and errors
az pipelines list --only-show-errors
# No output (useful for commands that only need to execute)
az pipelines run --name "$PIPELINE_NAME" --output none
# TSV format for shell scripts (clean, no formatting)
az repos pr list --output tsv --query "[].{ID:pullRequestId,Title:title}"
# JSON with specific fields
az pipelines list --output json --query "[].{Name:name, ID:id, URL:url}"
```
### Pipeline Orchestration
```bash
# Run pipeline and wait for completion
RUN_ID=$(az pipelines run --name "$PIPELINE_NAME" --query "id" -o tsv)
while true; do
STATUS=$(az pipelines runs show --run-id $RUN_ID --query "status" -o tsv)
if [[ "$STATUS" != "inProgress" && "$STATUS" != "notStarted" ]]; then
break
fi
sleep 10
done
# Check result
RESULT=$(az pipelines runs show --run-id $RUN_ID --query "result" -o tsv)
if [[ "$RESULT" == "succeeded" ]]; then
echo "Pipeline succeeded"
else
echo "Pipeline failed with result: $RESULT"
exit 1
fi
```
### Variable Group Management
```bash
# Create variable group idempotently
VG_NAME="production-variables"
VG_ID=$(az pipelines variable-group list --query "[?name=='$VG_NAME'].id" -o tsv)
if [[ -z "$VG_ID" ]]; then
VG_ID=$(az pipelines variable-group create \
--name "$VG_NAME" \
--variables API_URL=$API_URL API_KEY=$API_KEY \
--authorize true \
--query "id" -o tsv)
echo "Created variable group with ID: $VG_ID"
else
echo "Variable group already exists with ID: $VG_ID"
fi
```
### Service Connection Automation
```bash
# Create service connection using configuration file
cat > service-connection.json <<'EOF'
{
"data": {
"subscriptionId": "$SUBSCRIPTION_ID",
"subscriptionName": "My Subscription",
"creationMode": "Manual",
"serviceEndpointId": "$SERVICE_ENDPOINT_ID"
},
"url": "https://management.azure.com/",
"authorization": {
"parameters": {
"tenantid": "$TENANT_ID",
"serviceprincipalid": "$SP_ID",
"authenticationType": "spnKey",
"serviceprincipalkey": "$SP_KEY"
},
"scheme": "ServicePrincipal"
},
"type": "azurerm",
"isShared": false,
"isReady": true
}
EOF
az devops service-endpoint create \
--service-endpoint-configuration service-connection.json \
--project "$PROJECT"
```
### Pull Request Automation
```bash
# Create PR with work items and reviewers
PR_ID=$(az repos pr create \
--repository "$REPO_NAME" \
--source-branch "$FEATURE_BRANCH" \
--target-branch main \
--title "Feature: $(git log -1 --pretty=%B)" \
--description "$(git log -1 --pretty=%B)" \
--work-items $WORK_ITEM_1 $WORK_ITEM_2 \
--reviewers "$REVIEWER_1" "$REVIEWER_2" \
--required-reviewers "$LEAD_EMAIL" \
--labels "enhancement" "backlog" \
--open \
--query "pullRequestId" -o tsv)
# Set auto-complete when policies pass
az repos pr update --id $PR_ID --auto-complete true
```
## Error Handling and Retry Patterns
### Retry Logic for Transient Failures
```bash
# Retry function for network operations
retry_command() {
local max_attempts=3
local attempt=1
local delay=5
while [[ $attempt -le $max_attempts ]]; do
if "$@"; then
return 0
fi
echo "Attempt $attempt failed. Retrying in ${delay}s..."
sleep $delay
((attempt++))
delay=$((delay * 2))
done
echo "All $max_attempts attempts failed"
return 1
}
# Usage
retry_command az pipelines run --name "$PIPELINE_NAME"
```
### Check and Handle Errors
```bash
# Check if pipeline exists before operations
PIPELINE_ID=$(az pipelines list --query "[?name=='$PIPELINE_NAME'].id" -o tsv)
if [[ -z "$PIPELINE_ID" ]]; then
echo "Pipeline not found. Creating..."
az pipelines create --name "$PIPELINE_NAME" --yaml-path azure-pipelines.yml
else
echo "Pipeline exists with ID: $PIPELINE_ID"
fi
```
### Validate Inputs
```bash
# Validate required parameters
if [[ -z "$PROJECT" || -z "$REPO" ]]; then
echo "Error: PROJECT and REPO must be set"
exit 1
fi
# Check if branch exists
if ! az repos ref list --repository "$REPO" --query "[?name=='refs/heads/$BRANCH']" -o tsv | grep -q .; then
echo "Error: Branch $BRANCH does not exist"
exit 1
fi
```
### Handle Permission Errors
```bash
# Try operation, handle permission errors
if az devops security permission update \
--id "$USER_ID" \
--namespace "GitRepositories" \
--project "$PROJECT" \
--token "repoV2/$PROJECT/$REPO_ID" \
--allow-bit 2 \
--deny-bit 0 2>&1 | grep -q "unauthorized"; then
echo "Error: Insufficient permissions to update repository permissions"
exit 1
fi
```
### Pipeline Failure Notification
```bash
# Run pipeline and check result
RUN_ID=$(az pipelines run --name "$PIPELINE_NAME" --query "id" -o tsv)
# Wait for completion
while true; do
STATUS=$(az pipelines runs show --run-id $RUN_ID --query "status" -o tsv)
if [[ "$STATUS" != "inProgress" && "$STATUS" != "notStarted" ]]; then
break
fi
sleep 10
done
# Check result and create work item on failure
RESULT=$(az pipelines runs show --run-id $RUN_ID --query "result" -o tsv)
if [[ "$RESULT" != "succeeded" ]]; then
BUILD_NUMBER=$(az pipelines runs show --run-id $RUN_ID --query "buildNumber" -o tsv)
az boards work-item create \
--title "Build $BUILD_NUMBER failed" \
--type Bug \
--description "Pipeline run $RUN_ID failed with result: $RESULT\n\nURL: $ORG_URL/$PROJECT/_build/results?buildId=$RUN_ID"
fi
```
### Graceful Degradation
```bash
# Try to download artifact, fallback to alternative source
if ! az pipelines runs artifact download \
--artifact-name 'webapp' \
--path ./output \
--run-id $RUN_ID 2>/dev/null; then
echo "Warning: Failed to download from pipeline run. Falling back to backup source..."
# Alternative download method
curl -L "$BACKUP_URL" -o ./output/backup.zip
fi
```
## Advanced JMESPath Queries
### Filtering and Sorting
```bash
# Filter by multiple conditions
az pipelines list --query "[?name.contains('CI') && enabled==true]"
# Filter by status and result
az pipelines runs list --query "[?status=='completed' && result=='succeeded']"
# Sort by date (descending)
az pipelines runs list --query "sort_by([?status=='completed'], &finishTime | reverse(@))"
# Get top N items after filtering
az pipelines runs list --query "[?result=='succeeded'] | [0:5]"
```
### Nested Queries
```bash
# Extract nested properties
az pipelines show --id $PIPELINE_ID --query "{Name:name, Repo:repository.{Name:name, Type:type}, Folder:folder}"
# Query build details
az pipelines build show --id $BUILD_ID --query "{ID:id, Number:buildNumber, Status:status, Result:result, Requested:requestedFor.displayName}"
```
### Complex Filtering
```bash
# Find pipelines with specific YAML path
az pipelines list --query "[?process.type.name=='yaml' && process.yamlFilename=='azure-pipelines.yml']"
# Find PRs from specific reviewer
az repos pr list --query "[?contains(reviewers[?displayName=='John Doe'].displayName, 'John Doe')]"
# Find work items with specific iteration and state
az boards work-item show --id $WI_ID --query "{Title:fields['System.Title'], State:fields['System.State'], Iteration:fields['System.IterationPath']}"
```
### Aggregation
```bash
# Count items by status
az pipelines runs list --query "groupBy([?status=='completed'], &[result]) | {Succeeded: [?key=='succeeded'][0].count, Failed: [?key=='failed'][0].count}"
# Get unique reviewers
az repos pr list --query "unique_by(reviewers[], &displayName)"
# Sum values
az pipelines runs list --query "[?result=='succeeded'] | [].{Duration:duration} | [0].Duration"
```
### Conditional Transformation
```bash
# Format dates
az pipelines runs list --query "[].{ID:id, Date:createdDate, Formatted:createdDate | format_datetime(@, 'yyyy-MM-dd HH:mm')}"
# Conditional output
az pipelines list --query "[].{Name:name, Status:(enabled ? 'Enabled' : 'Disabled')}"
# Extract with defaults
az pipelines show --id $PIPELINE_ID --query "{Name:name, Folder:folder || 'Root', Description:description || 'No description'}"
```
### Complex Workflows
```bash
# Find longest running builds
az pipelines build list --query "sort_by([?result=='succeeded'], &queueTime) | reverse(@) | [0:3].{ID:id, Number:buildNumber, Duration:duration}"
# Get PR statistics per reviewer
az repos pr list --query "groupBy([], &reviewers[].displayName) | [].{Reviewer:@.key, Count:length(@)}"
# Find work items with multiple child items
az boards work-item relation list --id $PARENT_ID --query "[?rel=='System.LinkTypes.Hierarchy-Forward'] | [].{ChildID:url | split('/', @) | [-1]}"
```
## Scripting Patterns for Idempotent Operations
### Create or Update Pattern
```bash
# Ensure pipeline exists, update if different
ensure_pipeline() {
local name=$1
local yaml_path=$2
PIPELINE=$(az pipelines list --query "[?name=='$name']" -o json)
if [[ -z "$PIPELINE" ]]; then
echo "Creating pipeline: $name"
az pipelines create --name "$name" --yaml-path "$yaml_path"
else
echo "Pipeline exists: $name"
fi
}
```
### Ensure Variable Group
```bash
# Create variable group with idempotent updates
ensure_variable_group() {
local vg_name=$1
shift
local variables=("$@")
VG_ID=$(az pipelines variable-group list --query "[?name=='$vg_name'].id" -o tsv)
if [[ -z "$VG_ID" ]]; then
echo "Creating variable group: $vg_name"
VG_ID=$(az pipelines variable-group create \
--name "$vg_name" \
--variables "${variables[@]}" \
--authorize true \
--query "id" -o tsv)
else
echo "Variable group exists: $vg_name (ID: $VG_ID)"
fi
echo "$VG_ID"
}
```
### Ensure Service Connection
```bash
# Check if service connection exists, create if not
ensure_service_connection() {
local name=$1
local project=$2
SC_ID=$(az devops service-endpoint list \
--project "$project" \
--query "[?name=='$name'].id" \
-o tsv)
if [[ -z "$SC_ID" ]]; then
echo "Service connection not found. Creating..."
# Create logic here
else
echo "Service connection exists: $name"
echo "$SC_ID"
fi
}
```
### Idempotent Work Item Creation
```bash
# Create work item only if doesn't exist with same title
create_work_item_if_new() {
local title=$1
local type=$2
WI_ID=$(az boards query \
--wiql "SELECT ID FROM WorkItems WHERE [System.WorkItemType]='$type' AND [System.Title]='$title'" \
--query "[0].id" -o tsv)
if [[ -z "$WI_ID" ]]; then
echo "Creating work item: $title"
WI_ID=$(az boards work-item create --title "$title" --type "$type" --query "id" -o tsv)
else
echo "Work item exists: $title (ID: $WI_ID)"
fi
echo "$WI_ID"
}
```
### Bulk Idempotent Operations
```bash
# Ensure multiple pipelines exist
declare -a PIPELINES=(
"ci-pipeline:azure-pipelines.yml"
"deploy-pipeline:deploy.yml"
"test-pipeline:test.yml"
)
for pipeline in "${PIPELINES[@]}"; do
IFS=':' read -r name yaml <<< "$pipeline"
ensure_pipeline "$name" "$yaml"
done
```
### Configuration Synchronization
```bash
# Sync variable groups from config file
sync_variable_groups() {
local config_file=$1
while IFS=',' read -r vg_name variables; do
ensure_variable_group "$vg_name" "$variables"
done < "$config_file"
}
# config.csv format:
# prod-vars,API_URL=prod.com,API_KEY=secret123
# dev-vars,API_URL=dev.com,API_KEY=secret456
```
## Real-World Workflows
### CI/CD Pipeline Setup
```bash
# Setup complete CI/CD pipeline
setup_cicd_pipeline() {
local project=$1
local repo=$2
local branch=$3
# Create variable groups
VG_DEV=$(ensure_variable_group "dev-vars" "ENV=dev API_URL=api-dev.com")
VG_PROD=$(ensure_variable_group "prod-vars" "ENV=prod API_URL=api-prod.com")
# Create CI pipeline
az pipelines create \
--name "$repo-CI" \
--repository "$repo" \
--branch "$branch" \
--yaml-path .azure/pipelines/ci.yml \
--skip-run true
# Create CD pipeline
az pipelines create \
--name "$repo-CD" \
--repository "$repo" \
--branch "$branch" \
--yaml-path .azure/pipelines/cd.yml \
--skip-run true
echo "CI/CD pipeline setup complete"
}
```
### Automated PR Creation
```bash
# Create PR from feature branch with automation
create_automated_pr() {
local branch=$1
local title=$2
# Get branch info
LAST_COMMIT=$(git log -1 --pretty=%B "$branch")
COMMIT_SHA=$(git rev-parse "$branch")
# Find related work items
WORK_ITEMS=$(az boards query \
--wiql "SELECT ID FROM WorkItems WHERE [System.ChangedBy] = @Me AND [System.State] = 'Active'" \
--query "[].id" -o tsv)
# Create PR
PR_ID=$(az repos pr create \
--source-branch "$branch" \
--target-branch main \
--title "$title" \
--description "$LAST_COMMIT" \
--work-items $WORK_ITEMS \
--auto-complete true \
--query "pullRequestId" -o tsv)
# Set required reviewers
az repos pr reviewer add \
--id $PR_ID \
--reviewers $(git log -1 --pretty=format:'%ae' "$branch") \
--required true
echo "Created PR #$PR_ID"
}
```
### Pipeline Monitoring and Alerting
```bash
# Monitor pipeline and alert on failure
monitor_pipeline() {
local pipeline_name=$1
local slack_webhook=$2
while true; do
# Get latest run
RUN_ID=$(az pipelines list --query "[?name=='$pipeline_name'] | [0].id" -o tsv)
RUNS=$(az pipelines runs list --pipeline $RUN_ID --top 1)
LATEST_RUN_ID=$(echo "$RUNS" | jq -r '.[0].id')
RESULT=$(echo "$RUNS" | jq -r '.[0].result')
# Check if failed and not already processed
if [[ "$RESULT" == "failed" ]]; then
# Send Slack alert
curl -X POST "$slack_webhook" \
-H 'Content-Type: application/json' \
-d "{\"text\": \"Pipeline $pipeline_name failed! Run ID: $LATEST_RUN_ID\"}"
fi
sleep 300 # Check every 5 minutes
done
}
```
### Bulk Work Item Management
```bash
# Bulk update work items based on query
bulk_update_work_items() {
local wiql=$1
local updates=("$@")
# Query work items
WI_IDS=$(az boards query --wiql "$wiql" --query "[].id" -o tsv)
# Update each work item
for wi_id in $WI_IDS; do
az boards work-item update --id $wi_id "${updates[@]}"
echo "Updated work item: $wi_id"
done
}
# Usage: bulk_update_work_items "SELECT ID FROM WorkItems WHERE State='New'" --state "Active" --assigned-to "[email protected]"
```
### Branch Policy Automation
```bash
# Apply branch policies to all repositories
apply_branch_policies() {
local branch=$1
local project=$2
# Get all repositories
REPOS=$(az repos list --project "$project" --query "[].id" -o tsv)
for repo_id in $REPOS; do
echo "Applying policies to repo: $repo_id"
# Require minimum approvers
az repos policy approver-count create \
--blocking true \
--enabled true \
--branch "$branch" \
--repository-id "$repo_id" \
--minimum-approver-count 2 \
--creator-vote-counts true
# Require work item linking
az repos policy work-item-linking create \
--blocking true \
--branch "$branch" \
--enabled true \
--repository-id "$repo_id"
# Require build validation
BUILD_ID=$(az pipelines list --query "[?name=='CI'].id" -o tsv | head -1)
az repos policy build create \
--blocking true \
--enabled true \
--branch "$branch" \
--repository-id "$repo_id" \
--build-definition-id "$BUILD_ID" \
--queue-on-source-update-only true
done
}
```
### Multi-Environment Deployment
```bash
# Deploy across multiple environments
deploy_to_environments() {
local run_id=$1
shift
local environments=("$@")
# Download artifacts
ARTIFACT_NAME=$(az pipelines runs artifact list --run-id $run_id --query "[0].name" -o tsv)
az pipelines runs artifact download \
--artifact-name "$ARTIFACT_NAME" \
--path ./artifacts \
--run-id $run_id
# Deploy to each environment
for env in "${environments[@]}"; do
echo "Deploying to: $env"
# Get environment-specific variables
VG_ID=$(az pipelines variable-group list --query "[?name=='$env-vars'].id" -o tsv)
# Run deployment pipeline
DEPLOY_RUN_ID=$(az pipelines run \
--name "Deploy-$env" \
--variables ARTIFACT_PATH=./artifacts ENV="$env" \
--query "id" -o tsv)
# Wait for deployment
while true; do
STATUS=$(az pipelines runs show --run-id $DEPLOY_RUN_ID --query "status" -o tsv)
if [[ "$STATUS" != "inProgress" ]]; then
break
fi
sleep 10
done
done
}
```
## Enhanced Global Arguments
| Parameter | Description |
| -------------------- | ---------------------------------------------------------- |
| `--help` / `-h` | Show command help |
| `--output` / `-o` | Output format (json, jsonc, none, table, tsv, yaml, yamlc) |
| `--query` | JMESPath query string for filtering output |
| `--verbose` | Increase logging verbosity |
| `--debug` | Show all debug logs |
| `--only-show-errors` | Only show errors, suppress warnings |
| `--subscription` | Name or ID of subscription |
| `--yes` / `-y` | Skip confirmation prompts |
## Enhanced Common Parameters
| Parameter | Description |
| -------------------------- | ------------------------------------------------------------------- |
| `--org` / `--organization` | Azure DevOps organization URL (e.g., `https://dev.azure.com/{org}`) |
| `--project` / `-p` | Project name or ID |
| `--detect` | Auto-detect organization from git config |
| `--yes` / `-y` | Skip confirmation prompts |
| `--open` | Open resource in web browser |
| `--subscription` | Azure subscription (for Azure resources) |
## Getting Help
```bash
# General help
az devops --help
# Help for specific command group
az pipelines --help
az repos pr --help
# Help for specific command
az repos pr create --help
# Search for examples
az find "az repos pr create"
```Analyze Azure resource groups and generate detailed Mermaid architecture diagrams showing the relationships between individual resources. Use this skill when the user asks for a diagram of their Azure resources or help in understanding how the resources relate to each other.
# Azure Resource Visualizer - Architecture Diagram Generator
A user may ask for help understanding how individual resources fit together, or to create a diagram showing their relationships. Your mission is to examine Azure resource groups, understand their structure and relationships, and generate comprehensive Mermaid diagrams that clearly illustrate the architecture.
## Core Responsibilities
1. **Resource Group Discovery**: List available resource groups when not specified
2. **Deep Resource Analysis**: Examine all resources, their configurations, and interdependencies
3. **Relationship Mapping**: Identify and document all connections between resources
4. **Diagram Generation**: Create detailed, accurate Mermaid diagrams
5. **Documentation Creation**: Produce clear markdown files with embedded diagrams
## Workflow Process
### Step 1: Resource Group Selection
If the user hasn't specified a resource group:
1. Use your tools to query available resource groups. If you do not have a tool for this, use `az`.
2. Present a numbered list of resource groups with their locations
3. Ask the user to select one by number or name
4. Wait for user response before proceeding
If a resource group is specified, validate it exists and proceed.
### Step 2: Resource Discovery & Analysis
Once you have the resource group:
1. **Query all resources** in the resource group using Azure MCP tools or `az`.
2. **Analyze each resource** type and capture:
- Resource name and type
- SKU/tier information
- Location/region
- Key configuration properties
- Network settings (VNets, subnets, private endpoints)
- Identity and access (Managed Identity, RBAC)
- Dependencies and connections
3. **Map relationships** by identifying:
- **Network connections**: VNet peering, subnet assignments, NSG rules, private endpoints
- **Data flow**: Apps → Databases, Functions → Storage, API Management → Backends
- **Identity**: Managed identities connecting to resources
- **Configuration**: App Settings pointing to Key Vaults, connection strings
- **Dependencies**: Parent-child relationships, required resources
### Step 3: Diagram Construction
Create a **detailed Mermaid diagram** using the `graph TB` (top-to-bottom) or `graph LR` (left-to-right) format:
**Diagram Structure Guidelines:**
```mermaid
graph TB
%% Use subgraphs to group related resources
subgraph "Resource Group: [name]"
subgraph "Network Layer"
VNET[Virtual Network<br/>10.0.0.0/16]
SUBNET1[Subnet: web<br/>10.0.1.0/24]
SUBNET2[Subnet: data<br/>10.0.2.0/24]
NSG[Network Security Group]
end
subgraph "Compute Layer"
APP[App Service<br/>Plan: P1v2]
FUNC[Function App<br/>Runtime: .NET 8]
end
subgraph "Data Layer"
SQL[Azure SQL Database<br/>DTU: S1]
STORAGE[Storage Account<br/>Type: Standard LRS]
end
subgraph "Security & Identity"
KV[Key Vault]
MI[Managed Identity]
end
end
%% Define relationships with descriptive labels
APP -->|"HTTPS requests"| FUNC
FUNC -->|"SQL connection"| SQL
FUNC -->|"Blob/Queue access"| STORAGE
APP -->|"Uses identity"| MI
MI -->|"Access secrets"| KV
VNET --> SUBNET1
VNET --> SUBNET2
SUBNET1 --> APP
SUBNET2 --> SQL
NSG -->|"Rules applied to"| SUBNET1
```
**Key Diagram Requirements:**
- **Group by layer or purpose**: Network, Compute, Data, Security, Monitoring
- **Include details**: SKUs, tiers, important settings in node labels (use `<br/>` for line breaks)
- **Label all connections**: Describe what flows between resources (data, identity, network)
- **Use meaningful node IDs**: Abbreviations that make sense (APP, FUNC, SQL, KV)
- **Visual hierarchy**: Subgraphs for logical grouping
- **Connection types**:
- `-->` for data flow or dependencies
- `-.->` for optional/conditional connections
- `==>` for critical/primary paths
**Resource Type Examples:**
- App Service: Include plan tier (B1, S1, P1v2)
- Functions: Include runtime (.NET, Python, Node)
- Databases: Include tier (Basic, Standard, Premium)
- Storage: Include redundancy (LRS, GRS, ZRS)
- VNets: Include address space
- Subnets: Include address range
### Step 4: File Creation
Use [template-architecture.md](./assets/template-architecture.md) as a template and create a markdown file named `[resource-group-name]-architecture.md` with:
1. **Header**: Resource group name, subscription, region
2. **Summary**: Brief overview of the architecture (2-3 paragraphs)
3. **Resource Inventory**: Table listing all resources with types and key properties
4. **Architecture Diagram**: The complete Mermaid diagram
5. **Relationship Details**: Explanation of key connections and data flows
6. **Notes**: Any important observations, potential issues, or recommendations
## Operating Guidelines
### Quality Standards
- **Accuracy**: Verify all resource details before including in diagram
- **Completeness**: Don't omit resources; include everything in the resource group
- **Clarity**: Use clear, descriptive labels and logical grouping
- **Detail Level**: Include configuration details that matter for architecture understanding
- **Relationships**: Show ALL significant connections, not just obvious ones
### Tool Usage Patterns
1. **Azure MCP Search**:
- Use `intent="list resource groups"` to discover resource groups
- Use `intent="list resources in group"` with group name to get all resources
- Use `intent="get resource details"` for individual resource analysis
- Use `command` parameter when you need specific Azure operations
2. **File Creation**:
- Always create in workspace root or a `docs/` folder if it exists
- Use clear, descriptive filenames: `[rg-name]-architecture.md`
- Ensure Mermaid syntax is valid (test syntax mentally before output)
3. **Terminal (when needed)**:
- Use Azure CLI for complex queries not available via MCP
- Example: `az resource list --resource-group <name> --output json`
- Example: `az network vnet show --resource-group <name> --name <vnet-name>`
### Constraints & Boundaries
**Always Do:**
- ✅ List resource groups if not specified
- ✅ Wait for user selection before proceeding
- ✅ Analyze ALL resources in the group
- ✅ Create detailed, accurate diagrams
- ✅ Include configuration details in node labels
- ✅ Group resources logically with subgraphs
- ✅ Label all connections descriptively
- ✅ Create a complete markdown file with diagram
**Never Do:**
- ❌ Skip resources because they seem unimportant
- ❌ Make assumptions about resource relationships without verification
- ❌ Create incomplete or placeholder diagrams
- ❌ Omit configuration details that affect architecture
- ❌ Proceed without confirming resource group selection
- ❌ Generate invalid Mermaid syntax
- ❌ Modify or delete Azure resources (read-only analysis)
### Edge Cases & Error Handling
- **No resources found**: Inform user and verify resource group name
- **Permission issues**: Explain what's missing and suggest checking RBAC
- **Complex architectures (50+ resources)**: Consider creating multiple diagrams by layer
- **Cross-resource-group dependencies**: Note external dependencies in diagram notes
- **Resources without clear relationships**: Group in "Other Resources" section
## Output Format Specifications
### Mermaid Diagram Syntax
- Use `graph TB` (top-to-bottom) for vertical layouts
- Use `graph LR` (left-to-right) for horizontal layouts (better for wide architectures)
- Subgraph syntax: `subgraph "Descriptive Name"`
- Node syntax: `ID["Display Name<br/>Details"]`
- Connection syntax: `SOURCE -->|"Label"| TARGET`
### Markdown Structure
- Use H1 for main title
- Use H2 for major sections
- Use H3 for subsections
- Use tables for resource inventories
- Use bullet lists for notes and recommendations
- Use code blocks with `mermaid` language tag for diagrams
## Example Interaction
**User**: "Analyze my production resource group"
**Agent**:
1. Lists all resource groups in subscription
2. Asks user to select: "Which resource group? 1) rg-prod-app, 2) rg-dev-app, 3) rg-shared"
3. User selects: "1"
4. Queries all resources in rg-prod-app
5. Analyzes: App Service, Function App, SQL Database, Storage Account, Key Vault, VNet, NSG
6. Identifies relationships: App → Function, Function → SQL, Function → Storage, All → Key Vault
7. Creates detailed Mermaid diagram with subgraphs
8. Generates `rg-prod-app-architecture.md` with complete documentation
9. Displays: "Created architecture diagram in rg-prod-app-architecture.md. Found 7 resources with 8 key relationships."
## Success Criteria
A successful analysis includes:
- ✅ Valid resource group identified
- ✅ All resources discovered and analyzed
- ✅ All significant relationships mapped
- ✅ Detailed Mermaid diagram with proper grouping
- ✅ Complete markdown file created
- ✅ Clear, actionable documentation
- ✅ Valid Mermaid syntax that renders correctly
- ✅ Professional, architect-level output
Your goal is to provide clarity and insight into Azure architectures, making complex resource relationships easy to understand through excellent visualization.When user is asking for guidance for which role to assign to an identity given desired permissions, this agent helps them understand the role that will meet the requirements with least privilege access and how to apply that role.
Use 'Azure MCP/documentation' tool to find the minimal role definition that matches the desired permissions the user wants to assign to an identity (If no built-in role matches the desired permissions, use 'Azure MCP/extension_cli_generate' tool to create a custom role definition with the desired permissions). Use 'Azure MCP/extension_cli_generate' tool to generate the CLI commands needed to assign that role to the identity and use the 'Azure MCP/bicepschema' and the 'Azure MCP/get_bestpractices' tool to provide a Bicep code snippet for adding the role assignment.Helps create, configure, and deploy Azure Static Web Apps using the SWA CLI. Use when deploying static sites to Azure, setting up SWA local development, configuring staticwebapp.config.json, adding Azure Functions APIs to SWA, or setting up GitHub Actions CI/CD for Static Web Apps.
## Overview
Azure Static Web Apps (SWA) hosts static frontends with optional serverless API backends. The SWA CLI (`swa`) provides local development emulation and deployment capabilities.
**Key features:**
- Local emulator with API proxy and auth simulation
- Framework auto-detection and configuration
- Direct deployment to Azure
- Database connections support
**Config files:**
- `swa-cli.config.json` - CLI settings, **created by `swa init`** (never create manually)
- `staticwebapp.config.json` - Runtime config (routes, auth, headers, API runtime) - can be created manually
## General Instructions
### Installation
```bash
npm install -D @azure/static-web-apps-cli
```
Verify: `npx swa --version`
### Quick Start Workflow
**IMPORTANT: Always use `swa init` to create configuration files. Never manually create `swa-cli.config.json`.**
1. `swa init` - **Required first step** - auto-detects framework and creates `swa-cli.config.json`
2. `swa start` - Run local emulator at `http://localhost:4280`
3. `swa login` - Authenticate with Azure
4. `swa deploy` - Deploy to Azure
### Configuration Files
**swa-cli.config.json** - Created by `swa init`, do not create manually:
- Run `swa init` for interactive setup with framework detection
- Run `swa init --yes` to accept auto-detected defaults
- Edit the generated file only to customize settings after initialization
Example of generated config (for reference only):
```json
{
"$schema": "https://aka.ms/azure/static-web-apps-cli/schema",
"configurations": {
"app": {
"appLocation": ".",
"apiLocation": "api",
"outputLocation": "dist",
"appBuildCommand": "npm run build",
"run": "npm run dev",
"appDevserverUrl": "http://localhost:3000"
}
}
}
```
**staticwebapp.config.json** (in app source or output folder) - This file CAN be created manually for runtime configuration:
```json
{
"navigationFallback": {
"rewrite": "/index.html",
"exclude": ["/images/*", "/css/*"]
},
"routes": [
{ "route": "/api/*", "allowedRoles": ["authenticated"] }
],
"platform": {
"apiRuntime": "node:20"
}
}
```
## Command-line Reference
### swa login
Authenticate with Azure for deployment.
```bash
swa login # Interactive login
swa login --subscription-id <id> # Specific subscription
swa login --clear-credentials # Clear cached credentials
```
**Flags:** `--subscription-id, -S` | `--resource-group, -R` | `--tenant-id, -T` | `--client-id, -C` | `--client-secret, -CS` | `--app-name, -n`
### swa init
Configure a new SWA project based on an existing frontend and (optional) API. Detects frameworks automatically.
```bash
swa init # Interactive setup
swa init --yes # Accept defaults
```
### swa build
Build frontend and/or API.
```bash
swa build # Build using config
swa build --auto # Auto-detect and build
swa build myApp # Build specific configuration
```
**Flags:** `--app-location, -a` | `--api-location, -i` | `--output-location, -O` | `--app-build-command, -A` | `--api-build-command, -I`
### swa start
Start local development emulator.
```bash
swa start # Serve from outputLocation
swa start ./dist # Serve specific folder
swa start http://localhost:3000 # Proxy to dev server
swa start ./dist --api-location ./api # With API folder
swa start http://localhost:3000 --run "npm start" # Auto-start dev server
```
**Common framework ports:**
| Framework | Port |
|-----------|------|
| React/Vue/Next.js | 3000 |
| Angular | 4200 |
| Vite | 5173 |
**Key flags:**
- `--port, -p` - Emulator port (default: 4280)
- `--api-location, -i` - API folder path
- `--api-port, -j` - API port (default: 7071)
- `--run, -r` - Command to start dev server
- `--open, -o` - Open browser automatically
- `--ssl, -s` - Enable HTTPS
### swa deploy
Deploy to Azure Static Web Apps.
```bash
swa deploy # Deploy using config
swa deploy ./dist # Deploy specific folder
swa deploy --env production # Deploy to production
swa deploy --deployment-token <TOKEN> # Use deployment token
swa deploy --dry-run # Preview without deploying
```
**Get deployment token:**
- Azure Portal: Static Web App → Overview → Manage deployment token
- CLI: `swa deploy --print-token`
- Environment variable: `SWA_CLI_DEPLOYMENT_TOKEN`
**Key flags:**
- `--env` - Target environment (`preview` or `production`)
- `--deployment-token, -d` - Deployment token
- `--app-name, -n` - Azure SWA resource name
### swa db
Initialize database connections.
```bash
swa db init --database-type mssql
swa db init --database-type postgresql
swa db init --database-type cosmosdb_nosql
```
## Scenarios
### Create SWA from Existing Frontend and Backend
**Always run `swa init` before `swa start` or `swa deploy`. Do not manually create `swa-cli.config.json`.**
```bash
# 1. Install CLI
npm install -D @azure/static-web-apps-cli
# 2. Initialize - REQUIRED: creates swa-cli.config.json with auto-detected settings
npx swa init # Interactive mode
# OR
npx swa init --yes # Accept auto-detected defaults
# 3. Build application (if needed)
npm run build
# 4. Test locally (uses settings from swa-cli.config.json)
npx swa start
# 5. Deploy
npx swa login
npx swa deploy --env production
```
### Add Azure Functions Backend
1. **Create API folder:**
```bash
mkdir api && cd api
func init --worker-runtime node --model V4
func new --name message --template "HTTP trigger"
```
2. **Example function** (`api/src/functions/message.js`):
```javascript
const { app } = require('@azure/functions');
app.http('message', {
methods: ['GET', 'POST'],
authLevel: 'anonymous',
handler: async (request) => {
const name = request.query.get('name') || 'World';
return { jsonBody: { message: `Hello, ${name}!` } };
}
});
```
3. **Set API runtime** in `staticwebapp.config.json`:
```json
{
"platform": { "apiRuntime": "node:20" }
}
```
4. **Update CLI config** in `swa-cli.config.json`:
```json
{
"configurations": {
"app": { "apiLocation": "api" }
}
}
```
5. **Test locally:**
```bash
npx swa start ./dist --api-location ./api
# Access API at http://localhost:4280/api/message
```
**Supported API runtimes:** `node:18`, `node:20`, `node:22`, `dotnet:8.0`, `dotnet-isolated:8.0`, `python:3.10`, `python:3.11`
### Set Up GitHub Actions Deployment
1. **Create SWA resource** in Azure Portal or via Azure CLI
2. **Link GitHub repository** - workflow auto-generated, or create manually:
`.github/workflows/azure-static-web-apps.yml`:
```yaml
name: Azure Static Web Apps CI/CD
on:
push:
branches: [main]
pull_request:
types: [opened, synchronize, reopened, closed]
branches: [main]
jobs:
build_and_deploy:
if: github.event_name == 'push' || (github.event_name == 'pull_request' && github.event.action != 'closed')
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- name: Build And Deploy
uses: Azure/static-web-apps-deploy@v1
with:
azure_static_web_apps_api_token: ${{ secrets.AZURE_STATIC_WEB_APPS_API_TOKEN }}
repo_token: ${{ secrets.GITHUB_TOKEN }}
action: upload
app_location: /
api_location: api
output_location: dist
close_pr:
if: github.event_name == 'pull_request' && github.event.action == 'closed'
runs-on: ubuntu-latest
steps:
- uses: Azure/static-web-apps-deploy@v1
with:
azure_static_web_apps_api_token: ${{ secrets.AZURE_STATIC_WEB_APPS_API_TOKEN }}
action: close
```
3. **Add secret:** Copy deployment token to repository secret `AZURE_STATIC_WEB_APPS_API_TOKEN`
**Workflow settings:**
- `app_location` - Frontend source path
- `api_location` - API source path
- `output_location` - Built output folder
- `skip_app_build: true` - Skip if pre-built
- `app_build_command` - Custom build command
## Troubleshooting
| Issue | Solution |
|-------|----------|
| 404 on client routes | Add `navigationFallback` with `rewrite: "/index.html"` to `staticwebapp.config.json` |
| API returns 404 | Verify `api` folder structure, ensure `platform.apiRuntime` is set, check function exports |
| Build output not found | Verify `output_location` matches actual build output directory |
| Auth not working locally | Use `/.auth/login/<provider>` to access auth emulator UI |
| CORS errors | APIs under `/api/*` are same-origin; external APIs need CORS headers |
| Deployment token expired | Regenerate in Azure Portal → Static Web App → Manage deployment token |
| Config not applied | Ensure `staticwebapp.config.json` is in `app_location` or `output_location` |
| Local API timeout | Default is 45 seconds; optimize function or check for blocking calls |
**Debug commands:**
```bash
swa start --verbose log # Verbose output
swa deploy --dry-run # Preview deployment
swa --print-config # Show resolved configuration
```Expert-level browser automation, debugging, and performance analysis using Chrome DevTools MCP. Use for interacting with web pages, capturing screenshots, analyzing network traffic, and profiling performance.
# Chrome DevTools Agent
## Overview
A specialized skill for controlling and inspecting a live Chrome browser. This skill leverages the `chrome-devtools` MCP server to perform a wide range of browser-related tasks, from simple navigation to complex performance profiling.
## When to Use
Use this skill when:
- **Browser Automation**: Navigating pages, clicking elements, filling forms, and handling dialogs.
- **Visual Inspection**: Taking screenshots or text snapshots of web pages.
- **Debugging**: Inspecting console messages, evaluating JavaScript in the page context, and analyzing network requests.
- **Performance Analysis**: Recording and analyzing performance traces to identify bottlenecks and Core Web Vital issues.
- **Emulation**: Resizing the viewport or emulating network/CPU conditions.
## Tool Categories
### 1. Navigation & Page Management
- `new_page`: Open a new tab/page.
- `navigate_page`: Go to a specific URL, reload, or navigate history.
- `select_page`: Switch context between open pages.
- `list_pages`: See all open pages and their IDs.
- `close_page`: Close a specific page.
- `wait_for`: Wait for specific text to appear on the page.
### 2. Input & Interaction
- `click`: Click on an element (use `uid` from snapshot).
- `fill` / `fill_form`: Type text into inputs or fill multiple fields at once.
- `hover`: Move the mouse over an element.
- `press_key`: Send keyboard shortcuts or special keys (e.g., "Enter", "Control+C").
- `drag`: Drag and drop elements.
- `handle_dialog`: Accept or dismiss browser alerts/prompts.
- `upload_file`: Upload a file through a file input.
### 3. Debugging & Inspection
- `take_snapshot`: Get a text-based accessibility tree (best for identifying elements).
- `take_screenshot`: Capture a visual representation of the page or a specific element.
- `list_console_messages` / `get_console_message`: Inspect the page's console output.
- `evaluate_script`: Run custom JavaScript in the page context.
- `list_network_requests` / `get_network_request`: Analyze network traffic and request details.
### 4. Emulation & Performance
- `resize_page`: Change the viewport dimensions.
- `emulate`: Throttling CPU/Network or emulating geolocation.
- `performance_start_trace`: Start recording a performance profile.
- `performance_stop_trace`: Stop recording and save the trace.
- `performance_analyze_insight`: Get detailed analysis from recorded performance data.
## Workflow Patterns
### Pattern A: Identifying Elements (Snapshot-First)
Always prefer `take_snapshot` over `take_screenshot` for finding elements. The snapshot provides `uid` values which are required by interaction tools.
```markdown
1. `take_snapshot` to get the current page structure.
2. Find the `uid` of the target element.
3. Use `click(uid=...)` or `fill(uid=..., value=...)`.
```
### Pattern B: Troubleshooting Errors
When a page is failing, check both console logs and network requests.
```markdown
1. `list_console_messages` to check for JavaScript errors.
2. `list_network_requests` to identify failed (4xx/5xx) resources.
3. `evaluate_script` to check the value of specific DOM elements or global variables.
```
### Pattern C: Performance Profiling
Identify why a page is slow.
```markdown
1. `performance_start_trace(reload=true, autoStop=true)`
2. Wait for the page to load/trace to finish.
3. `performance_analyze_insight` to find LCP issues or layout shifts.
```
## Best Practices
- **Context Awareness**: Always run `list_pages` and `select_page` if you are unsure which tab is currently active.
- **Snapshots**: Take a new snapshot after any major navigation or DOM change, as `uid` values may change.
- **Timeouts**: Use reasonable timeouts for `wait_for` to avoid hanging on slow-loading elements.
- **Screenshots**: Use `take_screenshot` sparingly for visual verification, but rely on `take_snapshot` for logic.Build agentic applications with GitHub Copilot SDK. Use when embedding AI agents in apps, creating custom tools, implementing streaming responses, managing sessions, connecting to MCP servers, or creating custom agents. Triggers on Copilot SDK, GitHub SDK, agentic app, embed Copilot, programmable agent, MCP server, custom agent.
# GitHub Copilot SDK
Embed Copilot's agentic workflows in any application using Python, TypeScript, Go, or .NET.
## Overview
The GitHub Copilot SDK exposes the same engine behind Copilot CLI: a production-tested agent runtime you can invoke programmatically. No need to build your own orchestration - you define agent behavior, Copilot handles planning, tool invocation, file edits, and more.
## Prerequisites
1. **GitHub Copilot CLI** installed and authenticated ([Installation guide](https://docs.github.com/en/copilot/how-tos/set-up/install-copilot-cli))
2. **Language runtime**: Node.js 18+, Python 3.8+, Go 1.21+, or .NET 8.0+
Verify CLI: `copilot --version`
## Installation
### Node.js/TypeScript
```bash
mkdir copilot-demo && cd copilot-demo
npm init -y --init-type module
npm install @github/copilot-sdk tsx
```
### Python
```bash
pip install github-copilot-sdk
```
### Go
```bash
mkdir copilot-demo && cd copilot-demo
go mod init copilot-demo
go get github.com/github/copilot-sdk/go
```
### .NET
```bash
dotnet new console -n CopilotDemo && cd CopilotDemo
dotnet add package GitHub.Copilot.SDK
```
## Quick Start
### TypeScript
```typescript
import { CopilotClient } from "@github/copilot-sdk";
const client = new CopilotClient();
const session = await client.createSession({ model: "gpt-4.1" });
const response = await session.sendAndWait({ prompt: "What is 2 + 2?" });
console.log(response?.data.content);
await client.stop();
process.exit(0);
```
Run: `npx tsx index.ts`
### Python
```python
import asyncio
from copilot import CopilotClient
async def main():
client = CopilotClient()
await client.start()
session = await client.create_session({"model": "gpt-4.1"})
response = await session.send_and_wait({"prompt": "What is 2 + 2?"})
print(response.data.content)
await client.stop()
asyncio.run(main())
```
### Go
```go
package main
import (
"fmt"
"log"
"os"
copilot "github.com/github/copilot-sdk/go"
)
func main() {
client := copilot.NewClient(nil)
if err := client.Start(); err != nil {
log.Fatal(err)
}
defer client.Stop()
session, err := client.CreateSession(&copilot.SessionConfig{Model: "gpt-4.1"})
if err != nil {
log.Fatal(err)
}
response, err := session.SendAndWait(copilot.MessageOptions{Prompt: "What is 2 + 2?"}, 0)
if err != nil {
log.Fatal(err)
}
fmt.Println(*response.Data.Content)
os.Exit(0)
}
```
### .NET (C#)
```csharp
using GitHub.Copilot.SDK;
await using var client = new CopilotClient();
await using var session = await client.CreateSessionAsync(new SessionConfig { Model = "gpt-4.1" });
var response = await session.SendAndWaitAsync(new MessageOptions { Prompt = "What is 2 + 2?" });
Console.WriteLine(response?.Data.Content);
```
Run: `dotnet run`
## Streaming Responses
Enable real-time output for better UX:
### TypeScript
```typescript
import { CopilotClient, SessionEvent } from "@github/copilot-sdk";
const client = new CopilotClient();
const session = await client.createSession({
model: "gpt-4.1",
streaming: true,
});
session.on((event: SessionEvent) => {
if (event.type === "assistant.message_delta") {
process.stdout.write(event.data.deltaContent);
}
if (event.type === "session.idle") {
console.log(); // New line when done
}
});
await session.sendAndWait({ prompt: "Tell me a short joke" });
await client.stop();
process.exit(0);
```
### Python
```python
import asyncio
import sys
from copilot import CopilotClient
from copilot.generated.session_events import SessionEventType
async def main():
client = CopilotClient()
await client.start()
session = await client.create_session({
"model": "gpt-4.1",
"streaming": True,
})
def handle_event(event):
if event.type == SessionEventType.ASSISTANT_MESSAGE_DELTA:
sys.stdout.write(event.data.delta_content)
sys.stdout.flush()
if event.type == SessionEventType.SESSION_IDLE:
print()
session.on(handle_event)
await session.send_and_wait({"prompt": "Tell me a short joke"})
await client.stop()
asyncio.run(main())
```
### Go
```go
session, err := client.CreateSession(&copilot.SessionConfig{
Model: "gpt-4.1",
Streaming: true,
})
session.On(func(event copilot.SessionEvent) {
if event.Type == "assistant.message_delta" {
fmt.Print(*event.Data.DeltaContent)
}
if event.Type == "session.idle" {
fmt.Println()
}
})
_, err = session.SendAndWait(copilot.MessageOptions{Prompt: "Tell me a short joke"}, 0)
```
### .NET
```csharp
await using var session = await client.CreateSessionAsync(new SessionConfig
{
Model = "gpt-4.1",
Streaming = true,
});
session.On(ev =>
{
if (ev is AssistantMessageDeltaEvent deltaEvent)
Console.Write(deltaEvent.Data.DeltaContent);
if (ev is SessionIdleEvent)
Console.WriteLine();
});
await session.SendAndWaitAsync(new MessageOptions { Prompt = "Tell me a short joke" });
```
## Custom Tools
Define tools that Copilot can invoke during reasoning. When you define a tool, you tell Copilot:
1. **What the tool does** (description)
2. **What parameters it needs** (schema)
3. **What code to run** (handler)
### TypeScript (JSON Schema)
```typescript
import { CopilotClient, defineTool, SessionEvent } from "@github/copilot-sdk";
const getWeather = defineTool("get_weather", {
description: "Get the current weather for a city",
parameters: {
type: "object",
properties: {
city: { type: "string", description: "The city name" },
},
required: ["city"],
},
handler: async (args: { city: string }) => {
const { city } = args;
// In a real app, call a weather API here
const conditions = ["sunny", "cloudy", "rainy", "partly cloudy"];
const temp = Math.floor(Math.random() * 30) + 50;
const condition = conditions[Math.floor(Math.random() * conditions.length)];
return { city, temperature: `${temp}°F`, condition };
},
});
const client = new CopilotClient();
const session = await client.createSession({
model: "gpt-4.1",
streaming: true,
tools: [getWeather],
});
session.on((event: SessionEvent) => {
if (event.type === "assistant.message_delta") {
process.stdout.write(event.data.deltaContent);
}
});
await session.sendAndWait({
prompt: "What's the weather like in Seattle and Tokyo?",
});
await client.stop();
process.exit(0);
```
### Python (Pydantic)
```python
import asyncio
import random
import sys
from copilot import CopilotClient
from copilot.tools import define_tool
from copilot.generated.session_events import SessionEventType
from pydantic import BaseModel, Field
class GetWeatherParams(BaseModel):
city: str = Field(description="The name of the city to get weather for")
@define_tool(description="Get the current weather for a city")
async def get_weather(params: GetWeatherParams) -> dict:
city = params.city
conditions = ["sunny", "cloudy", "rainy", "partly cloudy"]
temp = random.randint(50, 80)
condition = random.choice(conditions)
return {"city": city, "temperature": f"{temp}°F", "condition": condition}
async def main():
client = CopilotClient()
await client.start()
session = await client.create_session({
"model": "gpt-4.1",
"streaming": True,
"tools": [get_weather],
})
def handle_event(event):
if event.type == SessionEventType.ASSISTANT_MESSAGE_DELTA:
sys.stdout.write(event.data.delta_content)
sys.stdout.flush()
session.on(handle_event)
await session.send_and_wait({
"prompt": "What's the weather like in Seattle and Tokyo?"
})
await client.stop()
asyncio.run(main())
```
### Go
```go
type WeatherParams struct {
City string `json:"city" jsonschema:"The city name"`
}
type WeatherResult struct {
City string `json:"city"`
Temperature string `json:"temperature"`
Condition string `json:"condition"`
}
getWeather := copilot.DefineTool(
"get_weather",
"Get the current weather for a city",
func(params WeatherParams, inv copilot.ToolInvocation) (WeatherResult, error) {
conditions := []string{"sunny", "cloudy", "rainy", "partly cloudy"}
temp := rand.Intn(30) + 50
condition := conditions[rand.Intn(len(conditions))]
return WeatherResult{
City: params.City,
Temperature: fmt.Sprintf("%d°F", temp),
Condition: condition,
}, nil
},
)
session, _ := client.CreateSession(&copilot.SessionConfig{
Model: "gpt-4.1",
Streaming: true,
Tools: []copilot.Tool{getWeather},
})
```
### .NET (Microsoft.Extensions.AI)
```csharp
using GitHub.Copilot.SDK;
using Microsoft.Extensions.AI;
using System.ComponentModel;
var getWeather = AIFunctionFactory.Create(
([Description("The city name")] string city) =>
{
var conditions = new[] { "sunny", "cloudy", "rainy", "partly cloudy" };
var temp = Random.Shared.Next(50, 80);
var condition = conditions[Random.Shared.Next(conditions.Length)];
return new { city, temperature = $"{temp}°F", condition };
},
"get_weather",
"Get the current weather for a city"
);
await using var session = await client.CreateSessionAsync(new SessionConfig
{
Model = "gpt-4.1",
Streaming = true,
Tools = [getWeather],
});
```
## How Tools Work
When Copilot decides to call your tool:
1. Copilot sends a tool call request with the parameters
2. The SDK runs your handler function
3. The result is sent back to Copilot
4. Copilot incorporates the result into its response
Copilot decides when to call your tool based on the user's question and your tool's description.
## Interactive CLI Assistant
Build a complete interactive assistant:
### TypeScript
```typescript
import { CopilotClient, defineTool, SessionEvent } from "@github/copilot-sdk";
import * as readline from "readline";
const getWeather = defineTool("get_weather", {
description: "Get the current weather for a city",
parameters: {
type: "object",
properties: {
city: { type: "string", description: "The city name" },
},
required: ["city"],
},
handler: async ({ city }) => {
const conditions = ["sunny", "cloudy", "rainy", "partly cloudy"];
const temp = Math.floor(Math.random() * 30) + 50;
const condition = conditions[Math.floor(Math.random() * conditions.length)];
return { city, temperature: `${temp}°F`, condition };
},
});
const client = new CopilotClient();
const session = await client.createSession({
model: "gpt-4.1",
streaming: true,
tools: [getWeather],
});
session.on((event: SessionEvent) => {
if (event.type === "assistant.message_delta") {
process.stdout.write(event.data.deltaContent);
}
});
const rl = readline.createInterface({
input: process.stdin,
output: process.stdout,
});
console.log("Weather Assistant (type 'exit' to quit)");
console.log("Try: 'What's the weather in Paris?'\n");
const prompt = () => {
rl.question("You: ", async (input) => {
if (input.toLowerCase() === "exit") {
await client.stop();
rl.close();
return;
}
process.stdout.write("Assistant: ");
await session.sendAndWait({ prompt: input });
console.log("\n");
prompt();
});
};
prompt();
```
### Python
```python
import asyncio
import random
import sys
from copilot import CopilotClient
from copilot.tools import define_tool
from copilot.generated.session_events import SessionEventType
from pydantic import BaseModel, Field
class GetWeatherParams(BaseModel):
city: str = Field(description="The name of the city to get weather for")
@define_tool(description="Get the current weather for a city")
async def get_weather(params: GetWeatherParams) -> dict:
conditions = ["sunny", "cloudy", "rainy", "partly cloudy"]
temp = random.randint(50, 80)
condition = random.choice(conditions)
return {"city": params.city, "temperature": f"{temp}°F", "condition": condition}
async def main():
client = CopilotClient()
await client.start()
session = await client.create_session({
"model": "gpt-4.1",
"streaming": True,
"tools": [get_weather],
})
def handle_event(event):
if event.type == SessionEventType.ASSISTANT_MESSAGE_DELTA:
sys.stdout.write(event.data.delta_content)
sys.stdout.flush()
session.on(handle_event)
print("Weather Assistant (type 'exit' to quit)")
print("Try: 'What's the weather in Paris?'\n")
while True:
try:
user_input = input("You: ")
except EOFError:
break
if user_input.lower() == "exit":
break
sys.stdout.write("Assistant: ")
await session.send_and_wait({"prompt": user_input})
print("\n")
await client.stop()
asyncio.run(main())
```
## MCP Server Integration
Connect to MCP (Model Context Protocol) servers for pre-built tools. Connect to GitHub's MCP server for repository, issue, and PR access:
### TypeScript
```typescript
const session = await client.createSession({
model: "gpt-4.1",
mcpServers: {
github: {
type: "http",
url: "https://api.githubcopilot.com/mcp/",
},
},
});
```
### Python
```python
session = await client.create_session({
"model": "gpt-4.1",
"mcp_servers": {
"github": {
"type": "http",
"url": "https://api.githubcopilot.com/mcp/",
},
},
})
```
### Go
```go
session, _ := client.CreateSession(&copilot.SessionConfig{
Model: "gpt-4.1",
MCPServers: map[string]copilot.MCPServerConfig{
"github": {
Type: "http",
URL: "https://api.githubcopilot.com/mcp/",
},
},
})
```
### .NET
```csharp
await using var session = await client.CreateSessionAsync(new SessionConfig
{
Model = "gpt-4.1",
McpServers = new Dictionary<string, McpServerConfig>
{
["github"] = new McpServerConfig
{
Type = "http",
Url = "https://api.githubcopilot.com/mcp/",
},
},
});
```
## Custom Agents
Define specialized AI personas for specific tasks:
### TypeScript
```typescript
const session = await client.createSession({
model: "gpt-4.1",
customAgents: [{
name: "pr-reviewer",
displayName: "PR Reviewer",
description: "Reviews pull requests for best practices",
prompt: "You are an expert code reviewer. Focus on security, performance, and maintainability.",
}],
});
```
### Python
```python
session = await client.create_session({
"model": "gpt-4.1",
"custom_agents": [{
"name": "pr-reviewer",
"display_name": "PR Reviewer",
"description": "Reviews pull requests for best practices",
"prompt": "You are an expert code reviewer. Focus on security, performance, and maintainability.",
}],
})
```
## System Message
Customize the AI's behavior and personality:
### TypeScript
```typescript
const session = await client.createSession({
model: "gpt-4.1",
systemMessage: {
content: "You are a helpful assistant for our engineering team. Always be concise.",
},
});
```
### Python
```python
session = await client.create_session({
"model": "gpt-4.1",
"system_message": {
"content": "You are a helpful assistant for our engineering team. Always be concise.",
},
})
```
## External CLI Server
Run the CLI in server mode separately and connect the SDK to it. Useful for debugging, resource sharing, or custom environments.
### Start CLI in Server Mode
```bash
copilot --server --port 4321
```
### Connect SDK to External Server
#### TypeScript
```typescript
const client = new CopilotClient({
cliUrl: "localhost:4321"
});
const session = await client.createSession({ model: "gpt-4.1" });
```
#### Python
```python
client = CopilotClient({
"cli_url": "localhost:4321"
})
await client.start()
session = await client.create_session({"model": "gpt-4.1"})
```
#### Go
```go
client := copilot.NewClient(&copilot.ClientOptions{
CLIUrl: "localhost:4321",
})
if err := client.Start(); err != nil {
log.Fatal(err)
}
session, _ := client.CreateSession(&copilot.SessionConfig{Model: "gpt-4.1"})
```
#### .NET
```csharp
using var client = new CopilotClient(new CopilotClientOptions
{
CliUrl = "localhost:4321"
});
await using var session = await client.CreateSessionAsync(new SessionConfig { Model = "gpt-4.1" });
```
**Note:** When `cliUrl` is provided, the SDK will not spawn or manage a CLI process - it only connects to the existing server.
## Event Types
| Event | Description |
|-------|-------------|
| `user.message` | User input added |
| `assistant.message` | Complete model response |
| `assistant.message_delta` | Streaming response chunk |
| `assistant.reasoning` | Model reasoning (model-dependent) |
| `assistant.reasoning_delta` | Streaming reasoning chunk |
| `tool.execution_start` | Tool invocation started |
| `tool.execution_complete` | Tool execution finished |
| `session.idle` | No active processing |
| `session.error` | Error occurred |
## Client Configuration
| Option | Description | Default |
|--------|-------------|---------|
| `cliPath` | Path to Copilot CLI executable | System PATH |
| `cliUrl` | Connect to existing server (e.g., "localhost:4321") | None |
| `port` | Server communication port | Random |
| `useStdio` | Use stdio transport instead of TCP | true |
| `logLevel` | Logging verbosity | "info" |
| `autoStart` | Launch server automatically | true |
| `autoRestart` | Restart on crashes | true |
| `cwd` | Working directory for CLI process | Inherited |
## Session Configuration
| Option | Description |
|--------|-------------|
| `model` | LLM to use ("gpt-4.1", "claude-sonnet-4.5", etc.) |
| `sessionId` | Custom session identifier |
| `tools` | Custom tool definitions |
| `mcpServers` | MCP server connections |
| `customAgents` | Custom agent personas |
| `systemMessage` | Override default system prompt |
| `streaming` | Enable incremental response chunks |
| `availableTools` | Whitelist of permitted tools |
| `excludedTools` | Blacklist of disabled tools |
## Session Persistence
Save and resume conversations across restarts:
### Create with Custom ID
```typescript
const session = await client.createSession({
sessionId: "user-123-conversation",
model: "gpt-4.1"
});
```
### Resume Session
```typescript
const session = await client.resumeSession("user-123-conversation");
await session.send({ prompt: "What did we discuss earlier?" });
```
### List and Delete Sessions
```typescript
const sessions = await client.listSessions();
await client.deleteSession("old-session-id");
```
## Error Handling
```typescript
try {
const client = new CopilotClient();
const session = await client.createSession({ model: "gpt-4.1" });
const response = await session.sendAndWait(
{ prompt: "Hello!" },
30000 // timeout in ms
);
} catch (error) {
if (error.code === "ENOENT") {
console.error("Copilot CLI not installed");
} else if (error.code === "ECONNREFUSED") {
console.error("Cannot connect to Copilot server");
} else {
console.error("Error:", error.message);
}
} finally {
await client.stop();
}
```
## Graceful Shutdown
```typescript
process.on("SIGINT", async () => {
console.log("Shutting down...");
await client.stop();
process.exit(0);
});
```
## Common Patterns
### Multi-turn Conversation
```typescript
const session = await client.createSession({ model: "gpt-4.1" });
await session.sendAndWait({ prompt: "My name is Alice" });
await session.sendAndWait({ prompt: "What's my name?" });
// Response: "Your name is Alice"
```
### File Attachments
```typescript
await session.send({
prompt: "Analyze this file",
attachments: [{
type: "file",
path: "./data.csv",
displayName: "Sales Data"
}]
});
```
### Abort Long Operations
```typescript
const timeoutId = setTimeout(() => {
session.abort();
}, 60000);
session.on((event) => {
if (event.type === "session.idle") {
clearTimeout(timeoutId);
}
});
```
## Available Models
Query available models at runtime:
```typescript
const models = await client.getModels();
// Returns: ["gpt-4.1", "gpt-4o", "claude-sonnet-4.5", ...]
```
## Best Practices
1. **Always cleanup**: Use `try-finally` or `defer` to ensure `client.stop()` is called
2. **Set timeouts**: Use `sendAndWait` with timeout for long operations
3. **Handle events**: Subscribe to error events for robust error handling
4. **Use streaming**: Enable streaming for better UX on long responses
5. **Persist sessions**: Use custom session IDs for multi-turn conversations
6. **Define clear tools**: Write descriptive tool names and descriptions
## Architecture
```
Your Application
|
SDK Client
| JSON-RPC
Copilot CLI (server mode)
|
GitHub (models, auth)
```
The SDK manages the CLI process lifecycle automatically. All communication happens via JSON-RPC over stdio or TCP.
## Resources
- **GitHub Repository**: https://github.com/github/copilot-sdk
- **Getting Started Tutorial**: https://github.com/github/copilot-sdk/blob/main/docs/tutorials/first-app.md
- **GitHub MCP Server**: https://github.com/github/github-mcp-server
- **MCP Servers Directory**: https://github.com/modelcontextprotocol/servers
- **Cookbook**: https://github.com/github/copilot-sdk/tree/main/cookbook
- **Samples**: https://github.com/github/copilot-sdk/tree/main/samples
## Status
This SDK is in **Technical Preview** and may have breaking changes. Not recommended for production use yet.GitHub CLI (gh) comprehensive reference for repositories, issues, pull requests, Actions, projects, releases, gists, codespaces, organizations, extensions, and all GitHub operations from the command line.
# GitHub CLI (gh)
Comprehensive reference for GitHub CLI (gh) - work seamlessly with GitHub from the command line.
**Version:** 2.85.0 (current as of January 2026)
## Prerequisites
### Installation
```bash
# macOS
brew install gh
# Linux
curl -fsSL https://cli.github.com/packages/githubcli-archive-keyring.gpg | sudo dd of=/usr/share/keyrings/githubcli-archive-keyring.gpg
echo "deb [arch=$(dpkg --print-architecture) signed-by=/usr/share/keyrings/githubcli-archive-keyring.gpg] https://cli.github.com/packages stable main" | sudo tee /etc/apt/sources.list.d/github-cli.list > /dev/null
sudo apt update
sudo apt install gh
# Windows
winget install --id GitHub.cli
# Verify installation
gh --version
```
### Authentication
```bash
# Interactive login (default: github.com)
gh auth login
# Login with specific hostname
gh auth login --hostname enterprise.internal
# Login with token
gh auth login --with-token < mytoken.txt
# Check authentication status
gh auth status
# Switch accounts
gh auth switch --hostname github.com --user username
# Logout
gh auth logout --hostname github.com --user username
```
### Setup Git Integration
```bash
# Configure git to use gh as credential helper
gh auth setup-git
# View active token
gh auth token
# Refresh authentication scopes
gh auth refresh --scopes write:org,read:public_key
```
## CLI Structure
```
gh # Root command
├── auth # Authentication
│ ├── login
│ ├── logout
│ ├── refresh
│ ├── setup-git
│ ├── status
│ ├── switch
│ └── token
├── browse # Open in browser
├── codespace # GitHub Codespaces
│ ├── code
│ ├── cp
│ ├── create
│ ├── delete
│ ├── edit
│ ├── jupyter
│ ├── list
│ ├── logs
│ ├── ports
│ ├── rebuild
│ ├── ssh
│ ├── stop
│ └── view
├── gist # Gists
│ ├── clone
│ ├── create
│ ├── delete
│ ├── edit
│ ├── list
│ ├── rename
│ └── view
├── issue # Issues
│ ├── create
│ ├── list
│ ├── status
│ ├── close
│ ├── comment
│ ├── delete
│ ├── develop
│ ├── edit
│ ├── lock
│ ├── pin
│ ├── reopen
│ ├── transfer
│ ├── unlock
│ └── view
├── org # Organizations
│ └── list
├── pr # Pull Requests
│ ├── create
│ ├── list
│ ├── status
│ ├── checkout
│ ├── checks
│ ├── close
│ ├── comment
│ ├── diff
│ ├── edit
│ ├── lock
│ ├── merge
│ ├── ready
│ ├── reopen
│ ├── revert
│ ├── review
│ ├── unlock
│ ├── update-branch
│ └── view
├── project # Projects
│ ├── close
│ ├── copy
│ ├── create
│ ├── delete
│ ├── edit
│ ├── field-create
│ ├── field-delete
│ ├── field-list
│ ├── item-add
│ ├── item-archive
│ ├── item-create
│ ├── item-delete
│ ├── item-edit
│ ├── item-list
│ ├── link
│ ├── list
│ ├── mark-template
│ ├── unlink
│ └── view
├── release # Releases
│ ├── create
│ ├── list
│ ├── delete
│ ├── delete-asset
│ ├── download
│ ├── edit
│ ├── upload
│ ├── verify
│ ├── verify-asset
│ └── view
├── repo # Repositories
│ ├── create
│ ├── list
│ ├── archive
│ ├── autolink
│ ├── clone
│ ├── delete
│ ├── deploy-key
│ ├── edit
│ ├── fork
│ ├── gitignore
│ ├── license
│ ├── rename
│ ├── set-default
│ ├── sync
│ ├── unarchive
│ └── view
├── cache # Actions caches
│ ├── delete
│ └── list
├── run # Workflow runs
│ ├── cancel
│ ├── delete
│ ├── download
│ ├── list
│ ├── rerun
│ ├── view
│ └── watch
├── workflow # Workflows
│ ├── disable
│ ├── enable
│ ├── list
│ ├── run
│ └── view
├── agent-task # Agent tasks
├── alias # Command aliases
│ ├── delete
│ ├── import
│ ├── list
│ └── set
├── api # API requests
├── attestation # Artifact attestations
│ ├── download
│ ├── trusted-root
│ └── verify
├── completion # Shell completion
├── config # Configuration
│ ├── clear-cache
│ ├── get
│ ├── list
│ └── set
├── extension # Extensions
│ ├── browse
│ ├── create
│ ├── exec
│ ├── install
│ ├── list
│ ├── remove
│ ├── search
│ └── upgrade
├── gpg-key # GPG keys
│ ├── add
│ ├── delete
│ └── list
├── label # Labels
│ ├── clone
│ ├── create
│ ├── delete
│ ├── edit
│ └── list
├── preview # Preview features
├── ruleset # Rulesets
│ ├── check
│ ├── list
│ └── view
├── search # Search
│ ├── code
│ ├── commits
│ ├── issues
│ ├── prs
│ └── repos
├── secret # Secrets
│ ├── delete
│ ├── list
│ └── set
├── ssh-key # SSH keys
│ ├── add
│ ├── delete
│ └── list
├── status # Status overview
└── variable # Variables
├── delete
├── get
├── list
└── set
```
## Configuration
### Global Configuration
```bash
# List all configuration
gh config list
# Get specific configuration value
gh config list git_protocol
gh config get editor
# Set configuration value
gh config set editor vim
gh config set git_protocol ssh
gh config set prompt disabled
gh config set pager "less -R"
# Clear configuration cache
gh config clear-cache
```
### Environment Variables
```bash
# GitHub token (for automation)
export GH_TOKEN=ghp_xxxxxxxxxxxx
# GitHub hostname
export GH_HOST=github.com
# Disable prompts
export GH_PROMPT_DISABLED=true
# Custom editor
export GH_EDITOR=vim
# Custom pager
export GH_PAGER=less
# HTTP timeout
export GH_TIMEOUT=30
# Custom repository (override default)
export GH_REPO=owner/repo
# Custom git protocol
export GH_ENTERPRISE_HOSTNAME=hostname
```
## Authentication (gh auth)
### Login
```bash
# Interactive login
gh auth login
# Web-based authentication
gh auth login --web
# With clipboard for OAuth code
gh auth login --web --clipboard
# With specific git protocol
gh auth login --git-protocol ssh
# With custom hostname (GitHub Enterprise)
gh auth login --hostname enterprise.internal
# Login with token from stdin
gh auth login --with-token < token.txt
# Insecure storage (plain text)
gh auth login --insecure-storage
```
### Status
```bash
# Show all authentication status
gh auth status
# Show active account only
gh auth status --active
# Show specific hostname
gh auth status --hostname github.com
# Show token in output
gh auth status --show-token
# JSON output
gh auth status --json hosts
# Filter with jq
gh auth status --json hosts --jq '.hosts | add'
```
### Switch Accounts
```bash
# Interactive switch
gh auth switch
# Switch to specific user/host
gh auth switch --hostname github.com --user monalisa
```
### Token
```bash
# Print authentication token
gh auth token
# Token for specific host/user
gh auth token --hostname github.com --user monalisa
```
### Refresh
```bash
# Refresh credentials
gh auth refresh
# Add scopes
gh auth refresh --scopes write:org,read:public_key
# Remove scopes
gh auth refresh --remove-scopes delete_repo
# Reset to default scopes
gh auth refresh --reset-scopes
# With clipboard
gh auth refresh --clipboard
```
### Setup Git
```bash
# Setup git credential helper
gh auth setup-git
# Setup for specific host
gh auth setup-git --hostname enterprise.internal
# Force setup even if host not known
gh auth setup-git --hostname enterprise.internal --force
```
## Browse (gh browse)
```bash
# Open repository in browser
gh browse
# Open specific path
gh browse script/
gh browse main.go:312
# Open issue or PR
gh browse 123
# Open commit
gh browse 77507cd94ccafcf568f8560cfecde965fcfa63
# Open with specific branch
gh browse main.go --branch bug-fix
# Open different repository
gh browse --repo owner/repo
# Open specific pages
gh browse --actions # Actions tab
gh browse --projects # Projects tab
gh browse --releases # Releases tab
gh browse --settings # Settings page
gh browse --wiki # Wiki page
# Print URL instead of opening
gh browse --no-browser
```
## Repositories (gh repo)
### Create Repository
```bash
# Create new repository
gh repo create my-repo
# Create with description
gh repo create my-repo --description "My awesome project"
# Create public repository
gh repo create my-repo --public
# Create private repository
gh repo create my-repo --private
# Create with homepage
gh repo create my-repo --homepage https://example.com
# Create with license
gh repo create my-repo --license mit
# Create with gitignore
gh repo create my-repo --gitignore python
# Initialize as template repository
gh repo create my-repo --template
# Create repository in organization
gh repo create org/my-repo
# Create without cloning locally
gh repo create my-repo --source=.
# Disable issues
gh repo create my-repo --disable-issues
# Disable wiki
gh repo create my-repo --disable-wiki
```
### Clone Repository
```bash
# Clone repository
gh repo clone owner/repo
# Clone to specific directory
gh repo clone owner/repo my-directory
# Clone with different branch
gh repo clone owner/repo --branch develop
```
### List Repositories
```bash
# List all repositories
gh repo list
# List repositories for owner
gh repo list owner
# Limit results
gh repo list --limit 50
# Public repositories only
gh repo list --public
# Source repositories only (not forks)
gh repo list --source
# JSON output
gh repo list --json name,visibility,owner
# Table output
gh repo list --limit 100 | tail -n +2
# Filter with jq
gh repo list --json name --jq '.[].name'
```
### View Repository
```bash
# View repository details
gh repo view
# View specific repository
gh repo view owner/repo
# JSON output
gh repo view --json name,description,defaultBranchRef
# View in browser
gh repo view --web
```
### Edit Repository
```bash
# Edit description
gh repo edit --description "New description"
# Set homepage
gh repo edit --homepage https://example.com
# Change visibility
gh repo edit --visibility private
gh repo edit --visibility public
# Enable/disable features
gh repo edit --enable-issues
gh repo edit --disable-issues
gh repo edit --enable-wiki
gh repo edit --disable-wiki
gh repo edit --enable-projects
gh repo edit --disable-projects
# Set default branch
gh repo edit --default-branch main
# Rename repository
gh repo rename new-name
# Archive repository
gh repo archive
gh repo unarchive
```
### Delete Repository
```bash
# Delete repository
gh repo delete owner/repo
# Confirm without prompt
gh repo delete owner/repo --yes
```
### Fork Repository
```bash
# Fork repository
gh repo fork owner/repo
# Fork to organization
gh repo fork owner/repo --org org-name
# Clone after forking
gh repo fork owner/repo --clone
# Remote name for fork
gh repo fork owner/repo --remote-name upstream
```
### Sync Fork
```bash
# Sync fork with upstream
gh repo sync
# Sync specific branch
gh repo sync --branch feature
# Force sync
gh repo sync --force
```
### Set Default Repository
```bash
# Set default repository for current directory
gh repo set-default
# Set default explicitly
gh repo set-default owner/repo
# Unset default
gh repo set-default --unset
```
### Repository Autolinks
```bash
# List autolinks
gh repo autolink list
# Add autolink
gh repo autolink add \
--key-prefix JIRA- \
--url-template https://jira.example.com/browse/<num>
# Delete autolink
gh repo autolink delete 12345
```
### Repository Deploy Keys
```bash
# List deploy keys
gh repo deploy-key list
# Add deploy key
gh repo deploy-key add ~/.ssh/id_rsa.pub \
--title "Production server" \
--read-only
# Delete deploy key
gh repo deploy-key delete 12345
```
### Gitignore and License
```bash
# View gitignore template
gh repo gitignore
# View license template
gh repo license mit
# License with full name
gh repo license mit --fullname "John Doe"
```
## Issues (gh issue)
### Create Issue
```bash
# Create issue interactively
gh issue create
# Create with title
gh issue create --title "Bug: Login not working"
# Create with title and body
gh issue create \
--title "Bug: Login not working" \
--body "Steps to reproduce..."
# Create with body from file
gh issue create --body-file issue.md
# Create with labels
gh issue create --title "Fix bug" --labels bug,high-priority
# Create with assignees
gh issue create --title "Fix bug" --assignee user1,user2
# Create in specific repository
gh issue create --repo owner/repo --title "Issue title"
# Create issue from web
gh issue create --web
```
### List Issues
```bash
# List all open issues
gh issue list
# List all issues (including closed)
gh issue list --state all
# List closed issues
gh issue list --state closed
# Limit results
gh issue list --limit 50
# Filter by assignee
gh issue list --assignee username
gh issue list --assignee @me
# Filter by labels
gh issue list --labels bug,enhancement
# Filter by milestone
gh issue list --milestone "v1.0"
# Search/filter
gh issue list --search "is:open is:issue label:bug"
# JSON output
gh issue list --json number,title,state,author
# Table view
gh issue list --json number,title,labels --jq '.[] | [.number, .title, .labels[].name] | @tsv'
# Show comments count
gh issue list --json number,title,comments --jq '.[] | [.number, .title, .comments]'
# Sort by
gh issue list --sort created --order desc
```
### View Issue
```bash
# View issue
gh issue view 123
# View with comments
gh issue view 123 --comments
# View in browser
gh issue view 123 --web
# JSON output
gh issue view 123 --json title,body,state,labels,comments
# View specific fields
gh issue view 123 --json title --jq '.title'
```
### Edit Issue
```bash
# Edit interactively
gh issue edit 123
# Edit title
gh issue edit 123 --title "New title"
# Edit body
gh issue edit 123 --body "New description"
# Add labels
gh issue edit 123 --add-label bug,high-priority
# Remove labels
gh issue edit 123 --remove-label stale
# Add assignees
gh issue edit 123 --add-assignee user1,user2
# Remove assignees
gh issue edit 123 --remove-assignee user1
# Set milestone
gh issue edit 123 --milestone "v1.0"
```
### Close/Reopen Issue
```bash
# Close issue
gh issue close 123
# Close with comment
gh issue close 123 --comment "Fixed in PR #456"
# Reopen issue
gh issue reopen 123
```
### Comment on Issue
```bash
# Add comment
gh issue comment 123 --body "This looks good!"
# Edit comment
gh issue comment 123 --edit 456789 --body "Updated comment"
# Delete comment
gh issue comment 123 --delete 456789
```
### Issue Status
```bash
# Show issue status summary
gh issue status
# Status for specific repository
gh issue status --repo owner/repo
```
### Pin/Unpin Issues
```bash
# Pin issue (pinned to repo dashboard)
gh issue pin 123
# Unpin issue
gh issue unpin 123
```
### Lock/Unlock Issue
```bash
# Lock conversation
gh issue lock 123
# Lock with reason
gh issue lock 123 --reason off-topic
# Unlock
gh issue unlock 123
```
### Transfer Issue
```bash
# Transfer to another repository
gh issue transfer 123 --repo owner/new-repo
```
### Delete Issue
```bash
# Delete issue
gh issue delete 123
# Confirm without prompt
gh issue delete 123 --yes
```
### Develop Issue (Draft PR)
```bash
# Create draft PR from issue
gh issue develop 123
# Create in specific branch
gh issue develop 123 --branch fix/issue-123
# Create with base branch
gh issue develop 123 --base main
```
## Pull Requests (gh pr)
### Create Pull Request
```bash
# Create PR interactively
gh pr create
# Create with title
gh pr create --title "Feature: Add new functionality"
# Create with title and body
gh pr create \
--title "Feature: Add new functionality" \
--body "This PR adds..."
# Fill body from template
gh pr create --body-file .github/PULL_REQUEST_TEMPLATE.md
# Set base branch
gh pr create --base main
# Set head branch (default: current branch)
gh pr create --head feature-branch
# Create draft PR
gh pr create --draft
# Add assignees
gh pr create --assignee user1,user2
# Add reviewers
gh pr create --reviewer user1,user2
# Add labels
gh pr create --labels enhancement,feature
# Link to issue
gh pr create --issue 123
# Create in specific repository
gh pr create --repo owner/repo
# Open in browser after creation
gh pr create --web
```
### List Pull Requests
```bash
# List open PRs
gh pr list
# List all PRs
gh pr list --state all
# List merged PRs
gh pr list --state merged
# List closed (not merged) PRs
gh pr list --state closed
# Filter by head branch
gh pr list --head feature-branch
# Filter by base branch
gh pr list --base main
# Filter by author
gh pr list --author username
gh pr list --author @me
# Filter by assignee
gh pr list --assignee username
# Filter by labels
gh pr list --labels bug,enhancement
# Limit results
gh pr list --limit 50
# Search
gh pr list --search "is:open is:pr label:review-required"
# JSON output
gh pr list --json number,title,state,author,headRefName
# Show check status
gh pr list --json number,title,statusCheckRollup --jq '.[] | [.number, .title, .statusCheckRollup[]?.status]'
# Sort by
gh pr list --sort created --order desc
```
### View Pull Request
```bash
# View PR
gh pr view 123
# View with comments
gh pr view 123 --comments
# View in browser
gh pr view 123 --web
# JSON output
gh pr view 123 --json title,body,state,author,commits,files
# View diff
gh pr view 123 --json files --jq '.files[].path'
# View with jq query
gh pr view 123 --json title,state --jq '"\(.title): \(.state)"'
```
### Checkout Pull Request
```bash
# Checkout PR branch
gh pr checkout 123
# Checkout with specific branch name
gh pr checkout 123 --branch name-123
# Force checkout
gh pr checkout 123 --force
```
### Diff Pull Request
```bash
# View PR diff
gh pr diff 123
# View diff with color
gh pr diff 123 --color always
# Output to file
gh pr diff 123 > pr-123.patch
# View diff of specific files
gh pr diff 123 --name-only
```
### Merge Pull Request
```bash
# Merge PR
gh pr merge 123
# Merge with specific method
gh pr merge 123 --merge
gh pr merge 123 --squash
gh pr merge 123 --rebase
# Delete branch after merge
gh pr merge 123 --delete-branch
# Merge with comment
gh pr merge 123 --subject "Merge PR #123" --body "Merging feature"
# Merge draft PR
gh pr merge 123 --admin
# Force merge (skip checks)
gh pr merge 123 --admin
```
### Close Pull Request
```bash
# Close PR (as draft, not merge)
gh pr close 123
# Close with comment
gh pr close 123 --comment "Closing due to..."
```
### Reopen Pull Request
```bash
# Reopen closed PR
gh pr reopen 123
```
### Edit Pull Request
```bash
# Edit interactively
gh pr edit 123
# Edit title
gh pr edit 123 --title "New title"
# Edit body
gh pr edit 123 --body "New description"
# Add labels
gh pr edit 123 --add-label bug,enhancement
# Remove labels
gh pr edit 123 --remove-label stale
# Add assignees
gh pr edit 123 --add-assignee user1,user2
# Remove assignees
gh pr edit 123 --remove-assignee user1
# Add reviewers
gh pr edit 123 --add-reviewer user1,user2
# Remove reviewers
gh pr edit 123 --remove-reviewer user1
# Mark as ready for review
gh pr edit 123 --ready
```
### Ready for Review
```bash
# Mark draft PR as ready
gh pr ready 123
```
### Pull Request Checks
```bash
# View PR checks
gh pr checks 123
# Watch checks in real-time
gh pr checks 123 --watch
# Watch interval (seconds)
gh pr checks 123 --watch --interval 5
```
### Comment on Pull Request
```bash
# Add comment
gh pr comment 123 --body "Looks good!"
# Comment on specific line
gh pr comment 123 --body "Fix this" \
--repo owner/repo \
--head-owner owner --head-branch feature
# Edit comment
gh pr comment 123 --edit 456789 --body "Updated"
# Delete comment
gh pr comment 123 --delete 456789
```
### Review Pull Request
```bash
# Review PR (opens editor)
gh pr review 123
# Approve PR
gh pr review 123 --approve
--approve-body "LGTM!"
# Request changes
gh pr review 123 --request-changes \
--body "Please fix these issues"
# Comment on PR
gh pr review 123 --comment --body "Some thoughts..."
# Dismiss review
gh pr review 123 --dismiss
```
### Update Branch
```bash
# Update PR branch with latest base branch
gh pr update-branch 123
# Force update
gh pr update-branch 123 --force
# Use merge strategy
gh pr update-branch 123 --merge
```
### Lock/Unlock Pull Request
```bash
# Lock PR conversation
gh pr lock 123
# Lock with reason
gh pr lock 123 --reason off-topic
# Unlock
gh pr unlock 123
```
### Revert Pull Request
```bash
# Revert merged PR
gh pr revert 123
# Revert with specific branch name
gh pr revert 123 --branch revert-pr-123
```
### Pull Request Status
```bash
# Show PR status summary
gh pr status
# Status for specific repository
gh pr status --repo owner/repo
```
## GitHub Actions
### Workflow Runs (gh run)
```bash
# List workflow runs
gh run list
# List for specific workflow
gh run list --workflow "ci.yml"
# List for specific branch
gh run list --branch main
# Limit results
gh run list --limit 20
# JSON output
gh run list --json databaseId,status,conclusion,headBranch
# View run details
gh run view 123456789
# View run with verbose logs
gh run view 123456789 --log
# View specific job
gh run view 123456789 --job 987654321
# View in browser
gh run view 123456789 --web
# Watch run in real-time
gh run watch 123456789
# Watch with interval
gh run watch 123456789 --interval 5
# Rerun failed run
gh run rerun 123456789
# Rerun specific job
gh run rerun 123456789 --job 987654321
# Cancel run
gh run cancel 123456789
# Delete run
gh run delete 123456789
# Download run artifacts
gh run download 123456789
# Download specific artifact
gh run download 123456789 --name build
# Download to directory
gh run download 123456789 --dir ./artifacts
```
### Workflows (gh workflow)
```bash
# List workflows
gh workflow list
# View workflow details
gh workflow view ci.yml
# View workflow YAML
gh workflow view ci.yml --yaml
# View in browser
gh workflow view ci.yml --web
# Enable workflow
gh workflow enable ci.yml
# Disable workflow
gh workflow disable ci.yml
# Run workflow manually
gh workflow run ci.yml
# Run with inputs
gh workflow run ci.yml \
--raw-field \
version="1.0.0" \
environment="production"
# Run from specific branch
gh workflow run ci.yml --ref develop
```
### Action Caches (gh cache)
```bash
# List caches
gh cache list
# List for specific branch
gh cache list --branch main
# List with limit
gh cache list --limit 50
# Delete cache
gh cache delete 123456789
# Delete all caches
gh cache delete --all
```
### Action Secrets (gh secret)
```bash
# List secrets
gh secret list
# Set secret (prompts for value)
gh secret set MY_SECRET
# Set secret from environment
echo "$MY_SECRET" | gh secret set MY_SECRET
# Set secret for specific environment
gh secret set MY_SECRET --env production
# Set secret for organization
gh secret set MY_SECRET --org orgname
# Delete secret
gh secret delete MY_SECRET
# Delete from environment
gh secret delete MY_SECRET --env production
```
### Action Variables (gh variable)
```bash
# List variables
gh variable list
# Set variable
gh variable set MY_VAR "some-value"
# Set variable for environment
gh variable set MY_VAR "value" --env production
# Set variable for organization
gh variable set MY_VAR "value" --org orgname
# Get variable value
gh variable get MY_VAR
# Delete variable
gh variable delete MY_VAR
# Delete from environment
gh variable delete MY_VAR --env production
```
## Projects (gh project)
```bash
# List projects
gh project list
# List for owner
gh project list --owner owner
# Open projects
gh project list --open
# View project
gh project view 123
# View project items
gh project view 123 --format json
# Create project
gh project create --title "My Project"
# Create in organization
gh project create --title "Project" --org orgname
# Create with readme
gh project create --title "Project" --readme "Description here"
# Edit project
gh project edit 123 --title "New Title"
# Delete project
gh project delete 123
# Close project
gh project close 123
# Copy project
gh project copy 123 --owner target-owner --title "Copy"
# Mark template
gh project mark-template 123
# List fields
gh project field-list 123
# Create field
gh project field-create 123 --title "Status" --datatype single_select
# Delete field
gh project field-delete 123 --id 456
# List items
gh project item-list 123
# Create item
gh project item-create 123 --title "New item"
# Add item to project
gh project item-add 123 --owner-owner --repo repo --issue 456
# Edit item
gh project item-edit 123 --id 456 --title "Updated title"
# Delete item
gh project item-delete 123 --id 456
# Archive item
gh project item-archive 123 --id 456
# Link items
gh project link 123 --id 456 --link-id 789
# Unlink items
gh project unlink 123 --id 456 --link-id 789
# View project in browser
gh project view 123 --web
```
## Releases (gh release)
```bash
# List releases
gh release list
# View latest release
gh release view
# View specific release
gh release view v1.0.0
# View in browser
gh release view v1.0.0 --web
# Create release
gh release create v1.0.0 \
--notes "Release notes here"
# Create release with notes from file
gh release create v1.0.0 --notes-file notes.md
# Create release with target
gh release create v1.0.0 --target main
# Create release as draft
gh release create v1.0.0 --draft
# Create pre-release
gh release create v1.0.0 --prerelease
# Create release with title
gh release create v1.0.0 --title "Version 1.0.0"
# Upload asset to release
gh release upload v1.0.0 ./file.tar.gz
# Upload multiple assets
gh release upload v1.0.0 ./file1.tar.gz ./file2.tar.gz
# Upload with label (casing sensitive)
gh release upload v1.0.0 ./file.tar.gz --casing
# Delete release
gh release delete v1.0.0
# Delete with cleanup tag
gh release delete v1.0.0 --yes
# Delete specific asset
gh release delete-asset v1.0.0 file.tar.gz
# Download release assets
gh release download v1.0.0
# Download specific asset
gh release download v1.0.0 --pattern "*.tar.gz"
# Download to directory
gh release download v1.0.0 --dir ./downloads
# Download archive (zip/tar)
gh release download v1.0.0 --archive zip
# Edit release
gh release edit v1.0.0 --notes "Updated notes"
# Verify release signature
gh release verify v1.0.0
# Verify specific asset
gh release verify-asset v1.0.0 file.tar.gz
```
## Gists (gh gist)
```bash
# List gists
gh gist list
# List all gists (including private)
gh gist list --public
# Limit results
gh gist list --limit 20
# View gist
gh gist view abc123
# View gist files
gh gist view abc123 --files
# Create gist
gh gist create script.py
# Create gist with description
gh gist create script.py --desc "My script"
# Create public gist
gh gist create script.py --public
# Create multi-file gist
gh gist create file1.py file2.py
# Create from stdin
echo "print('hello')" | gh gist create
# Edit gist
gh gist edit abc123
# Delete gist
gh gist delete abc123
# Rename gist file
gh gist rename abc123 --filename old.py new.py
# Clone gist
gh gist clone abc123
# Clone to directory
gh gist clone abc123 my-directory
```
## Codespaces (gh codespace)
```bash
# List codespaces
gh codespace list
# Create codespace
gh codespace create
# Create with specific repository
gh codespace create --repo owner/repo
# Create with branch
gh codespace create --branch develop
# Create with specific machine
gh codespace create --machine premiumLinux
# View codespace details
gh codespace view
# SSH into codespace
gh codespace ssh
# SSH with specific command
gh codespace ssh --command "cd /workspaces && ls"
# Open codespace in browser
gh codespace code
# Open in VS Code
gh codespace code --codec
# Open with specific path
gh codespace code --path /workspaces/repo
# Stop codespace
gh codespace stop
# Delete codespace
gh codespace delete
# View logs
gh codespace logs
--tail 100
# View ports
gh codespace ports
# Forward port
gh codespace cp 8080:8080
# Rebuild codespace
gh codespace rebuild
# Edit codespace
gh codespace edit --machine standardLinux
# Jupyter support
gh codespace jupyter
# Copy files to/from codespace
gh codespace cp file.txt :/workspaces/file.txt
gh codespace cp :/workspaces/file.txt ./file.txt
```
## Organizations (gh org)
```bash
# List organizations
gh org list
# List for user
gh org list --user username
# JSON output
gh org list --json login,name,description
# View organization
gh org view orgname
# View organization members
gh org view orgname --json members --jq '.members[] | .login'
```
## Search (gh search)
```bash
# Search code
gh search code "TODO"
# Search in specific repository
gh search code "TODO" --repo owner/repo
# Search commits
gh search commits "fix bug"
# Search issues
gh search issues "label:bug state:open"
# Search PRs
gh search prs "is:open is:pr review:required"
# Search repositories
gh search repos "stars:>1000 language:python"
# Limit results
gh search repos "topic:api" --limit 50
# JSON output
gh search repos "stars:>100" --json name,description,stargazers
# Order results
gh search repos "language:rust" --order desc --sort stars
# Search with extensions
gh search code "import" --extension py
# Web search (open in browser)
gh search prs "is:open" --web
```
## Labels (gh label)
```bash
# List labels
gh label list
# Create label
gh label create bug --color "d73a4a" --description "Something isn't working"
# Create with hex color
gh label create enhancement --color "#a2eeef"
# Edit label
gh label edit bug --name "bug-report" --color "ff0000"
# Delete label
gh label delete bug
# Clone labels from repository
gh label clone owner/repo
# Clone to specific repository
gh label clone owner/repo --repo target/repo
```
## SSH Keys (gh ssh-key)
```bash
# List SSH keys
gh ssh-key list
# Add SSH key
gh ssh-key add ~/.ssh/id_rsa.pub --title "My laptop"
# Add key with type
gh ssh-key add ~/.ssh/id_ed25519.pub --type "authentication"
# Delete SSH key
gh ssh-key delete 12345
# Delete by title
gh ssh-key delete --title "My laptop"
```
## GPG Keys (gh gpg-key)
```bash
# List GPG keys
gh gpg-key list
# Add GPG key
gh gpg-key add ~/.ssh/id_rsa.pub
# Delete GPG key
gh gpg-key delete 12345
# Delete by key ID
gh gpg-key delete ABCD1234
```
## Status (gh status)
```bash
# Show status overview
gh status
# Status for specific repositories
gh status --repo owner/repo
# JSON output
gh status --json
```
## Configuration (gh config)
```bash
# List all config
gh config list
# Get specific value
gh config get editor
# Set value
gh config set editor vim
# Set git protocol
gh config set git_protocol ssh
# Clear cache
gh config clear-cache
# Set prompt behavior
gh config set prompt disabled
gh config set prompt enabled
```
## Extensions (gh extension)
```bash
# List installed extensions
gh extension list
# Search extensions
gh extension search github
# Install extension
gh extension install owner/extension-repo
# Install from branch
gh extension install owner/extension-repo --branch develop
# Upgrade extension
gh extension upgrade extension-name
# Remove extension
gh extension remove extension-name
# Create new extension
gh extension create my-extension
# Browse extensions
gh extension browse
# Execute extension command
gh extension exec my-extension --arg value
```
## Aliases (gh alias)
```bash
# List aliases
gh alias list
# Set alias
gh alias set prview 'pr view --web'
# Set shell alias
gh alias set co 'pr checkout' --shell
# Delete alias
gh alias delete prview
# Import aliases
gh alias import ./aliases.sh
```
## API Requests (gh api)
```bash
# Make API request
gh api /user
# Request with method
gh api --method POST /repos/owner/repo/issues \
--field title="Issue title" \
--field body="Issue body"
# Request with headers
gh api /user \
--header "Accept: application/vnd.github.v3+json"
# Request with pagination
gh api /user/repos --paginate
# Raw output (no formatting)
gh api /user --raw
# Include headers in output
gh api /user --include
# Silent mode (no progress output)
gh api /user --silent
# Input from file
gh api --input request.json
# jq query on response
gh api /user --jq '.login'
# Field from response
gh api /repos/owner/repo --jq '.stargazers_count'
# GitHub Enterprise
gh api /user --hostname enterprise.internal
# GraphQL query
gh api graphql \
-f query='
{
viewer {
login
repositories(first: 5) {
nodes {
name
}
}
}
}'
```
## Rulesets (gh ruleset)
```bash
# List rulesets
gh ruleset list
# View ruleset
gh ruleset view 123
# Check ruleset
gh ruleset check --branch feature
# Check specific repository
gh ruleset check --repo owner/repo --branch main
```
## Attestations (gh attestation)
```bash
# Download attestation
gh attestation download owner/repo \
--artifact-id 123456
# Verify attestation
gh attestation verify owner/repo
# Get trusted root
gh attestation trusted-root
```
## Completion (gh completion)
```bash
# Generate shell completion
gh completion -s bash > ~/.gh-complete.bash
gh completion -s zsh > ~/.gh-complete.zsh
gh completion -s fish > ~/.gh-complete.fish
gh completion -s powershell > ~/.gh-complete.ps1
# Shell-specific instructions
gh completion --shell=bash
gh completion --shell=zsh
```
## Preview (gh preview)
```bash
# List preview features
gh preview
# Run preview script
gh preview prompter
```
## Agent Tasks (gh agent-task)
```bash
# List agent tasks
gh agent-task list
# View agent task
gh agent-task view 123
# Create agent task
gh agent-task create --description "My task"
```
## Global Flags
| Flag | Description |
| -------------------------- | -------------------------------------- |
| `--help` / `-h` | Show help for command |
| `--version` | Show gh version |
| `--repo [HOST/]OWNER/REPO` | Select another repository |
| `--hostname HOST` | GitHub hostname |
| `--jq EXPRESSION` | Filter JSON output |
| `--json FIELDS` | Output JSON with specified fields |
| `--template STRING` | Format JSON using Go template |
| `--web` | Open in browser |
| `--paginate` | Make additional API calls |
| `--verbose` | Show verbose output |
| `--debug` | Show debug output |
| `--timeout SECONDS` | Maximum API request duration |
| `--cache CACHE` | Cache control (default, force, bypass) |
## Output Formatting
### JSON Output
```bash
# Basic JSON
gh repo view --json name,description
# Nested fields
gh repo view --json owner,name --jq '.owner.login + "/" + .name'
# Array operations
gh pr list --json number,title --jq '.[] | select(.number > 100)'
# Complex queries
gh issue list --json number,title,labels \
--jq '.[] | {number, title: .title, tags: [.labels[].name]}'
```
### Template Output
```bash
# Custom template
gh repo view \
--template '{{.name}}: {{.description}}'
# Multiline template
gh pr view 123 \
--template 'Title: {{.title}}
Author: {{.author.login}}
State: {{.state}}
'
```
## Common Workflows
### Create PR from Issue
```bash
# Create branch from issue
gh issue develop 123 --branch feature/issue-123
# Make changes, commit, push
git add .
git commit -m "Fix issue #123"
git push
# Create PR linking to issue
gh pr create --title "Fix #123" --body "Closes #123"
```
### Bulk Operations
```bash
# Close multiple issues
gh issue list --search "label:stale" \
--json number \
--jq '.[].number' | \
xargs -I {} gh issue close {} --comment "Closing as stale"
# Add label to multiple PRs
gh pr list --search "review:required" \
--json number \
--jq '.[].number' | \
xargs -I {} gh pr edit {} --add-label needs-review
```
### Repository Setup Workflow
```bash
# Create repository with initial setup
gh repo create my-project --public \
--description "My awesome project" \
--clone \
--gitignore python \
--license mit
cd my-project
# Set up branches
git checkout -b develop
git push -u origin develop
# Create labels
gh label create bug --color "d73a4a" --description "Bug report"
gh label create enhancement --color "a2eeef" --description "Feature request"
gh label create documentation --color "0075ca" --description "Documentation"
```
### CI/CD Workflow
```bash
# Run workflow and wait
RUN_ID=$(gh workflow run ci.yml --ref main --jq '.databaseId')
# Watch the run
gh run watch "$RUN_ID"
# Download artifacts on completion
gh run download "$RUN_ID" --dir ./artifacts
```
### Fork Sync Workflow
```bash
# Fork repository
gh repo fork original/repo --clone
cd repo
# Add upstream remote
git remote add upstream https://github.com/original/repo.git
# Sync fork
gh repo sync
# Or manual sync
git fetch upstream
git checkout main
git merge upstream/main
git push origin main
```
## Environment Setup
### Shell Integration
```bash
# Add to ~/.bashrc or ~/.zshrc
eval "$(gh completion -s bash)" # or zsh/fish
# Create useful aliases
alias gs='gh status'
alias gpr='gh pr view --web'
alias gir='gh issue view --web'
alias gco='gh pr checkout'
```
### Git Configuration
```bash
# Use gh as credential helper
gh auth setup-git
# Set gh as default for repo operations
git config --global credential.helper 'gh !gh auth setup-git'
# Or manually
git config --global credential.helper github
```
## Best Practices
1. **Authentication**: Use environment variables for automation
```bash
export GH_TOKEN=$(gh auth token)
```
2. **Default Repository**: Set default to avoid repetition
```bash
gh repo set-default owner/repo
```
3. **JSON Parsing**: Use jq for complex data extraction
```bash
gh pr list --json number,title --jq '.[] | select(.title | contains("fix"))'
```
4. **Pagination**: Use --paginate for large result sets
```bash
gh issue list --state all --paginate
```
5. **Caching**: Use cache control for frequently accessed data
```bash
gh api /user --cache force
```
## Getting Help
```bash
# General help
gh --help
# Command help
gh pr --help
gh issue create --help
# Help topics
gh help formatting
gh help environment
gh help exit-codes
gh help accessibility
```
## References
- Official Manual: https://cli.github.com/manual/
- GitHub Docs: https://docs.github.com/en/github-cli
- REST API: https://docs.github.com/en/rest
- GraphQL API: https://docs.github.com/en/graphqlExecute git commit with conventional commit message analysis, intelligent staging, and message generation. Use when user asks to commit changes, create a git commit, or mentions "/commit". Supports: (1) Auto-detecting type and scope from changes, (2) Generating conventional commit messages from diff, (3) Interactive commit with optional type/scope/description overrides, (4) Intelligent file staging for logical grouping
# Git Commit with Conventional Commits
## Overview
Create standardized, semantic git commits using the Conventional Commits specification. Analyze the actual diff to determine appropriate type, scope, and message.
## Conventional Commit Format
```
<type>[optional scope]: <description>
[optional body]
[optional footer(s)]
```
## Commit Types
| Type | Purpose |
| ---------- | ------------------------------ |
| `feat` | New feature |
| `fix` | Bug fix |
| `docs` | Documentation only |
| `style` | Formatting/style (no logic) |
| `refactor` | Code refactor (no feature/fix) |
| `perf` | Performance improvement |
| `test` | Add/update tests |
| `build` | Build system/dependencies |
| `ci` | CI/config changes |
| `chore` | Maintenance/misc |
| `revert` | Revert commit |
## Breaking Changes
```
# Exclamation mark after type/scope
feat!: remove deprecated endpoint
# BREAKING CHANGE footer
feat: allow config to extend other configs
BREAKING CHANGE: `extends` key behavior changed
```
## Workflow
### 1. Analyze Diff
```bash
# If files are staged, use staged diff
git diff --staged
# If nothing staged, use working tree diff
git diff
# Also check status
git status --porcelain
```
### 2. Stage Files (if needed)
If nothing is staged or you want to group changes differently:
```bash
# Stage specific files
git add path/to/file1 path/to/file2
# Stage by pattern
git add *.test.*
git add src/components/*
# Interactive staging
git add -p
```
**Never commit secrets** (.env, credentials.json, private keys).
### 3. Generate Commit Message
Analyze the diff to determine:
- **Type**: What kind of change is this?
- **Scope**: What area/module is affected?
- **Description**: One-line summary of what changed (present tense, imperative mood, <72 chars)
### 4. Execute Commit
```bash
# Single line
git commit -m "<type>[scope]: <description>"
# Multi-line with body/footer
git commit -m "$(cat <<'EOF'
<type>[scope]: <description>
<optional body>
<optional footer>
EOF
)"
```
## Best Practices
- One logical change per commit
- Present tense: "add" not "added"
- Imperative mood: "fix bug" not "fixes bug"
- Reference issues: `Closes #123`, `Refs #456`
- Keep description under 72 characters
## Git Safety Protocol
- NEVER update git config
- NEVER run destructive commands (--force, hard reset) without explicit request
- NEVER skip hooks (--no-verify) unless user asks
- NEVER force push to main/master
- If commit fails due to hooks, fix and create NEW commit (don't amend)Create, update, and manage GitHub issues using MCP tools. Use this skill when users want to create bug reports, feature requests, or task issues, update existing issues, add labels/assignees/milestones, or manage issue workflows. Triggers on requests like "create an issue", "file a bug", "request a feature", "update issue X", or any GitHub issue management task.
# GitHub Issues
Manage GitHub issues using the `@modelcontextprotocol/server-github` MCP server.
## Available MCP Tools
| Tool | Purpose |
|------|---------|
| `mcp__github__create_issue` | Create new issues |
| `mcp__github__update_issue` | Update existing issues |
| `mcp__github__get_issue` | Fetch issue details |
| `mcp__github__search_issues` | Search issues |
| `mcp__github__add_issue_comment` | Add comments |
| `mcp__github__list_issues` | List repository issues |
## Workflow
1. **Determine action**: Create, update, or query?
2. **Gather context**: Get repo info, existing labels, milestones if needed
3. **Structure content**: Use appropriate template from [references/templates.md](references/templates.md)
4. **Execute**: Call the appropriate MCP tool
5. **Confirm**: Report the issue URL to user
## Creating Issues
### Required Parameters
```
owner: repository owner (org or user)
repo: repository name
title: clear, actionable title
body: structured markdown content
```
### Optional Parameters
```
labels: ["bug", "enhancement", "documentation", ...]
assignees: ["username1", "username2"]
milestone: milestone number (integer)
```
### Title Guidelines
- Start with type prefix when useful: `[Bug]`, `[Feature]`, `[Docs]`
- Be specific and actionable
- Keep under 72 characters
- Examples:
- `[Bug] Login fails with SSO enabled`
- `[Feature] Add dark mode support`
- `Add unit tests for auth module`
### Body Structure
Always use the templates in [references/templates.md](references/templates.md). Choose based on issue type:
| User Request | Template |
|--------------|----------|
| Bug, error, broken, not working | Bug Report |
| Feature, enhancement, add, new | Feature Request |
| Task, chore, refactor, update | Task |
## Updating Issues
Use `mcp__github__update_issue` with:
```
owner, repo, issue_number (required)
title, body, state, labels, assignees, milestone (optional - only changed fields)
```
State values: `open`, `closed`
## Examples
### Example 1: Bug Report
**User**: "Create a bug issue - the login page crashes when using SSO"
**Action**: Call `mcp__github__create_issue` with:
```json
{
"owner": "github",
"repo": "awesome-copilot",
"title": "[Bug] Login page crashes when using SSO",
"body": "## Description\nThe login page crashes when users attempt to authenticate using SSO.\n\n## Steps to Reproduce\n1. Navigate to login page\n2. Click 'Sign in with SSO'\n3. Page crashes\n\n## Expected Behavior\nSSO authentication should complete and redirect to dashboard.\n\n## Actual Behavior\nPage becomes unresponsive and displays error.\n\n## Environment\n- Browser: [To be filled]\n- OS: [To be filled]\n\n## Additional Context\nReported by user.",
"labels": ["bug"]
}
```
### Example 2: Feature Request
**User**: "Create a feature request for dark mode with high priority"
**Action**: Call `mcp__github__create_issue` with:
```json
{
"owner": "github",
"repo": "awesome-copilot",
"title": "[Feature] Add dark mode support",
"body": "## Summary\nAdd dark mode theme option for improved user experience and accessibility.\n\n## Motivation\n- Reduces eye strain in low-light environments\n- Increasingly expected by users\n- Improves accessibility\n\n## Proposed Solution\nImplement theme toggle with system preference detection.\n\n## Acceptance Criteria\n- [ ] Toggle switch in settings\n- [ ] Persists user preference\n- [ ] Respects system preference by default\n- [ ] All UI components support both themes\n\n## Alternatives Considered\nNone specified.\n\n## Additional Context\nHigh priority request.",
"labels": ["enhancement", "high-priority"]
}
```
## Common Labels
Use these standard labels when applicable:
| Label | Use For |
|-------|---------|
| `bug` | Something isn't working |
| `enhancement` | New feature or improvement |
| `documentation` | Documentation updates |
| `good first issue` | Good for newcomers |
| `help wanted` | Extra attention needed |
| `question` | Further information requested |
| `wontfix` | Will not be addressed |
| `duplicate` | Already exists |
| `high-priority` | Urgent issues |
## Tips
- Always confirm the repository context before creating issues
- Ask for missing critical information rather than guessing
- Link related issues when known: `Related to #123`
- For updates, fetch current issue first to preserve unchanged fieldsProcess and manipulate images using ImageMagick. Supports resizing, format conversion, batch processing, and retrieving image metadata. Use when working with images, creating thumbnails, resizing wallpapers, or performing batch image operations.
# Image Manipulation with ImageMagick
This skill enables image processing and manipulation tasks using ImageMagick
across Windows, Linux, and macOS systems.
## When to Use This Skill
Use this skill when you need to:
- Resize images (single or batch)
- Get image dimensions and metadata
- Convert between image formats
- Create thumbnails
- Process wallpapers for different screen sizes
- Batch process multiple images with specific criteria
## Prerequisites
- ImageMagick installed on the system
- **Windows**: PowerShell with ImageMagick available as `magick` (or at `C:\Program Files\ImageMagick-*\magick.exe`)
- **Linux/macOS**: Bash with ImageMagick installed via package manager (`apt`, `brew`, etc.)
## Core Capabilities
### 1. Image Information
- Get image dimensions (width x height)
- Retrieve detailed metadata (format, color space, etc.)
- Identify image format
### 2. Image Resizing
- Resize single images
- Batch resize multiple images
- Create thumbnails with specific dimensions
- Maintain aspect ratios
### 3. Batch Processing
- Process images based on dimensions
- Filter and process specific file types
- Apply transformations to multiple files
## Usage Examples
### Example 0: Resolve `magick` executable
**PowerShell (Windows):**
```powershell
# Prefer ImageMagick on PATH
$magick = (Get-Command magick -ErrorAction SilentlyContinue)?.Source
# Fallback: common install pattern under Program Files
if (-not $magick) {
$magick = Get-ChildItem "C:\\Program Files\\ImageMagick-*\\magick.exe" -ErrorAction SilentlyContinue |
Select-Object -First 1 -ExpandProperty FullName
}
if (-not $magick) {
throw "ImageMagick not found. Install it and/or add 'magick' to PATH."
}
```
**Bash (Linux/macOS):**
```bash
# Check if magick is available on PATH
if ! command -v magick &> /dev/null; then
echo "ImageMagick not found. Install it using your package manager:"
echo " Ubuntu/Debian: sudo apt install imagemagick"
echo " macOS: brew install imagemagick"
exit 1
fi
```
### Example 1: Get Image Dimensions
**PowerShell (Windows):**
```powershell
# For a single image
& $magick identify -format "%wx%h" path/to/image.jpg
# For multiple images
Get-ChildItem "path/to/images/*" | ForEach-Object {
$dimensions = & $magick identify -format "%f: %wx%h`n" $_.FullName
Write-Host $dimensions
}
```
**Bash (Linux/macOS):**
```bash
# For a single image
magick identify -format "%wx%h" path/to/image.jpg
# For multiple images
for img in path/to/images/*; do
magick identify -format "%f: %wx%h\n" "$img"
done
```
### Example 2: Resize Images
**PowerShell (Windows):**
```powershell
# Resize a single image
& $magick input.jpg -resize 427x240 output.jpg
# Batch resize images
Get-ChildItem "path/to/images/*" | ForEach-Object {
& $magick $_.FullName -resize 427x240 "path/to/output/thumb_$($_.Name)"
}
```
**Bash (Linux/macOS):**
```bash
# Resize a single image
magick input.jpg -resize 427x240 output.jpg
# Batch resize images
for img in path/to/images/*; do
filename=$(basename "$img")
magick "$img" -resize 427x240 "path/to/output/thumb_$filename"
done
```
### Example 3: Get Detailed Image Information
**PowerShell (Windows):**
```powershell
# Get verbose information about an image
& $magick identify -verbose path/to/image.jpg
```
**Bash (Linux/macOS):**
```bash
# Get verbose information about an image
magick identify -verbose path/to/image.jpg
```
### Example 4: Process Images Based on Dimensions
**PowerShell (Windows):**
```powershell
Get-ChildItem "path/to/images/*" | ForEach-Object {
$dimensions = & $magick identify -format "%w,%h" $_.FullName
if ($dimensions) {
$width,$height = $dimensions -split ','
if ([int]$width -eq 2560 -or [int]$height -eq 1440) {
Write-Host "Processing $($_.Name)"
& $magick $_.FullName -resize 427x240 "path/to/output/thumb_$($_.Name)"
}
}
}
```
**Bash (Linux/macOS):**
```bash
for img in path/to/images/*; do
dimensions=$(magick identify -format "%w,%h" "$img")
if [[ -n "$dimensions" ]]; then
width=$(echo "$dimensions" | cut -d',' -f1)
height=$(echo "$dimensions" | cut -d',' -f2)
if [[ "$width" -eq 2560 || "$height" -eq 1440 ]]; then
filename=$(basename "$img")
echo "Processing $filename"
magick "$img" -resize 427x240 "path/to/output/thumb_$filename"
fi
fi
done
```
## Guidelines
1. **Always quote file paths** - Use quotes around file paths that might contain spaces
2. **Use the `&` operator (PowerShell)** - Invoke the magick executable using `&` in PowerShell
3. **Store the path in a variable (PowerShell)** - Assign the ImageMagick path to `$magick` for cleaner code
4. **Wrap in loops** - When processing multiple files, use `ForEach-Object` (PowerShell) or `for` loops (Bash)
5. **Verify dimensions first** - Check image dimensions before processing to avoid unnecessary operations
6. **Use appropriate resize flags** - Consider using `!` to force exact dimensions or `^` for minimum dimensions
## Common Patterns
### PowerShell Patterns
#### Pattern: Store ImageMagick Path
```powershell
$magick = (Get-Command magick).Source
```
#### Pattern: Get Dimensions as Variables
```powershell
$dimensions = & $magick identify -format "%w,%h" $_.FullName
$width,$height = $dimensions -split ','
```
#### Pattern: Conditional Processing
```powershell
if ([int]$width -gt 1920) {
& $magick $_.FullName -resize 1920x1080 $outputPath
}
```
#### Pattern: Create Thumbnails
```powershell
& $magick $_.FullName -resize 427x240 "thumbnails/thumb_$($_.Name)"
```
### Bash Patterns
#### Pattern: Check ImageMagick Installation
```bash
command -v magick &> /dev/null || { echo "ImageMagick required"; exit 1; }
```
#### Pattern: Get Dimensions as Variables
```bash
dimensions=$(magick identify -format "%w,%h" "$img")
width=$(echo "$dimensions" | cut -d',' -f1)
height=$(echo "$dimensions" | cut -d',' -f2)
```
#### Pattern: Conditional Processing
```bash
if [[ "$width" -gt 1920 ]]; then
magick "$img" -resize 1920x1080 "$outputPath"
fi
```
#### Pattern: Create Thumbnails
```bash
filename=$(basename "$img")
magick "$img" -resize 427x240 "thumbnails/thumb_$filename"
```
## Limitations
- Large batch operations may be memory-intensive
- Some complex operations may require additional ImageMagick delegates
- On older Linux systems, use `convert` instead of `magick` (ImageMagick 6.x vs 7.x)Generate breadboard circuit mockups and visual diagrams using HTML5 Canvas drawing techniques. Use when asked to create circuit layouts, visualize electronic component placements, draw breadboard diagrams, mockup 6502 builds, generate retro computer schematics, or design vintage electronics projects. Supports 555 timers, W65C02S microprocessors, 28C256 EEPROMs, W65C22 VIA chips, 7400-series logic gates, LEDs, resistors, capacitors, switches, buttons, crystals, and wires.
# Legacy Circuit Mockups
A skill for creating breadboard circuit mockups and visual diagrams for retro computing and electronics projects. This skill leverages HTML5 Canvas drawing mechanisms to render interactive circuit layouts featuring vintage components like the 6502 microprocessor, 555 timer ICs, EEPROMs, and 7400-series logic gates.
## When to Use This Skill
- User asks to "create a breadboard layout" or "mockup a circuit"
- User wants to visualize component placement on a breadboard
- User needs a visual reference for building a 6502 computer
- User asks to "draw a circuit" or "diagram electronics"
- User wants to create educational electronics visuals
- User mentions Ben Eater tutorials or retro computing projects
- User asks to mockup 555 timer circuits or LED projects
- User needs to visualize wire connections between components
## Prerequisites
- Understanding of component pinouts from bundled reference files
- Knowledge of breadboard layout conventions (rows, columns, power rails)
## Supported Components
### Microprocessors & Memory
| Component | Pins | Description |
|-----------|------|-------------|
| W65C02S | 40-pin DIP | 8-bit microprocessor with 16-bit address bus |
| 28C256 | 28-pin DIP | 32KB parallel EEPROM |
| W65C22 | 40-pin DIP | Versatile Interface Adapter (VIA) |
| 62256 | 28-pin DIP | 32KB static RAM |
### Logic & Timer ICs
| Component | Pins | Description |
|-----------|------|-------------|
| NE555 | 8-pin DIP | Timer IC for timing and oscillation |
| 7400 | 14-pin DIP | Quad 2-input NAND gate |
| 7402 | 14-pin DIP | Quad 2-input NOR gate |
| 7404 | 14-pin DIP | Hex inverter (NOT gate) |
| 7408 | 14-pin DIP | Quad 2-input AND gate |
| 7432 | 14-pin DIP | Quad 2-input OR gate |
### Passive & Active Components
| Component | Description |
|-----------|-------------|
| LED | Light emitting diode (various colors) |
| Resistor | Current limiting (configurable values) |
| Capacitor | Filtering and timing (ceramic/electrolytic) |
| Crystal | Clock oscillator |
| Switch | Toggle switch (latching) |
| Button | Momentary push button |
| Potentiometer | Variable resistor |
| Photoresistor | Light-dependent resistor |
### Grid System
```javascript
// Standard breadboard grid: 20px spacing
const gridSize = 20;
const cellX = Math.floor(x / gridSize) * gridSize;
const cellY = Math.floor(y / gridSize) * gridSize;
```
### Component Rendering Pattern
```javascript
// All components follow this structure:
{
type: 'component-type',
x: gridX,
y: gridY,
width: componentWidth,
height: componentHeight,
rotation: 0, // 0, 90, 180, 270
properties: { /* component-specific data */ }
}
```
### Wire Connections
```javascript
// Wire connection format:
{
start: { x: startX, y: startY },
end: { x: endX, y: endY },
color: '#ff0000' // Wire color coding
}
```
## Step-by-Step Workflows
### Creating a Basic LED Circuit Mockup
1. Define breadboard dimensions and grid
2. Place power rail connections (+5V and GND)
3. Add LED component with anode/cathode orientation
4. Place current-limiting resistor
5. Draw wire connections between components
6. Add labels and annotations
### Creating a 555 Timer Circuit
1. Place NE555 IC on breadboard (pins 1-4 left, 5-8 right)
2. Connect pin 1 (GND) to ground rail
3. Connect pin 8 (Vcc) to power rail
4. Add timing resistors and capacitors
5. Wire trigger and threshold connections
6. Connect output to LED or other load
### Creating a 6502 Microprocessor Layout
1. Place W65C02S centered on breadboard
2. Add 28C256 EEPROM for program storage
3. Place W65C22 VIA for I/O
4. Add 7400-series logic for address decoding
5. Wire address bus (A0-A15)
6. Wire data bus (D0-D7)
7. Connect control signals (R/W, PHI2, RESB)
8. Add reset button and clock crystal
## Component Pinout Quick Reference
### 555 Timer (8-pin DIP)
| Pin | Name | Function |
|:---:|:-----|:---------|
| 1 | GND | Ground (0V) |
| 2 | TRIG | Trigger (< 1/3 Vcc starts timing) |
| 3 | OUT | Output (source/sink 200mA) |
| 4 | RESET | Active-low reset |
| 5 | CTRL | Control voltage (bypass with 10nF) |
| 6 | THR | Threshold (> 2/3 Vcc resets) |
| 7 | DIS | Discharge (open collector) |
| 8 | Vcc | Supply (+4.5V to +16V) |
### W65C02S (40-pin DIP) - Key Pins
| Pin | Name | Function |
|:---:|:-----|:---------|
| 8 | VDD | Power supply |
| 21 | VSS | Ground |
| 37 | PHI2 | System clock input |
| 40 | RESB | Active-low reset |
| 34 | RWB | Read/Write signal |
| 9-25 | A0-A15 | Address bus |
| 26-33 | D0-D7 | Data bus |
### 28C256 EEPROM (28-pin DIP) - Key Pins
| Pin | Name | Function |
|:---:|:-----|:---------|
| 14 | GND | Ground |
| 28 | VCC | Power supply |
| 20 | CE | Chip enable (active-low) |
| 22 | OE | Output enable (active-low) |
| 27 | WE | Write enable (active-low) |
| 1-10, 21-26 | A0-A14 | Address inputs |
| 11-19 | I/O0-I/O7 | Data bus |
## Formulas Reference
### Resistor Calculations
- **Ohm's Law:** V = I × R
- **LED Current:** R = (Vcc - Vled) / Iled
- **Power:** P = V × I = I² × R
### 555 Timer Formulas
**Astable Mode:**
- Frequency: f = 1.44 / ((R1 + 2×R2) × C)
- High time: t₁ = 0.693 × (R1 + R2) × C
- Low time: t₂ = 0.693 × R2 × C
- Duty cycle: D = (R1 + R2) / (R1 + 2×R2) × 100%
**Monostable Mode:**
- Pulse width: T = 1.1 × R × C
### Capacitor Calculations
- Capacitive reactance: Xc = 1 / (2πfC)
- Energy stored: E = ½ × C × V²
## Color Coding Conventions
### Wire Colors
| Color | Purpose |
|-------|---------|
| Red | +5V / Power |
| Black | Ground |
| Yellow | Clock / Timing |
| Blue | Address bus |
| Green | Data bus |
| Orange | Control signals |
| White | General purpose |
### LED Colors
| Color | Forward Voltage |
|-------|-----------------|
| Red | 1.8V - 2.2V |
| Green | 2.0V - 2.2V |
| Yellow | 2.0V - 2.2V |
| Blue | 3.0V - 3.5V |
| White | 3.0V - 3.5V |
## Build Examples
### Build 1 — Single LED
**Components:** Red LED, 220Ω resistor, jumper wires, power source
**Steps:**
1. Insert black jumper wire from power GND to row A5
2. Insert red jumper wire from power +5V to row J5
3. Place LED with cathode (short leg) in row aligned with GND
4. Place 220Ω resistor between power and LED anode
### Build 2 — 555 Astable Blinker
**Components:** NE555, LED, resistors (10kΩ, 100kΩ), capacitor (10µF)
**Steps:**
1. Place 555 IC straddling center channel
2. Connect pin 1 to GND, pin 8 to +5V
3. Connect pin 4 to pin 8 (disable reset)
4. Wire 10kΩ between pin 7 and +5V
5. Wire 100kΩ between pins 6 and 7
6. Wire 10µF between pin 6 and GND
7. Connect pin 3 (output) to LED circuit
## Troubleshooting
| Issue | Solution |
|-------|----------|
| LED doesn't light | Check polarity (anode to +, cathode to -) |
| Circuit doesn't power | Verify power rail connections |
| IC not working | Check VCC and GND pin connections |
| 555 not oscillating | Verify threshold/trigger capacitor wiring |
| Microprocessor stuck | Check RESB is HIGH after reset pulse |
## References
Detailed component specifications are available in the bundled reference files:
- [555.md](references/555.md) - Complete 555 timer IC specification
- [6502.md](references/6502.md) - MOS 6502 microprocessor details
- [6522.md](references/6522.md) - W65C22 VIA interface adapter
- [28256-eeprom.md](references/28256-eeprom.md) - AT28C256 EEPROM specification
- [6C62256.md](references/6C62256.md) - 62256 SRAM details
- [7400-series.md](references/7400-series.md) - TTL logic gate pinouts
- [assembly-compiler.md](references/assembly-compiler.md) - Assembly compiler specification
- [assembly-language.md](references/assembly-language.md) - Assembly language specification
- [basic-electronic-components.md](references/basic-electronic-components.md) - Resistors, capacitors, switches
- [breadboard.md](references/breadboard.md) - Breadboard specifications
- [common-breadboard-components.md](references/common-breadboard-components.md) - Comprehensive component reference
- [connecting-electronic-components.md](references/connecting-electronic-components.md) - Step-by-step build guides
- [emulator-28256-eeprom.md](references/emulator-28256-eeprom.md) - Emulating 28256-eeprom specification
- [emulator-6502.md](references/emulator-6502.md) - Emulating 6502 specification
- [emulator-6522.md](references/emulator-6522.md) - Emulating 6522 specification
- [emulator-6C62256.md](references/emulator-6C62256.md) - Emulating 6C62256 specification
- [emulator-lcd.md](references/emulator-lcd.md) - Emulating a LCD specification
- [lcd.md](references/lcd.md) - LCD display interfacing
- [minipro.md](references/minipro.md) - EEPROM programmer usage
- [t48eeprom-programmer.md](references/t48eeprom-programmer.md) - T48 programmer referenceCreate new Agent Skills for GitHub Copilot from prompts or by duplicating this template. Use when asked to "create a skill", "make a new skill", "scaffold a skill", or when building specialized AI capabilities with bundled resources. Generates SKILL.md files with proper frontmatter, directory structure, and optional scripts/references/assets folders.
# Make Skill Template
A meta-skill for creating new Agent Skills. Use this skill when you need to scaffold a new skill folder, generate a SKILL.md file, or help users understand the Agent Skills specification.
## When to Use This Skill
- User asks to "create a skill", "make a new skill", or "scaffold a skill"
- User wants to add a specialized capability to their GitHub Copilot setup
- User needs help structuring a skill with bundled resources
- User wants to duplicate this template as a starting point
## Prerequisites
- Understanding of what the skill should accomplish
- A clear, keyword-rich description of capabilities and triggers
- Knowledge of any bundled resources needed (scripts, references, assets, templates)
## Creating a New Skill
### Step 1: Create the Skill Directory
Create a new folder with a lowercase, hyphenated name:
```
skills/<skill-name>/
└── SKILL.md # Required
```
### Step 2: Generate SKILL.md with Frontmatter
Every skill requires YAML frontmatter with `name` and `description`:
```yaml
---
name: <skill-name>
description: '<What it does>. Use when <specific triggers, scenarios, keywords users might say>.'
---
```
#### Frontmatter Field Requirements
| Field | Required | Constraints |
|-------|----------|-------------|
| `name` | **Yes** | 1-64 chars, lowercase letters/numbers/hyphens only, must match folder name |
| `description` | **Yes** | 1-1024 chars, must describe WHAT it does AND WHEN to use it |
| `license` | No | License name or reference to bundled LICENSE.txt |
| `compatibility` | No | 1-500 chars, environment requirements if needed |
| `metadata` | No | Key-value pairs for additional properties |
| `allowed-tools` | No | Space-delimited list of pre-approved tools (experimental) |
#### Description Best Practices
**CRITICAL**: The `description` is the PRIMARY mechanism for automatic skill discovery. Include:
1. **WHAT** the skill does (capabilities)
2. **WHEN** to use it (triggers, scenarios, file types)
3. **Keywords** users might mention in prompts
**Good example:**
```yaml
description: 'Toolkit for testing local web applications using Playwright. Use when asked to verify frontend functionality, debug UI behavior, capture browser screenshots, or view browser console logs. Supports Chrome, Firefox, and WebKit.'
```
**Poor example:**
```yaml
description: 'Web testing helpers'
```
### Step 3: Write the Skill Body
After the frontmatter, add markdown instructions. Recommended sections:
| Section | Purpose |
|---------|---------|
| `# Title` | Brief overview |
| `## When to Use This Skill` | Reinforces description triggers |
| `## Prerequisites` | Required tools, dependencies |
| `## Step-by-Step Workflows` | Numbered steps for tasks |
| `## Troubleshooting` | Common issues and solutions |
| `## References` | Links to bundled docs |
### Step 4: Add Optional Directories (If Needed)
| Folder | Purpose | When to Use |
|--------|---------|-------------|
| `scripts/` | Executable code (Python, Bash, JS) | Automation that performs operations |
| `references/` | Documentation agent reads | API references, schemas, guides |
| `assets/` | Static files used AS-IS | Images, fonts, templates |
| `templates/` | Starter code agent modifies | Scaffolds to extend |
## Example: Complete Skill Structure
```
my-awesome-skill/
├── SKILL.md # Required instructions
├── LICENSE.txt # Optional license file
├── scripts/
│ └── helper.py # Executable automation
├── references/
│ ├── api-reference.md # Detailed docs
│ └── examples.md # Usage examples
├── assets/
│ └── diagram.png # Static resources
└── templates/
└── starter.ts # Code scaffold
```
## Quick Start: Duplicate This Template
1. Copy the `make-skill-template/` folder
2. Rename to your skill name (lowercase, hyphens)
3. Update `SKILL.md`:
- Change `name:` to match folder name
- Write a keyword-rich `description:`
- Replace body content with your instructions
4. Add bundled resources as needed
5. Validate with `npm run skill:validate`
## Validation Checklist
- [ ] Folder name is lowercase with hyphens
- [ ] `name` field matches folder name exactly
- [ ] `description` is 10-1024 characters
- [ ] `description` explains WHAT and WHEN
- [ ] `description` is wrapped in single quotes
- [ ] Body content is under 500 lines
- [ ] Bundled assets are under 5MB each
## Troubleshooting
| Issue | Solution |
|-------|----------|
| Skill not discovered | Improve description with more keywords and triggers |
| Validation fails on name | Ensure lowercase, no consecutive hyphens, matches folder |
| Description too short | Add capabilities, triggers, and keywords |
| Assets not found | Use relative paths from skill root |
## References
- Agent Skills official spec: <https://agentskills.io/specification>Interface for MCP (Model Context Protocol) servers via CLI. Use when you need to interact with external tools, APIs, or data sources through MCP servers, list available MCP servers/tools, or call MCP tools from command line.
# MCP-CLI
Access MCP servers through the command line. MCP enables interaction with external systems like GitHub, filesystems, databases, and APIs.
## Commands
| Command | Output |
| ---------------------------------- | ------------------------------- |
| `mcp-cli` | List all servers and tool names |
| `mcp-cli <server>` | Show tools with parameters |
| `mcp-cli <server>/<tool>` | Get tool JSON schema |
| `mcp-cli <server>/<tool> '<json>'` | Call tool with arguments |
| `mcp-cli grep "<glob>"` | Search tools by name |
**Add `-d` to include descriptions** (e.g., `mcp-cli filesystem -d`)
## Workflow
1. **Discover**: `mcp-cli` → see available servers and tools
2. **Explore**: `mcp-cli <server>` → see tools with parameters
3. **Inspect**: `mcp-cli <server>/<tool>` → get full JSON input schema
4. **Execute**: `mcp-cli <server>/<tool> '<json>'` → run with arguments
## Examples
```bash
# List all servers and tool names
mcp-cli
# See all tools with parameters
mcp-cli filesystem
# With descriptions (more verbose)
mcp-cli filesystem -d
# Get JSON schema for specific tool
mcp-cli filesystem/read_file
# Call the tool
mcp-cli filesystem/read_file '{"path": "./README.md"}'
# Search for tools
mcp-cli grep "*file*"
# JSON output for parsing
mcp-cli filesystem/read_file '{"path": "./README.md"}' --json
# Complex JSON with quotes (use heredoc or stdin)
mcp-cli server/tool <<EOF
{"content": "Text with 'quotes' inside"}
EOF
# Or pipe from a file/command
cat args.json | mcp-cli server/tool
# Find all TypeScript files and read the first one
mcp-cli filesystem/search_files '{"path": "src/", "pattern": "*.ts"}' --json | jq -r '.content[0].text' | head -1 | xargs -I {} sh -c 'mcp-cli filesystem/read_file "{\"path\": \"{}\"}"'
```
## Options
| Flag | Purpose |
| ------------ | ------------------------- |
| `-j, --json` | JSON output for scripting |
| `-r, --raw` | Raw text content |
| `-d` | Include descriptions |
## Exit Codes
- `0`: Success
- `1`: Client error (bad args, missing config)
- `2`: Server error (tool failed)
- `3`: Network errorLook up Microsoft API references, find working code samples, and verify SDK code is correct. Use when working with Azure SDKs, .NET libraries, or Microsoft APIs—to find the right method, check parameters, get working examples, or troubleshoot errors. Catches hallucinated methods, wrong signatures, and deprecated patterns by querying official docs.
# Microsoft Code Reference
## Tools
| Need | Tool | Example |
|------|------|---------|
| API method/class lookup | `microsoft_docs_search` | `"BlobClient UploadAsync Azure.Storage.Blobs"` |
| Working code sample | `microsoft_code_sample_search` | `query: "upload blob managed identity", language: "python"` |
| Full API reference | `microsoft_docs_fetch` | Fetch URL from `microsoft_docs_search` (for overloads, full signatures) |
## Finding Code Samples
Use `microsoft_code_sample_search` to get official, working examples:
```
microsoft_code_sample_search(query: "upload file to blob storage", language: "csharp")
microsoft_code_sample_search(query: "authenticate with managed identity", language: "python")
microsoft_code_sample_search(query: "send message service bus", language: "javascript")
```
**When to use:**
- Before writing code—find a working pattern to follow
- After errors—compare your code against a known-good sample
- Unsure of initialization/setup—samples show complete context
## API Lookups
```
# Verify method exists (include namespace for precision)
"BlobClient UploadAsync Azure.Storage.Blobs"
"GraphServiceClient Users Microsoft.Graph"
# Find class/interface
"DefaultAzureCredential class Azure.Identity"
# Find correct package
"Azure Blob Storage NuGet package"
"azure-storage-blob pip package"
```
Fetch full page when method has multiple overloads or you need complete parameter details.
## Error Troubleshooting
Use `microsoft_code_sample_search` to find working code samples and compare with your implementation. For specific errors, use `microsoft_docs_search` and `microsoft_docs_fetch`:
| Error Type | Query |
|------------|-------|
| Method not found | `"[ClassName] methods [Namespace]"` |
| Type not found | `"[TypeName] NuGet package namespace"` |
| Wrong signature | `"[ClassName] [MethodName] overloads"` → fetch full page |
| Deprecated warning | `"[OldType] migration v12"` |
| Auth failure | `"DefaultAzureCredential troubleshooting"` |
| 403 Forbidden | `"[ServiceName] RBAC permissions"` |
## When to Verify
Always verify when:
- Method name seems "too convenient" (`UploadFile` vs actual `Upload`)
- Mixing SDK versions (v11 `CloudBlobClient` vs v12 `BlobServiceClient`)
- Package name doesn't follow conventions (`Azure.*` for .NET, `azure-*` for Python)
- Using an API for the first time
## Validation Workflow
Before generating code using Microsoft SDKs, verify it's correct:
1. **Confirm method or package exists** — `microsoft_docs_search(query: "[ClassName] [MethodName] [Namespace]")`
2. **Fetch full details** (for overloads/complex params) — `microsoft_docs_fetch(url: "...")`
3. **Find working sample** — `microsoft_code_sample_search(query: "[task]", language: "[lang]")`
For simple lookups, step 1 alone may suffice. For complex API usage, complete all three steps.Query official Microsoft documentation to understand concepts, find tutorials, and learn how services work. Use for Azure, .NET, Microsoft 365, Windows, Power Platform, and all Microsoft technologies. Get accurate, current information from learn.microsoft.com and other official Microsoft websites—architecture overviews, quickstarts, configuration guides, limits, and best practices.
# Microsoft Docs
## Tools
| Tool | Use For |
|------|---------|
| `microsoft_docs_search` | Find documentation—concepts, guides, tutorials, configuration |
| `microsoft_docs_fetch` | Get full page content (when search excerpts aren't enough) |
## When to Use
- **Understanding concepts** — "How does Cosmos DB partitioning work?"
- **Learning a service** — "Azure Functions overview", "Container Apps architecture"
- **Finding tutorials** — "quickstart", "getting started", "step-by-step"
- **Configuration options** — "App Service configuration settings"
- **Limits & quotas** — "Azure OpenAI rate limits", "Service Bus quotas"
- **Best practices** — "Azure security best practices"
## Query Effectiveness
Good queries are specific:
```
# ❌ Too broad
"Azure Functions"
# ✅ Specific
"Azure Functions Python v2 programming model"
"Cosmos DB partition key design best practices"
"Container Apps scaling rules KEDA"
```
Include context:
- **Version** when relevant (`.NET 8`, `EF Core 8`)
- **Task intent** (`quickstart`, `tutorial`, `overview`, `limits`)
- **Platform** for multi-platform docs (`Linux`, `Windows`)
## When to Fetch Full Page
Fetch after search when:
- **Tutorials** — need complete step-by-step instructions
- **Configuration guides** — need all options listed
- **Deep dives** — user wants comprehensive coverage
- **Search excerpt is cut off** — full context needed
## Why Use This
- **Accuracy** — live docs, not training data that may be outdated
- **Completeness** — tutorials have all steps, not fragments
- **Authority** — official Microsoft documentationManage NuGet packages in .NET projects/solutions. Use this skill when adding, removing, or updating NuGet package versions. It enforces using `dotnet` CLI for package management and provides strict procedures for direct file edits only when updating versions.
# NuGet Manager
## Overview
This skill ensures consistent and safe management of NuGet packages across .NET projects. It prioritizes using the `dotnet` CLI to maintain project integrity and enforces a strict verification and restoration workflow for version updates.
## Prerequisites
- .NET SDK installed (typically .NET 8.0 SDK or later, or a version compatible with the target solution).
- `dotnet` CLI available on your `PATH`.
- `jq` (JSON processor) OR PowerShell (for version verification using `dotnet package search`).
## Core Rules
1. **NEVER** directly edit `.csproj`, `.props`, or `Directory.Packages.props` files to **add** or **remove** packages. Always use `dotnet add package` and `dotnet remove package` commands.
2. **DIRECT EDITING** is ONLY permitted for **changing versions** of existing packages.
3. **VERSION UPDATES** must follow the mandatory workflow:
- Verify the target version exists on NuGet.
- Determine if versions are managed per-project (`.csproj`) or centrally (`Directory.Packages.props`).
- Update the version string in the appropriate file.
- Immediately run `dotnet restore` to verify compatibility.
## Workflows
### Adding a Package
Use `dotnet add [<PROJECT>] package <PACKAGE_NAME> [--version <VERSION>]`.
Example: `dotnet add src/MyProject/MyProject.csproj package Newtonsoft.Json`
### Removing a Package
Use `dotnet remove [<PROJECT>] package <PACKAGE_NAME>`.
Example: `dotnet remove src/MyProject/MyProject.csproj package Newtonsoft.Json`
### Updating Package Versions
When updating a version, follow these steps:
1. **Verify Version Existence**:
Check if the version exists using the `dotnet package search` command with exact match and JSON formatting.
Using `jq`:
`dotnet package search <PACKAGE_NAME> --exact-match --format json | jq -e '.searchResult[].packages[] | select(.version == "<VERSION>")'`
Using PowerShell:
`(dotnet package search <PACKAGE_NAME> --exact-match --format json | ConvertFrom-Json).searchResult.packages | Where-Object { $_.version -eq "<VERSION>" }`
2. **Determine Version Management**:
- Search for `Directory.Packages.props` in the solution root. If present, versions should be managed there via `<PackageVersion Include="Package.Name" Version="1.2.3" />`.
- If absent, check individual `.csproj` files for `<PackageReference Include="Package.Name" Version="1.2.3" />`.
3. **Apply Changes**:
Modify the identified file with the new version string.
4. **Verify Stability**:
Run `dotnet restore` on the project or solution. If errors occur, revert the change and investigate.
## Examples
### User: "Add Serilog to the WebApi project"
**Action**: Execute `dotnet add src/WebApi/WebApi.csproj package Serilog`.
### User: "Update Newtonsoft.Json to 13.0.3 in the whole solution"
**Action**:
1. Verify 13.0.3 exists: `dotnet package search Newtonsoft.Json --exact-match --format json` (and parse output to confirm "13.0.3" is present).
2. Find where it's defined (e.g., `Directory.Packages.props`).
3. Edit the file to update the version.
4. Run `dotnet restore`.Generate ASCII art diagrams using PlantUML text mode. Use when user asks to create ASCII diagrams, text-based diagrams, terminal-friendly diagrams, or mentions plantuml ascii, text diagram, ascii art diagram. Supports: Converting PlantUML diagrams to ASCII art, Creating sequence diagrams, class diagrams, flowcharts in ASCII format, Generating Unicode-enhanced ASCII art with -utxt flag
# PlantUML ASCII Art Diagram Generator
## Overview
Create text-based ASCII art diagrams using PlantUML. Perfect for documentation in terminal environments, README files, emails, or any scenario where graphical diagrams aren't suitable.
## What is PlantUML ASCII Art?
PlantUML can generate diagrams as plain text (ASCII art) instead of images. This is useful for:
- Terminal-based workflows
- Git commits/PRs without image support
- Documentation that needs to be version-controlled
- Environments where graphical tools aren't available
## Installation
```bash
# macOS
brew install plantuml
# Linux (varies by distro)
sudo apt-get install plantuml # Ubuntu/Debian
sudo yum install plantuml # RHEL/CentOS
# Or download JAR directly
wget https://github.com/plantuml/plantuml/releases/download/v1.2024.0/plantuml-1.2024.0.jar
```
## Output Formats
| Flag | Format | Description |
| ------- | ------------- | ------------------------------------ |
| `-txt` | ASCII | Pure ASCII characters |
| `-utxt` | Unicode ASCII | Enhanced with box-drawing characters |
## Basic Workflow
### 1. Create PlantUML Diagram File
```plantuml
@startuml
participant Bob
actor Alice
Bob -> Alice : hello
Alice -> Bob : Is it ok?
@enduml
```
### 2. Generate ASCII Art
```bash
# Standard ASCII output
plantuml -txt diagram.puml
# Unicode-enhanced output (better looking)
plantuml -utxt diagram.puml
# Using JAR directly
java -jar plantuml.jar -txt diagram.puml
java -jar plantuml.jar -utxt diagram.puml
```
### 3. View Output
Output is saved as `diagram.atxt` (ASCII) or `diagram.utxt` (Unicode).
## Diagram Types Supported
### Sequence Diagram
```plantuml
@startuml
actor User
participant "Web App" as App
database "Database" as DB
User -> App : Login Request
App -> DB : Validate Credentials
DB --> App : User Data
App --> User : Auth Token
@enduml
```
### Class Diagram
```plantuml
@startuml
class User {
+id: int
+name: string
+email: string
+login(): bool
}
class Order {
+id: int
+total: float
+items: List
+calculateTotal(): float
}
User "1" -- "*" Order : places
@enduml
```
### Activity Diagram
```plantuml
@startuml
start
:Initialize;
if (Is Valid?) then (yes)
:Process Data;
:Save Result;
else (no)
:Log Error;
stop
endif
:Complete;
stop
@enduml
```
### State Diagram
```plantuml
@startuml
[*] --> Idle
Idle --> Processing : start
Processing --> Success : complete
Processing --> Error : fail
Success --> [*]
Error --> Idle : retry
@enduml
```
### Component Diagram
```plantuml
@startuml
[Client] as client
[API Gateway] as gateway
[Service A] as svcA
[Service B] as svcB
[Database] as db
client --> gateway
gateway --> svcA
gateway --> svcB
svcA --> db
svcB --> db
@enduml
```
### Use Case Diagram
```plantuml
@startuml
actor "User" as user
actor "Admin" as admin
rectangle "System" {
user -- (Login)
user -- (View Profile)
user -- (Update Settings)
admin -- (Manage Users)
admin -- (Configure System)
}
@enduml
```
### Deployment Diagram
```plantuml
@startuml
actor "User" as user
node "Load Balancer" as lb
node "Web Server 1" as ws1
node "Web Server 2" as ws2
database "Primary DB" as db1
database "Replica DB" as db2
user --> lb
lb --> ws1
lb --> ws2
ws1 --> db1
ws2 --> db1
db1 --> db2 : replicate
@enduml
```
## Command-Line Options
```bash
# Specify output directory
plantuml -txt -o ./output diagram.puml
# Process all files in directory
plantuml -txt ./diagrams/
# Include dot files (hidden files)
plantuml -txt -includeDot diagrams/
# Verbose output
plantuml -txt -v diagram.puml
# Specify charset
plantuml -txt -charset UTF-8 diagram.puml
```
## Ant Task Integration
```xml
<target name="generate-ascii">
<plantuml dir="./src" format="txt" />
</target>
<target name="generate-unicode-ascii">
<plantuml dir="./src" format="utxt" />
</target>
```
## Tips for Better ASCII Diagrams
1. **Keep it simple**: Complex diagrams don't render well in ASCII
2. **Short labels**: Long text breaks ASCII alignment
3. **Use Unicode (`-utxt`)**: Better visual quality with box-drawing chars
4. **Test before sharing**: Verify in terminal with fixed-width font
5. **Consider alternatives**: For complex diagrams, use Mermaid.js or graphviz
## Example Output Comparison
**Standard ASCII (`-txt`)**:
```
,---. ,---.
|Bob| |Alice|
`---' `---'
| hello |
|------------->|
| |
| Is it ok? |
|<-------------|
| |
```
**Unicode ASCII (`-utxt`)**:
```
┌─────┐ ┌─────┐
│ Bob │ │Alice│
└─────┘ └─────┘
│ hello │
│─────────────>│
│ │
│ Is it ok? │
│<─────────────│
│ │
```
## Quick Reference
```bash
# Create sequence diagram in ASCII
cat > seq.puml << 'EOF'
@startuml
Alice -> Bob: Request
Bob --> Alice: Response
@enduml
EOF
plantuml -txt seq.puml
cat seq.atxt
# Create with Unicode
plantuml -utxt seq.puml
cat seq.utxt
```
## Troubleshooting
**Problem**: Garbled Unicode characters
- **Solution**: Ensure terminal supports UTF-8 and has proper font
**Problem**: Diagram looks misaligned
- **Solution**: Use fixed-width font (Courier, Monaco, Consolas)
**Problem**: Command not found
- **Solution**: Install PlantUML or use Java JAR directly
**Problem**: Output file not created
- **Solution**: Check file permissions, ensure PlantUML has write access