diff --git a/.github/workflows/README-staging-lint-checks.md b/.github/workflows/README-staging-lint-checks.md deleted file mode 100644 index b79d28e6..00000000 --- a/.github/workflows/README-staging-lint-checks.md +++ /dev/null @@ -1,108 +0,0 @@ -# Staging Lint Checks Workflow - -## Overview - -This workflow validates Azure OpenAPI validator rule changes by running selected rules against spec files from the -`azure-rest-api-specs` repository. It provides a staging environment to test new rules, rule modifications before they are -merged into the main codebase. - -## Purpose and Intent - -The workflow serves as a validation tool for linter rule development with the following workflow: - -1. **Rule Development**: Engineers write new validation rules or modify existing ones -2. **Testing Setup**: Engineers create a PR and specify which rules to test via labels -3. **Automated Validation**: The workflow runs specified rules against live specification files using AutoRest -4. **Result Analysis**: Engineers review the output to identify false positives, false negatives, or unexpected behavior -5. **Quality Assurance**: Engineers and reviewers validate that rules work correctly before production release - -**Important**: This workflow enforces three merge-blocking gates: - -1. **Rule changes require test rules** — If rule files are modified, the PR must specify rules to test via labels -2. **Command failures block merges** — AutoRest crashes or script errors cause the workflow to fail -3. **Validation errors require acknowledgment** — When errors are found, the author must add an `errors-acknowledged` label after reviewing them - -The workflow automatically re-runs when labels are added or removed. - -## Validation Requirements - -### When Rule Files Are Changed - -If your PR modifies any of these files, you **must** specify test rules: - -- `packages/rulesets/src/spectral/az-arm.ts` -- `packages/rulesets/src/spectral/az-common.ts` -- `packages/rulesets/src/spectral/az-dataplane.ts` -- `packages/rulesets/src/spectral/functions/*.ts` -- `packages/rulesets/src/native/legacyRules/**/*.ts` -- `packages/rulesets/src/native/functions/**/*.ts` -- `packages/rulesets/src/native/rulesets/**/*.ts` - -The workflow will fail until test rules are specified. - -### When Validation Errors Are Found - -1. Download the `linter-findings` artifact and review the errors -2. Add the `errors-acknowledged` label to confirm you have reviewed them -3. The workflow re-runs automatically and passes (reviewer approval is still required) - -Removing the label re-triggers the workflow and re-blocks the PR. - -## How to Use - -### Specifying Rules to Test - -Add labels to your pull request with the format `test-`: - -- `test-PostResponseCodes` -- `test-DeleteMustNotHaveRequestBody` -- `test-LongRunningOperationsWithLongRunningExtension` - -### Workflow Configuration - -The workflow can be configured through environment variables: - -```yaml -env: - SPEC_REPO: Azure/azure-rest-api-specs - MAX_FILES: "100" - ALLOWED_RPS: "compute,monitor,sql,hdinsight,network,resource,storage" -``` - -- **SPEC_REPO**: Source repository for OpenAPI specifications -- **MAX_FILES**: Maximum number of specification files to process -- **ALLOWED_RPS**: Comma-separated list of resource providers to include in testing - -**Note**: These values can be modified directly in the `.github/workflows/staging-lint-checks.yaml` file to adjust the -workflow behavior based on testing requirements. - -### Labels - -| Label | Purpose | -| --------------------- | ------------------------------------------------------------- | -| `test-` | Specifies a rule to validate (e.g., `test-PostResponseCodes`) | -| `errors-acknowledged` | Confirms the PR author has reviewed validation errors | - -## Debugging and Troubleshooting - -### Viewing Results - -1. Navigate to the Actions tab in your repository -2. Find the workflow run for your PR -3. Download the `linter-findings` artifact to see detailed validation results -4. The artifact contains output logs and any errors found - -### Common Issues - -**No rules detected**: Ensure your PR labels follow the exact format `test-`. - -**Workflow fails**: Check the workflow logs for specific error messages. The artifact will still be uploaded -even if the workflow fails. - -**Missing resource provider**: If testing rules against specifications from RPs not in the default list, update -the `ALLOWED_RPS` environment variable. - -## Related Components - -- `.github/workflows/src/extract-rule-names-and-run-validation.js`: Single consolidated script that parses rule names from PR labels and runs AutoRest with the selected rules over allowed spec files. -- GitHub Action workflow file: `.github/workflows/staging-lint-checks.yaml` orchestrates checkout, build, and script execution. diff --git a/CONTRIBUTING.md b/CONTRIBUTING.md index 96365b34..10536901 100644 --- a/CONTRIBUTING.md +++ b/CONTRIBUTING.md @@ -8,6 +8,7 @@ - [Communicate the addition of a rule to API reviewers](#communicate-the-addition-of-a-rule-to-api-reviewers) - [Ensure TypeSpec supports the new rule](#ensure-typespec-supports-the-new-rule) - [Add a new rule to the staging pipeline](#add-a-new-rule-to-the-staging-pipeline) + - [Test rules using the staging lint checks workflow](#test-rules-using-the-staging-lint-checks-workflow) - [Determine which rules are ready for release](#determine-which-rules-are-ready-for-release) - [Create a release PR and generate the changelog](#create-a-release-pr-and-generate-the-changelog) - [Create a release tag](#create-a-release-tag) @@ -133,6 +134,77 @@ To add a rule to the staging pipeline: 2. Merge the new rule to the main branch. Once merged, your new rule will start running in the staging pipeline. You can verify the rule is running with the instructions in [Verify the deployed changes](#verify-the-deployed-changes). +## Test rules using the staging lint checks workflow + +The staging lint checks workflow (`.github/workflows/staging-lint-checks.yaml`) validates rule changes in this repository +by running selected rules against spec files from the `azure-rest-api-specs` repository before merging. It provides a +staging environment to test new rules and rule modifications with the following workflow: + +1. **Rule Development**: Engineers write new validation rules or modify existing ones +2. **Testing Setup**: Engineers create a PR and specify which rules to test via labels +3. **Automated Validation**: The workflow runs specified rules against live specification files using AutoRest +4. **Result Analysis**: Engineers review the output to identify false positives, false negatives, or unexpected behavior +5. **Quality Assurance**: Engineers and reviewers validate that rules work correctly before production release + +### Validation requirements + +**When rule files are changed**, if your PR modifies any of these files, you **must** specify test rules: + +- `packages/rulesets/src/spectral/az-arm.ts` +- `packages/rulesets/src/spectral/az-common.ts` +- `packages/rulesets/src/spectral/az-dataplane.ts` +- `packages/rulesets/src/spectral/functions/*.ts` +- `packages/rulesets/src/native/legacyRules/**/*.ts` +- `packages/rulesets/src/native/functions/**/*.ts` +- `packages/rulesets/src/native/rulesets/**/*.ts` + +**When validation errors are found:** + +1. Download the `linter-findings` artifact and review the errors +2. Add the `errors-acknowledged` label to confirm you have reviewed them +3. The workflow re-runs automatically and passes (reviewer approval is still required) + +Removing the label re-triggers the workflow and re-blocks the PR. + +### Merge-blocking gates + +The workflow enforces three merge-blocking gates: + +1. **Rule changes require test rules** — If rule files are modified, the PR must specify rules to test via labels +2. **Command failures block merges** — AutoRest crashes or script errors cause the workflow to fail +3. **Validation errors require acknowledgment** — When errors are found, the author must add an `errors-acknowledged` label after reviewing them + +### Specifying rules to test + +Add labels to your pull request with the format `test-`: + +- `test-PostResponseCodes` +- `test-DeleteMustNotHaveRequestBody` +- `test-LongRunningOperationsWithLongRunningExtension` + +### Labels + +| Label | Purpose | +| --------------------- | ------------------------------------------------------------- | +| `test-` | Specifies a rule to validate (e.g., `test-PostResponseCodes`) | +| `errors-acknowledged` | Confirms the PR author has reviewed validation errors | + +### Workflow configuration + +The workflow can be configured through environment variables in `.github/workflows/staging-lint-checks.yaml`: + +- **SPEC_REPO**: Source repository for OpenAPI specifications (default: `Azure/azure-rest-api-specs`) +- **MAX_FILES**: Maximum number of specification files to process (default: `100`) +- **ALLOWED_RPS**: Comma-separated list of resource providers to include in testing (default: + `compute,monitor,sql,hdinsight,network,resource,storage`) + +### Viewing results + +1. Navigate to the Actions tab in your repository +2. Find the workflow run for your PR +3. Download the `linter-findings` artifact to see detailed validation results +4. The artifact contains output logs and any errors found + ## Determine which rules are ready for release Use data from staging pipeline runs to determine if a rule is ready for release.