Skip to content

Conversation

@AkhilaIlla
Copy link
Collaborator

@AkhilaIlla AkhilaIlla commented Aug 20, 2025

This PR introduces a workflow to validate linter rules that are under development. It ensures these rules are tested against the specifications, allowing us to build confidence in them before merging and releasing to production.

How the staging-lint-checks workflow runs selected rules

Root workflow: staging-lint-checks.yaml
Triggers: pull_request events (opened, edited, synchronize, labeled, unlabeled)
Flow:
Checks out this repo and sets up Node 18
Derives RULE_NAMES from PR labels (test-) or a “rules: A,B” line in the PR body
Checks out azure-rest-api-specs into specs/
Installs minimal deps for the runner
Runs the script run-selected-rules.js with env:
RULE_NAMES, SPEC_ROOT=specs/, FAIL_ON_ERRORS, OUTPUT_FILE
The runner filters to JSON under specification//resource-manager/stable for key RPs and executes only the selected Spectral rules
Uploads artifacts/linter-findings.txt as a build artifact

Rules:PutResponseCodes

Latest workflow run: https://github.com/Azure/azure-openapi-validator/actions/runs/18791474287/job/53622542134
Output:
linter-findings.txt

@mikeharder mikeharder self-assigned this Aug 25, 2025
mikeharder and others added 2 commits August 25, 2025 11:31
… into akhilailla/add_workflow_to_simulate_test_pipeline_for-_linter_rules
Copy link
Member

@danieljurek danieljurek left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This PR is moving things in the right direction. I have a few questions about how rules are evaluated and whether there's a way to infer what to check instead of relying on the user to supply that data.

@mikeharder
Copy link
Member

mikeharder commented Sep 5, 2025

@AkhilaIlla: @danieljurek and I were discussing, and we think this might work better if we put the workflow in the specs repo instead. It would work like this:

  1. Trigger WF in specs repo manually
  2. When triggering, specify parameters like "PR number to aov", "Rules to test", "specs to test" etc
  3. WF downloads private builds from aov PR artifacts (or clones and builds locally)
  4. WF runs rule against selected specs, using private build of aov

This has several benefits:

  1. We can test specs in any branch in specs or specs-pr. If the WF lives in aov repo, can only test public specs repo.
  2. specs repo already has a lot of shared code and example WFs
  3. WF could be slightly tweaked, to allow testing published versions of aov in addition to private builds
  4. Long-term, I'd like to move aov into the specs repo, so we don't need to publish/consume at all

I can create a skeleton WF in the specs repo to help get you started.

My start on a WF in specs repo to test aov: Azure/azure-rest-api-specs#37180 #Resolved

@AkhilaIlla AkhilaIlla merged commit 868be4f into main Oct 31, 2025
9 checks passed
@AkhilaIlla AkhilaIlla deleted the akhilailla/add_workflow_to_simulate_test_pipeline_for-_linter_rules branch October 31, 2025 16:18
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants