NAME
litmus — Identifies high-risk .NET source files and ranks them by where to start testing today (cross-references git churn,…
SYNOPSIS
INFO
DESCRIPTION
Identifies high-risk .NET source files and ranks them by where to start testing today (cross-references git churn, code coverage, cyclomatic complexity, and dependency entanglement).
README
Litmus
Find where to start testing in a legacy codebase.
Litmus is a .NET global CLI tool that answers two questions:
- Where is it dangerous to leave code untested? — ranked by Risk Score
- Where can you actually start testing today? — ranked by Starting Priority
The result is a ranked table. Files that are dangerous and practically testable appear at the top. Files that are dangerous but heavily entangled appear lower, with a clear signal to introduce seams first.
Quick Start
# Install dotnet tool install --global dotnet-litmusRun from the directory containing your .sln file
dotnet-litmus scan
That's it. The tool auto-detects the solution file, runs tests, collects coverage, and produces a prioritized report.
No tests yet?
# Analyze without running tests — ranks by churn, complexity, and testability
dotnet-litmus scan --no-coverage
Understanding the Output
Rank File Commits Coverage Complexity Dependency Risk Priority Level 1 Services/OrderService.cs 47 12% 94 Low 1.42 1.42 High 2 Services/ReportFormatter.cs 22 31% 67 Low 0.71 0.71 High 3 Controllers/PaymentGateway.cs 31 8% 118 Very High 1.61 0.32 Medium 4 Data/LegacyDbSync.cs 41 0% 201 Very High 1.89 0.19 Low
4 files analyzed. 2 high-priority (start today), 1 medium-priority (next sprint). 2 high-risk file(s) need seam introduction before testing.
Reading the table
| Column | Meaning |
|---|---|
| Commits | Number of git commits touching this file in the analysis window |
| Coverage | Line coverage from the Cobertura report |
| Complexity | Cyclomatic complexity (sum across all methods) |
| Dependency | Cost of adding test seams: Low, Medium, High, Very High |
| Risk | How dangerous it is to leave untested (0-2.0) |
| Priority | Where to start testing today (0-2.0) |
| Level | Actionable tier based on Starting Priority |
PaymentGateway.cs has a higher Risk (1.61) than OrderService.cs (1.42), but its Very High dependency level pushes its Starting Priority down to 0.32 (Medium). The tool is telling you: "This file is dangerous, but introduce seams before attempting to test it."
Row colors
| Color | Meaning |
|---|---|
| Red | High priority — risky and testable now |
| Yellow | Medium priority — plan for next sprint |
| Default | Low priority — backlog or too entangled |
The Risk column is independently colored to highlight dangerous-but-entangled files.
Priority and risk levels
| Level | Score Range | Priority meaning | Risk meaning |
|---|---|---|---|
| High | >= 0.6 | Start here — testable now | Changes often, poorly tested, complex |
| Medium | >= 0.2 | Plan for next sprint | Moderate risk |
| Low | < 0.2 | Backlog or too entangled | Low churn, well-tested, or simple |
Method-level drill-down
Use --detailed to expand the top 5 files with per-method coverage and complexity:
dotnet-litmus scan --detailed
Rank File Commits Coverage Complexity Dependency Risk Priority Level
1 Services/OrderService.cs 47 12% 94 Low 1.42 1.42 High
ProcessOrder — 50% 25
ValidateInput — 0% 18
2 Services/ReportFormatter.cs 22 31% 67 Low 0.71 0.71 High
FormatReport — 10% 30
BuildHeader — 80% 8
Method rows show coverage and complexity only — churn is a file-level signal (shown as —), and no method-level priority is computed since only 2 of 4 signals are available. Methods are sorted by complexity descending.
Commands
Litmus has two commands: scan runs tests and analyzes in one step; analyze skips testing and uses an existing coverage file.
scan — run tests and analyze in one step
# Auto-detect solution file from current directory dotnet-litmus scanSpecify solution explicitly
dotnet-litmus scan --solution MyApp.sln
Target a specific test directory
dotnet-litmus scan --solution MyApp.sln --tests-dir tests/MyApp.Tests
Export results
dotnet-litmus scan --output report.json
Export an HTML report to share with the team
dotnet-litmus scan --output report.html
scan auto-detects the solution file when a single .sln or .slnx exists in the current directory. It then:
- Runs
dotnet testwith the XPlat Code Coverage collector - Streams live output so you see build progress and test results in real time
- Discovers and merges all
coverage.cobertura.xmlfiles (one per test project) - Runs the full analysis pipeline (git churn, complexity, seam detection, scoring)
- Cleans up temporary test results
analyze — use an existing coverage file
# Auto-detect solution, provide coverage file dotnet-litmus analyze --coverage TestResults/.../coverage.cobertura.xmlSpecify solution explicitly
dotnet-litmus analyze --solution MyApp.sln --coverage coverage.xml
Use analyze when you already have a Cobertura XML coverage report (e.g., from CI).
CLI Reference
Shared options
| Option | Default | Description |
|---|---|---|
--solution | auto-detect | Path to .sln or .slnx. Auto-detected when one exists in cwd. |
--since | 1 year ago | Git history cutoff (ISO date format, e.g. 2025-01-01) |
--top | 20 | Number of files to display |
--exclude | -- | Glob pattern(s) to exclude (repeatable) |
--output | -- | Export to .json, .csv, or .html file |
--baseline | -- | Previous JSON export for delta comparison |
--format | table | Stdout format: table, json, csv, or html |
--verbose | false | Show detailed intermediate scores |
--quiet | false | Suppress all output except errors |
--fail-on-threshold | -- | Exit with code 1 if any file's Risk Score or Starting Priority exceeds this value (0.0-2.0) |
--detailed | false | Expand top-ranked files with per-method coverage and complexity |
--no-color | false | Disable colored output |
scan-only options
| Option | Default | Description |
|---|---|---|
--tests-dir | solution file | Directory or project to run dotnet test against |
--no-coverage | false | Skip test execution and coverage collection |
--coverage-tool | coverlet | Coverage collector: coverlet or dotnet-coverage |
--timeout | 10 | Maximum minutes for test execution |
analyze-only options
| Option | Default | Description |
|---|---|---|
--coverage | required | Path to Cobertura XML coverage file |
Prerequisites
- .NET 8 SDK or later (including .NET 9 and .NET 10)
- git installed and available on PATH
- For
scan: test projects must referencecoverlet.collector(or use--coverage-tool dotnet-coverage) - For
scan --no-coverage: no test projects or coverage tooling required - For
analyze: a pre-generated Cobertura XML coverage report
Installation
# From NuGet (recommended) dotnet tool install --global dotnet-litmusOr from a local build
dotnet pack Litmus/Litmus.csproj -c Release dotnet tool install --global --add-source Litmus/bin/Release dotnet-litmus
Or run without installing
dotnet run --project Litmus -- scan
Examples
Scan the last 6 months, show top 10
dotnet-litmus scan --since 2025-08-01 --top 10
Use dotnet-coverage instead of coverlet
dotnet tool install --global dotnet-coverage
dotnet-litmus scan --coverage-tool dotnet-coverage
Exclude generated code
dotnet-litmus analyze \
--coverage coverage.xml \
--exclude "*.Generated.cs" \
--exclude "**/ViewModels/*.cs" \
--output report.json
Drill down into top files
dotnet-litmus scan --detailed --top 10
Shows method-level coverage and complexity for the top 5 files (out of the 10 displayed). Useful for identifying which specific methods inside a high-risk file need attention first.
Legacy codebase with no tests
dotnet-litmus scan --no-coverage --top 10
Compare against a baseline
# Save a baseline dotnet-litmus scan --output baseline.jsonLater: compare
dotnet-litmus scan --baseline baseline.json
When --baseline is provided, a Delta column appears showing how each file's Starting Priority changed (+0.15 = degraded, -0.10 = improved, NEW = not in baseline). A summary reports: vs baseline: N improved, N degraded, N new, N removed.
Machine-readable output
# JSON to stdout dotnet-litmus analyze --coverage coverage.xml --format json | jq '.[].file'CSV to stdout
dotnet-litmus analyze --coverage coverage.xml --format csv > results.csv
Quiet mode: only exit code + file export
dotnet-litmus scan --quiet --output report.json
HTML report
# Self-contained HTML file with a sortable table — share in Slack or attach to a PR dotnet-litmus scan --output report.htmlOr pipe to stdout
dotnet-litmus analyze --coverage coverage.xml --format html > report.html
CI/CD Integration
Litmus works well in CI pipelines for tracking test debt over time.
GitHub Actions example
name: Litmus Analysis on: [push]jobs: litmus: runs-on: ubuntu-latest steps: - uses: actions/checkout@v4 with: fetch-depth: 0 # Full history needed for git churn
- uses: actions/setup-dotnet@v4 with: dotnet-version: '8.0.x' - name: Install Litmus run: dotnet tool install --global dotnet-litmus - name: Run analysis run: dotnet-litmus scan --output report.json --format json --quiet - name: Upload report uses: actions/upload-artifact@v4 with: name: litmus-report path: report.json
Baseline comparison in CI
- name: Download previous baseline uses: actions/download-artifact@v4 with: name: litmus-baseline continue-on-error: true # First run won't have a baseline- name: Run analysis with baseline run: | if [ -f baseline.json ]; then dotnet-litmus scan --output report.json --baseline baseline.json else dotnet-litmus scan --output report.json fi - name: Save as next baseline uses: actions/upload-artifact@v4 with: name: litmus-baseline path: report.json
Quality gate
# Fail the build if any file scores above 1.0
dotnet-litmus scan --fail-on-threshold 1.0 --quiet
Key flags for CI
| Flag | Purpose |
|---|---|
--quiet | No console output, only exit code and file export |
--output report.json | Machine-readable export |
--output report.html | Shareable HTML report |
--format json | JSON to stdout for piping |
--no-color | Disable ANSI codes in log output |
--baseline previous.json | Track regressions over time |
--fail-on-threshold 1.0 | Fail the build if any file exceeds a score |
How Scores Are Calculated
Litmus cross-references four signals to produce its scores:
| Signal | What it measures |
|---|---|
| Git churn | How frequently a file changes |
| Code coverage | How well a file is tested |
| Cyclomatic complexity | How complex the file's logic is |
| Dependency entanglement | How many unseamed dependencies block testability |
A "seam" (from Michael Feathers' Working Effectively with Legacy Code) is a place where you can substitute a dependency without changing production code — typically via dependency injection or an interface. An "unseamed" dependency is one a test cannot replace, like a direct new HttpClient() or DateTime.Now call.
Phase 1 — Risk Score
RiskScore = ChurnNorm x (1 - CoverageRate) x (1 + ComplexityNorm)
Each factor is normalized to [0, 1]. Range: 0 to 2.0. A file that changes constantly, has no tests, and is highly complex scores near 2.0.
Phase 2 — Starting Priority
The dependency score measures unseamed dependencies — things a test cannot substitute. Six signals are detected via Roslyn:
| Signal | Weight | What it detects |
|---|---|---|
| Unseamed infrastructure calls | 2.0 | DateTime.Now, File.*, new HttpClient(), new DbContext() |
| Direct instantiation in methods | 1.5 | new ConcreteType() (excluding DTOs, exceptions, collections) |
| Concrete constructor parameters | 0.5 | Constructor params without interface convention |
| Static calls on non-utility types | 1.0 | MyHelper.Transform() (excluding Math, Convert, etc.) |
| Async seam calls | 1.5 | await _httpClient.GetAsync(), await _db.SaveChangesAsync() |
| Concrete downcasts | 1.0 | (ConcreteType)expr and expr as ConcreteType |
DI registration files (Program.cs, Startup.cs, files with AddScoped/AddSingleton/AddTransient) get a zeroed dependency score.
StartingPriority = RiskScore x (1 - DependencyNorm)
Fully seamed (DependencyNorm = 0) -> Priority equals Risk.
Maximally entangled (DependencyNorm = 1) -> Priority drops to 0.
How is this different from SonarQube?
SonarQube is a code quality platform that reports code smells, bugs, and coverage gaps — but it doesn't tell you where to start testing. It has no concept of git churn, no seam detection, and no prioritized starting list.
Litmus is purpose-built for a different question: "I inherited a legacy codebase with little or no test coverage. Which files should I test first?"
| SonarQube | Litmus | |
|---|---|---|
| Goal | Broad code quality monitoring | Prioritized test starting list |
| Signals | Static analysis rules, coverage % | Git churn + coverage + complexity + seam detection |
| Output | Dashboard of issues | Ranked table: start here, plan next, introduce seams first |
| Setup | Server, database, CI integration | dotnet tool install, run from terminal |
| Delta tracking | Requires paid tier for branch analysis | --baseline flag (free, built-in) |
| Cost | Free tier limited; paid for full features | Free and open source |
They complement each other. Use SonarQube for ongoing quality gates; use Litmus to decide where to invest testing effort in a legacy codebase.
Exit Codes
| Code | Meaning |
|---|---|
0 | Success. Analysis completed (or no files found after filters — warning printed). |
1 | Error. Validation failure, missing dependencies, test failure with no coverage, runtime error, or --fail-on-threshold exceeded. |
Default exclusions
The following patterns are always excluded to reduce noise from auto-generated files:
*.Designer.cs,*.g.cs,*.g.i.cs,*.generated.cs*AssemblyInfo.cs,*GlobalUsings.g.cs*.xaml.cs**/Migrations/*.cs,*ModelSnapshot.csProgram.cs,Startup.cs**/obj/**,**/bin/**,**/wwwroot/**
Use --exclude to add additional patterns on top of these.
Troubleshooting
No solution file found
If no --solution is provided and no single .sln/.slnx exists in the current directory:
# Move to the solution directory cd /path/to/your/project dotnet-litmus scanOr specify the path
dotnet-litmus scan --solution path/to/MyApp.sln
If multiple solution files exist, you must specify which one.
Tests fail and no coverage
If you see "No coverage files were generated because some tests failed", fix the failing tests first. Coverage cannot be collected from failed test runs.
If tests pass but no coverage is generated, your test projects are missing coverlet.collector:
dotnet add <test-project> package coverlet.collector
Or use dotnet-coverage which doesn't require a package reference:
dotnet tool install --global dotnet-coverage
dotnet-litmus scan --coverage-tool dotnet-coverage
No tests in the codebase
If your codebase has no tests yet, skip coverage collection entirely:
dotnet-litmus scan --no-coverage
This ranks files by git churn, cyclomatic complexity, and dependency analysis only. All files are treated as 0% coverage. Use this to find where to start writing tests, then re-run without --no-coverage once you have coverage data.
scan hangs during test execution
Usually caused by coverlet hanging after tests complete. Solutions in order of preference:
Use
dotnet-coverage: Avoids the coverlet data collector entirely.dotnet tool install --global dotnet-coverage dotnet-litmus scan --coverage-tool dotnet-coverageUpgrade coverlet: Update
coverlet.collectorto the latest version.Increase timeout: For large solutions that just need more time.
dotnet-litmus scan --timeout 30Use
analyzedirectly: Generate coverage separately.dotnet-coverage collect "dotnet test MyApp.sln" -f cobertura -o coverage.xml dotnet-litmus analyze --coverage coverage.xml
Coverage prerequisites for analyze
dotnet test --collect:"XPlat Code Coverage"
For multiple test projects, merge with ReportGenerator:
dotnet tool install -g dotnet-reportgenerator-globaltool
reportgenerator -reports:"**/coverage.cobertura.xml" -targetdir:"merged" -reporttypes:Cobertura
dotnet-litmus analyze --coverage merged/Cobertura.xml
The scan command does this merge automatically.
Solution Format Support
Both classic .sln files and the newer .slnx XML format are supported. The format is auto-detected from the file extension.
License
MIT