NAME
dns-benchmark-tool — Fast, comprehensive DNS performance testing with DNSSEC validation, DoH/DoT support, and enterprise features
SYNOPSIS
sudo apt install python3-pipINFO
DESCRIPTION
Fast, comprehensive DNS performance testing with DNSSEC validation, DoH/DoT support, and enterprise features
README
DNS Benchmark Tool
Part of BuildTools - Network Performance Suite
Fast, comprehensive DNS performance testing with DNSSEC validation, DoH/DoT support, and enterprise features
pip install dns-benchmark-tool
dns-benchmark benchmark --use-defaults --formats csv,excel
🎉 1,400+ downloads this week! Thank you to our growing community.
📢 Want multi-region testing? Join the waitlist →
Real Time Tracking
🎉 Today’s Release Highlights 
We’ve added three powerful CLI commands to make DNS benchmarking even more versatile:
🚀 top — quick ranking of resolvers by speed and reliability
📊 compare — side‑by‑side benchmarking with detailed statistics and export options
🔄 monitoring — continuous performance tracking with alerts and logging
# Quick resolver ranking dns-benchmark topCompare resolvers side-by-side
dns-benchmark compare Cloudflare Google Quad9 --show-details
Run monitoring for 1 hour with alerts
dns-benchmark monitoring --use-defaults --formats csv,excel --interval 30 --duration 3600
--alert-latency 150 --alert-failure-rate 5 --output monitor.log
📈 Community Highlights
- ⭐ Stars: Grew from 7 → 110+ after posting on Hacker News
- 📦 Downloads: Rebounded to 200+/day after initially stalling
- 🐘 Mastodon: Shared there too, but the real surge came from HN
- 💬 Feedback: Constructive input from HN community directly shaped patches v0.3.0 → v0.3.1
- 🚀 Takeaway: Hacker News visibility was the catalyst for adoption momentum
Table of Contents
- DNS Benchmark Tool
- Part of BuildTools - Network Performance Suite
- 🎉 Today’s Release Highlights
- 📈 Community Highlights
- Table of Contents
- 🎯 Why This Tool?
- Quick start
- ✨ Key Features
- 🔧 Advanced Capabilities
- 💼 Use Cases
- 📦 Installation & Setup
- 📖 Usage Examples
- Inline input support for resolvers and domains
- 🔧 Utilities
- Complete usage guide
- 🔍 README Adjustments for Final Patch
- ⚡ CLI Commands
- 📊 Analysis Enhancements
- ⚡ Best Practices
- Feedback & Community Input
- ⚙️ Configuration Files
- Output formats
- Performance optimization
- Troubleshooting
- Automation & CI
- Screenshots
- Getting help
- Release workflow
- 🌐 Hosted Version (Coming Soon)
- 🛣️ Roadmap
- 🤝 Contributing
- ❓ FAQ
- 🔗 Links & Support
- License
🎯 Why This Tool?
DNS resolution is often the hidden bottleneck in network performance. A slow resolver can add hundreds of milliseconds to every request.
The Problem
- ⏱️ Hidden Bottleneck: DNS can add 300ms+ to every request
- 🤷 Unknown Performance: Most developers never test their DNS
- 🌍 Location Matters: "Fastest" resolver depends on where YOU are
- 🔒 Security Varies: DNSSEC, DoH, DoT support differs wildly
The Solution
dns-benchmark-tool helps you:
- 🔍 Find the fastest DNS resolver for YOUR location
- 📊 Get real data - P95, P99, jitter, consistency scores
- 🛡️ Validate security - DNSSEC verification built-in
- 🚀 Test at scale - 100+ concurrent queries in seconds
Perfect For
- ✅ Developers optimizing API performance
- ✅ DevOps/SRE validating resolver SLAs
- ✅ Self-hosters comparing Pi-hole/Unbound vs public DNS
- ✅ Network admins running compliance checks
Quick start
Installation
pip install dns-benchmark-tool
Run Your First Benchmark
# Test default resolvers against popular domains
dns-benchmark benchmark --use-defaults --formats csv,excel
View Results
Results are automatically saved to ./benchmark_results/ with:
- Summary CSV with statistics
- Detailed raw data
- Optional PDF/Excel reports
That's it! You just benchmarked 5 DNS resolvers against 10 domains.
✨ Key Features
🚀 Performance
- Async queries - Test 100+ resolvers simultaneously
- Multi-iteration - Run benchmarks multiple times for accuracy
- Statistical analysis - Mean, median, P95, P99, jitter, consistency
- Cache control - Test with/without DNS caching
🔒 Security & Privacy
- DNSSEC validation - Verify cryptographic trust chains
- DNS-over-HTTPS (DoH) - Encrypted DNS benchmarking
- DNS-over-TLS (DoT) - Secure transport testing
- DNS-over-QUIC (DoQ) - Experimental QUIC support
📊 Analysis & Export
- Multiple formats - CSV, Excel, PDF, JSON
- Visual reports - Charts and graphs
- Domain statistics - Per-domain performance analysis
- Error breakdown - Identify problematic resolvers
🏢 Enterprise Features
- TSIG authentication - Secure enterprise queries
- Zone transfers - AXFR/IXFR validation
- Dynamic updates - Test DNS write operations
- Compliance reports - Audit-ready documentation
🌐 Cross-Platform
- Linux, macOS, Windows - Works everywhere
- CI/CD friendly - JSON output, exit codes
- IDNA support - Internationalized domain names
- Auto-detection - Windows WMI DNS discovery
🔧 Advanced Capabilities
⚠️ These flags are documented for visibility but not yet implemented.
They represent upcoming advanced features.
--doh→ DNS-over-HTTPS benchmarking (coming soon)--dot→ DNS-over-TLS benchmarking (coming soon)--doq→ DNS-over-QUIC benchmarking (coming soon)--dnssec-validate→ DNSSEC trust chain validation (coming soon)--zone-transfer→ AXFR/IXFR zone transfer testing (coming soon)--tsig→ TSIG-authenticated queries (coming soon)--idna→ Internationalized domain name support (coming soon)
🚀 Performance & Concurrency Features
- Async I/O with dnspython - Test 100+ resolvers simultaneously
- Trio framework support - High-concurrency async operations
- Configurable concurrency - Control max concurrent queries
- Retry logic - Exponential backoff for failed queries
- Cache simulation - Test with/without DNS caching
- Multi-iteration benchmarks - Run tests multiple times for accuracy
- Warmup phase - Pre-warm DNS caches before testing
- Statistical analysis - Mean, median, P95, P99, jitter, consistency scores
Example:
dns-benchmark benchmark \
--max-concurrent 200 \
--iterations 5 \
--timeout 3.0 \
--warmup
🔒 Security & Privacy Features
- DNSSEC validation - Verify cryptographic trust chains
- DNS-over-HTTPS (DoH) - Encrypted DNS benchmarking via HTTPS
- DNS-over-TLS (DoT) - Secure transport layer testing
- DNS-over-QUIC (DoQ) - Experimental QUIC protocol support
- TSIG authentication - Transaction signatures for enterprise DNS
- EDNS0 support - Extended DNS features and larger payloads
Example:
# Test DoH resolvers
dns-benchmark benchmark \
--doh \
--resolvers doh-providers.json \
--dnssec-validate
🏢 Enterprise & Migration Features
- Zone transfers (AXFR/IXFR) - Full and incremental zone transfer validation
- Dynamic DNS updates - Test DNS write operations and updates
- EDNS0 support - Extended DNS options, client subnet, larger payloads
- Windows WMI integration - Auto-detect active system DNS settings
- Compliance reporting - Generate audit-ready PDF/Excel reports
- SLA validation - Track uptime and performance thresholds
Example:
# Validate DNS migration
dns-benchmark benchmark \
--resolvers old-provider.json,new-provider.json \
--zone-transfer \ # coming soon
--output migration-report/ \
--formats pdf,excel
📊 Analysis & Reporting Features
- Per-domain statistics - Analyze performance by domain
- Per-record-type stats - Compare A, AAAA, MX, TXT, etc.
- Error breakdown - Categorize and count error types
- Comparison matrices - Side-by-side resolver comparisons
- Trend analysis - Performance over time (with multiple runs)
- Best-by-criteria - Find best resolver by latency/reliability/consistency
Example:
# Detailed analysis
dns-benchmark benchmark \
--use-defaults \
--formats csv,excel \
--domain-stats \
--record-type-stats \
--error-breakdown \
--formats csv,excel,pdf
🌐 Internationalization & Compatibility
- IDNA support - Internationalized domain names (IDN)
- Multiple record types - A, AAAA, MX, TXT, CNAME, NS, SOA, PTR, SRV, CAA
- Cross-platform - Linux, macOS, Windows (native support)
- CI/CD integration - JSON output, proper exit codes, quiet mode
- Custom resolvers - Load from JSON, test your own DNS servers
- Custom domains - Test against your specific domain list
Example:
# Test internationalized domains
dns-benchmark benchmark \
--domains international-domains.txt \
--record-types A,AAAA,MX \
--resolvers custom-resolvers.json
💡 Most users only need basic features. These advanced capabilities are available when you need them.
💼 Use Cases
🔧 For Developers: Optimize API Performance
# Find fastest DNS for your API endpoints
dns-benchmark benchmark \
--domains api.myapp.com,cdn.myapp.com \
--record-types A,AAAA \
--resolvers production.json \
--iterations 10
Result: Reduce API latency by 100-300ms
🛡️ For DevOps/SRE: Validate Before Migration
# Test new DNS provider before switching
dns-benchmark benchmark \
--resolvers current-dns.json,new-dns.json \
--use-defaults \
--dnssec-validate \ # coming soon
--output migration-report/ \
--formats csv,excel
Result: Verify performance and security before migration
🏠 For Self-Hosters: Prove Pi-hole Performance
# Compare Pi-hole against public resolvers (coming soon)
dns-benchmark compare \
--resolvers pihole.local,1.1.1.1,8.8.8.8,9.9.9.9 \
--domains common-sites.txt \
--rounds 10
Result: Data-driven proof your self-hosted DNS is faster (or not!)
📊 For Network Admins: Automated Health Checks
# Add to crontab for monthly reports
0 0 1 * * dns-benchmark benchmark \
--use-defaults \
--output /var/reports/dns/ \
--formats excel,csv \
--domain-stats \
--error-breakdown
Result: Automated compliance and SLA reporting
🔐 For Privacy Advocates: Test Encrypted DNS
# Benchmark privacy-focused DoH/DoT resolvers
dns-benchmark benchmark \
--doh \ # coming soon
--resolvers privacy-resolvers.json \
--domains sensitive-sites.txt \
--dnssec-validate
Result: Find fastest encrypted DNS without sacrificing privacy
📦 Installation & Setup
Requirements
- Python 3.9+
- pip package manager
Install from PyPI
pip install dns-benchmark-tool
Install from Source
git clone https://github.com/frankovo/dns-benchmark-tool.git
cd dns-benchmark-tool
pip install -e .
Verify Installation
dns-benchmark --version
dns-benchmark --help
First Run
# Test with defaults (recommended for first time)
dns-benchmark benchmark --use-defaults --formats csv,excel
📖 Usage Examples
Basic Usage
# Basic test with progress bars dns-benchmark benchmark --use-defaults --formats csv,excelBasic test without progress bars
dns-benchmark benchmark --use-defaults --formats csv,excel --quiet
Test with custom resolvers and domains
dns-benchmark benchmark --resolvers data/resolvers.json --domains data/domains.txt
Quick test with only CSV output
dns-benchmark benchmark --use-defaults --formats csv
Advanced Usage
# Export a machine-readable bundle dns-benchmark benchmark --use-defaults --json --output ./resultsTest specific record types
dns-benchmark benchmark --use-defaults --formats csv,excel --record-types A,AAAA,MX
Custom output location and formats
dns-benchmark benchmark
--use-defaults
--output ./my-results
--formats csv,excelInclude detailed statistics
dns-benchmark benchmark
--use-defaults
--formats csv,excel
--record-type-stats
--error-breakdownHigh concurrency with retries
dns-benchmark benchmark
--use-defaults
--formats csv,excel
--max-concurrent 200
--timeout 3.0
--retries 3Website migration planning
dns-benchmark benchmark
--resolvers data/global_resolvers.json
--domains data/migration_domains.txt
--formats excel,pdf
--output ./migration_analysisDNS provider selection
dns-benchmark benchmark
--resolvers data/provider_candidates.json
--domains data/business_domains.txt
--formats csv,excel
--output ./provider_selectionNetwork troubleshooting
dns-benchmark benchmark
--resolvers "192.168.1.1,1.1.1.1,8.8.8.8"
--domains "problematic-domain.com,working-domain.com"
--timeout 10
--retries 3
--formats csv
--output ./troubleshootingSecurity assessment
dns-benchmark benchmark
--resolvers data/security_resolvers.json
--domains data/security_test_domains.txt
--formats pdf
--output ./security_assessmentPerformance monitoring
dns-benchmark benchmark
--use-defaults
--formats csv
--quiet
--output /var/log/dns_benchmark/$(date +%Y%m%d_%H%M%S)New top commands
Run a basic benchmark (default: rank by latency)
dns-benchmark top
→ Tests all resolvers with sample domains, ranks by latency
Limit the number of resolvers shown
dns-benchmark top --limit 5
→ Shows only the top 5 resolvers
Rank by success rate
dns-benchmark top --metric success
→ Ranks resolvers by highest success rate
Rank by reliability (combined score: success rate + latency)
dns-benchmark top --metric reliability
→ Uses weighted score to rank resolvers
Filter resolvers by category
dns-benchmark top --category privacy dns-benchmark top --category family dns-benchmark top --category security
→ Tests only resolvers in the specified category
Use a custom domain list
dns-benchmark top --domains domains.txt
→ Loads domains from a text file instead of built-in sample list
Specify DNS record types
dns-benchmark top --record-types A,AAAA,MX
→ Queries multiple record types (comma-separated)
Adjust timeout and concurrency
dns-benchmark top --timeout 3.0 --max-concurrent 50
→ Sets query timeout to 3 seconds and limits concurrency to 50
Export results to JSON
dns-benchmark top --output results.json
→ Saves results in JSON format
Export results to CSV
dns-benchmark top --output results.csv
→ Saves results in CSV format
Export results to TXT
dns-benchmark top --output results.txt
→ Saves results in plain text format
Quiet mode (no progress bar, CI/CD friendly)
dns-benchmark top --quiet
→ Suppresses progress output
Example combined usage
dns-benchmark top --limit 10 --metric reliability --category privacy --output top_resolvers.csv
→ Benchmarks privacy resolvers, ranks by reliability, shows top 10, exports to CSV
New compare commaands
Comparison of resolvers by name
dns-benchmark compare Cloudflare Google Quad9
^ Compares Cloudflare, Google, and Quad9 resolvers using default domains and record type A
Basic compare resolvers by IP address
dns-benchmark compare 1.1.1.1 8.8.8.8 9.9.9.9
^ Directly specify resolver IPs instead of names
Increase iterations for more stable results
dns-benchmark compare "Cloudflare" "Google" --iterations 5
^ Runs 5 rounds of queries per resolver/domain/record type
Use a custom domain list from file
dns-benchmark compare Cloudflare Google -d ./data/domains.txt
^ Loads domains from domains.txt instead of sample domains
Query multiple record types
dns-benchmark compare Cloudflare Google -t A,AAAA,MX
^ Tests A, AAAA, and MX records for each domain
Adjust timeout and concurrency
dns-benchmark compare Cloudflare Google --timeout 3.0 --max-concurrent 200
^ Sets query timeout to 3 seconds and allows 200 concurrent queries
Export results to JSON
dns-benchmark compare Cloudflare Google -o results.json
^ Saves comparison summary to results.json
Export results to CSV
dns-benchmark compare Cloudflare Google -o results.csv
^ Saves comparison summary to results.csv (via CSVExporter)
Suppress progress output
dns-benchmark compare Cloudflare Google --quiet
^ Runs silently, only prints final results
Show detailed per-domain breakdown
dns-benchmark compare Cloudflare Google --show-details
^ Prints average latency and success counts per domain for each resolver
New monitoring commands
Start monitoring with default resolvers and sample domains
dns-benchmark monitoring --use-defaults
^ Runs indefinitely, checking every 60s, using built-in resolvers and 5 sample domains
Monitor with a custom resolver list from JSON
dns-benchmark monitoring -r resolvers.json --use-defaults
^ Loads resolvers from resolvers.json, domains from defaults
Monitor with a custom domain list
dns-benchmark monitoring -d domains.txt --use-defaults
^ Uses default resolvers, but domains are loaded from domains.txt
Change monitoring interval to 30 seconds
dns-benchmark monitoring --use-defaults --interval 30
^ Runs checks every 30 seconds instead of 60
Run monitoring for a fixed duration (e.g., 1 hour = 3600 seconds)
dns-benchmark monitoring --use-defaults --duration 3600
^ Stops automatically after 1 hour
Set stricter alert thresholds
dns-benchmark monitoring --use-defaults --alert-latency 150 --alert-failure-rate 5
^ Alerts if latency >150ms or failure rate >5%
Save monitoring results to a log file
dns-benchmark monitoring --use-defaults --output monitor.log
^ Appends results and alerts to monitor.log
Combine options: custom resolvers, domains, interval, duration, and logging
dns-benchmark monitoring -r resolvers.json -d domains.txt -i 45 --duration 1800 -o monitor.log
^ Monitors resolvers from resolvers.json against domains.txt every 45s, for 30 minutes, logging to monitor.log
Run monitoring for 1 hour with alerts
dns-benchmark monitoring --use-defaults --interval 30 --duration 3600
--alert-latency 150 --alert-failure-rate 5 --output monitor.log
⚠️ Note for new commands: Resolvers with no successful queries are excluded from ranking and will display Avg Latency: N/A.
Inline input support for resolvers and domains
This patch introduces full support for comma‑separated inline values for the
--resolvers and --domains flags, fixing issue #39 and improving cli usability without breaking any existing workflows.
New capabilities
- inline resolvers:
--resolvers "1.1.1.1,8.8.8.8,9.9.9.9" - inline domains:
--domains "google.com,github.com" - single values:
--resolvers "1.1.1.1"or--domains "google.com" - named resolvers:
--resolvers "cloudflare,google,quad9" - mixed input:
--resolvers "1.1.1.1,cloudflare,8.8.8.8"
Backward compatibility
- all existing file‑based configurations continue to work
- no breaking changes to the cli
- file detection takes priority over inline parsing
Usage Examples
Before (Only files worked)
dns-benchmark benchmark \
--resolvers data/resolvers.json \
--domains data/domains.txt
After (Both work)
# Inline (New) dns-benchmark benchmark \ --resolvers "1.1.1.1,8.8.8.8,9.9.9.9" \ --domains "google.com,github.com" \ --timeout 10 \ --retries 3 \ --formats csv \ --output ./troubleshootingFiles (STILL WORKS)
dns-benchmark benchmark
--resolvers data/resolvers.json
--domains data/domains.txt
--formats csv
Named resolvers
# Named resolvers
dns-benchmark benchmark \
--resolvers "Cloudflare,Google,Quad9" \
--domains "google.com,github.com" \
--timeout 10 \
--retries 3 \
--formats csv \
--output ./troubleshooting_named
Mixed input
# Mixed input
dns-benchmark benchmark \
--resolvers "1.1.1.1,Cloudflare,8.8.8.8" \
--domains "google.com,github.com" \
--timeout 10 \
--retries 3 \
--formats csv \
--output ./troubleshooting_mixed
Single
# Single
dns-benchmark benchmark \
--resolvers "1.1.1.1" \
--domains "google.com" \
--timeout 10 \
--retries 3 \
--formats csv \
--output ./troubleshooting
🔧 Utilities
Feedback
# Provide feedback
dns-benchmark feedback
Risolver management
# Show default resolvers and domains dns-benchmark list-defaultsBrowse all available resolvers
dns-benchmark list-resolvers
Browse with detailed information
dns-benchmark list-resolvers --details
Filter by category
dns-benchmark list-resolvers --category security dns-benchmark list-resolvers --category privacy dns-benchmark list-resolvers --category family
Export resolvers to different formats
dns-benchmark list-resolvers --format csv dns-benchmark list-resolvers --format json
Domain management
# List all test domains dns-benchmark list-domainsShow domains by category
dns-benchmark list-domains --category tech dns-benchmark list-domains --category ecommerce dns-benchmark list-domains --category social
Limit results
dns-benchmark list-domains --count 10 dns-benchmark list-domains --category news --count 5
Export domain list
dns-benchmark list-domains --format csv dns-benchmark list-domains --format json
Category overview
# View all available categories
dns-benchmark list-categories
Configuration management
# Generate sample configuration dns-benchmark generate-config --output sample_config.yamlCategory-specific configurations
dns-benchmark generate-config --category security --output security_test.yaml dns-benchmark generate-config --category family --output family_protection.yaml dns-benchmark generate-config --category performance --output performance_test.yaml
Custom configuration for specific use case
dns-benchmark generate-config --category privacy --output privacy_audit.yaml
Complete usage guide
Quick performance test
# Basic test with progress bars dns-benchmark benchmark --use-defaultsQuick test with only CSV output
dns-benchmark benchmark --use-defaults --formats csv --quiet
Test specific record types
dns-benchmark benchmark --use-defaults --record-types A,AAAA,MX
Add-on analytics flags:
# Include domain and record-type analytics and error breakdown
dns-benchmark benchmark --use-defaults \
--domain-stats --record-type-stats --error-breakdown
JSON export:
# Export a machine-readable bundle
dns-benchmark benchmark --use-defaults --json --output ./results
Network administrator
# Compare internal vs external DNS dns-benchmark benchmark \ --resolvers "192.168.1.1,1.1.1.1,8.8.8.8,9.9.9.9" \ --domains "internal.company.com,google.com,github.com,api.service.com" \ --formats excel,pdf \ --timeout 3 \ --max-concurrent 50 \ --output ./network_auditTest DNS failover scenarios
dns-benchmark benchmark
--resolvers data/primary_resolvers.json
--domains data/business_critical_domains.txt
--record-types A,AAAA
--retries 3
--formats csv,excel
--output ./failover_test
ISP & network operator
# Comprehensive ISP resolver comparison dns-benchmark benchmark \ --resolvers data/isp_resolvers.json \ --domains data/popular_domains.txt \ --timeout 5 \ --max-concurrent 100 \ --formats csv,excel,pdf \ --output ./isp_performance_analysisRegional performance testing
dns-benchmark benchmark
--resolvers data/regional_resolvers.json
--domains data/regional_domains.txt
--formats excel
--quiet
--output ./regional_analysis
Developer & DevOps
# Test application dependencies dns-benchmark benchmark \ --resolvers "1.1.1.1,8.8.8.8" \ --domains "api.github.com,registry.npmjs.org,pypi.org,docker.io,aws.amazon.com" \ --formats csv \ --quiet \ --output ./app_dependenciesCI/CD integration test
dns-benchmark benchmark
--resolvers data/ci_resolvers.json
--domains data/ci_domains.txt
--timeout 2
--formats csv
--quiet
Security auditor
# Security-focused resolver testing dns-benchmark benchmark \ --resolvers data/security_resolvers.json \ --domains data/malware_test_domains.txt \ --formats csv,pdf \ --output ./security_auditPrivacy-focused testing
dns-benchmark benchmark
--resolvers data/privacy_resolvers.json
--domains data/tracking_domains.txt
--formats excel
--output ./privacy_analysis
Enterprise IT
# Corporate network assessment dns-benchmark benchmark \ --resolvers data/enterprise_resolvers.json \ --domains data/corporate_domains.txt \ --record-types A,AAAA,MX,TXT,SRV \ --timeout 10 \ --max-concurrent 25 \ --retries 2 \ --formats csv,excel,pdf \ --output ./enterprise_dns_auditMulti-location testing
dns-benchmark benchmark
--resolvers data/global_resolvers.json
--domains data/international_domains.txt
--formats excel
--output ./global_performance
🔍 README Adjustments for Final Patch
New CLI Options
| Option | Description | Example |
|---|---|---|
--iterations, -i | Run the full benchmark loop N times | dns-benchmark benchmark --use-defaults -i 3 |
--use-cache | Allow cached results to be reused across iterations | dns-benchmark benchmark --use-defaults -i 3 --use-cache |
--warmup | Run a full warmup (all resolvers × domains × record types) | dns-benchmark benchmark --use-defaults --warmup |
--warmup-fast | Run a lightweight warmup (one probe per resolver) | dns-benchmark benchmark --use-defaults --warmup-fast |
--include-charts | Embed charts and graphs in PDF/Excel reports for visual performance analysis | dns-benchmark benchmark --use-defaults --formats pdf,excel --include-charts |
⚡ CLI Commands
The DNS Benchmark Tool now includes three specialized commands for different workflows:
🚀 Top
Quickly rank resolvers by speed and reliability.
# Rank resolvers quickly dns-benchmark topUse custom domain list
dns-benchmark top -d domains.txt
Export results to JSON
dns-benchmark top -o results.json
📊 Compare
Benchmark resolvers side‑by‑side with detailed statistics.
# Compare Cloudflare, Google, and Quad9 dns-benchmark compare Cloudflare Google Quad9Compare by IP addresses
dns-benchmark compare 1.1.1.1 8.8.8.8 9.9.9.9
Show detailed per-domain breakdown
dns-benchmark compare Cloudflare Google --show-details
Export results to CSV
dns-benchmark compare Cloudflare Google -o results.csv
🔄 Monitoring
Continuously monitor resolver performance with alerts.
# Monitor default resolvers continuously (every 60s) dns-benchmark monitoring --use-defaultsMonitor with custom resolvers and domains
dns-benchmark monitoring -r resolvers.json -d domains.txt
Run monitoring for 1 hour with alerts
dns-benchmark monitoring --use-defaults --interval 30 --duration 3600
--alert-latency 150 --alert-failure-rate 5 --output monitor.log
🌟 Command Showcase
| Command | Purpose | Typical Use Case | Key Options | Output |
|---|---|---|---|---|
| top | Quick ranking of resolvers by speed and reliability | Fast check to see which resolver is best right now | --domains, --record-types, --output | Sorted list of resolvers with latency & success rate |
| compare | Side‑by‑side comparison of specific resolvers | Detailed benchmarking across chosen resolvers/domains | --domains, --record-types, --iterations, --output, --show-details | Table of resolvers with latency, success rate, per‑domain breakdown |
| monitoring | Continuous monitoring with alerts | Real‑time tracking of resolver performance over time | --interval, --duration, --alert-latency, --alert-failure-rate, --output, --use-defaults | Live status indicators, alerts, optional log file |
📊 Analysis Enhancements
- Iteration count: displayed when more than one iteration is run.
- Cache hits: shows how many queries were served from cache (when
--use-cacheis enabled). - Failure tracking: resolvers with repeated errors are counted and can be inspected with
get_failed_resolvers(). - Cache statistics: available via
get_cache_stats(), showing number of cached entries and whether cache is enabled. - Warmup results: warmup queries are marked with
iteration=0in raw data, making them easy to filter out in analysis.
Example summary output:
=== BENCHMARK SUMMARY ===
Total queries: 150
Successful: 140 (93.33%)
Average latency: 212.45 ms
Median latency: 198.12 ms
Fastest resolver: Cloudflare
Slowest resolver: Quad9
Iterations: 3
Cache hits: 40 (26.7%)
⚡ Best Practices
| Mode | Recommended Flags | Purpose |
|---|---|---|
| Quick Run | --iterations 1 --timeout 1 --retries 0 --warmup-fast | Fast feedback, minimal retries, lightweight warmup. Good for quick checks. |
| Thorough Run | --iterations 3 --use-cache --warmup --timeout 5 --retries 2 | Multiple passes, cache enabled, full warmup. Best for detailed benchmarking. |
| Debug Mode | --iterations 1 --timeout 10 --retries 0 --quiet | Long timeout, no retries, minimal output. Useful for diagnosing resolver issues. |
| Balanced Run | --iterations 2 --use-cache --warmup-fast --timeout 2 --retries 1 | A middle ground: moderate speed, some retries, cache enabled, quick warmup. |
Feedback & Community Input
We value your input! Help us improve dns-benchmark by sharing your experience and DNS challenges.
Feedback Command
Open the feedback form directly from CLI:
dns-benchmark feedback
This command:
- Opens the feedback survey in your default browser
- Takes ~2 minutes to complete
- Directly shapes our roadmap and priorities
- Automatically marks feedback as given (won't prompt again)
Survey link: https://forms.gle/BJBiyBFvRJHskyR57
Smart Feedback Prompts
To avoid being intrusive, dns-benchmark uses intelligent prompting:
When prompts appear:
- After your 5th, 15th, and 30th benchmark run
- With a 24-hour cooldown between prompts
- Only if you haven't already given feedback
Auto-dismiss conditions:
- You've already submitted feedback
- You've dismissed the prompt 3 times
- You've opted out via environment variable
Example prompt:
────────────────────────────────────────────────────────── 📢 Quick feedback request Help shape dns-benchmark! Share your biggest DNS challenge. → https://forms.gle/BJBiyBFvRJHskyR57 (2 min survey) → Or run: dns-benchmark feedback ──────────────────────────────────────────────────────────
Show this again? (y/n) [y]:
Privacy & Data Storage
What we store locally:
dns-benchmark stores feedback prompt state in ~/.dns-benchmark/feedback.json
Contents:
{
"total_runs": 15,
"feedback_given": false,
"dismissed_count": 0,
"last_shown": 1699876543,
"version": "1.0"
}
Privacy notes:
- ✅ All data stored locally on your machine
- ✅ No telemetry or tracking
- ✅ No automatic data transmission
- ✅ File is only read/written during benchmark runs
- ✅ Safe to delete at any time
What we collect (only when you submit feedback):
- Whatever you choose to share in the survey
- We never collect usage data automatically
Opting Out
Method 1: Dismiss the prompt
When prompted, type n to dismiss:
Show this again? (y/n) [y]: n
✓ Got it! We won't ask again. Thanks for using dns-benchmark!
After 3 dismissals, prompts stop permanently.
Method 2: Environment variable (complete disable)
# Bash/Zsh export DNS_BENCHMARK_NO_FEEDBACK=1Windows PowerShell
$env:DNS_BENCHMARK_NO_FEEDBACK="1"
Permanently (add to ~/.bashrc or ~/.zshrc)
echo 'export DNS_BENCHMARK_NO_FEEDBACK=1' >> ~/.bashrc
Method 3: Delete state file
rm ~/.dns-benchmark/feedback.json
Method 4: CI/CD environments Feedback prompts are automatically disabled when:
CI=trueenvironment variable is set (standard in GitHub Actions, GitLab CI, etc.)--quietflag is used
Reset for testing (developers):
dns-benchmark reset-feedback # Hidden command
⚙️ Configuration Files
Resolvers JSON format
{
"resolvers": [
{
"name": "Cloudflare",
"ip": "1.1.1.1",
"ipv6": "2606:4700:4700::1111"
},
{
"name": "Google DNS",
"ip": "8.8.8.8",
"ipv6": "2001:4860:4860::8888"
}
]
}
Domains text file format
# Popular websites google.com github.com stackoverflow.comCorporate domains
microsoft.com apple.com amazon.com
CDN and cloud
cloudflare.com aws.amazon.com
Output formats
CSV outputs
- Raw data: individual query results with timestamps and metadata
- Summary statistics: aggregated metrics per resolver
- Domain statistics: per-domain metrics (when --domain-stats)
- Record type statistics: per-record-type metrics (when --record-type-stats)
- Error breakdown: counts by error type (when --error-breakdown)
Excel report
- Raw data sheet: all query results with formatting
- Resolver summary: comprehensive statistics with conditional formatting
- Domain stats: per-domain performance (optional)
- Record type stats: per-record-type performance (optional)
- Error breakdown: aggregated error counts (optional)
- Performance analysis: charts and comparative analysis
PDF report
- Executive summary: key findings and recommendations
- Performance charts: latency comparison; optional success rate chart
- Resolver rankings: ordered by average latency
- Detailed analysis: technical deep‑dive with percentiles
📄 Optional PDF Export
By default, the tool supports CSV and Excel exports.
PDF export requires the extra dependency weasyprint, which is not installed automatically to avoid runtime issues on some platforms.
Install with PDF support
pip install dns-benchmark-tool[pdf]
Usage
Once installed, you can request PDF output via the CLI:
dns-benchmark --use-defaults --formats pdf --output ./results
If weasyprint is not installed and you request PDF output, the CLI will show:
[-] Error during benchmark: PDF export requires 'weasyprint'. Install with: pip install dns-benchmark-tool[pdf]
⚠️ WeasyPrint Setup (for PDF export)
The DNS Benchmark Tool uses WeasyPrint to generate PDF reports.
If you want PDF export, you need extra system libraries in addition to the Python package.
🛠 Linux (Debian/Ubuntu)
sudo apt install python3-pip libpango-1.0-0 libpangoft2-1.0-0 \
libharfbuzz-subset0 libjpeg-dev libopenjp2-7-dev libffi-dev
🛠 macOS (Homebrew)
brew install pango cairo libffi gdk-pixbuf jpeg openjpeg harfbuzz
🛠 Windows
Install GTK+ libraries using one of these methods:
MSYS2: Download MSYS2, then run:
pacman -S mingw-w64-x86_64-gtk3 mingw-w64-x86_64-libffiGTK+ 64‑bit Installer: Download GTK+ Runtime and run the installer.
Restart your terminal after installation.
✅ Verify Installation
After installing the system libraries, install the Python extra:
pip install dns-benchmark-tool[pdf]
Then run:
dns-benchmark --use-defaults --formats pdf --output ./results
JSON export
- Machine‑readable bundle including:
- Overall statistics
- Resolver statistics
- Raw query results
- Domain statistics
- Record type statistics
- Error breakdown
Generate Sample Config
dns-benchmark generate-config \
--category privacy \
--output my-config.yaml
Performance optimization
# Large-scale testing (1000+ queries) dns-benchmark benchmark \ --resolvers data/many_resolvers.json \ --domains data/many_domains.txt \ --max-concurrent 50 \ --timeout 3 \ --quiet \ --formats csvUnstable networks
dns-benchmark benchmark
--resolvers data/backup_resolvers.json
--domains data/critical_domains.txt
--timeout 10
--retries 3
--max-concurrent 10Quick diagnostics
dns-benchmark benchmark
--resolvers "1.1.1.1,8.8.8.8"
--domains "google.com,cloudflare.com"
--formats csv
--quiet
--timeout 2
Troubleshooting
# Command not found pip install -e . python -m dns_benchmark.cli --helpPDF generation fails (Ubuntu/Debian)
sudo apt-get install libcairo2 libpango-1.0-0 libpangocairo-1.0-0
libgdk-pixbuf2.0-0 libffi-dev shared-mime-infoOr skip PDF
dns-benchmark benchmark --use-defaults --formats csv,excel
Network timeouts
dns-benchmark benchmark --use-defaults --timeout 10 --retries 3 dns-benchmark benchmark --use-defaults --max-concurrent 25
Debug mode
# Verbose run python -m dns_benchmark.cli benchmark --use-defaults --formats csvMinimal configuration
dns-benchmark benchmark --resolvers "1.1.1.1" --domains "google.com" --formats csv
Automation & CI
Cron jobs
# Daily monitoring 0 2 * * * /usr/local/bin/dns-benchmark benchmark --use-defaults --formats csv --quiet --output /var/log/dns_benchmark/daily_$(date +\%Y\%m\%d)Time-based variability (every 6 hours)
0 */6 * * * /usr/local/bin/dns-benchmark benchmark --use-defaults --formats csv --quiet --output /var/log/dns_benchmark/$(date +%Y%m%d_%H)
GitHub Actions example
- name: DNS Performance Test
run: |
pip install dnspython pandas click tqdm colorama
dns-benchmark benchmark \
--resolvers "1.1.1.1,8.8.8.8" \
--domains "api.service.com,database.service.com" \
--formats csv \
--quiet
Screenshots
Place images in docs/screenshots/:
docs/screenshots/cli_run.pngdocs/screenshots/excel_report.pngdocs/screenshots/pdf_summary.pngdocs/screenshots/pdf_charts.pngdocs/screenshots/excel_charts.pngdocs/screenshots/real_time_monitoring.png
1. CLI Benchmark Run
2. Excel Report Output
3. PDF Executive Summary
4. PDF Charts
5. Excel Charts
6. Real Time Monitoring
Getting help
dns-benchmark --help
dns-benchmark benchmark --help
dns-benchmark list-resolvers --help
dns-benchmark list-domains --help
dns-benchmark list-categories --help
dns-benchmark generate-config --help
Common scenarios:
# I'm new — where to start? dns-benchmark list-defaults dns-benchmark benchmark --use-defaultsTest specific resolvers
dns-benchmark list-resolvers --category security dns-benchmark benchmark --resolvers data/security_resolvers.json --use-defaults
Generate a management report
dns-benchmark benchmark --use-defaults --formats excel,pdf
--domain-stats --record-type-stats --error-breakdown --json
--output ./management_report
Release workflow
Prerequisites
- GPG key configured: run
make gpg-checkto verify. - Branch protection: main requires signed commits and passing CI.
- CI publish: triggered on signed tags matching vX.Y.Z.
- GPG key configured: run
Prepare release (signed)
Patch/minor/major bump:
make release-patch # or: make release-minor / make release-major- Updates versions.
- Creates or reuses
release/X.Y.Z. - Makes a signed commit and pushes the branch.
Open PR: from
release/X.Y.Zintomain, then merge once CI passes.
Tag and publish
Create signed tag and push:
make release-tag VERSION=X.Y.Z- Tags main with
vX.Y.Z(signed). - CI publishes to PyPI.
- Tags main with
Manual alternative
Create branch and commit signed:
git checkout -b release/manually-update-version-based-on-release-pattern git add . git commit -S -m "Release release/$NEXT_VERSION" git push origin release/$NEXT_VERSIONOpen PR and merge into main.
Then tag:
make release-tag VERSION=$NEXT_VERSION
Notes
- Signed commits:
git commit -S ... - Signed tags:
git tag -s vX.Y.Z -m "Release vX.Y.Z" - Version sources:
pyproject.tomlandsrc/dns_benchmark/__init__.py
- Signed commits:
🌐 Hosted Version (Coming Soon)
CLI stays free forever. The hosted version adds features impossible to achieve locally:
🌍 Multi-Region Testing
Test from US-East, US-West, EU, Asia simultaneously. See how your DNS performs for users worldwide.
📊 Historical Tracking
Monitor DNS performance over time. Identify trends, degradation, and optimize continuously.
🚨 Smart Alerts
Get notified via Email, Slack, PagerDuty when DNS performance degrades or SLA thresholds are breached.
👥 Team Collaboration
Share results, dashboards, and reports across your team. Role-based access control.
📈 SLA Compliance
Automated monthly reports proving DNS provider meets SLA guarantees. Audit-ready documentation.
🔌 API Access
Integrate DNS monitoring into your existing observability stack. Prometheus, Datadog, Grafana.
Join the Waitlist → | Early access gets 50% off for 3 months
🛣️ Roadmap
✅ Current Release (CLI Edition)
- Benchmark DNS resolvers across domains and record types
- Export to CSV, Excel, PDF, JSON
- Statistical analysis (P95, P99, jitter, consistency)
- Automation support (CI/CD, cron)
🚧 Hosted Version (Q1 2026)
CLI stays free forever. Hosted adds:
- 🌍 Multi-region testing (US, EU, Asia, custom)
- 📊 Historical tracking with charts and trends
- 🚨 Alerts (Email, Slack, PagerDuty, webhooks)
- 👥 Team collaboration and sharing
- 📈 SLA compliance reporting
- 🔌 API access and integrations
Join Waitlist for early access
🔜 More Network Tools (Q1-Q2 2026)
Part of BuildTools - Network Performance Suite:
- 🔍 HTTP/HTTPS Benchmark - Test API endpoints and CDNs
- 🔒 SSL Certificate Monitor - Never miss renewals
- 📡 Uptime Monitor - 24/7 availability tracking
- 🌐 API Health Dashboard - Complete network observability
💡 Your Input Matters
Help shape our roadmap:
- 📝 2-minute feedback survey
- 💬 GitHub Discussions
- ⭐ Star us if this helps you!
🤝 Contributing
We love contributions! Here's how you can help:
Ways to Contribute
- 🐛 Report bugs - Open an issue
- 💡 Suggest features - Start a discussion
- 📝 Improve docs - README, examples, tutorials
- 🔧 Submit PRs - Bug fixes, features, tests
- ⭐ Star the repo - Help others discover the tool
- 📢 Spread the word - Tweet, blog, share
🛠 Development & Makefile Commands
This project includes a Makefile to simplify installation, testing, and code quality checks.
.PHONY: install install-dev uninstall mypy black isort flake8 cov test clean cli-test🔧 Install package (runtime only)
install: pip install .
🔧 Install package with dev extras (pytest, mypy, flake8, black, isort, etc.)
install-dev: pip install .[dev]
🔧 Uninstall package
uninstall: pip uninstall -y dns-benchmark-tool
dnspython pandas aiohttp click pyfiglet colorama Jinja2 weasyprint openpyxl pyyaml tqdm matplotlib
mypy black flake8 autopep8 pytest coverage isortmypy: mypy .
isort: isort .
black: black .
flake8: flake8 src tests --ignore=E126,E501,E712,F405,F403,E266,W503 --max-line-length=88 --extend-ignore=E203
cov: coverage erase coverage run --source=src -m pytest -vv -s coverage html
test: mypy black isort flake8 cov
clean: rm -rf pycache .pytest_cache htmlcov .coverage coverage.xml
build dist *.egg-info .eggs benchmark_results cli-test:Run only the CLI smoke tests marked with @pytest.mark.cli
pytest -vv -s -m cli tests/test_cli_commands.py
Common usage
Install runtime only
make installInstall with dev dependencies
make install-devRun type checks, linting, formatting, and tests
make testRun CLI smoke tests only
make cli-testClean build/test artifacts
make clean
Code Guidelines
- Follow PEP 8 style guide
- Add tests for new features
- Update documentation
- Keep PRs focused and atomic
❓ FAQ
Why is my ISP's DNS not fastest?
Local ISP DNS often has caching advantages but may lack:
- Global anycast network (slower for distant domains)
- DNSSEC validation
- Privacy features (DoH/DoT)
- Reliability guarantees
Test both and decide based on YOUR priorities!
How often should I benchmark DNS?
- One-time: When choosing DNS provider
- Monthly: For network health checks
- Before migration: When switching providers
- After issues: To troubleshoot performance
Can I test my own DNS server?
Yes! Just add it to a custom resolvers JSON file:
{
"resolvers": [
{"name": "My DNS", "ip": "192.168.1.1"}
]
}
What's the difference between CLI and hosted version?
CLI (Free Forever):
- Run tests from YOUR location
- Save results locally
- Manual execution
- Open source
Hosted (Coming Soon):
- Test from MULTIPLE regions
- Historical tracking
- Automated scheduling
- Alerts and integrations
Is this tool safe to use in production?
Yes! The tool only performs DNS lookups (read operations). It does NOT:
- Modify DNS records
- Perform attacks
- Send data to external servers (unless you enable hosted features)
All tests are standard DNS queries that any resolver handles daily.
Why do results vary between runs?
DNS performance varies due to:
- Network conditions
- DNS caching (resolver and intermediate)
- Server load
- Geographic routing changes
Run multiple iterations (--iterations 5) for more consistent results.
🔗 Links & Support
Official
- Website: buildtools.net
- PyPI: dns-benchmark-tool
- GitHub: frankovo/dns-benchmark-tool
Community
- Feedback: 2-minute survey
- Discussions: GitHub Discussions
- Issues: Bug Reports
Stats
- Downloads: 1,400+ (this week)
- Active Users: 600+
License
This project is licensed under the MIT License — see the LICENSE file for details.
Built with ❤️ by @frankovo
Part of BuildTools - Network Performance Suite





