PRs Merged / Week
Last updated: January 13, 2026
Overview
The PRs merged / Week report helps you understand your engineering team's delivery velocity by measuring how many pull requests individual contributors merge per week on average. This metric provides a normalized view of productivity that accounts for out-of-office time, making it fair to compare developers across different teams and time periods.
What This Report Measures
PRs merged / Week shows the average number of pull requests merged per week by active contributors. The metric is calculated as:
PRs merged / week = Total PRs merged on active days ÷ Total active coding daysKey features:
✅ Normalized for time off - Only counts PRs merged on active working days, so vacations and leave don't skew the numbers
✅ Developer-focused - Only includes contributors classified as software engineers
✅ Fair comparison - Enables apples-to-apples comparisons across teams with different work patterns
✅ Trend visibility - Track changes over time to spot velocity shifts
Accessing the Report
Navigation Path
Navigate to Insights > Productivity > Velocity in the main navigation
Select Delivery tab
Click on PRs merged / week
Understanding the Data
Main Visualization
The report displays a time series chart showing PR merge rates over your selected time period. Each data point represents the average number of PRs merged per week during that period.
What you'll see:
Current metric value - Your team's current PR merge rate (e.g., "5.2 PRs/week")
Trend line - Weekly progression showing velocity changes
Benchmark comparison - How you rank against your organization (percentile)
Period-over-period comparison - Change vs. previous time period
Breakdown Views
You can break down the metric by multiple dimensions:
Dimension | What It Shows | Use Case |
Team/Group | PR merge rates for each team | Identify high-performing teams or bottlenecks |
IC Level | Compare junior, mid-level, and senior engineers | Understand productivity patterns by experience |
Individual Contributors | Each developer's merge rate | Performance reviews, coaching opportunities |
Repository | Which codebases have most merge activity | Resource allocation, codebase health |
Time | Weekly trends over selected period | Spot seasonal patterns or project impacts |
Grid View
The breakdown table shows:
Individual contributor names and teams
Their PR merge rate (with 1 decimal precision)
Percentile ranking within their peer group
Change from previous period (↑ or ↓)
Customizing Your View
Time Range Options
Last 7 days
Last 2 weeks
Last 4 weeks (recommended for stable trends)
Last 3 months
Last 6 months
Last 12 months
Custom date range
💡 Tip: Use 4-6 week windows to smooth out weekly variations and get more reliable trends.
Filters Available
Basic Filters:
Date Range - Select the time period you want to analyze
Team/Group - Focus on specific teams or compare multiple teams
Person - Drill into individual contributors
Advanced Filters:
Active Contributors Only - Filter out inactive developers
Repository - Analyze specific codebases
AI Code Ratio - Filter PRs by amount of AI-generated code (if enabled)
Aggregation Options
Choose how to aggregate the data:
Average (default) - Mean PR merge rate
P50 - Median value
P75, P90 - Upper percentiles
Sum - Total PRs merged
How to Use This Report
1. Monitor Team Velocity
What to do:
Set a baseline by reviewing your team's historical PR merge rate over the past 3-6 months
Track week-over-week trends to identify velocity changes
Compare current periods against previous quarters
What to look for:
Consistent delivery rates indicate steady velocity
Sudden drops may signal blockers, process issues, or team capacity problems
Gradual increases often indicate process improvements or team growth
2. Compare Teams Fairly
What to do:
Break down by Team/Group dimension
Review percentile rankings to understand relative performance
Consider team size and project complexity when comparing
What to look for:
Teams with similar structures should have comparable merge rates
Large discrepancies may indicate different PR sizing practices or process inefficiencies
Top-performing teams often have better CI/CD, clearer requirements, or smaller PR practices
3. Track Individual Performance
What to do:
Use the grid view to see individual contributor metrics
Break down by IC Level to compare within experience tiers
Review trends over 4-6 week periods (not single weeks)
What to look for:
Onboarding progress - New hires should show increasing merge rates as they ramp up
Consistent contributors - Steady rates indicate healthy productivity
Outliers - Both high and low outliers deserve investigation (could indicate issues or exceptional work)
4. Identify Process Bottlenecks
What to do:
Compare PR merge rates with PR Cycle Time metrics (available in the same Delivery category)
Look for patterns where merge rates are low despite high review activity
Red flags:
Low merge rate + high cycle time = slow review/approval process
Low merge rate + high commit activity = PRs sitting in "draft" or blocked state
Declining merge rates across entire team = systemic process issue
5. Set Healthy Baselines
What to do:
Establish your team's normal range based on historical data (e.g., "Our team typically merges 4-6 PRs/week per person")
Use this baseline to investigate deviations, not to set rigid targets
Consider your team's context: codebase complexity, testing requirements, code review standards
❌ Avoid:
Setting competitive targets between individuals (encourages unhealthy behaviors like artificial PR splitting)
Comparing against industry benchmarks without considering your context
Using this as the sole metric for performance evaluation
6. Investigate Anomalies
When to investigate:
📉 Sustained drops (2+ weeks) - Possible burnout, disengagement, or blockers
📈 Unusual spikes - May indicate corner-cutting, reduced code review rigor, or emergency deployments
🚩 One person far above/below peers - Different work type, process bypass, or need for support
How to investigate:
Drill into the individual's PR details
Review PR size and complexity trends
Check for correlation with other metrics (review cycles, rework rate)
Talk to the developer or team lead
Understanding Benchmarks
The report shows how your metric compares to:
Organization average - Your company's overall PR merge rate
Team average - Your team's collective merge rate
Percentile ranking - Where you fall relative to peers (e.g., 75th percentile means you're faster than 75% of your peers)
💡 Tip: Use benchmarks as context, not targets. Your team's goals, codebase requirements, and quality standards matter more than external comparisons.
Best Practices
✅ Do's
Look at trends over 4-6 weeks - Weekly data can be noisy; focus on patterns
Combine with other metrics - Use alongside PR Cycle Time, Review Depth, and Test Coverage for full context
Segment by experience level - Junior engineers naturally have different rates than seniors
Normalize for project phases - Expect lower rates during planning/research phases and higher rates during implementation sprints
Celebrate improvements - Recognize teams that improve their velocity sustainably
❌ Don'ts
Don't create individual targets - Encourages gaming the metric (tiny PRs, bypassing review)
Don't compare across very different projects - Infra work ≠ feature work ≠ bug fixes
Don't ignore context - One developer working on complex architecture may legitimately have lower merge rates
Don't react to single-week variations - Wait for sustained patterns before acting
Don't use as sole performance indicator - Combine with quality metrics, collaboration metrics, and qualitative feedback
Frequently Asked Questions
Why does this metric matter?
PR merge rate is a leading indicator of team velocity and throughput. It helps you:
Plan capacity - Understand realistic delivery rates for sprint planning
Spot problems early - Declining rates often signal issues before they become critical
Support team members - Identify when someone might need help or coaching
Measure process improvements - See if changes to CI/CD, review processes, or tooling improve velocity
How is "active coding day" defined?
An active coding day is any day when a developer:
Made commits to a repository
Opened, reviewed, or merged a pull request
Had development activity tracked by Span's integrations
Was not marked as out-of-office in your HRIS system
This ensures the metric only reflects actual working time.
What counts as a "merged PR"?
A PR is counted when:
✅ Status is "Merged" (not draft, closed, or reverted)
✅ Merged by a developer contributor (not bots or automation)
✅ Merged on an active working day
❌ Does NOT count: Reverted PRs, draft PRs, closed-without-merge PRs
Why might my merge rate be lower than expected?
Common reasons:
Larger PRs - Fewer but bigger PRs will show lower rates
Higher review standards - More thorough reviews slow merge velocity
Complex codebases - Infrastructure or architecture work takes longer
Recent team changes - New team members or restructuring affects velocity
Project phase - Research, planning, or POC phases have naturally lower merge rates
What's a "good" PR merge rate?
There's no universal "good" rate - it depends entirely on your context:
Team structure - Small teams often have higher rates per person
Codebase type - Frontend features merge faster than backend infrastructure
Quality standards - Teams with stringent testing/security requirements merge slower
PR sizing practices - Teams that break work into smaller PRs have higher rates
Use YOUR historical data as the benchmark, not external industry numbers.
Can I export this data?
Yes! Use the export options available on the report page:
CSV export - Download raw data for further analysis
PDF report - Generate a formatted report for sharing
Share link - Share a live view with team members
Schedule email reports - Receive regular updates in your inbox
Related Metrics
Combine this report with related metrics for deeper insights:
Metric | What It Shows | Why Use Together |
PR Cycle Time | Time from PR open to merge | If merge rate is low but cycle time is high, you have a review bottleneck |
Commits per Week | Commit frequency | High commits but low merges may indicate PRs stuck in review |
PR Diff Size | Average lines changed per PR | Smaller PRs should correlate with higher merge rates |
% PRs Merged with Tests | Test coverage in PRs | Ensure high merge rates don't come at the cost of quality |
PR Revert Rate | How often PRs are reverted | High merge rate with high revert rate signals quality issues |
Appendix: Technical Details
Calculation Formula
PRs merged / week = ΣPRS merged on active days ÷ Σ active coding daysData Sources
VCS Integrations: GitHub, GitLab, Azure DevOps
HRIS Integration: For out-of-office data (optional)
Refresh Rate: Near real-time (hourly incremental updates)
Data Quality
Only includes PRs with sufficient code analyzability (30%+ of lines)
Excludes bot-generated PRs and automated merges
Normalizes for time zone differences