Skip to main content
PR cycle time measures the elapsed time from when a pull request is opened to when it is merged. It captures the full code review and iteration process — writing the description, getting reviews, addressing feedback, waiting for CI, and finally merging.

What Periscope tracks

Periscope computes cycle time from GitHub PR merge events. The dashboard shows:
  • Percentiles — p50, p75, and p95 cycle times (in hours)
  • Average cycle time
  • Weekly trends showing how cycle time changes over time
  • Individual PR data for identifying outliers

How it is calculated

cycle_time = pr.mergedAt - pr.createdAt
Periscope captures this directly from GitHub’s pull_request webhook event when a PR is closed and merged. The created_at and merged_at timestamps from GitHub are used. Percentiles are computed across all merged PRs in the selected time range for your monitored repositories.

Interpreting the data

  • p50 under 24 hours is a strong indicator of healthy review practices and good team flow.
  • p50 over 72 hours typically signals bottlenecks — slow reviews, large PRs, or CI pipeline issues.
  • Large gap between p50 and p95 means most PRs flow well but some get stuck. Investigate the tail — are they large PRs, PRs from specific contributors, or PRs to specific services?
  • Increasing weekly trend may indicate growing team size (more review load), accumulating tech debt, or process friction.

Common causes of long cycle times

  • Large PRs that are hard to review (see size vs time)
  • Insufficient reviewer capacity or unclear ownership
  • Slow CI pipelines blocking merge
  • Timezone misalignment between author and reviewers
  • PRs waiting for manual QA or product sign-off

Reducing cycle time

  • Break work into smaller PRs (under 400 lines)
  • Set review SLAs and use PR assignment or CODEOWNERS
  • Invest in faster CI — flaky or slow tests are the biggest hidden tax
  • Use draft PRs to get early feedback before the full review
  • Automate what you can — auto-merge when CI passes and approvals are met

Cycle time vs lead time

These two metrics are related but measure different things:
MetricMeasuresData source
PR cycle timePR open to mergeGitHub
Lead time for changesPR merge to production deployGitHub + CI/CD
Together they capture the full journey: how long it takes to get code reviewed and merged (cycle time), and then how long until that merged code is live (lead time).

MCP tool

Query PR cycle time from your AI coding assistant:
get_pr_cycle_time(time_range: "30d")
Returns p50, p75, p95, average, weekly trend data, and up to 20 sample PRs.