Skip to main content
Skip table of contents

Sprint Performance Report in Time in Status – FAQ

The Sprint Performance Report in the Time in Status app gives Scrum teams a full, visual picture of a completed sprint: velocity, workload, completion rate, scope changes, and where time actually went. Instead of manually piecing together multiple Jira views, you get a single report that explains what happened in the sprint and why—ready to support planning and retrospectives.

1. What is the Sprint Performance Report in the Time in Status app?

The Sprint Performance Report is a visual, read-only sprint analysis inside the Time in Status app for Jira. It:

  • Works on boards that enable sprints.

  • Uses the board’s estimation methodStory Points, Work Item Count, or Original Time Estimate.

  • Shows completed sprint data in sections: Sprint information, Team Velocity, Workload, Completion rate, Committed, Completed, and Scope change.

It’s designed to give you a comprehensive sprint story (context, execution, scope changes, and outcomes) rather than just raw numbers.

2. How does the Sprint Performance Report decide which work items and estimation metric to use?

Two rules define what you see:

  1. Sprint scope

    • The report includes issues based on the board’s JQL filter and the sprint they belong to.

    • Only issues within the selected sprint and visible to that board are analyzed.

  2. Estimation metric

    • The report calculates everything using the same estimation method as the board:

      • Story Points – uses the story points field.

      • Work Item Count – counts issues.

      • Original Time – uses Original Time Estimate on issues.

    • You choose the estimation method in the board:

      • Board settings → Configure board → Estimation (or equivalent setting in team-managed/“next-gen” projects).

All metrics like Committed, Completed, Velocity, Workload, and Scope Change are then calculated in that unit.

3. How do I open the Sprint Performance Report in Jira?

You can access the report directly from the board:

  • Go to your board with sprints enabled.

  • Click More (top right corner on the board).

  • Select Sprint Performance Report from the menu.

You’ll then pick the sprint you want to analyze and, if needed, the estimation metric (Story Points, Work Item Count, or Original Time).

4. What does the “Sprint information” section show and how should I use it?

The Sprint information section gives you the sprint’s basic context plus time-based signals:

  • Sprint name, date range, and goals – what you planned and when.

  • Flagged Work Items – total number of flagged issues during the sprint.

  • Logged Time – total worklog time captured between sprint start and completion.

  • Status Time – total time issues spent in statuses during the sprint (excluding the first and last board status).

  • Sprint Work Item Structure chart – percentage share of each issue type in the sprint (e.g., Stories vs Bugs vs Tasks).

Use it to quickly answer:

  • Was the sprint heavy on bugs, maintenance, or new features?

  • Were many items flagged? Does that correlate with delays or missed goals?

  • Did we spend most time in “active” statuses or in waiting/review statuses?

5. How is Team Velocity calculated in the Sprint Performance Report?

The Team Velocity section shows Committed vs Completed work over the last 7 completed sprints, including the selected sprint:

  • Committed

    • Total estimation of issues in the sprint at the start, excluding items already in the board’s final status.

    • Measured in Story Points, Original Time, or Work Item Count.

  • Completed

    • Total estimation of issues that reach the final status on the board by the sprint’s end.

  • Average Velocity

    • Average completed value across the last 7 sprints:

    • Average Velocity = (Completed Sprint 1 + … + Completed Sprint 7) / 7

Use this section to:

  • Check if your forecast is honest (committed vs completed).

  • See if velocity is stable, rising, or inconsistent, which directly affects planning and stakeholder expectations.

6. What does the Workload section tell me about assignees?

The Workload section shows how work was distributed among team members during the sprint:

  • Each assignee is represented by a stacked bar with:

    • Committed work at sprint start.

    • Added work assigned during the sprint.

    • Removed work taken away during the sprint.

  • Positive portions (above zero) show committed + added.

  • Negative portions (below zero) show removed work.

  • Unassigned items may also appear as part of the chart.

This section answers:

  • Who started the sprint overloaded or underloaded?

  • Who had a lot of scope added mid-sprint?

  • Did we over-rotate work away from some people and onto others?

It’s ideal for workload balancing, preventing hidden hero bottlenecks, and spotting scope churn at the individual level.

7. How are Completion rate, Incompleted, and Carryover calculated? Why can completion exceed 100%?

The Completion rate block shows how much of the committed work was finished, plus what was carried over:

  • Completion rate (%)

    • Completed / Committed * 100

  • Completed (%)

    • Same formula, focusing only on finished work compared to initial commitment.

    • Can exceed 100% if the team completes more work than originally committed (e.g., extra tasks added mid-sprint and finished).

  • Incompleted (%)

    • Incompleted / Committed * 100

    • Incompleted = committed work that wasn’t completed by the sprint end.

  • Carryover (%)

    • Carryover / Committed * 100

    • Carryover = incomplete work that moved into a later sprint (active or completed).

    • Often Incompleted ≈ Carryover, unless some unfinished tasks are intentionally not continued.

Use this section to evaluate:

  • Are we consistently over-committing?

  • Are we hitting 120% completion because of scope chaos, or controlled bonus work?

  • How much work regularly spills over to the next sprint?

8. What do the Committed, Completed, and Scope change sections reveal about priorities and scope stability?

These three sections provide a priority- and scope-focused view:

  • Committed (by priority)

    • Shows how initial sprint commitment is distributed across priorities (e.g., Critical, High, Medium).

    • Helps verify that the team planned around the right priorities.

  • Completed (by priority)

    • Shows how much work was actually finished per priority category by sprint end.

    • Lets you check if high-priority items were truly delivered, not just started.

  • Scope change (Added vs Removed)

    • Pie chart with two slices: Added and Removed work during the sprint.

    • Measured in the same unit (Story Points, Original Time, or Count).

    • Scope change intensity can be expressed as:

      • (Added – Removed) / Committed * 100

These views help answer:

  • Did we complete the right work, not just the most visible work?

  • Did mid-sprint scope change derail our original plan?

  • Are we constantly adding “urgent” items but not removing anything in return?

9. How can I use the Sprint Performance Report in sprint reviews and retrospectives?

The Sprint Performance Report is designed to be a shared, objective reference for reviews and retros. You can:

  • Start with Sprint information & Velocity

    • Clarify context (goal, issue types, flagged items) and pace (committed vs completed over 7 sprints).

    • Set realistic expectations for future sprint commitments.

  • Use Completion rate & Workload to talk about planning and workload health

    • Discuss over- or under-commitment based on completion rate and carryover.

    • Check if some people had too much work or too many mid-sprint changes.

  • Use Committed/Completed by priority & Scope change to align with business goals

    • Confirm that priority work was delivered.

    • Show stakeholders the impact of repeated scope changes on completion.

  • Tie it back to Time in Status reports

    • If the Sprint report shows problems, drill into Time in Status, Status Count, or Transition Count reports to find exact bottlenecks and rework loops.

This shifts the retro from opinions (“we were blocked a lot”) to evidence-based coaching conversations (“our completion rate dropped when scope increased by 30% and review statuses doubled in time”).

10. How does the Sprint Performance Report work with other Time in Status features like dashboards and presets?

The Sprint Performance Report is part of the broader Time in Status toolkit:

  • Use Sprint Performance Report for end-of-sprint analysis and retrospectives.

  • Use Dashboard Gadgets (Time in Status gadgets) to monitor ongoing sprint health:

    • Aging issues, Time in Status, Assignee Time, Status Count, etc.

  • Use Status Groups, User Groups, calendars, and time formats in other reports to define how you measure cycle time, lead time, and workload.

  • Combine Sprint Performance insights with core reports (Time in Status, Assignee Time, Average Time, Status Count, Transition Count, Time in Status per Date) to move from “what happened” to “where exactly it happened and how to fix it”.

Together, these features let you automate sprint reporting, keep dashboards live, and base improvement decisions on consistent, repeatable data rather than ad-hoc snapshots.

 If you need help or want to ask questions, please contact SaaSJet Support or email us at support@saasjet.atlassian.net

Haven’t worked with the add-on yet? Give it a try

JavaScript errors detected

Please note, these errors can depend on your browser setup.

If this problem persists, please contact our support.