Our Evolution of Sprint Reviews

Since going full-fledged Agile in the Fall of 2011, Valpak’s approach to Sprint Reviews has evolved.  Towards scaling Agile, we’ve been regularly inspecting and adapting (as you would expect) across our 9 Scrums and 1 Kanban team.  Today, we have a Sprint Review approach in place that operates like a well-oiled machine and also scales well as you add more teams.   Let me share with you the evolution of our Sprint Reviews.  But first, some background on our Scrum process …

At Valpak, we sprint in 2-week iterations running from a Monday to a Friday across 9 teams and 50+ team members.  Because of heavily integrated systems and environments, all teams sprint on the same common sprint schedule.  This means that all teams hold Sprint Planning and Sprint Reviews on the same day.

Sprint Review 1.0: The Every-Team-For-Themselves Approach

Our first take at Sprint Reviews was pure madness!  Each ScrumMaster would schedule a separate 1 to 2 hour meeting on the same Friday for each Scrum team.  That’s nine 1 to 2 hour meetings within an 8 hour business day and on a Friday no less!  That approach works fine for Sprint Planning, which doesn’t involve the stakeholders, but royally sucks for Sprint Reviews.  As much as the ScrumMasters tried to coordinate the independent Sprint Reviews, stakeholders would receive overlapping meeting notices and were forced to decide which Sprint Review to attend and which to miss.  With much confusion and chaos, we quickly discovered that teams had many stakeholders in common and that we needed a Sprint Review approach that allows stakeholders to see demonstrations from all the Scrum teams they have a stake in.  It’s just not possible to be in two, three, or even nine places at once!  Logistically this approach just didn’t scale. 

Sprint Review 2.0:  The 180-Minute-Marathon-Mob Approach

Our second take at Sprint Reviews solved the logistical problems of the 1.0 approach and scaled much better, but still wasn’t quite right.  Our 2.0 approach was basically a single Sprint Review across all 9 Scrum teams.  Quite frankly, it was an exhausting 180-minute marathon with a mob of stakeholders in the biggest room we could find.  Each team would go up to the front of the room and present for about 20 minutes.  This included a review of the Sprint accomplishments and metrics from the ScrumMaster followed by a demonstration of working software by the team members.  This approach certainly proved that we can successfully aggregate the Sprint Reviews across all teams for all stakeholders, but it had some challenges of it’s own.  First off, trying to pay attention to anything or anyone for 3 hours straight is a challenge; even more so when you have people that aren’t necessarily professional speakers.  Second, there was just too much dead air and awkward pauses with constant switching of speakers, teams, laptops, decks, etc.  The result was a disengaged audience of stakeholders.

Sprint Review 3.0:  The Science Fair Approach

The challenges of the 2.0 approach to our Sprint Reviews led us to a modified approach we call “The Science Fair”.  Our 180-minute marathon of Sprint Reviews was shortened to 90-minutes with each team presenting for just 10 minutes each.  Within that 10 minutes, the ScrumMaster provides a condensed version of the Sprint accomplishments and metrics and the team demonstrates only the sexy stuff (via live demo and/or presentation format depending on the feature).  Oh and, we now incorporate handheld and table-top microphones to better project the voices of the speakers.  Following that 90-minute period, we hold a one hour Science Fair in the cube areas where the teams sit.  Just like parents viewing student experiments at a school Science Fair, this hour is for stakeholders, at their own pace, to make the rounds to each team they have an interest in for any given sprint.  Stakeholders can informally visit with each of the teams (and their ScrumMaster and Product Owners) at their everyday work spaces to ask questions and see greater detail on what was accomplished during the sprint.   This might be in the form of casual Q&A or real life demonstrations of working software.  Over time, we have found the shortened Sprint Reviews keep the audience engaged and the Science Fair meets the needs of stakeholders wanting greater detail or more personalized attention.

One thing is for certain … that is change!  We are bound to further evolve our approach to Sprint Reviews again and again and again.  That’s right, we most certainly will inspect and adapt and, this time next year, we might be doing something completely different.  All in all, we are doing what works for us today in keeping with our Agile ways.

Advertisements

About Stephanie Davis

Stephanie Davis is Senior Director of Enterprise Agility & Digital Product Leadership at Cox Target Media. In this role, Stephanie champions agility across the enterprise, leads the Agile PMO, and manages the CTM portfolio while also overseeing the Digital Product Leadership team. Stephanie leads the team of Agile Project Leaders in the roles of ScrumMaster, Kanban Lead, and/or Agile Project Manager as well as overseeing the IT Business Analysts. Stephanie is a career project leader with over 16 years in the field, including past positions with AT&T Business and IBM Global Services, and has maintained the Project Management Professional (PMP) certification since early in her career. She also maintains the PMI-Agile Certified Practitioner (ACP) and Certified Scrum Master (CSM) credentials. Her academic credentials include a BS in Marketing and an MBA in International Business. Stephanie serves and supports Agile within her community and beyond as organizer for the Tampa Bay Agile Meetup and the Agile Open Florida. Most recently, Stephanie has been elected to the Agile Alliance board for the 2015 to 2017 term. View all posts by Stephanie Davis

7 responses to “Our Evolution of Sprint Reviews

  • Mark Levison

    Stephanie – the story of your Sprint Reviews echos many themes I’ve heard in the past. You might be interested in reading the approach that Bob Galen and his team took: http://www.infoq.com/articles/agile-project-manager-viola

  • Rathina

    Stephanie – Science Fair approach is a good one and could become little chaotic if all the stakeholders try to jump onto one team due to something exciting in the demo. Sometimes some teams do not get enough visibility from stakeholders due to other exciting features from other teams.
    One way to manage this is to give sticky notes to all the stakeholders and ask them to stick their first 3 (1, 2 and 3) preferences of interactions with the teams. Now we can see on the wall, if any team is not getting enough attention / any team is generating too high an enthusiasm. Then we can have a more “Balanced Science Fair”.

  • Where Are We In Our Agile Journey?: Agile Survey Results | i.am.agile

    […] all agile process elements were perceived to have some value, Sprint Review/Science Fair, Retrospectives, and Sound Bites had low value scores (not surprising, since 3 out of 4 are […]

  • charlesbradley

    Stephanie,

    One of the main purposes of the Sprint Review is to adapt the PBL based on the the feedback solicited by the demos. Does your process include a re-joining of the group after the Science fair to do this?

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: