Since going full-fledged Agile in the Fall of 2011, Valpak’s approach to Sprint Reviews has evolved. Towards scaling Agile, we’ve been regularly inspecting and adapting (as you would expect) across our 9 Scrums and 1 Kanban team. Today, we have a Sprint Review approach in place that operates like a well-oiled machine and also scales well as you add more teams. Let me share with you the evolution of our Sprint Reviews. But first, some background on our Scrum process …
At Valpak, we sprint in 2-week iterations running from a Monday to a Friday across 9 teams and 50+ team members. Because of heavily integrated systems and environments, all teams sprint on the same common sprint schedule. This means that all teams hold Sprint Planning and Sprint Reviews on the same day.
Sprint Review 1.0: The Every-Team-For-Themselves Approach
Our first take at Sprint Reviews was pure madness! Each ScrumMaster would schedule a separate 1 to 2 hour meeting on the same Friday for each Scrum team. That’s nine 1 to 2 hour meetings within an 8 hour business day and on a Friday no less! That approach works fine for Sprint Planning, which doesn’t involve the stakeholders, but royally sucks for Sprint Reviews. As much as the ScrumMasters tried to coordinate the independent Sprint Reviews, stakeholders would receive overlapping meeting notices and were forced to decide which Sprint Review to attend and which to miss. With much confusion and chaos, we quickly discovered that teams had many stakeholders in common and that we needed a Sprint Review approach that allows stakeholders to see demonstrations from all the Scrum teams they have a stake in. It’s just not possible to be in two, three, or even nine places at once! Logistically this approach just didn’t scale.
Sprint Review 2.0: The 180-Minute-Marathon-Mob Approach
Our second take at Sprint Reviews solved the logistical problems of the 1.0 approach and scaled much better, but still wasn’t quite right. Our 2.0 approach was basically a single Sprint Review across all 9 Scrum teams. Quite frankly, it was an exhausting 180-minute marathon with a mob of stakeholders in the biggest room we could find. Each team would go up to the front of the room and present for about 20 minutes. This included a review of the Sprint accomplishments and metrics from the ScrumMaster followed by a demonstration of working software by the team members. This approach certainly proved that we can successfully aggregate the Sprint Reviews across all teams for all stakeholders, but it had some challenges of it’s own. First off, trying to pay attention to anything or anyone for 3 hours straight is a challenge; even more so when you have people that aren’t necessarily professional speakers. Second, there was just too much dead air and awkward pauses with constant switching of speakers, teams, laptops, decks, etc. The result was a disengaged audience of stakeholders.
Sprint Review 3.0: The Science Fair Approach
The challenges of the 2.0 approach to our Sprint Reviews led us to a modified approach we call “The Science Fair”. Our 180-minute marathon of Sprint Reviews was shortened to 90-minutes with each team presenting for just 10 minutes each. Within that 10 minutes, the ScrumMaster provides a condensed version of the Sprint accomplishments and metrics and the team demonstrates only the sexy stuff (via live demo and/or presentation format depending on the feature). Oh and, we now incorporate handheld and table-top microphones to better project the voices of the speakers. Following that 90-minute period, we hold a one hour Science Fair in the cube areas where the teams sit. Just like parents viewing student experiments at a school Science Fair, this hour is for stakeholders, at their own pace, to make the rounds to each team they have an interest in for any given sprint. Stakeholders can informally visit with each of the teams (and their ScrumMaster and Product Owners) at their everyday work spaces to ask questions and see greater detail on what was accomplished during the sprint. This might be in the form of casual Q&A or real life demonstrations of working software. Over time, we have found the shortened Sprint Reviews keep the audience engaged and the Science Fair meets the needs of stakeholders wanting greater detail or more personalized attention.
One thing is for certain … that is change! We are bound to further evolve our approach to Sprint Reviews again and again and again. That’s right, we most certainly will inspect and adapt and, this time next year, we might be doing something completely different. All in all, we are doing what works for us today in keeping with our Agile ways.
June 15th, 2012 at 5:00 pm
Stephanie – the story of your Sprint Reviews echos many themes I’ve heard in the past. You might be interested in reading the approach that Bob Galen and his team took: http://www.infoq.com/articles/agile-project-manager-viola
June 19th, 2012 at 5:54 pm
Stephanie – Science Fair approach is a good one and could become little chaotic if all the stakeholders try to jump onto one team due to something exciting in the demo. Sometimes some teams do not get enough visibility from stakeholders due to other exciting features from other teams.
One way to manage this is to give sticky notes to all the stakeholders and ask them to stick their first 3 (1, 2 and 3) preferences of interactions with the teams. Now we can see on the wall, if any team is not getting enough attention / any team is generating too high an enthusiasm. Then we can have a more “Balanced Science Fair”.
August 1st, 2013 at 1:40 pm
[…] all agile process elements were perceived to have some value, Sprint Review/Science Fair, Retrospectives, and Sound Bites had low value scores (not surprising, since 3 out of 4 are […]
October 30th, 2013 at 10:22 pm
One of the main purposes of the Sprint Review is to adapt the PBL based on the the feedback solicited by the demos. Does your process include a re-joining of the group after the Science fair to do this?
October 31st, 2013 at 2:27 pm
Of course! The POs work with their teams and stakeholders offline and during groomings to take into account Sprint Review feedback to adapt the product backlog.
October 31st, 2013 at 3:23 pm
Do you think, in a multi-team/one product situation, there is a lot of value in providing transparency to this kind of “after the demo” feedback/PBL evolution as part of the Sprint Review event itself? i.e. Where everyone sharing the same Product ROI goal gets to see how the product and PBL will evolve as a result of their collective work?
November 4th, 2013 at 1:33 pm