As Agile Practitioners, we are normally the only Agile Practitioner on the team. We do not have someone questioning our decisions, besides the development team (always and nonstop 😉 — which we appreciate!). But we do not have another person from the same profession challenging what we are doing. Who makes sure that we deliver high-quality work? How do we make sure to learn from each other? How do we get new ideas? How do we stay on top of our game? How do we make sure to be aligned in our ways of agile development? And how do we make sure that everyone can handle the workload and pushback we get from the teams?
Recently, these questions were flying around in the Stuart Agile team and we were looking for ways to solve them. Of course, we implemented initiatives, which mitigated some of these risks. For example, we created “brain meeting”, a space to deep-dive into agile discussions and align on practices. Or team meetings to talk about organisational topics, workload, and happiness. But we asked ourselves: Is that enough?
What would we suggest to our development teams if they had the described issues or questions?
If this conversation was with our development teams, what would we suggest they do? And the answer was super clear: Code reviews.
Why is this standard tool, which we expect all of our development teams to use, not part of our way of working? We took the decision to look into it and figure out how we could adapt the best practice of code reviews to our work as agile practitioners.
What we proposed
First, we needed to clarify what our “code” was, or in other words, what exactly we wanted to be reviewed. We brainstormed some ideas around this and quickly found out this was not a trivial topic. We decided to call it “deliverables”, which included things like workshops, trainings, retrospectives, kick-off meetings, how we do stand-ups, etc. It was clear that not all these deliverables could be reviewed in the exact same way or with the same cadence if we wanted to make the process effective and avoid overwhelming the team.
For the deliverables which consisted of something we created (a training, documentation, an article…) it was apparent that we could just review it asynchronously and collect feedback. The challenge was for the non-tangible things, such as reviewing certain meetings. In those occasions, we suggested organising a short session, around 30 minutes, with the reviewer to have a conversation. The proposed agenda for those conversations was:
- Giving context about the team and the goal you had in mind with the session
- Explaining how the session went (sticking to the facts) and what the outcome was
- Commenting on challenges and potential issues you encountered during the session: for instance, things that didn’t work as expected, problems with engaging people, pushback, etc.
- Finally, the reviewer gives feedback on all the points above and suggests alternatives when appropriate.
Additionally, as this was a new thing, we proposed to have a last part of the session in which we collected feedback about the session itself. We wanted to know if the reviews were useful for the team and to spot potential areas of improvement.
The next question we needed to answer was when the reviews should happen. Should we do one every time we work on something intended to help our teams? After every stand-up? When does it make sense to complete an agile review and when doesn’t it?
We concluded that this also depended on what we were reviewing. So we proposed dividing deliverables into two groups: Deliverables that need reviews every time and deliverables that only need it periodically.
Deliverables that required review every time included things such as the retrospective outcomes, training, big changes happening in the teams, and kick-off sessions for new teams.
On the other hand, deliverables that don’t require review every time included things like one-to-one sessions, refinement sessions, preparations for retrospectives, and stand-ups. In this group, we agreed the reviews could be requested on-demand when we felt it was necessary, but also established a baseline cadence for reviews. For instance, every quarter we will review how we are approaching stand-ups.
Finally, we proposed the creation of a specific Slack channel to request these reviews. Any member of the Agile team could volunteer to review any deliverable from the others. Multiple reviewers were also allowed where it made sense.
Ah! We also needed a name, since Code Reviews didn’t make sense for us. So we decided to simply call our new method Agile Reviews.
Implementation of the idea
After developing the idea we did two test reviews to try it out and to get an idea of how it feels to be reviewed and to be the reviewer. Also, we wanted to get some initial ideas on where we could improve. With the insights from these tests, we made some adjustments to the Agenda and to the expected time needed for specific points.
We also talked to some developers to deepen our knowledge about code reviews and figure out some tips and tricks from them. The main insight from these talks was that we should add the following ground rules:
- Making sure to give feedback with empathy
- Not being too picky
As the next step, we presented the idea to the team, discussed details and got their buy-in to try it out. One learning we had from that session was that we had to define the deliverables more in detail as well as making the goals and not-goals very explicit.
- Learn from each other
- Ensure the quality of our work
- Align on “how to do things” (identify areas in which we want to be aligned, similar to coding standards in the engineering world)
- Help maybe identify blockers or hidden problems
- Identify hidden personal struggles (is the agile practitioner happy in the team, stressed, unhappy, any personal issues in the team)
- Having everyone work the same way
How did it go?
We have been using agile reviews for three months in the team and so far we have done over 60 reviews in a team of eight people (including the review of this blog article!).
The feedback has been generally positive, helping us improve the quality of the work we deliver to the teams. It has also helped us to share information. A good example of this was a small presentation about how to improve the way we give and receive feedback. This presentation was originally created for a specific team to help them address some issues. However, after doing the agile reviews, some of the reviewers had an interest in using the same materials for their respective teams.
Another thing we can observe from this period is that there is a lot of interest in knowing what others in the Agile team are doing. When preparing this proposal, one of our concerns was that it could become difficult to find volunteers to be reviewers and that it could create too much overhead in the team. Far from it! What we have seen is that on many occasions there are multiple volunteers for the reviews. This is great because we are enabling people to share and learn new things, and also allows the reviewee to get more opinions.
We can corroborate that we achieved some of our goals with the feedback that we have collected. Some examples are:
- “Overall it has helped me see other structures and different points of view.”
- “I like the feedback very much (when I present something that I did), it’s an improvement goal for me and it works.”
- “The quality of our work has gone up, as has team collaboration and engagement.”
- “I feel the sharing of ideas has been really fruitful, it allows us to synthesise our thoughts and to learn from one another”
It’s also fair to mention that some people expressed they got a bit anxious about the reviews, the most recurring points being:
- Fear of missing out on what was going on if they couldn't review certain things
- Having a bit of a "competition" feeling when they were not submitting things for review while others did
- Introducing the reviews may have slowed down our speed as sometimes the process takes too long
These are undesired side effects and we are going to discuss them with the team. But we believe we can learn a lot from these insights as well. For example, developers may have similar concerns about code reviews and this experience can help us to understand their feelings.
Overall, we are happy with the results and we have learned a lot from them. This practice will definitely continue and, at the same time, we will keep collecting feedback and applying changes to remove pain points and help participants to feel less anxious.
After this experience, we are convinced that, in the same way, those code reviews are a best practice for development teams. Agile reviews are a great way to boost the quality, knowledge, and collaboration in a team of agile practitioners. We would definitely recommend applying something similar to all the agile teams out there!