Toggle Menu

Insights / Agile Transformation / My Take on Agile Assessments

April 10, 2018

My Take on Agile Assessments

4 mins read

Are you at the start of your Agile Journey? Or do you want an outside view on how well your journey is going? We find that organizations often request an Agile assessment to learn more about their current state and what the next steps ought to be.

The Challenge

The challenge is that most assessments are one-size fits all approaches. Do a search on the term “Agile Assessment” in Google and you get oodles of approaches and views. Go ahead, I’ll wait…

 

 

All that stuff sounds good, right? Yes… and I have two main worries with most of these.

The first is their one-size-fits-all approach. This limits their ability to determine an organization’s agility. Most of the assessments emphasize the use of specific practices. A few approaches look more closely at the interactions between people, but, even when this deeper consideration is made, little real analysis occurs once the question is ‘answered.’ The perfect little scorecards create the impression that the consultant or coach becomes all-knowing and thoroughly familiar with the organization’s challenges.

This brings out my second worry. Without a deeper analysis, it gets very difficult to provide meaningful insights and recommend possible next steps, such as experiments for the organization to attempt or future lines of inquiry. The common assessment frameworks also don’t offer any tools for organizations to understand their choices and to analyze results as they go forward on their own.

Exploration, Not Assessment

In my work, I prefer to avoid the word ‘assessment’ because it’s very judgmental sounding. Thanks to our own Hareem Mannan (@okhareem), I am now using a replacement term. I now do Explorations. Even if a client wants me to call it an assessment, I treat it more like a spelunking expedition. I dive deep into the organization and don’t limit myself to just understanding how ‘agile’ they are; I focus on their effectiveness.

So, what is the difference? The work involves a deeper series of conversations and the resulting output reflects this greater detail. If someone starts discussing problems with their work, I don’t restrict myself to examining, for example, how well product ownership is done or how well user stories are created and prioritized. I focus instead on asking open-ended questions to get a sense of their perceptions. I’ll observe the work patterns to see how these perceptions play out over time. The final output becomes a deep analysis that isn’t intended to be ‘right,’ but rather aims to provoke a conversation about what was observed and/or described.

This tailors the resulting output for each organization, and I don’t mean just a different version of the same bar graph or radar chart. The artifacts I create embed cultural models and use them to describe how the culture manifests within the organization (and perhaps how employees might want to see it change). I may use causal loops or other diagrams to explain systemic effects within the organization. I may create other visualizations to explore the perception of the business model and important work processes. I may leverage conceptual models to foster conversation on important attributes like trust, collaboration, or communication. In this way, the organization learns tools they can use with or without Excella’s coaching. If an external coach’s goal is to eventually wean an organization from needing them, I might as well start at the beginning.

A Different Artifact

How I present this final artifact also differs and is customized to the client. It usually becomes a presentation with lengthy conversation throughout; sometimes I integrate exercises to help stakeholders determine actions they can take or to help reinforce an important concept. These strategies are particularly effective with executives or managers who can foster change within the organization.

As an example, I recently worked with one senior manager who insisted his teams were working on only one project at a time; he didn’t realize how many efforts his staff members were simultaneously supporting. To bring this understanding home to him, I ran an exercise during the debrief that allowed people to anonymously annotate how many projects they were working on. After a quick collection of the data and calculation in real time, I got the senior manager to realize what each of his staff was supporting; on average, six projects simultaneously. This changed his thinking, made him consider an important issue, and got him thinking about how he might be able to help solve it.

After all, if you are hiring us, we want the engagement to be the start of a good Agile coaching relationship as our partner Jeff Gallimore has described.

You Might Also Like

Resources

Simplifying Tech Complexities and Cultivating Tech Talent with Dustin Gaspard

Technical Program Manager, Dustin Gaspard, join host Javier Guerra, of The TechHuman Experience to discuss the transformative...

Resources

How Federal Agencies Can Deliver Better Digital Experiences Using UX and Human-Centered Design

Excella UX/UI Xpert, Thelma Van, join host John Gilroy of Federal Tech Podcast to discuss...