How We Work
Minimum Viable Games
At the heart of how we work is a backlog of Minimum Viable Games.
The design of each Minimum Viable Game is one where the team gets to take on moving a measure that correlates to what’s currently constraining Time to Market. For example the team’s first Minimum Viable Game might be to take on moving the team's Say/Do ratio from say 50% to 80%.
In the process of playing an MVG, what shows up as a blocker is either something missing in the area of Tools, something missing in the area of Training or something missing in the area of Between Work.
When something is identified as missing in Tools or Training its straight forward enough for Company Leaders, Product Owners and Development Teams to deal with. People either have the tools they need to get the job done or they don’t. People either know what to do and how to do it or they don’t.
Dealing effectively with the area of Between Work however is quite different.
Whereas with Tools and Training generally speaking what we do to make a difference is add tools that are missing and add training where its missing, in the case of Between Work the reverse is true.
Most of what is required to be effective in causing teams and teamwork is about making connections and removing barriers - something we call Between Work.
Between Work includes...
- paying attention to what is between ourselves and reality - what gets in the way of doing what we say we are going to do by when we said we would do it
- paying attention to what is between ourselves and others in teams - what gets in the way of straight communication within and between teams and...
- paying attention to what is between ourselves and fulfilling our intentions - what gets in the way of staying true to what really matters to people about the work they do together.
It is in Doing the Between Work that Bullet Proof Kickass Teams are Forged.
Team Fitness Measures
The program backlog of Minimum Viable Games is built based on the identification of constraints that are impeding flow in the value stream.
Just as a Product Owner might organize their product backlog into a series of Minimum Viable Product Increments for release in market, the program backlog is organized into a series of Minimum Viable Games for those teams that are participating or contributing to the value stream.
Measures such as Monthly Active Users (MAU), Customer Lifetime Value (CLTV), Cost of Customer Acquisition (COCA) and AARRR (Acquisition, Activation, Revenue, Retention, Referral) are used to inform the design of Minimum Viable Games designed to impact the performance of what Product Owners are accountable for - that is impact what's constraining growth in adoption of products in-market.
Measures such as Defects in Production, Developer Retention, Velocity, Volatility and ratios for Planned versus Unplanned Work and Defects versus Development are used to inform the design of Minimum Viable Games designed to impact the performance of what Development Teams are accountable for - that is impact what's constraining Time to Market.
Program Performance Measures
A typical engagement starts with a discovery and planning cycle during which time a value stream and a measure to move for the program are identified
The measure to move for the program is the basis for the economic justification that underlies the business case - it answers the question “Given everything else we are up to and everything else we are dealing with, why proceed with this program at all?”.
For existing products operating in market and at scale, the program measure to move is typically Time to Market for Development Teams.
For emerging products, that is products with little or no traction in market, the program measures to move are typically Time to Product/Market Fit for Product Owners and Feature Cycle Time for Development Teams
For a Company struggling to meet publicly announced deadlines and/or commitments made to channel partners - especially when what's at stake is the credibility of the company's brand - the program measure to move would likely be Say/Do at the Company Level.
Wash, Rinse, Repeat
The design of the engagement model is that it can be applied incrementally and iteratively and, as is the case with any iterative and incremental process, learnings along the way allow for the engagement to rapidly converge on the pathway for delivering the highest value soonest.
This approach is also very well suited to scaling up and scaling down with the lifecycle of an engagement.
In the early stages (after the initial discovery and planning stage) an engagement typically begins with a thin vertical slice through the organization that validates key assumptions regarding where the value chain is constrained and what it takes to remove the blockers.
In the mid stages MVGs can be taken on by multiple teams, including being taken on in ways that are designed to tackle the Between Work for teams of teams (true Partnership).
At the latter stages MVGs can become highly targeted, designed to deal with particular areas where constraints are problematic for the business.
Typically, the Wash, Rinse, Repeat cycle (where teams take on playing MVGs) continues until either continued investment in the program is no longer warranted and/or the Performance Measure for the program has been met.