From analysis to action: What to do after the sprint

From analysis to action what to do at the end of the sprint3

From analysis to action what to do at the end of the sprint

You’ve completed your Enterprise Design Sprint. Congrats! After you’ve gotten feedback there are two things left to do: figure out exactly what you’ll be doing after the sprint and actually execute the plan.

If you were drawn to the Enterprise Design Sprint series, you were probably facing issues in your product development cycle like,

  • The product design cycle was taking too long.
  • You were delivering value but not getting the response from customers and stakeholders that you were hoping for.
  • You had presentations, white papers, or sketches about what could be done. Some of it may even be validated with customers. But you didn’t have a plan for implementing the concepts. Or you ran into issues when trying to implement them.
  • You were working on problems involving multiple stakeholders, business lines, and solution partners who weren’t on the same page.
  • You didn’t have one single, completely informed decision maker who could make the final call.
  • You were having difficulties reconciling modern experiences with legacy systems and business processes.
  • You had “wicked problems” to tackle but your organization was ignoring them in favor of low-hanging fruit. And you were starting to run out of low-hanging fruit.
  • You were having difficulty helping other people focus on anything beyond the latest fire.

 

Going through the Enterprise Design Sprint exercises should help with many of these problems. By the end of the week,

  • You would have sped up the design cycle, going from problem to plan in around a week.
  • You would have gotten valuable feedback from users and validated the feasibility with stakeholders.
  • Stakeholders would be on the same page about the problem you’re solving, potential solutions, and phasing.
  • You would use success criteria and user needs to decide what was important in the absence of one single decision maker.
  • By understanding the as-is state, to-be state, and a path to get from one to the other, you can make a realistic plan for execution.
  • You were successful in getting the team to think about strategy for at least a week, while being mindful of quick wins.

 

Getting that far is a major triumph. You spent a week breaking down a complex problem and providing some clarity around what’s needed and possible. However, if no investment decisions are made based off of this info then it was just an educational exercise. Not a horrible way to spend time, but we changemakers like #GTD.

What do you do next? How do we make sure something actually happens with this information? When do we start building things and getting them out to real customers? How do we move past the inertia of making difficult things happen in a complex enterprise?

 

Action items after the sprint

To move from analysis to action you’ll need to go through three steps:

Step 1) Revisit your assumptions
Step 2) Determine how this information will change investment decisions
Step 3) Set up collaboration systems

 

If the sprint topic made it through the sprint then it was prioritized and deemed to be important enough to spend time and energy on. What you want to decide now is if you should continue to move full steam ahead, abandon the idea, pivot slightly, or change course dramatically. Verifying the assumptions you made during the week will help answer that question, and setting up coordination channels will help teams stay aligned over time to deliver value or change direction.

 

Step 1) Revisit your assumptions

After you’ve completed the reviews with stakeholders, gather the team back together for a debriefing meeting. Share the feedback you’ve gathered from each perspective, writing key insights on sticky notes and pasting them up on a wall. Be sure to mark the source of the comment and their role, either as a note on the sticky note itself or by color coding. Organize the notes into clusters based on themes that you notice, as you walk through each of the following assumptions.

As a reminder, these were the assumptions you were looking to test qualitatively:

  • Users feel that they have the problem you identified and agree that it’s the most important problem to solve
  • Users think that this product/service/solution solves that problem
  • Business stakeholders are able and willing to support the change
  • Technical partners are able and willing to support the change
  • The solution is in alignment with high-level business and technical strategic goals and initiatives
  • Your draft roadmap and product or business model phasing are desirable and feasible for users and stakeholders

 

Users feel that they have the problem you identified and agree that it’s the most important problem to solve

Look for evidence that supports or contradicts the assumption. For example, in your user reviews did your users mention that the problem you’re trying to solve was important to them and that they would love a solution to it? Or did they shift the conversation to another problem that they have? Did you understand their process and their current customer journey correctly or did you make assumptions about certain steps? Refer back to your original customer profiles and as-is customer journey and add more notes.

If your users don’t seem to have the problem you identified that could be a sign that implementing the minimum viable product (MVP) you designed will be a waste of time. That is unless you use the MVP as a way to collect data to learn more about your users and provide something better in the future. Sometimes people react differently to an idea in the abstract than when they can purchase the solution. It can also take time for people to realize that they need it. If you didn’t see evidence of a product/market fit it could also be that the users you talked to didn’t have the same needs as your target population.

When evaluating problems, also look at any aggregate quantitative data that you’ve found related to the market or business. Maybe you were right about the severity of the problem for one person but not that many people encounter the problem.

 

Users think that this product/service/solution solves that problem

Next you’re looking for how well users feel that the solution solves the problem. Look for what they liked, didn’t care for, or thought was missing. Refer back to the to-be customer journey you created and mark any areas that were positive, negative, or neutral. Maybe your users felt that the solution partially solved their problem but there’s still an unaddressed pain point.

 

Business stakeholders are able and willing to support the change

After gathering feedback from business stakeholders review their likelihood of supporting the change. Would the solution require a business process change? A policy change? New funding? Pay attention to any hesitation that could mean that the change could take months to implement or get stuck in bureaucracy. If so, that is not a sign to stop moving forward, but can tell you if the change is unlikely to be supported organically, so you know to work on getting executive sponsorship.

If users like the idea but you are met with resistance from business stakeholders, gently ask why they feel that way. There is probably a story behind their reaction, perhaps based on their knowledge, incentives, and assumptions they have about the solution. For example, stakeholders may be resistant to a solution if they think it will cost too much time and money for the value it provides. Clarifying what really is required to implement it and what the value will be may help.

 

Technical partners are able and willing to support the change

Follow the same approach with technical changes, looking for signs that it will be difficult for partners to support you. Does it require large-scale refactoring? The purchase of new technology? Think about the criticality of that change on both your MVP design and the vision. Are the technical teams given similar priorities or will it take months or years to align? Resistance from technical partners could be caused by lower priority, skill, funding, or lack of technology in place. Some of those issues could be resolved by executive support to realign priorities and obtain extra funding. Others may need to be solved with a solution change or by working with alternative partners.

 

The solution is in alignment with high-level business and technical strategic goals and initiatives

After sharing with executives and other stakeholders, you probably learned at least one of three things:

  1. Other initiatives are a higher priority at the enterprise level. That’s important information so you can understand why and how those initiatives will influence your roadmap.
  2. There is existing work in this same area which means that your roadmap should align with theirs. It would make sense to start regular check-ins with the other teams working on those initiatives to share information and to stay aware of updates.
  3. Through the Enterprise Design Sprint the team identified important initiatives that were not being considered at the enterprise level. Your work during the sprint to explore information and gather customer feedback could be the grassroots initiative that will influence executive priorities.

 

Your draft roadmap and product or business model phasing are desirable and feasible for users and stakeholders

Maybe everyone’s on board with the MVP and long term vision but not on the same page about the relative phasing. This is a great place to be. Create a committee of stakeholders and partners and meet regularly to discuss implementation of the MVP and to share the results of data collection. Use the information from the pilot to decide which component to implement next, or run another design sprint if you’ve learned a lot of new information about your customers, systems, and business since the initial MVP.

 

Dealing with conflicting feedback

As you go through the comments and look for patterns, here are four things to help prioritize if you find conflicting feedback and you’re not sure what to focus on when making a decision about next steps:

  1. The source of the comment. How much power and interest does that person have in the final solution? Is he or she your target customer? Prioritize feedback from the people you are trying to serve with your solution, or anyone who has significant power to influence the success of the outcome.
  2. The impact of the comment on the risk of the system. Pay extra attention to anything that would negatively impact safety, security or the “ilities” such as affordability, reliability, availability, maintainability, and usability of the system. The relative priority of these will depend on your organization’s objectives.
  3. The relative severity of the needs expressed. People tend to love brainstorming and will point out new ideas if you show them a product. You want to be aware of features that are “need to haves” vs. “nice to haves” and how they reflect back to your stakeholders needs, gains, and pain points. That goes for everyone involved in the product, not just the end users. The “nice to haves” may be clues of areas to explore in the future as you work down the roadmap, while “need to haves” should be considered earlier.
  4. The impact on your strategic goals. You might stumble upon a great opportunity that just isn’t a good fit for the current goals of your organization. For example, you might prioritize feedback differently if you’re a non-profit, venture-backed start-up or self-funded start-up. You might hear about an employment problem when your mission is to solve a healthcare one.

 

Step 2) Determine how this information will change investment decisions

You’ve reviewed all of the feedback, organized it, and have some ideas of where the potential successes and pitfalls are. Now you need to make a decision about what will happen next. This decision will be based on the risk tolerance of your organization and the impact of moving forward with implementation. In an enterprise environment, most likely the sprint team will be making a recommendation or presenting the pros and cons of the choices to a higher-level decision maker (or group of decision makers).

The minimum viable product (MVP) is the first step in the journey of implementing the results of your Enterprise Design Sprint. In the short term you have four options:

  1. Stop all work on the MVP
  2. Defer all work on the MVP
  3. Pivot direction
  4. Move forward with the MVP

 

Stop all work on the MVP

This course of action makes sense if you identified a red flag, lack of funding, no support, or no need for the solution. In other words, the “V” or “viable” in MVP is questionable, but you might be getting pressure from some people to do it anyway. Examples of red flags could be indications that the customer experience will be worse if you move forward, that there’s a safety risk or a reputation risk for your organization.

Some risk can be mitigated but be on the lookout for any sign that the risk is too high to continue. Run another sprint or resolve one of the barriers before reconsidering implementation. In this case, it’s important that the findings be escalated to higher-level decision makers to share the downside of moving forward with the current plan. An idea that sounds great at the 10,000 foot view may be disastrous given the current state of systems and resources on the ground.

 

Defer all work on the MVP

Sometimes you’ll want to run an Enterprise Design Sprint to sort through ideas around a big hairy problem so you can start understanding how to slice up the work for the future. At the end of the sprint you might realize that there’s a need for the solution and some support but schedules may not yet align or there could be a dependency on something else like purchasing a new technology, assembling a team, or waiting until a previous initiative finishes before the partner team can shift focus to this one.

 

Pivot direction

Maybe you found that the MVP solution could work with some changes. Depending on the nature of the changes you might decide to run through another Enterprise Design Sprint or use your typical product development processes to refine the design before implementation.

 

Move forward with the MVP

If users like the solution, partners can support it in the short term, and it’s still high priority, then there’s no reason to delay implementation. Finish out any requirements, design, or development depending on your process, finalize the questions you want to answer, define metrics, and launch!

 

Step 3) Set up collaboration systems

You’ve figured out what to do with your MVP. Now we need to make sure something happens with the roadmap and end vision, otherwise, it all may be forgotten a few weeks after the sprint. With the longer term vision and roadmap, it matters less right now if your solution is perfect than if you have a process for revisiting the roadmap, validating it, and aligning priorities and roadmaps between teams.

Assuming your roadmap will require multiple teams to implement it, there are some pieces that will need to be in place:

  • Regular check-in meetings with the other teams to review roadmaps, timelines, schedules, and priorities. This could be on a weekly or monthly basis depending on the speed of change in your organization.
  • Somewhere to store this information, preferably in a visual online tool like these roadmapping tools, which will also help you map your sprint topic and resulting features to user personas, strategic goals and enterprise initiatives.
  • Somewhere to store process and user needs information captured during the sprint for analysts and designers to refer to in the future. Some of the roadmapping tools provide space for this info.
  • A joint process for validating and designing items in the roadmap. Scheduling another Enterprise Design Sprint with both teams represented could be the best option. At a minimum, making sure that both teams are invited to each other’s requirements sessions can help to ensure that the systems interface and overall functionality in the end-to-end process make sense.
  • A joint process for reviewing the results of metrics and lessons learned from user feedback. This could happen during your regular check-ins or in planned requirements sessions depending on the feedback. If you’re not reviewing the performance of the MVP to learn more about your users and inform future decisions, then you’re wasting a great opportunity.

 

Once you have these decisions and systems in place it will be easier to relax and know that you’re moving forward where you can, coordinating where needed, and have saved information that can’t be acted upon right now. Now it’s time to execute the plan or start the next Enterprise Design Sprint!

 

Next steps

You’ve learned about why Enterprise Design Sprints are useful, how to pick a great topic, what to expect during the week, and what to do after the sprint to ensure that your analysis translates to action. The next post in the series will be about troubleshooting common issues that facilitators run into.

Update: This blog series about Enterprise Design Sprints has been expanded into a guide for facilitators with all of the content, plus 50+ worksheets and some other surprises. Check it out in the store. 

Related reading