How to Use Quick-win Analysis to Plan a Roadmap
No matter how big the expectations are for immediate ROI, a long-term vision for how you plan to use a new software platform can’t be rolled out overnight.
- But where should you start?
- How do you prioritize?
- How can you move as fast as possible?
This article outlines recommended steps for any organization looking to build a long-term strategy in a new software platform while creating quick wins along the way.
Keep reading to learn more about Quick-win analyses — and be sure to download the Quick-win analysis template to get started working through your plans.Â
What is a Quick-win Analysis?
A quick-win analysis is a calculated method for evaluating the return on investment (ROI) of any given program when speed to launch is the priority. It’s important to note the emphasis on speed. Other ROI calculation models focus on other factors, like overall financial gain or time savings. A Quick-win Analysis is solely focused on speed of delivery compared to anticipated value.
Conducting a Quick-win Analysis
Building momentum for your long-term roadmap starts by first identifying the programs that will keep you in the news with management and in the budget conversations.
If you already have your long-term vision defined, you’ll need to identify the launch order of your programs by conducting a quick-win analysis.
Conducting a Quick-win Analysis involves four steps:
- Measuring Impact
- Estimating Level of Effort
- Calculating Values
- Prioritizing programs
Step 1: Measure Impact
To measure the impact of each program within your long-term vision, you will need to first assess individual levels of breadth and depth.
You can do that by answering the following questions:
- Do you have a working program now?
- Does your desired program work across departments or teams?
- How many people will be or are impacted by this program?
- How much executive visibility exists with this program?
The answers to these questions can help you quickly see the volume of people or teams your programs could help, as well as the exposure level of management to rally support.
Step 2: Estimate Level of Effort
Next, you’ll want to determine how much time and how many team members would be involved in changing your program.
You can do that by answering the following questions:
- Are your process requirements well defined?
- Can you use out-of-the-box software functionality?
- How many relationships does it have to other technology applications?
- How many stakeholders are involved?
The answers to these questions help you understand any contingencies and your overall readiness to deploy. We recommend that any process not yet well-defined is automatically moved lower on the prioritization list. Having clear processes defined and a complete understanding of technology contingencies are critical to a successful program launch and long-term adoption.
Step 3: Calculate Values
For a scientific approach to your Quick-win Analysis, associate numeric values to your answers from steps 1 and 2 (Measuring Impact and Estimating Level of Effort questions). This step provides an extra rubric level for quantitative data-driven decision-making.
To figure the value of each program in your long-term vision, subtract the level of effort from total impact.
[Impact Value – LOE Value = Total Value]
Reference the examples below to learn how you can apply numeric values to your questions and answers.
Measuring Impact Values:
- Do you have a working program now?
- Yes = 5
- Yes, but updates are needed = 10
- No = 25
- Does this program work across departments or teams?
- Enterprise-wide = 25
- Multiple departments/teams = 10
- One department/team = 5
- How many people will be or are impacted by this program?
- Enterprise-wide = 25
- Multiple departments/teams = 10
- One department/team = 5
- How much executive visibility exists with this program?
- High = 25
- Some = 10
- Low = 5
Estimating Level of Effort Values:
- Are your process requirements well defined?
- Yes = 5
- Yes, but updates are needed = 10
- No = 25
- Can you use out-of-the-box software functionality?
- Yes = 5
- Yes, but updates are needed = 10
- No = 25
- How many relationships does it have to other technology applications?
- Many = 25
- Few = 10
- None = 5
- How many stakeholders are involved?
- Many = 25
- Few = 10
- None = 5
Step 4: Prioritize Your Programs
Now that you have a total value for each program included in your long-term vision, you can prioritize your programs by sorting from highest total value to least.
From here, you can plot each program in sequence order against a timeline by phase:
- Goals & requirements gathering
- Process workflow & report design
- Configure your software to reflect desired automation & reporting
- Test your functionality in a non-production environment
- Evangelize the updated program & the value in adopting
- Launch to your full user base
Implementation Best Practices
Example Quick Win Analysis
The scenario outlined below follows steps 1 – 4 of the Quick-win Analysis process using a Governance, Risk & Compliance (GRC) long-term roadmap as an example.
Financial Services Company
Goal: Move entire GRC program out of legacy software system and into Onspring
Requirement: Launch elements of GRC in Onspring within 30-days of purchase
Quick-win Analysis was conducted for the following GRC programs:
·       Risk management
·       Third-party risk management
·       Compliance management
·       Policy management
·       Internal audit
·       Incident management
·       Business continuity & recovery
Â
Steps 1 & 2: Measure Impact & Level of Effort of Each Program
The GRC team at this financial services company completed the Program Impact and Level of Effort assessments for each of the seven GRC programs included I their long-term roadmap.
The results of the assessments prioritized the launch order in Onspring for each program. Programs with the highest quick-win values were prioritized for launch, while programs with the lowest quick-win values were moved to later in the roadmap.Â
Quick-win Analysis results determined the launch schedule:
1.     Policy management
2.     Compliance management
3.     Business continuity & recovery
4.     Risk management
5.     Incident management
6.     Internal audit
7.     Third-party risk management
Â
Building the Launch Timeline
Now that the GRC team had a prioritized list of the seven programs in the overall GRC roadmap, they moved on to mapping the implementation of each program across a timeline.
To estimate the length of time that might be needed to complete each stage in technology implementation for all seven programs, the team referenced the program roadmap planning guide included in the Quick-win Analysis Template.
This guide takes into consideration the average length of time teams within an organization need in order to establish program goals and gather requirements, design their end-state process, configure a solution inside Onspring’s no-code self-administration platform, and test and communicate across teams before launch.
 The baselines below can help you estimate the length of time needed to implement each program based on its maturity stage.Â
Type of Program | Goals & Requirements | Design | Configure | Test | Evangelize | Launch |
Existing working process | 1 week | 1 week | 1 week | 1 week | 2 weeks | 1 week |
Existing process that needs refinement | 2 weeks | 2 weeks | 1 week | 1 week | 1 week | 1 week |
Existing process that needs overhauling | 2 weeks | 2 weeks | 1 week | 2 weeks | 2 weeks | 1 week |
New process that can be modeled after an existing process | 3 weeks | 1 week | 2 weeks | 1 week | 2 weeks | 1 week |
New process with limited number of stakeholders or impacted groups | 3 weeks | 3 weeks | 2 weeks | 2 weeks | 2 weeks | 1 week |
New process with large number of stakeholders or impacted groups and high visibility | 3 weeks | 3 weeks | 3 weeks | 3 weeks | 2 weeks | 1 week |
Implementation timeline per program from this example:
1.     Policy management (5 weeks)
2.     Compliance management (5 weeks)
3.     Business continuity & recovery (5 weeks)
4.     Risk management (5 weeks)
5.     Incident management (5 weeks)
6.     Internal audit (5 weeks)
7.     Third-party risk management (5 weeks)
To further expedite the overall GRC program launch, this financial services organization overlapped implementation stages across each program where appropriate. For example, when Onspring configuration began for the first policy management program, requirements gathering and goal documentation began for the next program in the queue, which was compliance management.
Overlapping activities enabled efficiencies and time savings across the full roadmap, ultimately enabling the company to launch all seven programs within six months.
Conclusion
When you are working with a large group of stakeholders to define where your organization should focus and prioritize, using a Quick-win Analysis can be incredibly helpful because it considers the soft, qualitative variables and turns them into measurable elements that can be prioritized by value.
Coupling the Quick-win Analysis prioritization approach with the roadmap planning timeline can be an excellent tool for communicating across your enterprise as well. It provides calculated visuals of what teams across your organization can expect and when.