The first metric we’ll dive into in the Instance Insights series is time to install. The time it takes for end users to get a vendor’s software up and running in a new environment is a key indicator of the quality of the packaging, configuration, testing, and documentation of the delivery of a software product. Lacking in one or more of these categories can result in a lengthy and cumbersome installation process.
Reducing time to install allows ISVs to complete more installations in less time, which allows customers to see value faster during crucial POC or onboarding phases. Improving this metric also creates value by reducing the time engineers spend assisting with installs, making them available to work on other things, like improving the product. Measuring this metric across all installation attempts allows software vendors to assess the overall performance of the delivery of applications into customer environments and identify key areas for improvement.
In our experience, best in class vendors will generally measure the 80th percentile for time to install performance, and will complete 80% of installs in under 2 hours.
Time to install is a multi-faceted metric and can be complicated to measure, so it’s important to define what it means to your business to ensure you measure what matters to you. While deciding on your definition of time to install, think about what outcomes you want to drive by focusing on this metric. Are you looking to reduce the support burden on engineering teams that are pulled in to support customer installations? Are you looking to reduce the time it takes to progress prospective customers through a proof of concept? Are you looking to reduce the minimum hours of a professional services package for new customers?
The specific start and end points for measuring time to install can vary. You can measure a few different combinations of timestamps to understand installation performance. In this article, we’ll consider two possible start points:
The graphic below shows a high-level summary of a common installation workflow:
While this article primarily focuses on time to install, the end goal of any customer installation effort is to get software not only up and running in a customer environment, but also getting to the point where the application is delivering value to the customer.
Time to value - this concept of delivering value may encompass the completion of any or all of the following:
Time to install - While there are many steps on the journey from intent to deploy up to software delivering value, we find the most valuable place to focus measurement and improvement is time to install - that is, the time it takes to get software installed, running, and healthy.
While we primarily focus on time to install, it’s important to keep time to value in mind. If you speed up your time to install but somehow slow down the overall time to value by adding time to other steps in the process, this should be considered an overall regression in delivery performance.
1. Time to Install (license) - time from license creation to software live
2. Time to Install (instance) - time from installation attempt start (usually initiating a CLI command) to software live
3. Time to Value - time from license creation to software delivering value
Where you start and end measuring time to install will be impacted by your business practices and installation methods. These are likely to change over time, so it’s important to reevaluate your definition regularly to make sure you’re still measuring what matters.
Before setting a goal, you’ll want to gauge what your current time to install is so that you’re able to set a goal that is attainable. While you may not have been measuring time to install previously (or even have a method for measurement), you should minimally be able to estimate what your happy paths look like vs. unhappy paths.
Example:
It’s recommended to set both a short and long term goal for time to install and each should be specific, measurable, attainable, relevant, and time bound. The goals should be based on current performance and are impacted by your ability to identify the why behind the installs that end up on the unhappy path. If you know the why, and it’s something that you’re able to address and improve, you’re more likely to attain a loftier goal.
Example:
Method of measurement may vary depending on how time to install was defined.
Starting point - if this was defined as the intent to deploy, this will have a defined date. Otherwise, unless your installation tooling tracks and reports this information, the point at which the customer starts the installation will need to be manually recorded and tracked.
Ending point - if this was defined as when the software is live, the end point can be captured manually (i.e. while on a call with the end customer helping them complete the install) or using tooling that is able to capture first ready data. High quality application health checks can help with the automated collection of this timestamp.
If you are measuring the end point of an air gap install or are using value delivery as the end point, the data will likely need to be manually recorded and tracked.
Scope - You should record the start and end points for each customer instance you deploy so that you can compute statistical summaries (median, mean, and percentiles) across your entire customer base. If you have multiple instances at a single customer site, it is worth recording each instance’s time to install separately. In general, if an instance never becomes live or never reaches a point of value delivery, it should not be included in your time to install data. We’ll go into more detail on measuring installation success rate in an upcoming blog post.
The graph below depicts the average time to install over time compared to the goal that was set. You can see where the goal was also adjusted after an initial short term goal was reached.
It can also be helpful to look at a histogram of all install attempts for a specific time period, with the median and 80th percentile identified, as depicted above. Tracking time to install using these data points can help your team understand the distribution of install times across instances and makes it easier to visualize if you’ve missed, met, or exceeded your goal(s).
Visualizations of the data can highlight the installations that end up on the unhappy path and help you understand if your goals are truly driving the outcomes that matter to your business. Looking at the example graph above, you might decide that you’re okay with 2 installs being in the 5+ days range, or you might decide to revise your performance metric to capture a maximum time to install instead of the 80th percentile.
As you are measuring your time to install performance, it’s important to identify edge cases and pain points that are negatively impacting progress towards your goals. Below are our recommendations for identifying and making adjustments to improve those pain points:
At Replicated, we are actively working on improving reporting and telemetry within our product to help ISVs understand the health and performance of customer instances and measure key indicators of performance like time to install.
Replicated can assist with the measurement of time to install by providing information on license creation dates for the starting point and the timestamp of first seen (for online installs) in the install information on Replicated’s new instance details page (see below). More detail on our improved customer instance telemetry can be found here.
Replicated also provides computations for instance and license time to install based on information about when the instance is first seen, and when it first reports a ready state during an update check. These are listed as “beta” because we’re still evolving how we compute and display this information for software vendors.
In addition to providing insight into the data, Replicated also has resources and features that can be used to make adjustments to improve time to install:
Continue iterating on the define-evaluate-goal-measure-adjust cycle as you make progress towards your ideal time to install. Once time to install is approaching best in class, vendors tend to look at improving the number of installations that can be performed fully self-service, without vendor assistance. If you measure time to install, we’d love to hear from you on what has worked for you and what your goals are.
Stay tuned for our next post in the Instance Insights series, install success rate.