Bleeding Edge to Trailing Edge: Assessing Supply Chain Technology Readiness

Supply chain managers have a wealth of technology tools available to them, far more than they can deploy effectively. Therefore a good supply chain leader is also a selector of technologies. A major part of current supply chain managers’ attentions are in winnowing tech options and building together a bouquet of effective solutions that are multiplicative in impact when used in concert. As part of my career I’ve been on the buying, advising, and selling side of this process. I think it’s fair to say that there are no good instruments or metrics for supply chain managers to assess the adoption readiness of new technologies. This article will introduce one approach in hopes of filling that gap or sparking further research and discussion.

What About the Hype Cycle?

Some readers may now be asking themselves; “what about the Gartner Hype Cycle?”. That is indeed an instrument to judge the degree to which a technology has been adopted by the industry. But it is flawed in several important ways. First, as noted in a separate article, it lacks the final stage of technology adoption: obsolescence and replacement. Second, its primary axis is concerned with the emotional engagement or trust in the technology. This is part but not most of what makes a technology adoption-ready. Third, it’s simply imprecise because the curve graphic implies continuity but gives no guidance on what a smidge left or right really means in a factual sense. I believe the scale is more anecdotal than quantitative.

Updated Hype Curve

Two Principle Dimensions

I suggest there are two primary dimensions for evaluating adoption readiness of a new supply chain technology. The first is the robustness of the technology, i.e. does the tech work in the real world? The second dimension is the marginal improvement it offers over competing technologies, i.e. what is the return on investment for adopting this tech compared to others (assuming it works as expected)? A simple 2-dimension space is shown below, but will be refined further in to the article.

Tech Readiness Matrix

Supply Chain Technology Robustness Scale

Supply chain managers are not the first to need to assess the robustness of a new technology. Just because something works in a lab, or in limited tests, does not mean it is ready to support full scale operations. Recently within the entrepreneurship community the NASA technology readiness level has been pointed to as an example of an assessment of robustness. Here is the scale care of Wikipedia:

Level Description
  1. Basic principles observed and reported
This is the lowest “level” of technology maturation. At this level, scientific research begins to be translated into applied research and development.
  1. Technology concept and/or application formulated
Once basic physical principles are observed, then at the next level of maturation, practical applications of those characteristics can be ‘invented’ or identified. At this level, the application is still speculative: there is not experimental proof or detailed analysis to support the conjecture.
  1. Analytical and experimental critical function and/or characteristic proof of concept
At this step in the maturation process, active research and development (R&D) is initiated. This must include both analytical studies to set the technology into an appropriate context and laboratory-based studies to physically validate that the analytical predictions are correct. These studies and experiments should constitute “proof-of-concept” validation of the applications/concepts formulated at level 2.
  1. Component and/or breadboard validation in laboratory environment
Following successful “proof-of-concept” work, basic technological elements must be integrated to establish that the “pieces” will work together to achieve concept-enabling levels of performance for a component and/or breadboard. This validation must be devised to support the concept that was formulated earlier, and should also be consistent with the requirements of potential system applications. The validation is “low-fidelity” compared to the eventual system: it could be composed of ad hoc discrete components in a laboratory.
  1. Component and/or breadboard validation in relevant environment
At this level, the fidelity of the component and/or breadboard being tested has to increase significantly. The basic technological elements must be integrated with reasonably realistic supporting elements so that the total applications (component-level, sub-system level, or system-level) can be tested in a ‘simulated’ or somewhat realistic environment.
  1. System/subsystem model or prototype demonstration in a relevant environment (ground or space)
A major step in the level of fidelity of the technology demonstration follows the completion of level 5. At level 6, a representative model or prototype system or system – which would go well beyond ad hoc, ‘patch-cord’ or discrete component level breadboarding – would be tested in a relevant environment. At this level, if the only ‘relevant environment’ is the environment of space, then the model/prototype must be demonstrated in space.
  1. System prototype demonstration in a space environment
Level 7 is a significant step beyond level 6, requiring an actual system prototype demonstration in a space environment. The prototype should be near or at the scale of the planned operational system and the demonstration must take place in space.

 

This scales removes ambiguity about the “proven robustness” of a technology. It serves as both an assessing tool and as an indicator of how to move a technology towards being more provably robust. In short, this is a useful scale for NASA. But it needs to be adapted in order to be used for supply chain contexts. Here is my attempt to do so, the Supply Chain Technology Robustness Scale:

Level Description
  1. Concept
A speculative vision is created to show how the new technology can be used to meet a supply chain need.
  1. Analytical Proof of Concept
Proofs of concept are completed analytically, validating expected use cases and outcomes from the concept formulated in level 1.
  1. Experimental Proof of Concept
The new technology is used as planned according to the level 2 concept, in support of actual supply chain needs, but with a limited scope and under the guise of an experiment.
  1. Limited Scope Rollout
The concept is deployed in to a real and complete supply chain environment, as part of the new standard technology landscape. The scope of coverage is limited or incomplete.
  1. Standard
The concept is deployed in full. No part of the supply chain is waiting to use this technology. It is the standard.

 

The scale proposed above is simpler than the NASA scale, mostly because supply chain technology selection doesn’t usually concern itself with the earliest stages of conversion between basic research and applied research. Therefore I assume a “level 1” robustness is simply a good concept of how to use a given technology in a supply chain context. Autonomous long-haul trucks are at level 1 robustness (as of mid-2014). There are some formulations of how to use this technology, but no proofs of concepts either analytically (i.e. business models) or experimentally (i.e. a DARPA-challenge style event for commercial long-haul trucking). Note that the scale proposed here could be used for assessing industry-wide robustness or company-wide robustness. In other words, it’s up to the supply chain manager to decide if they want to consider only what they have in the company or also to look at other companies’ experiences.

But technology robustness is not the only dimension that is needed to select supply chain technologies. The other dimension is marginal improvement over other technologies, and that’s what we’ll discuss now.

Comparative ROI Attractiveness

Nearly every technology being evaluated by a supply chain leader has alternatives. Emerging technologies need to be robust (i.e. low risk) but also ROI attractive in comparison to their alternatives. This is the second dimension of importance, and like the first dimension I’m going to propose a quantitative scale for it.

Level Description
  1. Unpredictable Cost to Serve
The cost to deploy the technology is essentially unknown.
  1. Bleeding Edge
The cost to deploy are predictably higher than the next best technology option.
  1. Cost Neutral
The technology can be deployed at no net change to business costs over an agreed time frame.
  1. ROI Positive
The technology can be deployed and generate a measurable return over an agreed time horizon when compared to the next best technology option.

 

Putting Together Robustness and Attractiveness

I hate to add to the glut of business concept matrixes out there, but here is my suggestion for how to plot technology adoption readiness for supply chain applications. I put the vanilla graphic below and then some additional details. What the first diagram shows is the matrix of robustness compared to attractiveness for a technology.

Tech Readiness Matrix

The matrix already shows one area blacked out because, short of gross incompetence, it’s just not possible to rollout a technology without having some good idea of its cost to serve.

Tech Readiness Path

The second diagram may or may not peak more reader’s interests. Here I lay out in red the kind of path many technologies take as they mature and become the new standard. Notice that there is a two-step dance occurring: first we conceptualize then we attempt to test that concept. What happens is that early conceptualizations often can’t provide clear or reliable ROI attractiveness. After all, it’s unproven technology: can we really give an accurate ROI projection? So its needs to be tested first. Naturally, an unknown or bleeding-edge technology will only be tested on a limited scope: and the maximum scale of proof of concept testing tends to be larger when there is a conceptual model suggesting an attractive ROI should the technology succeed. Ultimately this is a recursive feedback loop: we conceptualize how to use new technologies in a supply chain context, but need to test them in practice to understand their ROI attractiveness. As we test these concepts we find ways to improve the ROI expectation, leading to a potential next-round test. Successful tests cycle towards larger tests and eventual technology landscape dominance in the form of the new standard.

How to put this to use:

Set aside for a moment the fact that this instrument can and will be refined to better reflect and inform the supply chain community needs: it’s understood that the version proposed here is a starting point. The question I want to pose now is simply “how can this be used by supply chain leaders immediately”? I believe there are three ways:

  1. The instrument provides comparative formalization, which helps clarify how to assess competing technologies. That’s important because a muddled or informal process eventually leads to decisions based on process over substance, in the form of politics and enthusiasm driving the decision to adopt or bypass a new technology. Using this instrument within a group of decision makers helps guide and structure the evaluation in plainly useful ways.
  2. The instrument can provide clear guidance on how to match supply chain technology adoption to the overall company competitive strategy. Some companies, such as Dell or Wal-Mart, may strategically decide to invest heavier in the lower left of the matrix in an attempt to get bleeding-edge early adoption advantages. Other companies will focus on fast-following and invest somewhere in the middle right. And pure laggards will only invest in top right technologies. Once a CIO or CSCO makes their organizational values and priorities decision along these lines, the rest of the supply chain team can see when and what would be considered investable. In short, it orientates the supply chain management team towards the right kind of technology for the strategy of the company.
  3. It provides what-next guidance to those championing the adoption of new technologies. At any moment, the candidate technology can be plotted and then a plan formed to move it up and to the right in order to improve its adoption readiness.

Summary:

The NASA technology readiness level is a great stepping stone to forming a technology adoption readiness scale particular to supply chains. In this article I offer these viewpoints:

  1. Supply chain technology adoption readiness is mostly measured in two scales: robustness and ROI attractiveness. As a starting point for further research or discussions, I offer a quantitative scale for these dimensions.
  2. There seems to be a typical path technologies travel in terms of improving their robustness and ROI… this is a cycle of conceptualization and testing where the concepts get stronger and stronger in terms of likely ROI and the testing gets larger and larger in scope.
  3. The instrument and insights around assessing technology investment readiness are useful because they formalize technology selection, focus an organization of what kinds of technology adoption behavior the company will pursue, and provides guidance to champions of emerging technology on how to improve that tech’s adoption readiness.

Good luck to those who start using this, and feedback welcomed as usual.

 

Leave a Reply