August 4, 2015, By Chris Durney, Used with permission from PhaseOne 

In 2014, the newly created US Digital Service published the US Digital Services Playbook (DSP) because “the U.S. Government needs a new approach…[t]o increase the success rate of these [digital services] projects.” The DSP comprises “13 key plays drawn from successful practices from the private sector and government that. if followed together[italics added], will help government build effective digital services.” The question, of course, is why we should believe that the DSP will have better success than any other “best practices” approach promoted in the history of federal IT.

My graduate study was on innovation management, so I tend to see IT development first and foremost as an innovation management challenge. Thus, to answer the digital services question for myself, I borrowed from the classic study in the innovation field, Everett Roger’s Diffusion of Innovations.

Early in his book, Rogers lists five important characteristics of successful innovations:

  • Relative advantage is the degree to which the innovation is perceived as providing more benefit than the thing it replaces. The perceived advantage may be economic, social, personal convenience, or some other characteristic important to the adopting community. The greater the perceived advantage, the greater the rate of adoption.
  • Compatibility is the rightness-of-fit of the innovation with the sociotechnical culture, that is, the degree to which an innovation is consistent with the user’s experiences. If the new thing appears too alien to users, it may be too threatening to their life style to be adopted.
  • Complexity is the degree to which an innovation is perceived as difficult to understand and use, and slows the rate of adoption. Simplicity increases it.
  • Triability is the degree to which an innovation can be experimented with on a limited basis. The more an innovation can be tried out, the more risk can be squeezed out during the trial period. Lower risks contribute to lower user barriers and a higher rate of adoption.
  • Observability is the degree to which the results are visible to users and others and therefore lower uncertainty. Also, observable benefits spur discussion among members of the adopting community—Is that new? How do you like it? Does it work well?—and this discussion results in a lowering of the knowledge barrier and an increase in the impetus for change.

So, if the 13 plays in the DSP are followed together, might they create results that demonstrate relative advantage, compatibility, simplicity (the inverse of complexity), trialability, and observability, and thereby raise the chances that the services produced will be successfully adopted by the user community?

The first two plays—“Understand what people need” and “Address the whole experience from start to finish”—really try to get at the value that the digital service will be produce for the users. If we successfully understand what people need, we should be able to demonstrate clearly the relative advantage of the new service in light of the benefits provided and the “pain points” eliminated from the users’ current situation. Good measures of how well the service meets the user’s needs will help make the case for the delivery of real  benefits; but the users’ own experiences will be the final arbiter of whether  the perceived relative advantage actually exists or not.

Also in the first two plays, we determine the ways the service will fit into the users’ lives. That is, how can we make the service compatible with the way our users live? We are encouraged to “spend time with current and prospective users of the service” to get to know their needs and wants, behaviors and preferences. Then Play 8 requires that we incorporate modern technology solutions “that match what successful modern consumer and enterprise software companies would choose today.” By choosing technologies now in use in the private sector, we increase the chances that our users will already have experience with these technologies through their access to private sector services. If this is the case, our users will experience the services that we are developing for them as highly compatible with their growing expectations for what a digital experience entails.

Play 3 tells us to “make it simple and intuitive.” That couldn’t be more in line with Rogers’ third characteristic. The easier it is for the user to employ the service to achieve the desired benefits, the faster the rate at which the new service will be adopted. Play 3 encourages us to make sure all users can access the service easily and that we use language and design consistently as we develop the service (use of a design style guide is recommended).

The DSP is all about trialability. Play 1 tells us to “test prototypes of solutions with real people, in the field if possible.” Play 4 is designed to “get working software into users’ hands as early as possible to give the design and development team opportunities to adjust based on user feedback about the service.”  Play 5 suggest that we “Budget for Delivery,” making sure that the “[b]udget includes research, discovery, and prototyping activities.”  Rapid, incremental delivery allows us to get things into the hands of users early and reduce the risk of failure over the long term. Play 10, “Automate testing and deployments” ensures that “new features can be added often and be put into production easily.”

Finally, I think the most important game changer of the DSP may be the quick delivery of observable benefits to the users as the earliest possible time. This is the one area in which the traditional waterfall method, no matter how well executed, cannot compete with the digital services approach. The benefits of the waterfall approach cannot be observed until at least the first iteration of the final product is delivered, often more than a year, sometimes two, after the project begins. That is a long time for users to wait, for technologies to change, and for senior sponsors to move on to other positions.

The DSP, on the other hand, calls for a “functioning ‘minimum viable product’ (MVP)” to be shipped “as soon as possible, no longer than three months from the beginning of the project.” This increment provides observable relative advantage to the user because it “solves a core user need.”

In my quick estimation, then, the DSP approach aligns well—much better than the waterfall approach—with the characteristics of a successful innovation. Within three months, users have trialable functionality that is simple and intuitive to use, fits compatibly into their way of life, is simple and intuitive to use, and provides measurable benefits…

But does this alignment guarantee the success of digital services projects that follow the playbook? Perhaps only if the digital services sponsors live most faithfully by Play 7—“Bring in experienced teams!”

Related Post


Please enter your comment!
Please enter your name here