State of Change, Chapter 2: Business Technology Evolution with a Capital “R”
Is “the cloud” something your business actually needs? While you’re no doubt aware of the ongoing transition in how business information services are being delivered, and that your own organization is probably in the midst of an IT transition at some level, this is the question you are still quite likely asking.
There’s good reason. Marketing literature is laced with references to our changing business climate and our rapid-paced world, two observations which never failed to be true during the lifetimes of anyone alive today. When evidence of the same lacing, if you will, turns up in technology journalism, you could easily forget whether you’re reading a Web site or a sales brochure. You read about the “cloud bandwagon” (marketing can mix metaphors with reckless abandon) and whether it’s leaving your business behind. You’ve probably come across the pitfalls in “migrating to the cloud paradigm” (see what I mean?). And you read the latest war dispatches from the “cloud revolution” where nothing from the client/server era is left standing in the wake of the march against the stalwart loyalists and renegade, PC-based holdouts.
As the articles and the brochures and the literature keep trumpeting, what does your business need for it to get to the cloud before your competitors do?
In just the last quarter-century, it seems information technology revolutions have become so commonplace that the adjective “revolutionary” by itself invokes just as many images of skepticism as upheaval. Typically at a business convention, you’ll see a buzzword topic like “virtualization” or “cloud” or “Agile” being presented to attendees as revolutionary. And there, you’ll often find presenters stopping just short of condemning you for having failed, failed to take part. For something that’s supposed to be powerful enough to remake the world in its own image, you have to wonder how come your reticence alone is capable of stopping it.
A real trend is one that’s self-sustaining, self-motivating, “self-propelled.” Although marketing does tend to portray products or ideas as trends, it’s often wishful thinking. Viewed in that light, a real revolution changes everything, sometimes whether you like it or not.
Thus far, cloud dynamics has not quite changed everything about the global nature of business or the economy. We have yet to awaken and find ourselves in a Technicolor Land of Oompa-Loompas where the rules of the modern economy have been whimsically rewritten by elves. It’s premature to declare “cloud” the subject of a revolution. Historically, only successful events have been worthy of the moniker. The English monarchy would not be referring to the American Revolution as such, had it not been completely and brilliantly successful.
Can cloud dynamics become revolutionary, however? Yes. Most of the basic ingredients are there. The formula is in place for achieving widespread and sweeping improvement to the way we work. Whether history will judge the still-emerging concept as revolutionary will depend not upon how much has been disrupted, but instead upon how much the resulting reformation yielded measurable, lasting improvement. Disruptions are pointless unto themselves.
Revolutions in IT have been tried before. Most of them have failed, some spectacularly. Veteran CIOs, admins, and IT managers are wary of jumping onto another bandwagon simply because it’s there. They see this thing called “cloud” and, with good reason, they wonder why such a fluffy euphemism is necessary for a technology that supposedly has such innate potential. They’re not ready to take a bow for the new revolution just yet. Actually, their more likely response is like that of Marvin, the Paranoid Android from The Hitchhiker’s Guide, when told a whole new life awaits him: “Oh, no, not another one!”
What a cloud revolution clearly lacks at present is the same ingredient that was lacking the last several times business revolutions have been tried: consensus.
Ground Zero: Competitive Advantage and the “Value Chain”
The way to gauge whether cloud dynamics is a more effective catalyst for an IT revolution that its predecessors, is to look at how those predecessors fared through history. The idea that information technology can revolutionize business is not only old but, in some respects, actually stale.
The 1960s saw the rise of the first business computers as essentially clerical aids. But in the following decade, the very first management-oriented applications had been geared toward improving the decision making process for businesses. These two decades constituted the first “eras” of the short history of business computing, such as it was at the time. But just as ‘60s computers did not automatically improve corporations’ accounting abilities, ‘70s computers did not lead to better decisions. Few people besides those tasked with the day-to-day operation of the machines, truly understood what they were and what they were doing. And the software (if it could be called that) was not even intended to be directly used by the people whom it was supposed to directly benefit.
So it was in the late 1970s that the trend of management information systems (MIS) was stalling out. To reverse this trend, a few corporations made a limited, though noteworthy, effort to make the adoption and implementation of information systems (IS) in the workplace follow a standardized model. At this time in history, the real point of this model was to help executives understand the fact that change was slow, and that this native slowness could be moderated by considering the process as being naturally divisible into stages, like the growth of an adolescent or the overcoming of an addiction.
In 1973, Harvard Business School Professor Richard L. Nolan noticed a correlation between the innovativeness of corporations and the amounts of their expenditures and investments in IS. In studying the patterns in which IS was integrated into institutions, and co-founding a consulting firm to advise firms about his discoveries, Nolan had identified six stages in the management lifecycle (as it’s called now) of information systems by 1979.
The Nolan Model was not an ideal projection of how IS should work, but rather a composite of observations about how it did work, in the era before the PC. Business computing in an organization basically evolved like this:
Initiation – First, well, you buy a computer. You know it’ll pay off sometime, but first, you’ve got to install it. Only a few people know how to use the computer at this point, and only a subset of those (sometimes as few as one person, sometimes zero) is appointed the planner, who figures out what should be done with the thing.
Contagion takes place when upper-level management finally catches on to what this planner person has been saying, and begins assuming the planning role for itself. Inevitably, too many people get involved in this phase, and there’s an over-investment in applications, many of whose purposes overlap.
Control – This is when executives apply the brakes to the limitless spending phase, and insist that some formalized standards be applied. Here is where the evolution stops being about data processing, as it was perceived in the ‘70s, and starts following the dictums and methodologies of information systems. The role of the first IS departments is to execute and manage the formal plan, in order to keep costs under control.
Integration – During this phase, naturally, the costs of keeping costs under control, coupled with the costs of hiring new people to ascertain the costs of keeping costs under control, skyrocket. (Either Nolan has a brilliant sense of humor or he’s very keenly perceptive.) But the rise of costs does start to taper off, as the recommendations of IS departments (at one time called “steering committees,” reflecting the intentionally non-permanent nature of the role) start to take effect.
Data administration – A new role emerges from the steering committees, dedicated not so much to the management of people’s involvement and interaction with information, as much as with the information itself. The IS administrator is the person assigned the task of integrating the functions of all the applications that were purchased during stages 1 and 2, in an era before operating systems offered any assistance in this regard. What we now call “cut-and-paste” was a process reached through considerable toil, trial, and error by the administrator.
Maturity – This is, well, the goal post. Nolan observed that organizations reach this final stage of evolution once users of applications accept and even embrace their subordinate roles with regard to applications (they no longer claim ownership of them just because they know how to use them), acknowledging the administrator’s authority and assisting where possible in the maintenance and integrity of the data.
In this early phase of business computing, Nolan recognized that businesses tend to dive into the deep waters of technology without knowing what they’re getting into. The learning process becomes a coping process, as well as one of mitigating the damage caused by inexperience and poor planning — characteristics which Nolan unashamedly applied to essentially everyone. This is why the Nolan Model looks more psychological than technological. Through consultations with Nolan and his peers, organizations came to recognize the six phases and accept them as necessary to their overall growth.
Then businesspeople saw their own sons and daughters successfully programming astrophysics simulations on their Apple IIs and TRS-80s, and they became envious. Here were individuals, outliers, kids who appeared to have jumped all the way to Stage 6 without the aid of a business plan, a formal methodology, an administrator, or even a high school diploma. Their astounding success with the first microcomputers was an indication to businesses that, with all respect to Prof. Nolan, they were missing something pretty obvious.
The first real information systems revolution in the enterprise became a face-saving effort.
The Dawn of the CIO
The notion that someone in the organization at an executive level, rather than management, should be ultimately responsible for information systems strategy was first put forth in 1981 by Dr. William Gruber and William R. Synnott, in a book entitled Information Resource Management: Opportunities and Strategies for the 1980s. (They actually had the idea in 1976, but publishers’ lead times during this period could be excruciating.) Their thinking was this: The first data processing systems were devoted to financial information and, as such, the responsibility for procuring, managing, and eventually replacing those systems typically fell upon the Chief Financial Officer. Gruber and Synnott realized a fundamental truth which, over three decades, has not changed even slightly although it tends to be forgotten: The people responsible for any project in a large organization are the ones who pay for it.
Because the CFO is typically involved with day-to-day operations (the COO role had yet to take root in large companies at this time), the two pointed out, he is not in a position to perceive long-term strategies. Senior management and executives’ needs with respect to business information are long-term, and how they utilize this information — Synnott and Gruber realized this for the first time — is critical to the structure and function of their organization. So because long-term strategy and short-term operations should be separate functions, they proposed the Chief Information Officer, an executive responsible not just for information systems but for the information it produces.
In a 2010 interview with CIO Insight magazine (PDF available here), Dr. Gruber referred back to his and Synnott’s book:
In the 30 years since I co-invented the CIO, the trend was as forecasted in my book: The CIO has become more strategic, more involved with the strategic direction of their companies, more visible as a corporate spokesperson, and less involved directly with the lower level operations of IS such as technology operations and new system development.
The seeds for the first concerted effort toward an IS revolution were planted by Prof. Michael E. Porter in 1985, with the publication of Competitive Advantage: Creating and Sustaining Superior Performance. Five years earlier, Porter had the vision of how firms establish competitive position, either by delivering cost advantages (discounts) or quality advantages (premiums) to the customer — to the customer, there’s really no other way. But since 1980, Porter had been playing with the idea of how differentiation in business processes enables one or the other to happen.
As a result, Competitive Advantage was one of the first published works to explain value as the cumulative sum of all the processes involved in delivering the product or service to the customer, including research, requisition, negotiation, delivery, and support.
Perceiving value similarly to how a plant accumulates nutrients from the air and soil to produce oxygen, Porter plotted what he called the value chain, and postulated that all the systems in an organization contribute to its final quantification of value. As Porter and Victor E. Millar explained for Harvard Business Review, “A business is profitable if the value it creates exceeds the cost of performing the value activities. To gain competitive advantage over its rivals, a company must either perform these activities at a lower cost or perform them in a way that leads to differentiation and a premium price (more value).”
Wrote Porter in Competitive Advantage in 1985:
Competitive advantage cannot be understood by looking at a firm as a whole. It stems from the many discrete activities a firm performs in designing, producing, marketing, delivering, and supporting its product. Each of these activities can contribute to a firm’s relative cost position and create a basis for differentiation. A cost advantage, for example, may stem from such disparate sources as a low-cost physical distribution system, a highly efficient assembly process, or superior sales force utilization. Differentiation can stem from similarly diverse factors, including the procurement of high quality raw materials, a responsive order entry system, or a superior product design... The value chain disaggregates a firm into its strategically relevant activities in order to understand the behavior of costs and the existing and potential sources of differentiation.
The seeds are very deeply planted in this paragraph, but their tips are showing: “order entry system,” “sales force utilization.” Porter approached the topic of business value from a business perspective, rather than one of information systems — which, at the time, might have sounded convoluted. But business experts soon inferred that information systems played a core role in all of the processes that Porter used as examples of cost advantage and differentiation.
Like a message passed between kids playing the “telephone game,” the Porter model tends to lose its distinctiveness when re-explained by a variety of people. As presented here, using Porter’s original phraseology, all of the vertical primary activities are interwoven with all of the horizontal support activities — all of the support activities pertain to the primaries. Note Porter did not actually call the support functions “secondary activities,” despite the proclivity of subsequent experts to assume that anything that doesn’t belong in category #1 must belong to category #2. Porter did not — did not — characterize “technology development” as secondary or subservient. This is a misinterpretation. In fact, everything in the horizontal tiers are critical to everything in the vertical. (Unless you care to explain why “planning” is a secondary function.)
Despite being one of four support activities in Porter’s value chain model, IS had become the cornerstone of his competitive advantage methodology. Partnered with Victor E. Millar, he suggested a five-step process for ascertaining how IS could be used to exploit latent competitive advantages, back at a time before IS was being utilized everywhere. Essentially, the process was this:
Assess how much information each business unit uses, for what Porter and Millar called information intensities.
Determine how the utilization of that information would change for each unit once IS is employed there. Not for the company as a whole first, but unit by unit.
Using some of Millar’s charts as guides, identify and rank on a quantitative scale the degree to which utilization of IS could create competitive advantage.
Investigate how the generation of that advantage may be exploited a second time (not unlike a turbocharger’s role in a car’s drivetrain) to create new business functions, exploiting new products or services.
Develop a strategic plan for the implementation of IS, keeping those subsequent opportunities in mind.
In order for this five-step plan to even be feasible, however, Millar and Porter suggested in “How Information Gives You Competitive Advantage,” the strategic function of IS should be wrested from the monopoly of the data processing department (EDP). Workers whose sole purview was the secondary aspects of the value chain, they argued, cannot strategize with respect to the value chain as a whole. Instead, they suggested that “an IS manager should coordinate the architecture and standards of the many applications throughout the organization, as well as provide assistance and coaching in systems development.”
Coordination sounds like fun. But it doesn’t happen automatically. This is where business leaders decided to build a comprehensive strategy for coordination.
The First Shot Heard ‘Round the World: SISP
Strategic Information Systems Planning is the first genuine effort at a business revolution triggered by the need to pare down Nolan’s six stages, and implement Porter and Millar’s five steps. SISP was born during the era when business conferences were preaching that companies could achieve higher goals by repeating mantras and even praying to new gods that wore silk neckties. Amid all that, SISP took the more practical route, professing this: If systems of work were patterned after systems of information processing, then improving the way we process would in turn improve how we work. And that improvement, in turn, would enable organizations to think beyond their existing capabilities, and conceive new and formerly unachievable goals.
Put another way, if businesses knew what they were getting into to start with, the path to maturity wouldn’t always have to be a coping mechanism.
To accomplish this goal, SISP rebuilt Porter’s value chain into something of an IS flowchart. It demonstrated that an alignment of the goals of IS leaders with those of business leaders would lead to a system where any improvements to IS would directly translate to increased business value, and thus competitive advantage.
At this time in history, “I/T” with a slash referred to the skill of designing and implementing computer technology, while “I/S” with a slash referred to the installed base of hardware and software. Remember, this was still a time when folks thought a “network” was NBC or CBS, “services” were held on Sunday mornings, and “strategy” was something lacking in the US’ pursuit of victory in Vietnam.
Strategy was indeed something people typically associated with warfare rather than business. So one of the first consultants to draw the connection between strategy and IS, framed the resulting issue in the light of combat. Charles Wiseman produced a series of articles throughout the 1980s that re-cast Porter’s competitive value as a prize seized by businesses on the field of battle. In a 1984 article “Creating Competitive Weapons from Information Systems,” co-authored with NYU Professor Ian C. MacMillan, Wiseman presented real-world examples of major IS projects such as American Airlines’ SABRE reservation system, to prove a point: Information systems could take down competitors the way combat systems take down enemies.
Here, Wiseman and MacMillan coined the “SIS” part of the phrase, and defined it thus: “computerized information systems used to support an organization’s competitive strategy, its strategy for gaining advantage over its competitors.” They continued:
Emerging from the convergence of technological innovations in information processing (including telecommunications) and competitive forces reshaping industry landscapes, strategic information systems form a new variety of information system, radically different in organizational use from traditional management information systems and from the more recent decision support systems. As competition increases, information systems will become critical to gaining a competitive edge.
In their primordial version of the SISP planning grid, they suggested businesses begin their planning by making two classes of choices, which they called the thrust and the target. The thrust was the basic goal, which they also perceived as the propellant for the proverbial missile: whether it’s differentiation of the nature of labor from the way competitors work; cost reduction that can be passed on to customers; or innovation, usually with respect to the marketing and/or distribution channels. (Note they distinguished “differentiation” from “innovation,” rather than equating them.) The target was the party most affected by the deployment (another combat-derived term): suppliers, customers, or competitors.
There were supplemental issues to be determined, such as the direction of thrust (whether the information was to be used as input to the system or the output from it), and to what extent the three “generic skills” of I/S (processing, storage, and transmission) can be utilized. At the time, these must have seemed like esoteric questions, because much of the later literature on SISP that acknowledges Wiseman’s contribution, actually omits the gist of it, especially the combat metaphors. Finally, there was this stipulation:
Developing an effective information system is not done without effort. To achieve maximum benefit requires a joint effort on the part of line management and information systems managers. The best way to do this systematically is to have them jointly develop a competitive strategy for the business unit. As most planners are well aware, this process involves (1) assessing the current competitive strategy and position, (2) assessing environmental factors affecting the business, and (3) developing a strategy to meet the anticipated challenges over the planning period. What is less obvious is that substantial strategic information system opportunities can be uncovered at all stages in the process.
Here was the first clear mandate that a real strategy, derived from the world of real competitive advantage, can only be accomplished when the people responsible for production (the “line management”) and those maintaining the systems, work together. The common thread of all SISP documents — including the numerous ones that followed claiming to be the first one ever — was the ideal of aligning IS/IT with management.
Strategic Re-alignment
Too many of the world’s would-be revolutions are memorialized by post-mortem analyses. A 1988 study by Profs. Albert Lederer and Vijay Sethi published in MIS Quarterly, which examined a multitude of different methodologies and techniques for applying SISP, revealed why SISP did not appear to be working for too many enterprises. Their findings surprised readers then, but would not do so today: When executives or senior management initiated the SISP plan, IT (or “IS”) didn’t get interested in it. Conversely, when IT initiated the plan, neither executives nor senior management were there to support it. In organizations without a CIO, where the “Director of Data Processing” (DDP) did not report to an executive, little was accomplished. When either side of the company tried to make the problem simpler by breaking it down into departments or silos, they only slowed the plan down. And among those who completed the plan, fewer than half registered satisfaction with the results.
In August 1989, John C. Henderson and N. Venkatraman, two researchers with MIT’s Sloan School of Management, published a brief work entitled “Strategic Alignment: A Framework for Strategic Information Technology Management.” Here, Henderson and Venkatraman equated Porter’s concept of value in terms of business performance and customer expectations, with value in terms of information systems efficiency; it effectively said these values are one and the same. They believed that, if the business management and information management branches of an organization not only worked together but set common objectives for each other, the organization can craft new competitive advantages through the re-architecture of the value chain — advantages that were not only impossible before but inconceivable.
The only way this could be accomplished, the researchers claimed, was the way you solve a complex, multi-variable algebra equation: by factoring business processes into units of work that can be automated. The modern concept of business process management began with this idea.
Given that the researchers used the phrase “I/S” or “I/S infrastructure” to refer to what we usually call “IT infrastructure” today, here is how they introduced the notion that information systems could redefine business:
The cross-domain alignment between business strategy and I/S infrastructure and processes depicts a classic linkage view prevalent today. The other type of alignment between I/T strategy and organizational infrastructure and processes reflects a view of automation of the work environment. Specifically, creating a linkage between business strategy and I/S infrastructures and processes requires the specification of work processes, roles and authority structures in order to relate how the I/S products and services will impact the business strategy. That is, the business strategy must be decomposed into work processes in order to define the requirements of the I/S infrastructure and processes.
The automation type of cross-domain alignment represents the potential for emerging technology to change or alter organizational processes. This view emphasizes the potential value of I/T and how the I/S infrastructure and processes provide a service organization to support this potential.
“I/S journalists” of the time dubbed this the “SISP Revolution.” It was the beginning of this ideal: If we break down everything a business does into its indivisible, constituent processes, then the simple automation of these processes would point to improvements in efficiency and perceived value that can lead to competitive advantage. It’s like the way a mathematician simplifies a formula by breaking it down into its basic steps, and then eliminating the redundant ones. If we get rid of inefficiency, the experts concluded, we reduce costs. Then through reinvestment, we can fortify the business to produce goods and services in new and innovative ways — and to the extent that customers perceive them as innovative, they will accumulate new and quantifiable business value.
It made such sense at the time. But as a 1993 study for MIS Quarterly by Prof. Michael J. Earl of London Business School made clear, at least one-third of participants were dissatisfied with the results of various approaches to SISP, including any approach at all in which cross-domain alignment was a critical factor. Citing Lederer and Sethi directly, Prof. Earl wrote:
Implementation was a common concern. Even where SISP was judged to have been successful, the resultant strategies or plans were not always followed up or fully implemented. Even though clear directions might be set and commitments made to develop new applications, projects often were not initiated and systems development did not proceed... Evidence from the interviews suggests that typically resources were not made available, management was hesitant, technological constraints arose, or organizational resistance emerged. Where plans were implemented, other concerns arose, including technical quality, the time and cost involved, or the lack of benefits realized. Implementation concerns were raised most by IS directors, perhaps because they are charged with delivery or because they hoped SISP would provide hitherto elusive strategic direction of their function.
SISP — a methodology rooted in the idea that business management and information systems management must align both their goals and processes with one another’s to achieve unrealized business value — was failing in organizations where these departments failed to align. On its face, that should not shock anyone. But Prof. Earl’s hard analysis revealed that desire alone was not a sufficient impetus for the two disparate departments of an organization to coordinate all the resources, training, and documentation necessary for them to achieve harmony. Both sides were relying on each other to provide the revolution. In the next article in this series, we’ll pick up the story from this point, revealing how SISP’s leaders tried to save the ideal with some patches and bug fixes.