State of Change, Chapter 12: Distribution and the Supply Chain
Building a company that distributes goods and services to a widely diverse customer base is both an art and a science. The art is the ability to envision a clockwork production and distribution mechanism whose scale — which could very well encompass the planet — is tantamount to its efficiency. There is no instruction manual for cultivating this ability.
But there are plenty of words to explain, or attempt to explain, the science. A supply chain is a machine, and the environment in which it operates is an economy. The function of an economy is to provide for the needs of customers; and the very reason warehouses exist in the first place is to meet customers’ anticipated needs. The science of producing just enough product to meet these needs for a reasonable period of time, such as three months, is guided by many theories. Nearly all of these theories, at least in the last half-century, are predicated upon the sharing of information between responsible parties, including suppliers, manufacturers, shippers, packagers, marketers. And in fewer scenarios around the world these days are all these parties directly connected to one corporation.
In an earlier article in this series, I brought up the topic of Electronic Data Interchange. Every manufacturer that deals with a major retailer, every producer that works with a shipper, and every business whose operations depend on or consist entirely of logistics is perhaps more intimately familiar with EDI than it would care to be. EDI tends to be the motivating force for business revolutions when their leaders frame it as an ideal rather than an interface — as a set of policies for ensuring that transactions between partnered businesses maintain a prescribed format and workflow.
Of the many models created over the last century of inter-organizational systems (IOS for short, no relation to Apple), EDI is perhaps the only one to emerge from the realm of theory into practical use. By documenting the stages of a business process, and producing those documents in a form that business partners can readily interpret, EDI provides a language for explaining how business is done. The inter-organizational model allows partners to perceive their precise role in the process.
Distinguishing the Cloud from the Silver Lining
In recent months, vendors have been re-introducing EDI in the context of some kind of cloud motif or framework, as though the concept natively belonged there. Presently, I’ve isolated four distinct classes of EDI deployments that their service providers characterize as “in the cloud,” and which may or may not be cloud-based to varying degrees:
Webforms-based EDI processing services, very similar if not identical to the kind proliferated since the 1990s. These services present members of a supply chain with rather straightforward ways to produce common business documents, such as invoices and shipping notices, in formats that are translated from HTML to a common EDI format in the background. Vendors here are trying to integrate a few aspects of cloud dynamics where they can, usually by introducing metered services (payment by the month, or by quantities of documents processed).
Multi-tenant EDI applications hosted by SaaS providers, some of which offer services to EDI trading partners throughout the supply chain. The main benefit offered by hosted applications such as Logility Cloud Services is a flexible licensing model. The potential benefit for multiple tenants, however, lies in the service acting as a go-between, translating EDI documents to whatever formats they may require. Arguably, such applications may not be “true cloud,” though they may indeed run on cloud platforms such as Amazon AWS, and may serve to resell cloud infrastructure to SaaS customers.
Outsourced EDI specialists such as RedTail Solutions, which provide customers with a kind of service portal for EDI processing, including transactions that may require human oversight (for example, shipping label printing and distribution). This class of specialists is aware that its customers have already invested in enterprise resource planning (ERP) applications such as Sage 100. So rather than hosting an alternative to ERP, RedTail serves as an outsourcing candidate for business processing, giving Sage direct links to its service portal that replace a company’s internal network connections to its operations division.
EDI service platforms such as Microsoft’s new BizTalk Services, which operates on its Windows Azure cloud platform (at the time of this writing, as a “preview”). This class is intended for a company’s internal developers who build and maintain their own custom apps on-premise. BizTalk is made available as Web services that are contacted through apps running either natively or on a browser. Such services can be set up to connect directly with on- or off-premise ERP applications from providers such as SAP and PeopleSoft.
As you can see, the phrase “cloud-based EDI” can mean dramatically different things, depending on who’s giving the definition. In truth, EDI is an automation process for business, whereas cloud computing is a methodology for technology. Both can be extended in ways that improve business processes — EDI by way of management, and cloud by organization. But they are different classes of technology.
EDI gave rise to the first wave of business information networks, long before the Internet gave rise to any kind of low-level networking protocol. It established a conversational model of business, not through the sharing of common information resources but rather the exchange of documents (e.g., purchase orders, invoices, shipping notices) across bridges that connect those resources. The encoding of those documents in a common format was originally conceived to be a paper saver — an alternative to printing them out and mailing them between business partners, or stamping them out on punch cards and shipping them in crates.
Ideally, with EDI, fewer humans are directly involved in the process of overseeing every single document that businesses must exchange. Instead, the existence of trading partner agreements enables partners to expressly validate entire classes of documents that they should expect to exchange in the normal course of business. As long as they’re being processed correctly and in the order these agreements specify, they’re in order and presumed valid.
The alternative to the adoption of a network IOS is for all the partners in business transactions to share a common database, or to have secured access to one. There have been times, especially during the period when Web portals were being pushed as business panaceas, that this notion was not treated as ludicrous. Some so-called “collaboration platforms” consist of systems where business partners collectively view common, shared documents pertaining to the status of ongoing transactions — and, now that you think about it, you can probably recall exactly what I’m talking about. And, oh yes, these things are indeed being marketed nowadays as cloud collaboration platforms.
Where such collaboration platforms fail (which happens more frequently than is typically reported), it’s usually because shared documents such as order tracking lists, advance shipment notices, and inventory lists do not maintain adequate audit trails. Mistakes cannot be rolled back. Those responsible for errors plead innocence, and non-repudiation — the state where the validity of every document in the transaction chain can withstand any challenge — cannot be guaranteed. Errors end up being everyone’s fault.
Put another way, there’s a reason it’s called “supply chain.” Suppliers need the ability to perceive the status of the shipping process, and both suppliers and shippers need to perceive the status of manufacturers... who often get their marching orders from marketing, or sales, or both.
The Visibility Dilemma
By the early 1990s, still years before the advent of the Web, business experts were compiling very long lists of reasons why EDI implementations were failing. Most of these lists had one common element: Once businesses standardized a process, it stuck to that standard. And its unyielding rigidity disabled it from adapting to the often equally rigid requirements of potential partners in the supply chain.
As a result, when one component of the supply chain made necessary strategic shifts, the other links in the chain that relied on that fixed, standardized way of working, were forced to make reactive adjustments.
In 1992, then-Stanford professor Hau Lee and then-HP fellow Corey Billington noted some of these more frequently occurring shifts in the supply chain, in an article for MIT’s Sloan School of Management. In one instance, a parts manufacturing plant opted to cut its inventory, since its performance was being judged by how low its inventory was kept. This increased the plant’s response times to its customer’s distribution centers, forcing them to respond by keeping their inventory levels high in order to reduce the number of pending orders.
When the chain of transactions in an EDI workflow becomes tighter, Lee and Billington found, the flow of events in that chain makes it vibrate like a string, carrying the force of their side-effects down the line and making everyone else feel the effects. Any reactionary event is more apt to cause, quite understandably, a chain reaction. In the early ‘90s, such circumstances typically led business experts and consultants to advise their clients to become more flexible, less rigid.
This trend was noted in 1994 by a team led by Ian Graham of the University of Edinburgh, in a report for the European Commission on EDI’s social impact in Europe. Graham’s team wrote:
One perspective is to view EDI as a socially neutral technology, in which the existing relationships between organisations are “edified;” documents previously transferred by mail being transferred electronically. This narrow vision predicts negligible impacts on industrial structure and modest employment implications as routine clerical operations are eliminated at the interface between firms and in the postal system, but leaving the internal operations of organisations untouched. The alternative paradigm is to view EDI as an enabler of more radical changes in the relationships between organisations. EDI may be viewed as a technology which, by reducing the cost of transferring information, may fundamentally alter industrial structure, for example, by replacing intermediary brokers with an electronic market, or by dispersing operational control through the supply-chain. Interest in this radical conception is closely related to the modish concept of “business process redesign,” which focuses on the use of IT internally as a central element in radical change.
The word we started hearing with respect to that “radical change” was “agile.” The introduction of this notion led to the first movement away from EDI.
But the first movement was short-lived. Soon after the Web portal “revolution” began and ended in the late ‘90s, businesses needed some mechanism for standardizing the events being stored in a new breed of business databases. They turned back to EDI, noting that it appeared to fulfill three fundamental functions:
It applied a common vocabulary for business transactions, ensuring that multiple companies potentially stationed across the globe would at least agree upon the meanings of certain things;
In the era before virtualization and the cloud, it prescribed a common technological infrastructure, to ensure that messages from one EDI-compliant system were structurally (if not syntactically) compatible with messages from another.
It gave businesses a common, sensible list of things that needed to be secured, so that they could begin to understand how to secure them.
EDI divided the responsible parties in the document exchange process into discrete components. By the 2000s, the list of those components had become standardized:
The transaction application serves as the business’ electronic switchboard. It sends and receives EDI documents, serving as the business’ electronic liaison.
Value-added network (VAN) is a name that someone’s marketing department coined for a multi-lingual mailbox service. The VAN encodes transacted documents in one of the many standardized formats that, in many cases, governments actually prescribe for doing business in America (ANSI X.12) and Europe (UN/EDIFACT). In the pre-Internet era, the VAN served as the business’ electronic mail, which was delivered using one of the many information services (uuNET, GEIS, MCI) that would eventually agree to permanently link their services to make the Internet possible.
An EDI translator is a go-between which validates the format and syntax of EDI transactions, and gathers a log of all such transactions between business partners for validation and auditing.
An EDI mapper is the counterpart to the translator, specifying the formats and attributes of transactions as required by the party on the receiving end. Some hosted services offer a combination of translator and mapper; or of translator, mapper, and VAN, as something they call a data bridge, in hopes that groups of partnered companies would collectively subscribe to them.
As business strategists applied themselves to maximizing the usefulness of the components in this standardized toolkit, the goal they eventually foresaw was a kind of unification of business databases — a means, with the aid of automated translators and negotiators, for a company to see into another company’s database. One potential for such an ability could be to foresee their partner’s customer’s needs. This concept was marketed as supply chain visibility or value chain visibility.
Initially, technology companies put forth the idea that if everything a company produces and ships could be tracked in real-time — a sort of “Internet of things” — then any set of partnered companies utilizing these EDI components would be operating as though they had synchronized their databases. Some have positioned value chain visibility and global data synchronization networks as the same product, and continue to do so today.
A 2010 presentation by the US-branch of non-profit standards organization GS1 (PDF available here) explains it this way: “GS1 manages a global system that allows companies all around the world to globally and uniquely identify their physical things like trade items (products & services), assets, logistic units, shipments, and physical locations and logical things like a corporation or a service relationship between provider and recipient. When this powerful identification system is combined with GS1 BarCodes, EPC tags, eCom business messages, and the Global Data Synchronization Network (GDSN), the connection is made between these physical or logical things and the information the supply chain needs about them. With the connection made, one world of global commerce comes into view.”
In what may be interpreted with 20/20 hindsight as a hedging of its bets, in 2007, IBM began positioning its own value chain visibility initiative as an alternative to EDI — a way to break down the barriers between organizations that EDI components must recognize to do their jobs, and integrate all data, every bit of it, into a new model. And in this model, sophisticated business intelligence (BI) tools would marshal the process of which data can or should be seen by what clients. IBM’s 2007 doctrine (PDF available here) attempted to spark a second movement away from EDI, by dividing businesses into two camps: those who used EDI, and those who knew what they’re doing.
Supply chain executives are at different points in building smarter collaborative visibility capabilities. Some are still struggling with transactional level exchanges and breaking down the silos among functions within the enterprise. In sharing information with their value chain partners, they rely primarily on electronic data interchange (EDI) and are working through standardization and data management approaches to make sense of the information...
But the “Visionaries”... they are pushing ahead by using collaboration among their network partners with business intelligence to make collective and fast decisions. They are using business intelligence and advanced analytics to analyze, monitor and detect changes, from the highest priority events to the most minute transaction, that influence customer service. From adjustments in forecasts due to real-time point of sale or actual orders, to production schedule adjustments from a supplier to in-transit shipment status from a carrier they are aware and reacting quickly.
For IBM to break back into this business, it needed to dispense with the prevailing component model of business document exchange. It tried to do this by decoupling transactions from documents in its customers’ minds, altering the role of computers from liaison officers for their respective businesses, to overseers of a vast store of valuable analytics waiting to be tapped.
The outcry against a potentially dangerous path of worldwide database integration, whether through EDI or some kind of “cloud,” preceded this marketing campaign, beginning as early as 1996. In one of the most prescient warnings about information ever published, an IT policy director at the University of Michigan with a Ph.D., named Virginia Rezmierski, foresaw not only the problem which would plague the Web’s architects two decades later, but the strange, new, cloudy context in which that plague would appear:
If everything is electronically available, and sorting for any number of variables can be done with ease, then information tends to become a commodity, simply zeros and ones out of which fascinating conclusions can be drawn, without sensitivity to the privacy of the individuals presented. Once databases are electronically available, this ability to match them becomes one of the most serious pitfalls on the EDI trail...
If we start along this trail unclear about our purpose, unclear about who is setting the pace, unclear, perhaps, about the trail itself, then we may become distracted by the side trails, the new technological developments that can solve yet one more problem that may not even have been identified. A trail that doesn’t quite get you to where you need to go — but gets you somewhere else — might be a serendipitous achievement, but it is more likely to be a disastrous detour.
Reconciliation
“IBM believes that cloud is a fundamental shift in the delivery model of IT, observes Ric Telford, IBM’s vice president of cloud services, in a recent interview with me. Telford divides the three eras of business computing into the mainframe era, the client/server era, and the cloud era (today). But he cautions against the perception that any technology that visionaries may use in the present age should necessarily render obsolete an older model of working.
“IBM’s strategy has been to guide clients through cloud computing. Cloud should be about improving business value, not just about going on to the next cool thing. You mentioned there’s a lot of, ‘Hey, the water’s fine, jump right in,’” Telford tells me. “The water’s not the same temperature for everybody.”
Telford’s approach represents a more moderate tone from IBM, with respect both to cloud computing and the technologies that cloud would seek to absorb — inter-organizational models being one of them. For instance, in the late 1990s and 2000s, vendors were moving their transactional systems to a model very different from EDI, where client-side applications made transactional requests of server-side functions, and servers either complied or they didn’t. It’s the model that gave rise to the Web.
“Back then, some of those models made assumptions about the fact that everything was going to move to client/server. We know now that that didn’t happen,” Telford continues, “and there’s good reasons why.”
There are a lot of workloads that made sense then, and make sense today, to run on a mainframe. Now, you could argue that the mainframe has evolved as well to be more client/server-like, and that’s a fair argument. Yet when you fundamentally think about the difference in those two styles, there are plenty of workloads today that still run on mainframes, and mainframes continue to be a very important part of IT. That will continue, as will some of client/server.
The other part of that, which we tell our clients, is this: You shouldn’t be going in with a mindset of saying, “Now, we’re going to go 100% cloud.” You should go in with a mindset that says, “I have some new tools in my toolbox that I can use where it makes sense.”
One can draw the following conclusion from Telford’s comments to me: No single IOS translates into everyday computing processes in such a way that all of any organization’s workflows are completely modeled. As the number of requirements for all of an organization’s business partners collectively rises, the number of consensus points between those partners that everything must always be done the same way and along the same business model, is reduced — perhaps proportionately. Put another way, the more everyone agrees upon what needs to be standardized, the less standardization there can be.
It gives rise to a rational argument away from the entirely integrated, “collaborational” approach. And it’s a step back from the picture of omniscient value chain visibility that IBM painted in 2007, though it’s probably much more realistic.
As John Patrick Saldanha discovered for his 2006 Penn State B.A. thesis, as more partners standardize their business chain and cost reduction management systems, the costs of the reductions themselves tend to rise. So even when standardization can enable smooth workflows, and transaction costs can be pared down, the up-front cost of doing so can become prohibitive.
What’s more, getting too wound up in the transaction standardization process tends to distance the people involved from one another — trying to make a flowchart do what a phone call can do better. As Saldanha wrote:
IOS is not the only or the best way to organize inter-organizational communication and information. Adopting an IOS requires investment in process re-engineering to fully exploit its advantages. Additionally adoption of IOS such as the EDI can be expensive. The conventional phone, fax and e-mail may offer simpler and cost effective solutions to reduce uncertainty by simple reconfiguration of current operations.
This might have been the final ride into the sunset for the last sequel in the EDI franchise. But cloud dynamics has a way of changing the fundamental aspects of business itself, in such a way that the circumstances that prevented EDI from being fully effective the first, second, third, and successive times, may actually no longer exist.
There’s People in the Cloud
The case-in-point for the comeback of EDI involves vendor-managed inventory (VMI). It’s a concept that emerged after the turn of the century, as part of the rise of value chain visibility. Its principle is this: If the proper trading partner agreements are in place, a supplier/buyer should be able to detect in a timely fashion when a customer/seller needs replenishment, and by how much. This would especially apply to raw materials suppliers, whose clients could be furniture manufacturers or home builders. If suppliers could see what customers needed, before they knew they needed them, they could perform a valuable service.
There are two approaches to this problem. The first, and most obvious, is by codifying the existing EDI transaction sequence between partners and simply expediting the automation of it, which the Graham team from the EC study might classify as “edification” (pun obviously intended). The second is through a radical restructuring of the business process, but this time through the application of cloud dynamics. Here, the roles of EDI translator and EDI mapper would be mutually absorbed into a single service, which would be hosted by a cloud service provider. That provider would effectively serve as the keeper of the unified database, but would manage it in a form of escrow.
It would be value chain visibility, all right, but with a third party entrusted to play the role of “visionary.”
In a recent self-produced video, Ron Burnett, product director for SaaS provider Logility, explained the rise, fall, and subsequent rise of VMI. For it to work properly, Burnett explained, the seller needs to be able to glean insights directly from the buyer’s point-of-sale (POS) data. It’s the type of prescience that the typical exchange of EDI documents cannot provide. An atypical exchange could conceivably be created, but it would require the customer to codify an exchange of its own private data with its supplier(s) at regular intervals. As Burnett describes it, “I think over time, too much emphasis was being placed back on the seller of the product, and the buyer was taking more of a hands-off focus on the VMI process.”
I think it was the supplier that was more interested in the VMI relationship, and that’s why it tended to be more one-sided. I think in today’s environment, as we’ve seen our economy change over the last few years, the buyers have gained more interest in some type of collaborative process with their suppliers. And this is why VMI is starting to regain popularity, and why I’ve started to use the term “collaborative VMI” to talk about this re-emphasis on the vendor managed inventory process.
Surprisingly, Burnett goes on, direct oversight of the customer’s POS data did not yield any insight. Typical business intelligence requires an analyst to be able to correlate various time-synchronized data points to extrapolate trends. Any single set of data points by itself cannot yield such insight. More information needs to be shared, he states, though once IT was brought in to help integrate more data into the sharing process, fewer and fewer of the people actually responsible for the transactions, actually knew they were taking place. Literally, IT was setting up a process and automating it to run itself. Automation meant that processes had to be done in a rigid, uncompromising fashion that ended up being difficult for prospective partners to follow.
When the Web first took up the EDI cause, it promised to standardize the automation process, including for VMI, by introducing conventional webforms — Web page versions of everyday documents. That rigidity standardized the interface for partnered businesses, at the expense of agility. In another recent video, Jeff Franklin, vice president for product development at EDI services provider RedTail Solutions, explains what he calls the “webforms predicament:”
It’s great, when you start off. You have one person downloading the orders, they’re re-keying them into MAS 90/200. It’s a great way to start out, and it’s inexpensive. But as you start growing, and it grows to a hundred orders, and all of a sudden there’s a hundred invoices, and advance ship notices, and printing out labels, and then as you keep growing, you’re adding more and more people. Now, if you throw in a third-party logistics [3PL] or a warehouse that you’re working through as well, it throws another curveball into it and becomes a huge issue.
The cloud-based service that RedTail is offering is an example of incorporating human-powered roles as part of the cloud process. From Franklin’s perspective, there are any number of specialized jobs that organizations look to outside specialists to handle. Managing transactions should not be considered a “core competency,” any more than janitorial service, legal representation, or parcel delivery. “EDI is a direct natural for outsourcing as well,” he says.
He explains RedTail’s role like this: First, its system ties in directly with the client’s trading partners, acquiring each of their specific EDI mapping specifications. It tests and validates each of those maps, ensuring their compatibility with enterprise resource planning applications such as Sage 100 ERP (a.k.a., MAS 90/200). After notifying the client directly that the shipping process is about to proceed, it then assembles the necessary UCC/EAN-128 shipping labels (there’s plenty of alphabet soup in EDI) required by advance ship notices, and acquires each trading partners’ approval for those notices. Appropriate notices are also served to any of the client’s outsourced warehouses or 3PLs, potentially automating the entire advance ship notice process if the client uses a warehouse management system (WMS). Then RedTail translates and validates all messages between parties during the shipping process.
Keep in mind that EDI was created decades ago ostensibly as a means of specifying, codifying, and automating what were described at the time as fundamental business processes. These were the functions that defined businesses to their partners: the integrity of the transactions spoke for the integrity of the company.
But now that cloud dynamics makes it possible for the management of transactions, including both personal and impersonal aspects, to be completely outsourced — for all four EDI components to fall outside the company — it’s becoming acceptable to perceive transaction management as an ancillary function. It’s not just the same as tipping the axis of the job role 90 degrees on Porter’s value chain, from primary to support. It’s a declaration that since EDI is, by definition, standardized and undifferentiated, it no longer should be treated as a source of competitive advantage.
This is contrary to the viewpoint of advocates for collaborative VMI — an argument that oversight is an internal support role, and the cloud may serve merely to host the application. It’s an unresolved question, for which there is no easy answer in the short term: To what extent should cloud dynamics enable personally managed business processes to move outside the company?
It’s a question we’ll examine in further detail in the next article in this series, which covers the manufacturing sector.