Techno Babel Techno Babel

September 2, 1996

The development of hot new technologies and the breakneck growth of high-tech firms mean little, economically speaking, if these new capabilities are not built into the productive process. A great many pundits fear this is the case today. U.S. research and development labs routinely discover pharmaceuticals to treat cancers and devise biological processes to produce such compounds cheaply and safely. Entrepreneurs and investment bankers grow rich and millions of workers find employment in firms hawking miracle fibers, hyper-fast digital switches, or products that generate power from the sun, the wind, or the tides. But despite this cornucopia of hustle and invention, income gains over the past quarter century appear slim when compared to the pace established by the U.S. economy across the age of modern economic growth, a record running back to the early nineteenth century.

There is a good deal of debate about the validity of the growth statistics. Productivity in the huge services sector is especially hard to measure. But if the figures are valid, they reveal a paradox a society producing marvelous technical wonders that it may not be able to absorb into the productive process.

Concern has focused primarily on the economy's ability to exploit the new information technologies. Our ability to shuffle and shift information -- representations of reality ranging from alphanumeric characters to full-motion video -- has progressed incredibly swiftly over the past quarter-century. If we could eat bytes (digitized characters), or MIPS (millions of instructions per second), or bauds (signaling events per second), our larder would be a thousand times larger today than it was in 1970. Information technology, moreover, now touches nearly all businesses, professions, and industries. And computing and telecommunications equipment alone -- not including software -- account for one-third of all business equipment investment. One might think that IT's breathtaking advances, combined with this level of investment, would have led to dramatic income and output gains throughout the U.S. economy. But as Massachusetts Institute of Technology economist Robert Solow has quipped, "We see computers everywhere but in the productivity statistics."

Bringing advanced IT into the productive process turns out to be hardly straightforward. One hurdle is a standard trap in the game of technical progress -- what J.S. Metcalf and Ian Miles of the University of Manchester call "the tyranny of combinatorial explosion." Cluttering the marketplace today are too many hardware platforms, operating systems, communication paradigms, and software applications. The resulting confusion can paralyze users and degrade the rate of technical progress.

A second problem stems from the fact that IT is "process," not "product" technology. It is how firms make goods and services, not the design of those goods and services. Introducing new products requires a struggle in the marketplace; but absorbing powerful process technologies generally involves an organizational "restructuring" -- a shift in work flows, skill demands, authority relationships, and staffing levels. Such changes don't come easily, and bungled implementations can leave firms worse off.

Finally, recent advances have raised a set of unresolved issues specific to information technology. IT is business technology; it affects the way organizations see the world and make decisions. In some areas, the technology needs complementary advances to realize its promise. In other areas, the advances themselves raise managerial problems that, for the moment, seem beyond the reach of technical solutions.

 

THE INVESTMENT DECISION

Picking new technology is never easy. But it seemed far simpler in the 1970s, muses Kavin Moody, executive-in-residence at Babson College's Center for Information Management Studies, in Wellesley, Massachusetts. Moody, former head of information technology at Boston's Gillette Corporation and technology executive at Bank of Boston, says adoption decisions were largely controlled by internal corporate hurdle rates -- a minimum return on investment (ROI) a new technology had to yield. Firms set different hurdle rates for "revenue-generating" and "cost-cutting" investments. Moody believes companies nearly always used the latter, which was also the lower, for IT shops were cost centers then. They were part of the comptroller's department, responsible for processing back-office paperwork, and organizations generally bought technology when it let them do the work less expensively.

For any technology, estimating future product costs is often the trickiest part of the purchasing decision. This is because technology costs -- in IT especially -- tend to fall over time. If other firms adopt the technology, then vendors can often expand, realize economies of scale, and lower their prices; other vendors will surface to offer complementary or competitive products; and the supply of workers able to operate and service the technology will rise. Technologies attractive today thus generally become better values tomorrow. Delay also lets users avoid getting caught on technology's "bleeding" edge, or trapped in a technical dead-end.

Counterbalancing these cautions, and also hard to estimate, are the payoffs. These include the immediate gains and those to be captured down the road. The longer a shop uses a new technology, the more adept it becomes. When making a purchase-or-wait decision, companies must estimate these learning-by-doing effects. If a firm's competitors adopt an important technology, one could also avoid a competitive risk by going along with the herd adopting the new technology could eliminate a source of difference between the firm and its competitors. The ROI calculation for technology investments thus requires a critical exercise of judgment about the enterprise's ability to exploit the new capabilities, about the future course of its industry, and that of its suppliers.

When the returns on a piece of technology did look attractive, Moody launched a well-developed adoption routine. It began with a proof-of-concept, continued through system compatibility and performance trials, and ended, if all went well, in a standard rollout exercise. In his early days, Moody's organization brought in computers to automate routine clerical tasks. As staffing costs absorbed an increasing portion of the corporate IT budget, a software industry emerged and companies bought packaged programs that trimmed development, testing, training, and maintenance costs. Whatever the new technology, the firm required continuous and rock-solid data processing, so the adoption process proceeded with a great deal of care.

 

COMBINATORIAL EXPLOSION

In the old days, IBM's dominant position in the IT industry simplified the investment process. Big Blue offered a pretty clear picture of what its new and soon-to-arrive technologies could do, and how they fit together. So Moody and other corporate users could plan their purchases and build up in-house expertise in IBM systems and software.

Today, the corporate IT world is altogether different. Firms now purchase a great many products from a great many vendors. For hardware, they pick from mainframes, minicomputers, PCs, and/or workstations. For operating systems, they could use Windows (3.1, 95, and/or NT), OS/2, VMS, MVS, and/or a UNIX or Java. For applications, they could buy WordPerfect, Word, 123, Excel, dBase II, DBII, SAP, Oracle, Sybase, and/or Informix, and that's just the beginning. For communications, they have LANs, WANs, client-server applications, and/or intranets.

The speed at which such products appear, get eclipsed, or are forced to operate with another has been so swift that IT executives are loath to invest in developing in-house expertise. Increasingly they seek sophistication from vendors or from the huge, new IT consulting industry that sells expertise by the hour. While this is often the only response possible, it further separates corporate IT executives from developments in the underlying technology.

Further complicating matters is a parallel fracturing of corporate control over IT investments. Ever since the late 1970s, when turnkey departmental "solutions" popped up, often running on new minicomputers, IT has spread beyond the glass-room confines of the comptroller's back-office paperwork shop. VPs of manufacturing, human resources, purchasing, distribution, marketing, sales, you name it, bought self-contained systems to process information. They often bought systems to raise quality, revenue, or market share -- payoffs far harder to quantify than cutting clerical or programming costs. This complicated the already tricky ROI calculations. Then the PC and packaged software explosion of the 1980s put IT resources on the desks of individual workers, from secretaries to research scientists. The PCs often arrived without any explicit ROI justification, says Moody, for the payoffs seemed both large and amorphous. Corporate control over technology investments had all but broken down.

 

ANTI-ENTROPY

To bring order out of chaos, firms created a new official -- the "Chief Information Officer" -- in the mid-1980s. The CIO generally reports to the Chief Executive or Chief Operating Officer and is responsible for deploying IT in a way that boosts the firm's overall performance; the relevant ROI is the shareholder ROI. Based on his experience at Gillette and the Bank of Boston, however, Moody likens the job to drinking from a hydrant.

CIOs need to make sense of developments in the technology and bring their judgments to bear on corporate IT investments. They do this by defining organization-wide IT standards and architectures covering hardware, software, data structures, and organizational responsibilities. The firm can thus economize on support and training costs. A more important benefit, but harder to quantify, is that users around the organization can access each others' files and expertise. And by limiting the "tyranny of combinatorial explosion," CIOs simplify the IT planning process for users throughout the organization.

Despite the utility they bring, says Jerry Kantor, also at Babson College, the average tenure of a CIO runs only about three years. Bringing order is a thankless task, says Kantor, too often viewed by the orderees as an unwanted intrusion. Users often have their own technical staffs, minds of their own, and significant clout in the organization. And expectations often run too high about what IT can do for the firm. But the source of these conflicts and the reason why CIO tenures are short lies in the technology itself -- in IT's relentless, unpredictable advance.

 

THE TRAVAILS OF REENGINEERING

What also makes IT advances hard to exploit is the fact that information technology, as "process" technology, must generally be integrated into a preexisting production system.

All workplaces are interlocking bundles of machines, workers, work routines, quirky work-arounds, and shop-specific cultures. When management introduces a new and improved technology -- whether cybernetic or metal-punching -- it requires adjustments throughout the workplace. The greater the productivity gain, the more far-reaching the task, and powerful new IT technologies typically demand different employee skills, threaten jobs, disrupt hierarchies, and generally disorient workers.

The problem is classic The father of "scientific management," Frederick W. Taylor, had been instrumental in developing "high-speed steel," one of a series of advances that let turn-of-the-century manufacturers process metal with unprecedented economy, speed, and precision. As machining costs plummeted, what became expensive was the cost of getting metal on and off the machines and on to the next station, the cost of inventory stock-outs and buildups, and the cost of muddled administrative assignments. Taylor reasoned the road to efficiency lay not through speedier ways to work metal; this was already cheap. Progress required better training and rationally reorganized workflows, inventory controls, and supervisory assignments -- the stuff of scientific management and industrial (re)engineering.

Modern information technology raises similar issues. Many leaders in the business process reengineering movement say that organizations today possess enormously powerful IT resources. Poor shop floor implementation, however, explains why they have so little to show for it. Going beyond Taylor, these critics fault management for failing to recast the business to best exploit the new technology. Thus in their 1995 book, Creative Destruction, Richard Nolan and David Croson, of the Harvard Business School and the Wharton School of Business, assert that shifting to a "customer-driven, IT-enabled network organization" could halve the corporate head count without cutting output, and raise the firm's long-term growth rate.

 

A SUCCESS STORY

One New England enterprise to make this transition is Acme Insurance (not its real name), a property and casualty insurer. As recently as 1990, Acme operated truly archaic equipment -- an IBM 1401 with keypunch machines as the input device. Today, the business runs off a spiffy client-server network, with document imaging, high-speed links to outside data suppliers, and custom front-ends supporting its customer-contact agents.

Central to the company's success, says CIO Al Smith (not his real name), is management's leadership in recasting the business. Smith, with thirty years' experience under his belt, could have brought in new systems to run the old business cheaper and faster. This would have been a standard upgrade for a process engineer. But the company president had a new business vision. He saw an opportunity to aggressively expand sales of the auto and homeowners' insurance using the new information technology. His business model called for a corps of telephone agents who, when called by a potential customer, could issue price quotations and bind insurance coverage at the end of the initial conversation. Smith's task was to provide this capability.

Projects like this succeed more often, says Tom Johnson, of Tenex Consulting in Burlington, Massachusetts, when senior business executives rely on senior IT managers like Smith. Hot-shot kids with the latest technical chops are often ignorant of the business and are out of sync with older executives. They also tend to lack the maturity needed to manage the delicate adoption process.

One benefit of Smith's maturity has been his take on the tradeoff between progress and perfection. Both Smith and Acme believe in "total quality management," a movement that sets as its performance target "six-sigma" consistency -- less than four defects per million. Smith's upbringing in corporate computing reinforces this commitment to rock-solid reliability. But he knows that progress today is not so accommodating. Acme's new system uses five different computers, layers of interconnected software products, and webs of communication links. So it's "down" more often than the old IBM. That's part of the price of the new capability, says Smith. He expects that TQM's "continuous improvement" process will steadily raise the sigma count. But he hopes users and customers will keep demanding more functionality, which will often come at a cost in reliability. If the system ever ran at six sigmas, says Smith, "it could show that people in the firm had run out of ideas for positive change."

Maturity also helps handle perhaps the greatest barrier to capturing gains from process technologies -- employee resistance. Workers will always fight initiatives that cut the firm's head count in half, as Frederick Taylor should have learned. This was not the main problem at Acme. The company expanded its "numerator" -- it took business away from competitors that used traditional brokerage channels -- so it did not reduce its head count "denominator." More problematic at Acme was the inevitable restructuring of jobs and relationships.

Reengineering redefined three types of jobs at Acme Insurance. Clerks who had simply moved paper now operate the scanners and index the documents they process. The telephone agents now decide when to request outside information (at a cost), when to continue an interview (at a cost of time), when to terminate, when to quote, and when to refer a more complicated prospect to an underwriter. And the underwriters now concentrate on "macro-underwriting" -- pricing risk factors, not these individual policies. To ease these transitions, Smith involved these employees throughout the redesign process.

While the changes were uniformly upgrades, the sharp decline in the value of a worker's former contribution has been unnerving. And the employees had to be willing and able to upgrade their skills and assume greater responsibility. MIT's Eric Brynjolfsson, in a study of a reengineering project, found that "a surprisingly large subset of workers expressed no desire to become empowered." Smith reports similar problems. He says the firm now tries to hire ambitious and flexible workers; it even wants telephone agents with college degrees. Finding and keeping such workers, however, has been a challenge.

Such shifts in the demand for labor point to a wider concern. New process technologies undermine the earning power of workers who lack complementary skills and personal characteristics. So the income gains promised by the new technology will not be distributed equally, and will not be achieved until the workforce can adjust to these new demands.

 

THE PARADOX OF PLENTY

A final problem in absorbing IT is specific to the technology. IT is business technology. It helps us see the world and make decisions and thus forms part of an organization's management system. IT's rapid advance and its spread across the enterprise has produced what Marc Willinger and Ehud Zuscovitch, economists at France's Université Louis Pasteur, call information-intensive production systems, with large numbers of workers doing analytical and managerial work. Willinger and Zuscovitch see significant economies of scope in such organizations -- the more information processors at work, the greater the potential for cross-fertilization and the ability to develop custom products and solutions to business demands. But as IT has empowered these workers and firms, it has also raised the demand for information in ways that currently, and perhaps permanently, outstrip the supply.

The best IT-enabled organizations are full of workers with initiative and impressive analytical skills. Using powerful computers and software, they can model the world and make predictions about the state of affairs. They can quickly design new materials, machines, or packaging; research markets; analyze financial statements; or develop sophisticated business plans. What they often lack is good information about the business. Corporate "information environments are appalling," writes Thomas Davenport of the University of Texas in his forthcoming Information Ecology; managers, says Davenport, generally have "exceedingly little accessible information about their employees, their customers, or even their products."

Davenport's observation is truly ironic, for trunkloads of information lie all about the IT-enabled organization. As Shoshanna Zuboff, of the Harvard Business School observes in her acute The Age of the Smart Machine, information technology is unique in that it can automatically throw off trace data that records the workings of a business or technical process. Data entered into a modern electronic cash register not only flows into the standard accounting routines and newer inventory control systems. It can also be gathered, organized, and queried to yield information on customers or products, or on how shelf-space, pricing, and the price of a complementary product influence sales of a particular item.

Accessing this data, however, is hardly straightforward. Managers, analysts, and workers can hardly jump willy-nilly into the organization's on-line data systems, for they could muck it up or run down response times with an ill-constructed query. So organizations have created separate management information systems. Data stewards, another new occupational category, extract data out of on-line systems and ship them to data "warehouses" that supply the organization's managerial and analytical needs.

The logistics alone is a major technical challenge, says Edward Peters of Intersolv, a Rockville, Maryland vendor. Firms are just learning how to quality-test and transport such data and where to site the warehouses. A deeper problem lies in the data itself. Information arriving from around the enterprise is rarely defined in a consistent fashion, says Ralph Loftin, a consultant in Newton, Massachusetts. A category as rudimentary as "customer" lists companies in the accounts receivable database and human contacts in sales. Government agencies likewise provide monthly employment and wage data for the fabricated metals industry, but nothing for software. Imposing greater consistency and relevance seems essential. But managers constantly develop "hip-pocket" dialects to help them understand and organize their work environments, says Loftin, so imposing complete consistency -- and accessibility -- is an exercise in hubris costing far too much in time, money, and meanings lost.

 

RIDING A ROCKET

James Champy, one of the founding fathers of the business reengineering movement and now head of Perot Systems' Boston consulting office, finds IT's most serious failing in the management suite. He says the general decentralization of corporate authority in IT-enabled organizations has overloaded traditional oversight systems. Top managers are responsible for the performance of the enterprise, he says, but can no longer effectively supervise their subordinates.

IT allows managers of business units to access information, model the world, and test out options as never before. As a result, says Champy, they now propose to senior managers discontinuous jumps in the scale and scope of their operations. Competitors are likewise making bolder moves that violate expectations in a unit's business plan. Because of this instability, controls such as five-year plans with ROI and cash-flow projections have lost a great deal of value. Product demonstrations, reviews of analytical reports, and in-depth discussions of business plans can provide the necessary understanding. But this takes an enormous amount of time and attention. IT can augment this process of discovery, says Davenport, but can never be its substitute.

So the IT-enabled organization is like an automobile fitted with a new jet engine. The larger the enterprise and the more resources it has, the sounder its information infrastructure and the more ambitious its workers, the greater the potential for cross-fertilization and the more powerful this engine might be. But so long as human response times remain constant, the person atop of the enterprise, who is driving the car, won't go much faster than he or she did in the past. As the jet always threatens to accelerate out of control, the driver might even lack the confidence to continue on at the former rate of speed. Until organizations fix this fundamental management problem -- with or without the help of IT -- Champy says we will not fully harness the productive power of the new technology.

 

Selected Sources

Books

Richard L. Nolan and David C. Croson, Creative Destruction A Six-Stage Process for Transforming the Organization, Harvard Business School Press, 1995.

Shoshanna Zuboff, In the Age of the Smart Machine, Basic Books, 1988.

Thomas H. Davenport, Information Managing Information and Knowledge as if People Mattered, Oxford University Press, forthcoming.

Articles

Eric Brynjolfsson, Amy Austin Renshaw, and Marshall van Alstyne, "The Matrix of Change," draft. Sloan School of Management, Massachusetts Institute of Technology.

Richard R. Nelson, "Recent Evolutionary Theorizing About Economic Change," Journal of Economic Literature 33 (March 1995), pp. 48-90.

Jerry Kanter, "The Successful Chief Information Officer (CIO) "You Gotta Know the Territory," Working Paper Series #96-04. Center for Information Management Studies (CIMS), Babson College, 1996.

J.S. Metcalfe and Ian Miles, "Standards, Selection and Variety An Evolutionary Approacy," Information Economics and Policy 6 (1994) pp. 243-68.J.S. Metcalfe and Ian Miles, "Standards, Selection and Variety An Evolutionary Approacy," Information Economics and Policy 6 (1994) pp. 243-68.

up down About the Authors