To Market to Market To Market to Market

September 2, 1996

It doesn't take a rocket scientist to see that making money in a high-technology business is no sure thing. Established and successful firms have stumbled like Digital Equipment Corp. Promising companies have disintegrated like Thinking Machines. Now-forgotten start-ups in biotech and software once attracted piles of cash and mountains of effort, but were unable to get their products off the ground. Technological superiority provides no guarantees The failure of the NeXT workstation is just one example of a "better" product that flopped in the marketplace.

In some ways, high-tech firms are much like other ventures. They prosper by anticipating the market, figuring which products to produce, and how to promote and price them. They choose when to form alliances, when to play hardball, and when to "live and let live." Their commitments take investment -- in research and development, in production facilities, and in marketing and distribution capacity -- that may be costly, even impossible, to reverse.

Rapid and uncertain technological evolution makes for added problems. It can be particularly hard for high-tech firms to figure out who their customers might be and what they might want. Ask anyone trying to sell products over the Internet. Industries in a state of flux can bring new competitors out of the woodwork Small start-ups operating below the radar screen are one worry; behemoths from adjacent industries with deep pockets and a desire for fresh territory are another.

Technological surprise can upset even the best-laid plans and make room for newcomers. Consider the predicament faced by makers of traditional ulcer remedies -- Tagamet, Zantac, Pepcid, and Axid -- after the unexpected discovery that ulcers can be cured with antibiotics. The vicissitudes of public opinion and government regulation are another risk factor. They might make life easier, as they recently have for companies trying to get AIDS drugs to market; or tougher, as they have for high-tech manufacturers that work with environmentally sensitive materials. In a rapidly changing environment, high-tech companies that sit still are courting trouble.

High-tech firms frequently operate in a world of increasing returns, where there are advantages to size, and the winners win big. So the risks are there, but the rewards can be enormous. There are no simple rules to follow, as artful strategies depend on many contingencies. But this leaves room for the ingenuity of a firm that can imagine the future and devise a path that maneuvers among the risks. So, let the strategizing begin.



All companies face uncertainty, but high-tech companies face more, faster. In traditional industries, such as oil and paper production, firms think ahead a decade or more. They deliberate for months about capital investments that may well be in use twenty years hence. In contrast, many high-tech companies consider the long term to be eighteen months to three years -- maybe five years at most, notes David Mason, president of Northeast Consulting Resources, a Boston consulting firm specializing in technology. In this rapidly changing environment, firms must make quick judgments about investing in expensive equipment that can swiftly become obsolete.

High-tech markets are thus fraught with risk. Maybe things won't go as expected, or an unpleasant surprise may reduce the value of the firm's resources and activities. But uncertainty can also be a source of strategic opportunity, observes Elizabeth Teisberg, of the University of Virginia. A changing environment is especially fertile ground for creating new possibilities. And there are potentially high payoffs for companies prepared to take advantage of them.

To manage the opportunities and risks, many high-tech companies use some version of scenario or decision analysis in charting their course. Firm managers enumerate potential risks (market, technological, political, regulatory, and so on), and develop and think through the logic of different possible accounts of the future. As they appraise each scenario, managers may find that the sequence in which uncertainties are likely to be resolved, or some other logic, links some outcomes to others. Thus, the future may be less uncertain than previously believed. Or they might realize that the firm's decisions would be identical under all scenarios, making the uncertainty merely a worry, not critical to strategy development. Often, the analysis involves role playing. It can be instructive to view the future from the vantage point of competitors, customers, and suppliers. Some outcomes may very much depend on actions of the players, whereas others will not.

Firms can thus clarify which risks are subject to their control. And they can respond by reducing these risks at a variety of spots along the value chain, notes Teisberg. Thus, Genzyme, a Cambridge, Massachusetts, biotech company, chose to swing for the "base hit" rather than the "home run," and develop lower-return products to provide early revenue and knowledge useful in the future. It also hedged against a rise in the price of a key input through a long-term supply contract. Canon, with a strategic commitment to offering the best color desktop printer, decided to spread its research and development efforts among several competing technologies to protect its investment in production facilities, marketing, and service networks. Adjusting the pace and timing of strategic investments by making them contingent on test marketing or other outcome on a stage-by-stage basis is, says Teisberg, an especially effective method to reduce risk and increase the rewards to a successful venture.

But no amount of planning eliminates uncertainty, nor is minimizing risk desirable. Avoiding strategic commitments that carry significant risks might consign a firm to average returns. For example, Genzyme reduced its risk when it piggybacked its efforts to fight Gaucher's disease on earlier government research that had identified the crucial molecule. But in doing so, Genzyme gave up a chance at getting a patent on the eventual drug, possibly sacrificing part of the upside potential. Many other decisions that a firm can make to spread risk, hedge, or invest in flexibility are expensive -- especially for a small startup with limited resources.

So the issue is not only which risks to minimize, but also which to accept. And, most important, how to improve the chance of success and increase the reward to winning, once the bet is made. Perhaps the firm can preempt an undesirable response from competitors. Or influence the course of regulation. Or redefine the ways firms interact so as to avoid costly competitive battles. Actions that alter industry structure or change the way the game is played work best. After all, the firm that chooses the game and makes the rules is likely to gain the upper hand.



Changing the game often means enlarging it, since high-tech business tends to exhibit increasing returns. In companies with large research and development expenses relative to manufacturing costs, size confers a cost advantage The more product sold, the lower the unit costs.

Increasing returns may be even more important on the revenue side. Network externalities make high-tech products more valuable if they are compatible with a network of other users. Fax machines, for example, are only useful when other people have compatible fax machines and customers are more willing to buy a high-tech product once a standard becomes established. High-tech products also can be complicated, requiring trained personnel or investment in compatible equipment to perform at their optimum. Customers, concerned about stranded investment, may prefer buying from a large, established firm that is unlikely to go out of business. For all these reasons, the firm that gets big can hope to see its unit costs fall, its sales grow even more as a consequence, and, ideally, its products entrenched as the industry standard.

Thus, strategic alliances between a high-tech firm and one of its suppliers or a producer of a complementary product are commonplace. In a recent, high-profile example, Netscape Communications joined with Sun Microsystems and Oracle to increase demand for personal computers by developing a simple, low-cost version which can be managed through the Internet or other network.

A strategy to expand the market may pay off, even when it means creating additional competitors, say Adam Brandenburger of the Harvard Business School and Barry Nalebuff of the Yale School of Management in their book, Co-opetition. They argue that Intel profited from relinquishing its monopoly of the 8086 microprocessor and agreeing to provide a second-sourcing license to IBM and eleven other manufacturers in the late 1970s. The reason, say Brandenburger and Nalebuff, was greatly increased sales.

IBM, lagging behind Apple in PCs at the time, hoped to close the gap quickly by using Intel's microprocessor and Microsoft's operating system. It also decided on an open-architecture policy to enable other firms to develop compatible products and sidestep possible antitrust concerns. But IBM worried about its dependence on Intel, whose manufacturing reliability was at that point untested. And it wanted assurances of price competition. So it predicated its commitment to Intel on the right to license. Intel agreed and gave up its monopoly, hoping that IBM's success would expand the PC market and drive demand for Intel chips. When PC sales did take off, Intel was able to grow far more quickly by controlling a smaller share (30 percent in 1987) of a much larger market.

Still, this arrangement encouraged Intel to open up an edge over the competition by quickly jumping to the next-generation chip, Brandenburger and Nalebuff note. And IBM's decision in favor of open architecture led to the rise of the clones -- Leading Edge, Dell, and Compaq. Their entry as buyers of microprocessors significantly improved Intel's bargaining power with IBM (indeed, with all PC manufacturers), as IBM was now a smaller share of all customers. "IBM nurtured cooperation," observes Neil Niman of the University of New Hampshire, "only to discover that having made it to the dance, its partner went off courting others, leaving it all alone."

Thus, the balance of power shifted. When it came time to produce the 286 generation of chips, Intel was able to limit licensing to five companies and retain a 75 percent market share. For the 386 chip and beyond, Intel regained most of its monopoly, granting a single license to IBM, good only for internal use. The market for PCs grew, and Intel became fixed as the industry standard. Ultimately, IBM turned to Apple and Motorola in a belated and still struggling effort to create a competitor to Intel chips, the Power PC.

The economics of increasing returns surfaces again as Intel searches for ways to drive demand for its post-Pentium generation of chips, observe Brandenburger and Nalebuff. They cite Intel's investment in ProShare, which makes a video conferencing system that sits on top of a desktop computer and consumes tons of processing power. But video conferencing machines are useless unless others also have them; and only mass production can lower prices enough for the equipment to become commonplace. So, Intel has developed alliances with various phone companies, which have agreed to offer ProShare to their customers at a discount (at the same time, increasing demand for the ISDN phone lines necessary to handle transmission of video images). Intel also has negotiated an agreement with Compaq to include ProShare in Compaq business computers as a way of creating momentum, bringing prices down and, hopefully, driving demand for the next generation of chips.



If increasing returns confers an advantage to being big, it stands to reason that it also confers an advantage to being first. After all, the first firm to market has an unparalleled opportunity to entrench its product, grow large, and reap the advantages of low cost and customer lock-in. Once dominant, the firm can defend its market power against interlopers and extend it into the future.

This certainly has been the conventional wisdom. Entrepreneurs and established firms race to be first. Several studies from the 1980s appear to support this proclivity. Gerard Tellis of the University of Southern California and Peter Golder of New York University report the findings The average market share across a range of businesses (not just high-tech firms) was about 30 percent for market pioneers, 19 percent for early followers, and 13 percent for late entrants. In some cases, the market leader maintains its dominance through several technological advances. Polaroid, the inventor of instant photography, was able to extend its lead to the next product generation for many years.

But the statistical studies that support this view suffer from several design defects, contend Tellis and Golder. Most sampled only surviving firms, which biased their results. After careful reanalysis, Tellis and Golder find that the first firms to sell in a new product category were not as dominant as conventional wisdom supposed. Nearly half failed to survive. In only 11 percent of product categories did they still control the largest market share, and they were even less dominant in categories that began after World War II. By contrast, "early entrants," firms that arrived an average of thirteen years after the market pioneers, rarely failed. They exhibited a high rate of leadership, typically assumed early in the product life cycles and, on average, had a market share three times that of the first entrants. "The best example is Microsoft," notes Neil Niman. "Microsoft has never been first to market in anything, but always seems to earn the lion's share of business and become the dominant player." So strategic success depends upon more than simply being first.

In fact, technological advance under some circumstances can create an opening for newcomers. When the advance represents a radical departure from previous technique, rather than merely an incremental move, established firms may not be able to extend their monopoly power into the new regime. They may hang back from making needed investments in the new technology because they are unwilling to cannibalize their existing product line. And within the organization, "core competencies" can harden into core rigidities, observes Dorothy Leonard-Barton of the Harvard Business School, making it difficult for leaders to keep pace in the transition to the next technology.

Organizational rigidities seem to account for change of leadership in the photolithography industry, which makes sophisticated capital equipment used by semiconductor manufacturers. After each major technological innovation, the dominant firm was supplanted by an entrant. This was so, claims Rebecca Henderson of MIT, even though the innovations were not "radical." That is, established firms expected to be able to retain their market power, and most incumbent companies invested heavily to develop the new innovations. Any concern about cannibalizing their earlier generations of product did not seem to slow their strategic spending. Nevertheless, the leading firm lost its position each time technology advanced.

The market leader changed, Henderson suggests, because the innovations rendered part of the leading firms' knowledge obsolete. This knowledge was so deeply embedded in routines and procedures that the obsolescence may have been difficult to observe. Henderson cites the example of Kasper Instruments, which responded inappropriately to early complaints about its product's accuracy. It incorrectly attributed the trouble to distortions introduced during processing -- which had, in fact, been a source of many problems under the previous technology -- and overlooked the true source of the difficulty, an error in a particular mechanism. Kasper also failed to detect a significant technical advance in a competitor's product because the company used evaluative criteria developed under an earlier regime. Thus, Henderson suggests that old ways of thinking and dated procedures made the research efforts of established firms less productive than those of new entrants.



In the early development of a technologically advancing industry, firms tend to be small. They offer a variety of competing products, and technological change is rapid, especially among new entrants. The industry likely contains a fair number of companies, with a high rate of both entry and exit. Market shares move around, and there may be no consistent market leader.

As the technology evolves and the market expands, so do the number of firms. Diversity in competing product technologies peaks and eventually begins to decline, while increasing effort is made at process innovation.

What happens next? Being first (or at least early) creates enormous advantages. The incumbent frequently can establish itself as the industry standard, take advantage of scale economies, and extend its market power into the next generation of technology. Not every firm has this chance -- being first is partly a matter of proficiency and partly luck. But a firm that is first chooses a strategy and invests its resources accordingly to increase the market and its own capacity, to entrench its customers, to protect itself from imitators, and, if it can, to defend against the inertia and obsolescence that incumbency breeds.

Then one day, a new entrant, maybe a small startup or an established company from a related industry, comes on the scene with a better idea. The idea is so good that it just might be able to neutralize the incumbent's advantages and leapfrog over the existing leader. The newcomer must pick its spot decide which markets to enter and whether to attack the leader directly or take a less confrontational, differentiated approach. The leader then figures whether and how to fight back. And the winner... ?

The inescapable fact of high-tech business is uncertainty. There is no way around it; it is the one constant of high-tech life. Industries are born, mature, and mutate into new forms, even whole new industries. Just as Microsoft and Intel caught IBM, so may Netscape and its partners overtake the giant Microsoft. Or maybe not. So far, Bill Gates has been able to respond quickly and reorient Microsoft's strategy. How long its dominance will continue is anybody's guess. Yet, it is this very uncertainty that opens the way for high-tech firms to innovate. For it is only when answers are not yet known that there is the incentive to experiment.



The computer industry hasn't made a dime...Intel and Microsoft make money, but look at all the people who are losing money all the world over. It is doubtful the industry has yet broken even," said Peter Drucker, in a recent interview in Wired magazine. A provocative statement, but is it true?

Paul Gompers of the Harvard Business School and Alon Brav of the University of Chicago think Drucker is probably mistaken. They looked at companies that went public from 1975 to 1992, most of which were high-tech firms, and found their rate of return to be about average, once they adjusted for risk and company size. Only companies that were both small and without venture capital backing showed anything close to a zero return. And even for them, it is impossible to be certain whether this low return resulted from some unusual circumstance that is unlikely to be repeated (such as a dearth of capital for small firms that occurred in the early 1980s) or from foolish investing.

Moreover, the social return from a new technology's increased efficiency and lower price is very likely to exceed the private return. Intense competition may have reduced profits and even bankrupted some disk drive makers, points out Josh Lerner, of the Harvard Business School, but consumers continue to enjoy the cheap prices and increased capacity that resulted. Although getting an exact measure is difficult, Edwin Mansfield of the University of Pennsylvania found in his pioneering 1977 study that the median social return from investment in a fairly routine variety of industrial innovations was 56 percent, a very high figure. More recently, Manuel Trajtenberg, of Tel Aviv University, estimated that the social return to the development of a particular medical technology, CT scanners, might be as high as 270 percent.



IBM might have prevented the deterioration of its position in the PC market had it pursued an open-architecture policy without bringing in outside sourcing from Intel and Microsoft, contends Neil Niman of the University of New Hampshire. Or, alternatively, had it relied on outside sourcing without allowing clones of its hardware. But taken together, open architecture and outside sourcing permitted Intel and Microsoft to become monopoly suppliers of an essential input and, therefore, eroded IBM's bargaining strength.

Perhaps IBM could have salvaged the situation by fixing the terms of its future license agreements when its bargaining power was still strong. IBM did negotiate a long-term contract with Intel. But long-term contracts are hard to write and harder to enforce, especially when rapid technological change means they have to encompass changing product specifications and quality. Intel signed such a ten-year contract in 1981 with Advanced Micro Devices, another chip manufacturer, then proceeded to spend several years in a costly legal battle over its interpretation. Thus, long-term contracts are no guarantee.

Business professors Adam Brandenburger and Barry Nalebuff have a different suggestion. In their view, IBM should have demanded that Intel pay with equity for its right to share in the expanded PC market. IBM did purchase a 20 percent equity stake in Intel during 1982 and 1983, and received warrants to buy another 10 percent. But after helping Intel through a period of capital investment and financial difficulty, IBM sold out for $625 million in 1986 and 1987. Ten years later, this share would have been worth $18 billion.

Likewise, IBM had a chance to buy 30 percent of Microsoft for $300 million in 1986, a stake also worth $18 billion by 1995. By insisting that Intel and Microsoft fork over equity, IBM could have used its early strength to secure a share of these future profits.


Selected Sources


Adam Brandenburger and Barry Nalebuff, Co-opetition, Doubleday, 1996.

Richard Foster, Innovation: The Attacker's Advantage, Simon and Schuster, 1986.

Dorothy Leonard-Barton, Wellsprings of Knowledge: Building and Sustaining the Sources of Innovation, Harvard Business School Press, 1995.

Richard Rumelt, Dan Schendel and David Teece, Fundamental Issues in Strategy: A Research Agenda, Harvard Business School Press, 1994.


Richard Caves "Economic Analysis and the Quest for Competitive Advantage," American Economic Review, May 1984, pp. 127-132.

Rebecca Henderson, "Underinvestment and Incompetence as Responses to Radical Innovation: Evidence from the Photolithographic Alignment Industry," RAND Journal of Economics, Vol. 24, No. 2, Summer 1993, pp. 248-270.

Marvin Leiberman and David Montgomery, "First-Mover Advantages," Strategic Management Journal 9, 1988, pp.41-58.

Ruth Raubitschek, "Multiple Scenario Analysis and Business Planning," Advances in Strategic Management, Vol. 5, 1988, pp. 181-205.

Gerard Tellis and Peter Golder, "First to Market, First to Fail? Real Causes of Enduring Market Leadership," Sloan Management Review, Winter 1996, pp. 65-75.

up down About the Authors