The Era of Fragmentation, Part 1: Load Factor

By the early 1980s, the roots of what we know now as the Internet had been established – its basic protocols designed and battle-tested in real use – but it remained a closed system almost entirely under the control of a single entity, the U.S. Department of Defense. Soon that would change, as it expanded to academic computer science departments across the U.S. with CSNET. It would continue to grow from there within academia, before finally opening to general commercial use in the 1990s.

But that the Internet would become central to the coming digital world, the much touted “information society,” was by no means obvious circa 1980. Even for those who had heard of it, it remained little more than a very promising academic experiment. The rest of the world did not stand still, waiting with bated breath for its arrival. Instead, many different visions for bringing online services to the masses competed for money and attention.

Personal Computing

By about 1975, advances in semiconductor manufacturing had made possible a new kind of computer. At few years prior, engineers had figured out how to pack the core processing logic of a computer onto a single microchip – a microprocessor. Companies such as Intel began to offer high-speed short-term memory on chips as well, to replace the magnetic core memory of previous generations of computers. This brought the most central and expensive parts of the computer under the sway of Moore’s Law, which, in turn, drove the unit price of chip-based computing and memory relentlessly downward for decades to come. By the middle of the decade, this process had already brought the price of these components low enough that a reasonably comfortable middle-class American might consider buying and building a computer of his or her own. Such machines were called microcomputers (or, sometimes, personal computers).

The claim to the title of the first personal computer been fiercely contested, with some looking back as far as Wes Clark’s LINC or the Lincoln Labs TX-0, which, after all, were wielded interactively by a single user at a time. Putting aside strict questions of precedence, any claimant to significance based on historical causality must concede to one obvious champion. No other machine had the catalytic effect that the MITS Altair 8800 had, in bringing about the explosion of microcomputing in the late 1970s.

851px-altair_8800_computer
The Altair 8800, atop optional 8-inch floppy disk unit

The Altair fell into the electronic hobbyist community like a seed crystal. It convinced hobbyists that it was possible for a person build and own their own computer at a reasonable price, and they coalesced into communities to discuss their new machines, like the Homebrew Computer Club in Menlo Park. Those hobbyist cells then launched the much wider wave of commercial microcomputing based on mass-produced machines that required no hardware skills to bring to life, such as the Apple II and Radio Shack TRS-80.

By 1984, 8% of U.S. households had their own computer, a total of some seven million machines1. Meanwhile, businesses were acquiring their own fleets of personal computers at the rate of hundreds of thousands per year, mostly the IBM 5150 and its clones2. At the higher end of the price range for single-user computers, a growing market had also appeared for workstations from the likes of Silicon Graphics and Sun Microsystems – beefier computers equipped standard with high-end graphical displays and networking hardware, intended for use by scientists, engineers and other technical specialists.

None of these machines would be invited to play in the rarefied world of ARPANET. Yet many of their users wanted access to the promised fusion of computers and communications that academic theorists had been talking up in the popular press since Taylor and Licklider’s 1968 “Computer As a Communication Device,” and even before. As far back as 1966, computer scientist John McCarthy had promised in Scientific American that “[n]o stretching of the demonstrated technology is required to envision computer consoles installed in every home and connected to public-utility computers through the telephone system.”  The range of services such a system could offer, he averred, would be impossible to enumerate, but he put forth a few examples: “Everyone will have better access to the Library of Congress than the librarian himself now has. …Full reports on current events, whether baseball scores, the smog index in Los Angeles or the minutes of the 178th meeting of the Korean Truce Commission, will be available for the asking. Income tax returns will be automatically prepared on the basis of continuous, cumulative annual records of income, deductions, contributions and expenses.”

Articles in the popular press described the possibilities for electronic mail, digital games, services of all kinds from legal and medical advice to online shopping. But how, practically, would all these imaginings take shape? Many answers were in the offing. In hindsight, this era bears the aspect of a broken mirror. All of the services and concepts that would characterize the commercial internet of the 1990s – and then some – were manifest in the 1980s, but in fragments, scattered piecemeal across dozens of different systems. With a few exceptions3, these systems did not interconnect, each stood isolated from the others, a “walled garden,” in later terminology. Users on one system had no way to communicate or interact with those on another, and the quest to attract more users was thus for the most part a zero-sum game.

In this installment, we’ll consider one set of participants in this new digital land grab, time-sharing companies looking to diversity into a new market with attractive characteristics.

Load Factor

In 1892, Samuel Insull, a protégé of Thomas Edison, headed west and to lead a new  branch of Edison’s electrical empire, the Chicago Edison Company. There he consolidated many of the core principles of modern utility management, among them the concept of the load factor – the average load on the electrical system divided by its highest load. The higher the load factor the better, because any deviation below 1/1 represents waste – expensive capital capacity that’s needed to handle the peak of demand, but left idle in the troughs. Insull therefore set out to fill in the troughs in the demand curve by developing new classes of customers that would use electricity at different times of day (or even in different seasons), even if it meant offering them discounted rates. In the early years of electrical power, the primary demand came from domestic lighting, with most demand in the evening. So Insull promoted its use for industrial machinery to increase daytime use. This still left dips in the morning and evening rush, so he convinced the Chicago streetcar systems convert to electrical traction. And so Insull maximized the value of his capital investments, even though it often meant offering lower prices[^hughes].

samuel_insull
Insull in 1926, when he was pictured on the cover of Time magazine.

[^hughes]: Thomas P. Hughes, Networks of Power (1983), 216-225. 

The same principles still applied to capital investments in computers nearly a century later, and it was exactly the desirability of a balanced load factor and the incentive for offering lower off-peak prices that made possible two new online services for microcomputers that launched nearly simultaneously in the summer of 1979: CompuServe and The Source.

CompuServe

In 1969, the newly-formed Golden United Life Insurance company of Columbus, Ohio created a subsidiary called the Compu-Serv Network. The founder of Golden United wanted to be a cutting-edge, high-tech company with computerized records, and so he had hired a young computer science grad named John Goltz to lead the effort. Goltz, however, was gulled by a DEC salesman into buying a PDP-10, an expensive machine with far more computer power than Golden United currently needed. The idea behind Compu-Serv was to turn that error into an opportunity, by selling the excess computer power to paying customers who would dial into the Compu-Serv PDP-10 via a remote terminal. In the late 1960s this time-sharing model for selling computer service was spreading rapidly, and Golden United wanted to get its own cut of the action. In the 1970s the time-sharing subsidiary spun off to operate independently, re-branded itself as CompuServe, and built its own packet-switching network in order to be able to offer affordable, nationwide access to its computer centers in Columbus.

A national market not only gave the company access to more potential customers, it also extended the demand curve for computer time, by spreading it across four time zones. Nonetheless, there were still a large gulf of time between the end of business hours in California and the start of business on the East Coast, not to mention the weekends. CompuServe CEO Jeff Wilkins saw an opportunity in the growing fleet of home computers, many of whose owners whiled away their evening and weekend hours on their electronic hobby. What if they were offered access to email, message boards, and games on CompuServe computers, at discounted rates for evening and weekend access ($5 an hour, versus $12 during the work day4)?

So Wilkins launched a trial of a service he called MicroNET (intentionally held at arms length from the main CompuServe brand) and after a slow start it gradually proved a resounding success. Because of CompuServe’s national data network, most users only had to dial a local number to reach MicroNET, and thus avoided long-distance telephone charges, despite the fact that the actual computers they were connecting to resided in Ohio. His experiment having proved itself, Wilkins dropped the MicroNET name and folded the service under the CompuServe brand. Soon the company began to offer services tailored to the needs of microcomputer users, such as games and other software available for sale on-line.

But by far the most popular services were the communications platforms. For long-lived public content and discussions there were the forums, ranging across every topic from literature to medicine, from woodworking to pop music. Forums were generally left to their own devices by CompuServe, being administered and moderated by ordinary users who took on the role of “sysops” for each forum. The other main communications platform was the “CB Simulator”, coded up over the weekend by Sandy Trevor, a CompuServe executive. Named after citizen band (CB) radio, a popular hobby at the time, it allowed users to have text-based chats in real-time in dedicated channels, a similar model to the ‘talk’ programs offered on many time-sharing systems. Many dedicated users would hang out for hours on CB Simulator, shooting the breeze, making friends, or even finding lovers.

The Source

Hot on the heels of MicroNET – launching just eight days later in July of 1979 – came another on-line service for microcomputers that arrived at essentially the same place as Jeff Wilkins, despite starting from a very different angle. William (Bill) Von Meister, a son of German immigrants, whose father had helped establish zeppelin service between Germany and the U.S., was a serial enterpreneur. He no sooner got some new enterprise off the ground than he lost interest, or was forced out by disgruntled financial backers. He could not have been more different than the steady Wilkins. As of the mid-1970s, his greatest successes to date were in electronic communications – Telepost, a service which sent messages across the country electronically to the switching center nearest its recipient, and then covered the last mile via next-day mail; and TDX, which used computers to optimize the routing of telephone calls, reducing the cost of long-distance telephone service within large businesses.

Having, predictably, lost interest in TDX, Von Meister’s newest enthusiasm in the late 1970s was Infocast, which he planned to launch in McClean, Virginia. In effect, it was an extension of the Telepost concept, except instead of using mail for the last mile delivery, he would use the FM radio sideband (basically the same mechanism that’s used to transmit station identification, artist, and song title to the screens of modern radios) to deliver digital data to computer terminals. In particular, he planned to target highly distributed business with lots of locations that needed regular information updates from their central office, such as banks, insurance companies, and grocery stores.

meister
Bill Von Meister

But what Von Meister really wanted to build was a national network to deliver data into homes, to terminals by the millions, not thousands.  Convincing a business to spend $1000 on a special FM receiver and terminal was one thing, however, to ask the same of consumers was quite another matter. So Von Meister went casting about for another means to deliver news, weather, and other information into homes; and he found it, in the hundreds of thousands of microcomputers that were sprouting like mushrooms in american offices and dens, in homes ready-equipped with telephone connections. He partnered with Jack Taub, a deep-pocketed and well-connected businessman who loved the concept and wanted to invest. Taub and Von Meister initially called the new service CompuCom, a mix of truncation and compounding typical for a computer company of the day, but later settled on a much more abstract and visionary name – The Source.

The main problem they faced was a lack of any technical infrastructure with which to deliver this vision. To get it they partnered with two companies with, collectively, the same resources as CompuServe – time-shared computers and a national data communications network, both of which sat mostly idle on evenings and weekends. Dialcom, headquartered across the Potomac in Silver Springs, Maryland, provided the computing muscle. Like CompuServe, it had begun in 1970 as a time-sharing service5, though by the end of the decade it offered many other digital services. Telenet, the packet-switched network spun off by Bolt, Beranek and Newman earlier in the decade, provided the communications infrastructure. By paying discounted rates to Dialcom and Telenet for off-peak service, Taub and Von Meister were able to offer access to The Source for $2.75 an hour on nights and weekends, after an initial $100 membership fee6

Other than the pricing structure, the biggest difference between The Source and CompuServe was how they expected people to use their systems. The early services that CompuServe offered, such as email, the forums, CB, and the software exchange, generally assumed that users would form their own communities and build their own superstructures atop a basic hardware and software foundation, much like corporate users of time-sharing systems. Taub and Von Meister, however, had no cultural background in time-sharing. Their business plan centered around providing large amounts of information for the upscale, professional consumer: a New York Times database, United Press International news wires, stock information from Dow Jones, airline pricing, local restaurant guides, wine lists. Perhaps the single most telling detail was that Source users were welcomed by a menu of service options on log-in, CompuServe users by a command line.

In keeping with the personality differences between Wilkins and Von Meister, the launch of The Source was as grandiose as MicroNET’s was subtle, including a guest appearance by Isaac Asimov to announce the arrival of science fiction become science fact. Likewise in keeping with Von Meister’s personality and his past, his tenure at The Source would not be lengthy. The company immediately ran into financial difficulties due to his massive overspending. Taub and his brother had a large enough ownership share to oust Von Meister, and they did just that in October of 1979, just a few months after the launch party.

The Decline of Time-Sharing

The last company to enter the microcomputing market due to the logic of load factor was General Electric Information Services (GEIS), a division of the electrical engineering giant. Founded in the mid-1960s, when GE was still trying to compete in the computer manufacturing business, GEIS was conceived as a way to try to outflank IBM’s dominant position in computer sales. Why buy from them, GE pitched, when you can rent from us? The effort made little dent in IBM’s market share, but made enough money to receive continued investment into the 1980s, by which point GEIS owned a worldwide data network and two major computing centers one of them in Cleveland, Ohio and the other in Europe.

In 1984, someone at GEIS noticed the growth of The Source and CompuServe (the latter had, by that time, over 100,000 users), and saw a way to put their computing centers to work in off-peak hours. To build their own consumer offering they recruited a CompuServe veteran, Bill Louden. Louden, disgruntled with managers from the corporate sales side who began muscling in on the increasingly lucrative consumer business, had jumped ship with a group of fellow defectors to try to build their own online service in Atlanta, called Georgia OnLine. They tried to turn the lack of access to a national data network into a virtue, by offering services tailored for the local market, such as an events guide and classified ads, but the company went bust, so Louden was very receptive to the offer from GEIS.

Louden called the new service GEnie, a backronym for General Electric Network for Information Exchange. It offered all of the services that The Source and CompuServe had by now made table stakes in the market – a chat application (CB simulator), bulletin boards, news, weather, and sports information.

GEnie was the last personal computing service born out of the time-sharing industry and the logic of the load factor. By the mid-1980s, the entire economic balance of power had begun to shift. As small computers proliferated in the millions, offering digital services to the mass market became a more and more enticing business in its own right, rather than simply a way to leverage existing capital. In the early days, The Source and CompuServe were tiny, with only a few thousand subscribers each in 1980. A decade later, millions of subscribers paid monthly for on-line services in the U.S. – with CompuServe at the forefront of the market, having absorbed its erstwhile rival, The Source. The same process also made time-sharing less attractive to businesses – why pay all the telecommunications costs and overhead of accessing a remote computer owned by someone else, when it was becoming so easy to equip your own office with powerful machines? Not until fiber optics drove the unit cost of communications into the ground would this logic reverse direction again.

Time-sharing companies were not the only route to the consumer market, however. Rather than starting with mainframe computers and looking for places to put them to work, others started from the appliance that millions already had in their homes, and looked for ways to connect it to a computer.

[Previous] [Next]

Further Reading

Michael A. Banks, On the Way to the Web (2008)

Jimmy Maher, “A Net Before the Web,” filfre.net (2017)

 


  1. https://www.ntia.doc.gov/legacy/ntiahome/fttn99/appendix.html 
  2. Andrew Pollack, “Big I.B.M. Has Done It Again,” New York Times, March 27, 1983 
  3. Companies did sometimes make data exchange deals, such as when CompuServe agreed to exchange email with MCI. “MCI Mail Link with Compuserve,” The New York Times, Feb. 25, 1986. 
  4. Equivalent to about $24 and $58 in 2020 dollars, respectively. 
  5. Incidentally, a Dialcom terminal in his high school provided a young Eric Schmidt, future Google CEO, with his first exposure to computers. 
  6. Equivalent to about $13 an hour and a $480 signup cost in 2020 dollars. 

2 thoughts on “The Era of Fragmentation, Part 1: Load Factor”

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s