This post contains some rough notes for Andrew Russell's Open Standards in the Digital Age which I just finished reading. If you are interested in the broad historical arc of standards development, particularly as they relate to the history of the Internet and telecommunications in the United States, then this is the book for you. In trying to keep with the spirit of Paul Edward's reading advice these notes aren't really meant to convey the full picture of Open Standards, but just focus on the somewhat selfish task of reminding myself why I read it, what my personal stakes were in the book, and what I got out of it.

Before you begin, figure out why you are reading this particular book, and how you are going to read it. If you don’t have reasons and strategies of your own — not just those of your teacher — you won’t learn as much.

I wanted to read Open Standards because I've been thinking that researching web archiving practices will also require understanding the standards that make it possible. In addition to his findings about the development of the Internet, I was also curious about what theories, methods and sources Russell used in his attempt to investigate standards as "recipes for reality" (p. 16).

Russell is a [Science and Technologies Studies] (STS) scholar, who in addition to starting the STS program at Stevens Institute for Technology, also helped start The Maintainers conference, which I've wanted to attend the past couple of years, and couldn't for a variety of reasons. My interest in repair, in particular how it relates to digital preservation, was sparked a few years ago by encountering Jackson (2014). After hearing Jackson speak at UMD, and more reading I've dug into since then, I've come to see the connections between repair, maintenance and infrastructure studies as one strand in a tradition of STS work. STS is a very interdisciplinary field that (I think) fits well in an iSchool environment, as well as the field of digital humanities, because of its study of technology using historical and ethnographic methods.

Aside: Digital humanities often gets articulated as the use of digital tools in humanities research. But I think it's fair to say that DH also includes the study of information technologies using theories and methods from the humanities. I'm hoping that this is not a radical claim so I connect my interest in information studies and digital humanities.

Open Standards is centered around two concepts. The first is that our technologies are shaped by our politics, and that in turn our technologies shape our politics. So technologies are embodiments and expressions of ideology. This theme draws on research on the politics and social construction of technologies (Hughes, 1987; Winner, 1980 ), which are in the news a lot today as we grapple with the effects of algorithm and big data on society. While Russell doesn't come out and say it, I think both of these ideas connect with theories of coproduction (Jasanoff, 2006) and Actor-Network-Theory (Law & Singleton, 2013), which treats the human and material as equal actors.

The second core idea in Open Standards is the role of critique in the study of technology, and standards specifically. On the one hand, critique is used as a way of furthering what Foucault (1997) describes as an instrument of resistance, and a challenge to the status quo. By the end of the book Russell has positioned standards themselves as critiques:

I have argued that we should understand these hybrid organizations -- and the standards they create -- as value-laden expressions of ideology, or ideas about how society should be ordered and how power should be exercised. I also have argued that innovation in network standards is a form of critique; these innovations do not merely challenge what is, they take productive action and make what could be. In this way, Open Standards and the Digital Age is an attempt to bring themes from business, economic and organizational history into closer conversation with constructivist histories of science and technology. In the process, we can bring a richer historical perspective to the question of how control persists in the midst of decentralization.

This approach reflects Russell's interest in using critique not just as a means for deconstruction, but also as a means for creation, invention and taking things apart in order to remake the world (Raunig, 2008). Both of these themes strongly recall Jackson's use of the word repair, and breakdown as a site for design and innovation.

According to Russell there are three types of standards: de facto (market driven), de jure (legally enforced) and voluntary consensus. Russell's contribution here is in distinguishing this third type of standard achieved through the process of voluntary consensus, which is a kind of middle ground between standards that are developed in the marketplace, and standards that are enforced by law. Russell's interest in this middle category is what allows him to provide a much richer historical description of the processes that give rise to standards. Otherwise the story would have reduced to either an economic or political analysis, that would miss out on the distinctive stories present in the book.

Standardization in the US in the 19th century happened largely in committees as part of organizations like American Society for Mechanical Engineers (ASME), American Institute for Electrical Engineers (AIEE), American Society for Testing Materials (ASTM). These efforts were largely driven by attempts to improve efficiency and reliability. These organizations welcomed participation from all types of interested parties, who were typically engineers and managers from businesses that were directly effected by the standards. The associations had rules to try to ensure one party didn’t exert too much control. These standards committees were linked directly to the professions that hosted them. [Herbert Hoover], himself a successful businessman and philanthropist, mobilized the idea of efficiency, and standardization in associations as a way to achieve it. Paul Gough Agnew of the American Engineering Standards Committee was also a significant figure who had a role in the formation of ISO after WW2 who pointed out that:

... the human difficulties are usually much more serious than the technical ones. (p. 68)

Which is saying things lightly. The American Standards Association developed processes to encourage standardization without requiring centralized intervention by the federal government. However Hoover as President certainly was endorsing these standards development organizations. De facto stanards in the United States were being jointly approved of at the national scale as a technique for opening markets.

AT&T’s standardization processes were mammoth.

By 1929, AT&T had created standards for an astonishing variety of functions, including telephone plant design; underground cables; raw materials; manufacture, distribution, installation, inspection, and maintenance of new equipment; business and accounting methods; nontechnical supplies (such as office furniture, appliance, janitor’s supplies, cutlery, and china); and provisions for safety, health, and even sleet storms. By the 1980s, the index alone of the Bell System Practices filled 969 pages; the volumes filled more the 80 cubic feet. (p. 109-110).

The abbreviated form of the Bell System Practices as "The Practice" is a literal indicator that standards are in fact coded instructions, for understanding practices. As

Despite AT&T’s monopoly Bancroft Gherardi Jr. emerges as a very likeable figure because of his role shaping the engineering efforts of AT&T over a period of rapid growth, and also for his role developing standards at the AIEE, ASME and ASA. Participation in these bodies allowed AT&T to not only reinforce its own position in the marketplace, but also to keep up with innovations happening elsewhere in the industry. The use of standards allowed them to take an incremental approach to innovation, rather than one of radical disruption. This incrementalism allowed for improving the services and infrastructure while also maintaining it, and reminds me a bit of the early development of the web. Engineers at AT&T applied lessons learned during the development of railroads and the telegraph, using standards to control network effects. In some ways they were too successful, which led to being regulated by the DoJ and FCC.

The FCC itself was born of a mistrust of business oversight having led to the Great Depression. Interest in decentralization was further fueled by interest in developing electronics and using market competition to drive it.

[ARPANET] was developed by the Defense Dept during the Cold War using packet switching which was theorized by Paul Baran at RAND and Donald Davies at National Physics Laboratory in England. AT&T had nothing to do with the effort — it just ran over their lines. Defense Dept funded graduate student researchers at Universities who met as the Network Working Group (NWG). The NWG was a closed network, very unlike the standards bodies that existed at the time. NWG initiated the RFC series since no other standards body would do. They were informal documents because they thought the real designers would arrive sometime, says Steve Crocker at UCLA:

Most of us were graduate students … we kept expecting that an official protocol design team would announce itself. (p. 169)

Discussion of ARPANET was opened up in 1972 at the First International Conference of Computer Communications in DC. This meeting brought ARPANET people into conversation with others working on different approaches to packet switched networks being done in the UK and France ([Cyclades]) went to form the International Network Working Group (INWG) led by then UCLA grad student Vint Cerf. Cerf and Robert Kahn were able to seed lots of research money from ARPA to work on new ways of delivering packets between different types of networks (inter-networking). Kuo at NWG issued a recommendation that the various large standards bodies not issue a standard, because more experimentation was needed. But there was too much interest and companies and standards organizations couldn’t resist.

A significant debate was between virtual circuit and datagram based protocols. Both were packet switching based but in the former the network made sure all the packets were delivered, whereas in that latter only the endpoints (the sender and receiver) kept track of that. Datagrams came from Louis Pouzin at the Cyclades project. The virtual circuit approach simulated a connection, but datagrams were connectionless. Virtual circuits were more like the old telephonic technologies. With datagrams the network didn’t care what the data was, only the edges did. This became known as the [End-to-End-Principle]. Pouzin comes across as a very likeable character in this story because of his understanding of these standards as critiques of the political and economic order, that (for computer networks) is largely defined by IBM.

An early standard for computer networks developed by the telecommunications industry. The X.25 standard was desirable to the telephone corporations because it was circuit based. Roberts (first director of IPTO at DARPA and “Father of ARPANET”) was trying to start a company and tried to rush X.25 through when he was presiding over the group that drafted X.25. Pouzin opposed X.25 for technical as well as political/economic reasons, which was unusual (Pouzin, 1976).

Failure to get datagrams into X.25 and TCP into INWG 96 led Cerf to leave Stanford and go to direct the Internet Program at IPTO with Kahn.

Cerf, as we have seen, assumed a new leadership role in ARPA and, before long, stopped attending INWG meetings. He, along with Kahn, would eventually be singled out and celebrated for their pioneering roles in the creation of the Internet. The pair’s fame stems from a fateful strategic decision: to abandon the international standards process in order to build a network for their wealthy and powerful client, the American military. Although there were many different developments beyond the control of Cerf and Kahn that contributed to the eventual growth of the Internet, it is tempting to wonder what might have happened if the leaders of INWG had been able to build and sustain a meaningful international consensus design for datagram networks.

INWG 96 basically went nowhere because of the corporate and governmental forces that were arrayed against it. Pouzin was vocal about the adverse effect that the telecommunications industry was having on the standardization process, and his Cyclades project was punished by the French government. But still Cerf called him the "datagram guru". Telecommunications industry pushing X.25 and attempts to standardize Open Systems Integration (OSI) at the ISO failed because of industry and national pressures. In an interesting aside [Charles Bachman] of Honeywell talks about the ISO’s approval of OSI reference model in 1982:

I would say that 75% was based on my work at Honeywell. It was well documented. We worked, as you sometimes have to do, to cover the traces. (p. 213)

In doing historical work around standards it becomes important to understand what these traces were, and why they were important to efface. The citation structure of Open Standards shows Russell transitions from citing archival material to citing content that is on the web. It really is striking how much of the history of the Internet is in emails on the Internet. This detail reminds me of [Kelty:2008]'s insight of a recursive public, where the technology of the Internet is used to create the Internet.

One of Russell's most compelling findings that Open Standards makes is that the Internet's ideology of openness obscures its autocratic origins:

The Internet, like many many technological novelties that preceded it, provokes fantasies that its history does not support. Its origins were autocratic, not democratic. Its design and standardization indicates the central importance of organizational boundaries and the alliances that mobilize across them; it does not portend the death of organizational gatekeepers or of organizations altogether. (p. 263)

It is useful to look at the development of the Internet as part of the larger historical arc of standardization, that is focused on looking at the organizations that are putting standards into motion. Russell credits Galambos (2012) with this idea of using the history of organizational change to look at technology. Galambos was on his PhD dissertation committee, so there is an intellectual lineage there to look at. Above all Russell's method of attending to the standards and the words people use to illuminate the practices and discourse of Internet engineering could serve as a good method for my own work.

Rather than focus on the lessons that past information networks can teach us in the present, I have chosen to study how the designers of networks responded to their own circumstances and how they have seen these through their own eyes. Inspired by Henry Demarest Lloyd's observation that "history is condensed in the catchwords of the people," I have been especially attentive to the discourses and practices of standardization through which network architects, engineers, and users sought to exercise power, impose order, create stability, and pursue openness.

In some ways Open Standards connects James Scott's [Seeing Like a State] which has greatly influenced my own thinking. In the United States there is not only the state that is seeing, but also a complex assemblage of organizations and infrastructure that are working to make certain things legible. How do Scott's ideas of resistance to these modes of seeing play into this? Thinking of Scott also highlights one of the potential blind spots of Russell's analysis. Much of the book rests on a foundation of American exceptionalism, where consensus based standards were developed because of an American fear of centralized control. Organizations needed ways of creating efficiencies in the marketplace without resorting to de jure standardization, as much of Europe was doing. However the development of de facto standards, and the history of practice can be traced back further than American distrust of authority. These are histories of practice that have a longer arc that would be useful to at least gesture at (Sennett, 2008). But perhaps it would undermine the historical tale that is being told here.

Despite arguments that the Internet’s development was the result of open standards processes, OSI’s failure was largely attributed to its openness. The development model of TCP/IP rejected membership and voting. Others such as Turner (2006) have linked this skepticism to counterculture of the 60s. However the convergence of telephony and computer technologies (ICTs) gave rise to a recognition that national if not global standards were required. The criticisms of OSI mobilized by TCP/IP camp were largely expressed as problems related to "anticipated standardization", instead of codifying existing practice. Kahn and Cerf pursued a different (and older) approach to OSI, where implementation preceded standardization. The whole OSI vs TCP/IP story is very reminiscent in a way of the Semantic Web vs REST visions of the Web...which itself is a story of standardization and openness.

There's lots to chew on in this book, but at the end of the day the main thing I can bring into my own work is the ideas of standards as critiques, as documents that organize and challenge the status quo in particular ways. Being able to articulate how the standards are operating and who they are operating on is a key element to understanding their historical context.

References

Foucault, M. (1997). The politics of truth. In S. Lotringer & L. Hochroth (Eds.) (pp. 23–82). New York: Semiotext(e).

Galambos, L. (2012). The creative society-and the price americans paid for it. Cambridge University Press.

Hughes, T. P. (1987). The social construction of technological systems: New directions in the sociology and history of technology. In W. E. Bijker, T. P. Hughes, & T. J. Pinch (Eds.) (pp. 51–82). MIT Press, Cambridge, MA.

Jackson, S. J. (2014). Media technologies: Essays on communication, materiality and society. In P. Boczkowski & K. Foot (Eds.). MIT Press. Retrieved from http://sjackson.infosci.cornell.edu/RethinkingRepairPROOFS(reduced)Aug2013.pdf

Jasanoff, S. (2006). States of knowledge: The co-production of science and the social order. Routledge.

Law, J., & Singleton, V. (2013). ANT and politics: Working in and on the world. Qualitative Sociology, 36(4), 485–502.

Pouzin, L. (1976). Virtual circuits vs. datagrams: Technical and political problems. In Proceedings of the june 7-10, 1976, National Computer Conference and Exposition (pp. 483–494). ACM. Retrieved from https://www.computer.org/csdl/proceedings/afips/1976/5084/00/50840483.pdf

Raunig, G. (2008). What is critique? Suspension and re-composition in textual and social machines. Transversal, 113. Retrieved from http://eipcp.net/transversal/0808/raunig/en

Sennett, R. (2008). The craftsman. Yale University Press.

Turner, F. (2006). Couterculture to cyberculture: Stewart Brand, the Whole Earth Network, and the rise of digital utopianism. University of Chicago Press.

Winner, L. (1980). Do artifacts have politics? Daedalus, 121–136.