Being Responsible in the Age of Social Media, Cryptocurrency and Smart Weapons/Cars/Phones
In this episode we will consider several aspects of ambivalent technological determinism. Through this analysis we will discover the political dimension of technology.
One form of ambivalent technological determinism is what Langdon Winner refers to as "technological drift" (Winner 88-100). In an age of pervasive technology, our society "drifts" on currents produced by the "directionless imposition of [technologically influenced] structures, interactions and values without meaningful [human] participation." (Stivers, 53) Though technological drift is unintentional it is not necessarily bad and, in fact, is a predictable consequence of contemporary technology. As Winner states,
technology is most productive when its ultimate range of results in neither foreseen nor controlled. To put it differently, technology always does more than we intend; we know this so well that it has actually become part of our intentions. Positive side effects are in fact a latent expectation or desire implicit in any plan of innovation. Negative side effects, similarly, are experienced as necessary evils that we are obligated to endue. Each [technological] intention, therefore, contains a concealed "unintention," which is just as much a part of our calculations as the immediate end in view. (Winner, 98)
This raises the issue of how the inevitability of technological drift can be responsibly anticipated.
Photo by Kvalifik on Unsplash
The Technological Imperative: Invention is the mother of necessity
A second form of ambivalent technological determinism is linked to a strong innovative and deterministic force generated by technology itself. This property of technology is captured in Kranzberg's second law. Standing traditional wisdom on its head, Kranzberg claims, "Invention is the mother of necessity." Stated simply, this means that "Every technical innovation seems to require additional technical advances to make it fully effective." (Kranzberg, 458). A prime example is hybrid cars, which are just now functioning as an established technology.
This property of technology is seen vividly when viewed at the level of large technological systems. Here it is clear that as systems develop within specific social contexts, new properties of the system emerge and are discovered, and new problems (technical, social, and moral) are revealed. Each of these aspects of technological development and application spawns change.
As Kranzberg indicates, the automobile is a prime example of this phenomenon:
[The development of the automobile] brought whole new industries into being and turned existing industries in new directions by its need for rubber tires, petroleum products, and new tools and materials. Furthermore, large scale use of the automobile demanded a host of auxiliary technological activities -- roads and highways, garages and parking lots, traffic signals, and parking meters. While it may be said that each of these other developments occurred in response to a specific need, I claim that it was the original invention [but not its purpose] that mothered that necessity. (Kranzberg, 459)
Remember earlier in this series of blogs that we defined technology as the extension or enhancement of human capacity or power by artificial means. In the case of automobile technology, the intent to expand human mobility has resulted in the need to develop technologies (such as oil-sill clean-up technologies) far removed from the original intent. Without the development of these additional technologies, the automobile-based transportation system as we know it would be impossible. With the advent of hybrid cars, in the context of global warming, this story of invention being the mother of necessity continues. Take for instance innovations in battery technology that are spinning off into so many other technological realms, and the emergence of electrical plug-in fueling stations in parking lots. Necessities related to smart ("autonomous") vehicles are already spinning out new technologies that will become tomorrow's normal.
This property of Technology goes by several names. Kranzberg calls it "technological imbalance," apparently wishing to stress the potential for the introduction of disequilibrium into the technological and social systems involved--what we now call "disruption." Winner stresses the ineluctable character of technology by referring to this phenomenon as the "technological imperative." [emphasis mine] Winner uses the concept of technological imperative specifically to describe "the system's need to control supply, distribution, and the full range of circumstances affecting its operation." (Winner, 251) I expand it to include all aspects of technological ordering.
The technological imperative has become critical to the ordering of human life in a technological society. Such societies are characterized by large-scale systems, such as the Internet, especially as it is being expanded and enhanced by blockchain and cellular communication technologies. The superstructure of contemporary society includes a vast array of technological and non technological structures, the primary purpose of which is the maintenance of technology (including increasingly its security). This superstructure creates demands on the society that must be met if the society is to function and prosper in its current and foreseeable form--as a technologically extended and enhanced society.
Technology as a political phenomenon
For Winner (and other critics of technological society, such as Daniel Bell, Herbert Marcuse, Jacques Ellul, and Neil Postman) the ultimate issue related to the technological imperative is human control--or, more precisely, the lack of control--over megatechnological systems. These critics argue that contemporary technology tends to co-opt political control from human beings.
Here, "political" is not understood solely in terms of institutions or political parties, though they are also involved in the technological imperative as evidenced by the furor over Russian technological interference in the 2016 US election. Rather, politics is viewed as the processes and structures related to social power and authority, and to social policy formation and implementation.
This raises fundamental questions regarding freedom in determining corporate and individual destiny and responsibility within the structures of technological society. Just consider the effects of presidential tweeting (a rather new and powerful communication technology related to social media) on virtually all aspects governance in the United States since the 2016 election.
According to Winner, technology has the "capacity to transform, order, and adapt animate and inanimate objects to accord with purely technical structures and processes." (Winner, 237) If this does not ring true, try taking a ride in an autonomous vehicle! Or try to reroute the next airline flight you get bumped from!! Or try to get your smart phone from doing whatever it is doing that you can't get it to keep from doing!!!
Winner's point can be applied to the general context of human agency and moral responsibility. Kranzberg's Laws apply. Technological phenomena are neither good nor bad in and of themselves. However, technology does tend to engender a technological ordering of life that becomes, in an age of pervasive technology, pervasive! It is thus a primary context for responsibility.
The implication is that technology should be viewed as not neutral, but as a political phenomenon. In a sense, the technological systems "legislate" social policy in contemporary societies. As Winner suggests, "New technologies are institutional structures within an evolving constitution that gives shape to a new polity, the technopolis." (Winner, 323) The question then becomes, what mechanisms for control of this situation are possible and necessary for the maintenance of human agency and democracy?
Next time in Being Responsible in the Age of Social Media, Cryptocurrency, and Smart Weapons/Cars/Phones -- "Ambivalent Technology 3: Forced Options"
This is an updated version of a portion of "Complex Responsibility in an Age of Technology," in Living Responsibly in Community, ed. Fredrick E. Glennon, et al. (University Press of America, 1997): 255-257. Buy at Amazon.
Langdon Winner, Autonomous Technology: Techniques Out-of-Control as a Theme in Political Thought (Cambridge, MIT Press, 1977).
Robert L. Stivers, Hunger, Technology, and Limits to Growth (Minneapolis: Augsburg Press, 1984).
Melvin Kranzberg, "Technology and History: 'Kranzberg's Laws'" Technology and Culture 27/3 (July 1986): 548-549.
In this series
Introduction: Being Responsible in the Age of Social Media, Cryptocurrency, and Smart Weapons/Cars/Phones - June 1, 2018.
Episode 1: "What Does It Mean To Be Responsible? "- June 5, 2018.
Episode 2: "Technology Revealed as a Mode of Human Activity" - June 16, 2018.
Episode 3: "Homo technicus as the Responsible Self" - June 30, 2018.
Episode 4: "The Scope of Responsibility in an Age of Pervasive Technology" - July 12, 2018.
Episode 5: "Ambivalent Technology 1: Technological Determinism" - August 10, 2018.
Episode 6: "Ambivalent Technology 2: The Political Dimension of Technology" - August 15, 2018.
Episode 7: "Ambivalent Technology 3: Forced Options"
Episode 8: "Ambivalent Technology 4: The Ideological Dimension of Technology"
Episode 9: "Ambivalent Technology 5: The Ethics of Self-Limitation"
Episode 10: "The Responsible Self as Homo technicus: Complex Responsibility"
© 2018 Russell E. Willis