The Ethics of Engineering

Nick Doty

UC Berkeley, School of Information

December 30, 2019

Also available as: pdf

Status of This Document

This is a chapter of a published dissertation: Enacting Privacy in Internet Standards.

2 The Ethics of Engineering

2.1 Engineering is inherently ethically-laden

In studying the ethical implications of the Internet and the Web (or indeed of technology in general), one might reasonably ask: why study the engineers at all? Why not just study the users of technology, or the business incentives for tech-focused corporations, or the specific details of software artifacts? Scholars of science, technology and society do examine all those things, with different research focuses,1 but I see a strong philosophical basis for exploring the ethics and ethos of those who engineer and design technology. While arguments against a view of tools as purely neutral are widespread and diverse, I am guided by the view described by José Ortega y Gasset of technology as a particularly human act.

Ortega’s argument is more essentialist than other arguments for the ethical implications of technology. Technology is not ethical just because it has a particular set of consequences and those consequences happen to be ethical ones; rather, technology is a set of choices about the good life.

To follow the argument step by step (Ortega y Gasset and Miller 1962):

  1. technology is the distinctly human act of changing or reforming nature;
  2. it is characteristic of man to employ technology, “the adaptation of the medium to the individual” (p. 96), to address her necessities;
  3. but technology is not limited to creating biological necessities;
  4. and indeed man seems to consider those “superfluous” things to be essential to life: “Not being, but well-being, is the fundamental necessity of man, the necessity of necessities” (p. 99); to sum it up:

Man, technology, well-being are, in the last instance, synonymous. (p. 100)

Ortega concludes from the synonymity that the direction of technology is inherently subjective, as a result of the different views of the good life:

Whereas life in the biological sense is a fixed entity defined for each species once and for all life, life in the human sense of good life is always mobile and infinitely variable. And with it, because they are a function of it, vary human necessities; and since technology is a system of actions called forth and directed by these necessities, it likewise is of Protean nature and ever changing. (p. 101)

If you accept the subjectivity of our desires and what makes for a good life and you accept Ortega’s argument that technology just is the reformation of nature to bring about those various superfluous aims, then, he argues, you should not accept that there is a singular progression of technology. To do so would be to “assume that man’s vital desires are always the same and that the only thing that varies in the course of time is the progressive advance towards their fulfillment. But this is as wrong as wrong can be” (p. 102).

The synonymity of technology and well-being and the potential losses of technology that comes from different desires or changed circumstances leads to the argument that engineering ought to be conceived of broadly, rather than as a narrow, neutral, technical item. Ortega is arguing for a holistic view of technology and the good life.

para ser ingeniero, no basta con ser ingeniero2

Engineers have to be more than just engineers because their work is the work of constructing human well-being and that view of well-being may change: “the social, economic and political conditions under which he works are changing rapidly” (p. 104).

2.2 Separation vs. integration

There are two fundamentally competing impulses over the role of the engineer and the engineering process in the ethical implications of a system. One is towards separation. It’s considered sound engineering practice to maintain a “separation of concerns”: the efficiency, modularity, reusability and testability of code all benefit from making each component self-contained and focused on its own task. An analogous philosophy argues for that separation in the process of developing new technology; the engineer focuses on the mechanism, not the policy (“Mechanism Not Policy” 2005), on the how, rather than the what. A developer might say that the choice for how the system is supposed to work is “above my pay grade” or that the quality of a piece of code is determined by whether it meets the specification provided by the customer or the manager. Engineers may choose to “punt” a decision to be resolved later or elsewhere, either for pragmatic concerns or to enable flexibility or choice by some other party (Doty 2015). Architects of the Internet have recommended a principle of accommodating “tussle” of different priorities of different stakeholders, including by “designing for choice” by different parties in a communication, because conflict is inevitable and unresolvable (Clark et al. 2002).3

Even the holistic view of technology as ethics can include this perspective: at times even Ortega cites engineers as a rank below “poets, philosophers [and] politicians” because the engineer is dependent on their analysis of the values of human life. Or to use the cogent example of the development of the atomic bomb, Richard Sennett introduces us to the argument over the engineer’s ethical culpability and involvement (2008), positioning Hannah Arendt as arguing for the subservience of the engineer (1958).

A counter-acting impulse is toward integration of ethical concerns into the development process. Scholars and practitioners both have argued that technical decisions are not “pure,” “apolitical” or “neutral.” There are infamous examples of choices of technical architecture with profound, concrete and durable impacts on basic questions of public policy, like the height of overpasses and the inaccessibility of parks to people without wealth (Caro 1975).4 These cases of technological delegation emphasize the impracticality of a separation approach.5 At times, scientists and engineers have spoken up to express their strong ethical perspectives, bolstered by their knowledge and participation in the development of influential technologies; to continue with the atomic bomb case, Einstein co-authored a post-war manifesto (1955), directed towards politicians and government leaders and arguing for pacifism.

This impulse towards intentional integration prompted the creation of a proto-field of academic study in “values in design” (VID): a community of interdisciplinary scholars recommending consideration of ethics and human values in the design of technology and infrastructure, rather than waiting for those implications to be seen and addressed after the fact (Knobel and Bowker 2011). But recognizing that values considerations can be relevant to technical design decisions does not automatically make it easy to integrate them (Flanagan, Howe, and Nissenbaum 2008):

It is one thing to subscribe, generally, to these ideals [either the ideals of liberty, justice, enlightenment, etc. or the responsibility to take them into account], even to make a pragmatic commitment to them, but putting them into practice, which can be considered a form of political or moral activism, in the design of technical systems is not straightforward.

With that inherent integration recognized, there have been prominent attempts in global politics to use the design process proactively to buttress values of interest.6 Regulators increasingly call for “privacy-by-design”7 with the hope that software built to support privacy will have fewer of the unanticipated and troubling breaches of privacy during its use. Privacy by design may include: default settings for more private modes; data minimization so that technical systems collect or retain only the granularity of information needed for a particular purpose; and, audits and organizational controls to limit misuse of personal data.

With perhaps less political attention, similar reasoning has been used to promote a security development lifecycle (Lipner 2004) and to consider other system properties (internationalization, accessibility, performance) throughout software development, and in each case we can see the struggles to enact those values and system properties, struggles including at least epistemological (what really is the value in each case) and practical (what methods and practices are best to bring it about) barriers (Flanagan, Howe, and Nissenbaum 2008).

It might be taken as obvious or simply accepted that the design of new communications technology has impacts on important public policy values and that there are ethical implications to the design decisions that engineers make. But these impulses in tension explain why that widely-accepted importance does not translate straightforwardly into how most ethically to design technology. Our study must recognize these competing principles and engineering practices. The ethics of engineering will include both accommodating diverse, conflicting uses and embedding some fundamental values.

2.3 Ethics in organizations, professions and individuals

Questions of how architectural decisions with ethical implications are made are often answered with high-level explanations based on economic incentives or legal constraints. (“Why does Google track my online browsing activity?” “Because tracking provides a higher return-on-investment in online advertising and Alphabet Inc. is a for-profit shareholder-value-maximizing firm.”) Market dynamics are no doubt important in the direction of technology firms and economic explanations will be useful in explaining corporate actions. But in this work I will primarily seek to examine the backgrounds, motivations and decisions of individuals (including software engineers and other participants in technical standard-setting) and the dynamics of working groups and professional communities.

I believe economic arguments do not have the explanatory power or richness that other social scientific analyses can provide and that free-market economics alone cannot account for the relevant architectural decisions made by engineers and others in the development of the Internet and the Web. This belief is informed by my understanding of:

That particular architectural decisions by individual engineers have meaningful ethical consequences is also informed by the philosophical arguments of the previous sections. If we accept the holistic view, as Ortega argues, that technology doesn’t just have ethical implications but is by its nature a defining of what is a good life and if we accept that integration of values into the design of technical systems is at least sometimes preferable, then we should, as researchers, look at the perspectives and practices of individuals engaged in engineering and design to more fully understand these ethical-technical decisions.

2.3.1 An ethos of engineering

To understand the ethical practices and commitments of this Internet engineering community, it is useful also to consider the ethos: their character or guiding concepts. Coleman (2012) describes the interplay of hacker ethics and aesthetics. While she is careful to point out that there is no singular hacker ethic, Coleman identifies political strains of liberalism (free speech, inalienable labor) connected to the deep satisfaction (eudaimonia, even) of tinkering and subversion of systems within F/OSS contributors.

Software engineering shares with other types of engineering an impulse to “build,” “make” or “create.” That impulse can develop an ethic to do something, to build something in part exactly because one can do so (Doty 2013). To solve a difficult problem, even without a particularly remunerative or societally valuable outcome is often considered sufficiently motivating reward. A common method of recruiting software engineers is extolling the set of “hard problems” to work on. We can see both exploratory motivations (a la climbing a mountain “because it’s there”) and a motivating sense of independence (showing that you can do it on your own, through use of technology) here.

At the same time, technology faces a challenge in response: just because you can do something, should you?

“your scientists were so preoccupied with whether or not they could, they didn’t stop to think if they should.” A quote from the movie Jurassic Park, now commonly used as a meme to humorously indicate that some novelty is foolish or irresponsible. For example: “Hey Jim Comey, listen to Jeff!”

Or, almost conversely, given the privilege of those few who can make potentially great differences through the creation and use of technology, are engineers doing their best to live up to that opportunity? This became a pointed question in the responses to the suicide of Aaron Swartz (aaronsw) and in local debates about tech company social responsibility, displacement and housing in San Francisco. And it spawned many recitations and riffs on the opening from “Howl” (Ginsberg 1955), applied to the apparent lack of ambition or importance of software development among Web giants:

The best minds of my generation are thinking about how to make people click ads.

While this earliest version of the quote is from Jeff Hammerbacher (formerly of Facebook) in 2011 (Vance 2011), it became a common, even blasé, criticism of priorities in software development.8 It’s not clear how precise the analogy is, whether Hammerbacher or others intended a reference to drug addiction, homosexuality or the artist as an outcast in materialist society as described by the Beat poets, or if “I saw the best minds of my generation” is simply a memorable introduction that can somewhat ironically be used to describe well-educated, ambitious computer programmers. It is consistent, though, in suggesting that the tech industry and individual software engineers make substantial impacts and in lamenting the loss of an opportunity to apply that intellectual energy to some higher goal. The ethos of capability and impact is tied to an ethical aspiration.

2.3.2 Professional ethics

Ethical norms spread through a profession, as well as through organizational hierarchies or personal social ties. Famously, the Hippocratic Oath is used as a formal example of an ethical code in medicine, a shared common agreement that among other commitments doctors shall, first, do no harm. In traditional engineering (civil and mechanical, in particular), a similar moral commitment is present in the Ritual of the Calling of an Engineer (written by Rudyard Kipling) or in the oath of the Order of the Engineer (“Obligation” 2018):

As an engineer, I pledge to practice Integrity and Fair Dealing, Tolerance, and Respect, and to uphold devotion to the standards and dignity of my profession, conscious always that my skill carries with it the obligation to serve humanity by making best use of the Earth’s precious wealth.

Professionalization can be a way for an obligation to the public to be maintained, even when it might be contrary to the particular interests of an individual or firm. The sociology of law has shown evidence that the professional background and training of lawyers can distribute norms across national boundaries (Carruthers and Halliday 2006). Software engineering may not have a code and professional societies with the same pervasiveness as in medicine, law or engineering, but ethical codes, ethical education and professional organizations9 are present and it’s clear that professionals in engineering and technology are asking the same questions as other professions about their commitments to society.

Calls for a Hippocratic Oath or more rigorous ethical codes of conduct for practitioners in software engineering and data science are widespread. That might be an indication that the existing codes of professional ethics are not widely known. But criticism of engineering ethics codes and their utility or focus is long-standing.

For example, Luegenbiehl and Puka note (1983) the historical basis of ethical codes in engineering as driven by an interest in professionalization, criticizes them (unfairly, I would say) for not being exhaustive guides on ethical conduct, and notes the individualism (perhaps inherited from legal and medical ethics) that may not be appropriate for engineering practices that we know affects the wider public. Lynch and Kline (2000) argue for considering the ethics of everyday and non-technical parts of engineering practice, rather than focusing too narrowly on whistleblower moments and case studies of conflicts with amoral management. Davis (1991) argues for the utility of a code of ethics as part of a profession, in solving the coordination problem of individual, ethically-minded engineers overcoming a client’s or manager’s request. But he also concludes that engineers have this professional and ethical motivation not because of any familiarity with the text of a code of ethics, but because it’s part of “thinking like an engineer.”

How codes conceive of their obligations can provide an explanation of (or indicate the presence of) a cultural perspective towards the profession. For example, some codes will focus on an obligation to the public while others may emphasize the responsibility to a particular client, with likely different results in professional attitudes. Stark and Hoffmann identify different motivating metaphors in ethical codes that represent different professions (or different parts of a broader computing or data science profession) and correspond to responding to different values and prioritizing different constituencies (2019). Professional codes can contribute to credibility or to benevolence or both, and computer engineering has unfortunately not had a focus on benevolence. They quote Kate Crawford in noting, “data ethics needs to ask, ‘what kind of world do we want to live in?’” Indeed, if we see engineering as technical work inherently and explicitly asking that question (as Ortega suggests) – how will tools shape a different world and which different world do we want? – then we can see engineering ethics as not just professional behavior or appropriate stakeholder harm-reduction, but an essential aspect of engineering.

In each of these reviews, the ethical impact of engineering and the construction or focus of an ethical code for the engineering profession has emphasized the distinctive practice of engineering itself, whether it’s “thinking like an engineer” or what makes up the day-to-day practice of engineering work. The ethos of engineering itself is difficult to specify but it meaningfully impacts how ethical practices are approached by individuals and communicated across organizations and professions.

2.4 Engineering impacts for values of the Internet

This chapter has reviewed the inherent ethical impacts of engineering. Given the outsized role that Internet engineering and the choices of many individual software engineers have for values such as privacy, this research seeks to understand how privacy is or is not supported by those who develop the Internet and the Web. We must ask, how do the designers of the Internet’s underlying protocols view privacy as a value? And how do their views ultimately affect the privacy of Internet and Web users?10

For this research project, I focus on public policy areas with a special valence for the Internet and the Web: security and privacy. Why those values, and how responsibility for those values is allocated and distributed, is explained in the following chapter: Privacy and Security: Values for the Internet.

References

Arendt, Hannah. 1958. The Human Condition. University of Chicago Press.
Caro, Robert A. 1975. The power broker: Robert Moses and the fall of New York. New York: Vintage Books. http://www.amazon.com/The-Power-Broker-Robert-Moses/dp/0394720245.
Carruthers, Bruce G, and Terence C Halliday. 2006. “Negotiating Globalization: Global Scripts and Intermediation in the Construction of Asian Insolvency Regimes.” Law & Social Inquiry 31 (3): 521–84. https://doi.org/10.1111/j.1747-4469.2006.00022.x.
Clark, David D., John Wroclawski, Karen R. Sollins, and Robert Braden. 2002. “Tussle in Cyberspace: Defining Tomorrow’s Internet.” In Proceedings of the 2002 Conference on Applications, Technologies, Architectures, and Protocols for Computer Communications, 347–56. SIGCOMM ’02. New York, NY, USA: ACM. https://doi.org/10.1145/633025.633059.
Coleman, E. Gabriella. 2012. Coding Freedom: The Ethics and Aesthetics of Hacking. Princeton University Press. http://www.amazon.com/dp/0691144613.
Davis, Michael. 1991. “Thinking Like an Engineer: The Place of a Code of Ethics in the Practice of a Profession.” Philosophy & Public Affairs, 150–67.
Doty, Nick. 2013. “Because You Can Is Reason Enough to Do Something.” Bcc (blog). August 17, 2013. http://bcc.npdoty.name/because-you-can-is-reason-enough-to-do-something.
———. 2015. “Interesting Questions I Heard from Students in Class: Standard-Setting and ‘Punting’ Decisions.” Known.npdoty.name (blog). February 24, 2015. http://known.npdoty.name/2015/interesting-questions-i-heard-from-students-in-class-standard-setting-and.
Einstein, Albert, and Bertrand Russell. 1955. “The Russell-Einstein Manifesto.” Proceedings of the First Pugwash Conference on Science and World Affairs. https://pugwash.org/1955/07/09/statement-manifesto/.
Flanagan, M., D. Howe, and H. Nissenbaum. 2008. “Embodying Values in Technology: Theory and Practice.” Information Technology and Moral Philosophy, 322–53.
Ginsberg, Allen. 1955. Howl. https://www.poetryfoundation.org/poems/49303/howl.
Knobel, Cory, and Geoffrey C. Bowker. 2011. “Values in Design.” Commun. ACM 54 (7): 26–28. https://doi.org/10.1145/1965724.1965735.
Lipner, S. 2004. “The Trustworthy Computing Security Development Lifecycle.” In 20th Annual Computer Security Applications Conference, 2–13. https://doi.org/10.1109/CSAC.2004.41.
Luegenbiehl, Heinz C., and Bill Puka. 1983. “Codes of Ethics and the Moral Education of Engineers [with Commentary].” Business & Professional Ethics Journal 2 (4): 41–66.
Lynch, William T., and Ronald Kline. 2000. “Engineering Practice and Engineering Ethics.” Science, Technology, & Human Values 25 (2): 195–225.
“Mechanism Not Policy.” 2005, July. http://c2.com/cgi/wiki?MechanismNotPolicy.
Morozov, Evgeny. 2013. To Save Everything, Click Here: The Folly of Technological Solutionism. Public Affairs.
“Obligation.” 2018. Order of The Engineer. 2018. http://www.order-of-the-engineer.org/?page_id=6.
Ortega y Gasset, José, and John William Miller. 1962. History as a System and Other Essays Toward a Philosophy of History. Translated by Helene Weyl. New Ed edition. New York: W. W. Norton & Company.
Sennett, Richard. 2008. The Craftsman. London: Allen Lane.
Stark, Luke, and Anna Lauren Hoffmann. 2019. “Data Is the New What? Popular Metaphors & Professional Ethics in Emerging Data Culture.” Journal of Cultural Analytics. https://doi.org/10.22148/16.036.
Tufekci, Zeynep. 2016. “The Real Bias Built In at Facebook.” The New York Times, May 19, 2016, sec. Opinion. https://www.nytimes.com/2016/05/19/opinion/the-real-bias-built-in-at-facebook.html.
Vance, Ashlee. 2011. “This Tech Bubble Is Different.” Bloomberg Businessweek, April 14, 2011. https://www.bloomberg.com/news/articles/2011-04-14/this-tech-bubble-is-different.
Winner, Langdon. 1980. “Do Artifacts Have Politics?” Daedalus 109 (1): 121–36. http://www.jstor.org/stable/20024652.

  1. The designer-artifact-user spectrum is only one way of classifying researchers or the object of research in technology and society, but it can be an interesting one for information science colleagues. South Hall whiteboards have explored this in spectrum and triangle form.↩︎

  2. En: to be an engineer, it is not enough to be an engineer. I first encountered this quote in Morozov (2013).↩︎

  3. Clark et al. explicitly consider “mechanism, not policy” and describe it as “too simplistic” but still a valuable principle in trying to separate out pieces of the system more or less likely to involve tussles between parties.↩︎

  4. However, note that while Caro’s example of overpasses to prevent public transit access is illustrative and well-documented, most of The Power Broker portrays Moses not as the skilled engineer (indeed, he has no engineering training), but as a skilled legislative aide, manager and architect of public opinion.↩︎

  5. This conversation can very easily get confused when we talk about the attribution of values. Some get upset when Latour writes about technical agents and accuse him of a basic fallacy of attributing mental states and intentions to inanimate objects. When critics speak of the bias or politics of algorithms (Tufekci 2016; Winner 1980), some technical audiences are confused because the algorithm itself has no apparent political content or skewed intent. It is often the choice of algorithm that has a political impact and the decision of the designer of a robotic agent that carries a moral weight.↩︎

  6. Flanagan, Howe and Nissenbaum called this the “pragmatic turn” to “values as a design aspiration” (2008).↩︎

  7. Cavoukian popularized the term and devised one specific process, but the approach in more general terms has been adopted by policymakers and software firms around the world.↩︎

  8. I mostly stopped keeping track after 2014, but even then, it was turning into a meta-joke because it was so widespread: https://pinboard.in/u:npdoty/t:bestminds/.↩︎

  9. The Association for Computing Machinery is currently revising its code of ethics. More specific organizations develop training and certification for sub-fields; IAPP for privacy, for example.↩︎

  10. These make up dissertation Research Question 2.↩︎