This is a chapter of a published dissertation: Enacting Privacy in Internet Standards.
What this leaves for the future is the question, or rather, the challenge, of what practices we could use in technical standard-setting to more effectively enact privacy and security for the Internet and the Web.
Throughout this research project and throughout my personal and professional efforts to support privacy on the Web, I have seen how potential improvements can involve three distinct but connected areas:
These might conceivably apply to questions of values in the design of technology generally, but I have observed them explicitly in the collaborative, rough consensus standard-setting process in particular.
Leaders have played a substantial role in the support of security and privacy in Internet and Web protocols, perhaps especially because the standard-setting process doesn’t rely on a single firm’s hierarchical model but instead pushes for interoperability and collaboration between disparate and competing organizations. Leaders have provided both a backstop and a motivation for security and privacy to be considered more directly in the design of the Internet and the Web.
In recognizing the inherently ethically-laden nature of engineering, a shift towards integration of values in design and engineering and the potential for techno-policy standards that explicitly involve values such as privacy, particular combinations of expertise are increasingly useful. Those with both technical and legal backgrounds may be able to recognize and evaluate possible socio-technical configurations. While it may not be a rapid intervention, education can help meet this need. Schools of Information pursue an interdisciplinary approach, typically combining computer science topics with social science, law and policy and user-centered design.1 Technology & Delegation, a seminar class, a lab class and a set of curricular resources,2 has been an explicit project to encourage students with varying backgrounds to confront direct intersections of technology and policy and how they interact.
The need for technologists engaged in public interest work and helping civil society and philanthropy has been described as a “pivotal moment” (Freedman et al. 2016). Scholars, foundations and practitioners have sought to develop a new field of public interest technology (Eaves et al. 2020), not unlike our earlier definition of a “citizen technologist” (Doty and Panger 2015). A network of universities is committed to “growing a new generation of civic-minded technologists” – an urgent and important goal.3 Clinics provide students with experiential learning while fellowships directly integrate technologists into traditional policymaking spaces.4
While motivated individuals or community leaders have made a significant difference, organizational processes can bring broader and more systematic considerations of privacy, security and other values to the Internet standard-setting process. Rotating assignments in a Security Directorate at IETF is credited with improving the consistency of security reviews in Internet protocols, and similar attempts have been made with triggering wide review, including privacy reviews and architectural design reviews, at W3C.5 Procedural requirements can also be a hook for interested individuals to provide feedback on features that affect important values like privacy.
Clear and systematic process also provides an opportunity for more confidence in how consensus technical standard-setting can apply to policy-related topics. Removing uncertainty could remove confusion or even encourage cooperation. Along the same lines, we might ask for clearer roles from policymakers in their participation in consensus techno-policy standardization – how invested they are and what they aim to contribute, whether that’s requirements, some democratic legitimacy, incentives to participate or the power to enforce standards.
Finally, process implies or perhaps even requires continual application. Systematization, clarity, establishing roles – these would all benefit from repeated, ongoing processes that proceed to address the next tech policy and continually review and revise existing systems. Periodic events, or development lifecycles that follow a linear waterfall model, don’t provide the same opportunities for building relationships and effective institutions.
Technologists must not forget the tools that influence tool-building. Tools here can range from simple, automated checks to comprehensive high-level design principles. Regarding security and privacy considerations in Internet standards, automated prompts can ensure that specification authors are at least aware of the need to directly address those values in new protocols. But simple, blunt requirements alone will also prove to be insufficient (Doty 2015). Detailed guidance might prove fruitful, perhaps especially for those interdisciplinary or values-minded individuals who want to directly address privacy or security details in their domain of interest. I’ve tried to contribute for my part guidance on mitigating browser fingerprinting (Doty 2019), because it is a detailed privacy topic that accumulates across different features and could benefit from some coordinated and comprehensive response.
Tools may be most effective, though, when they work in concert with people and processes. Questionnaires, for example, allow experts close to a particular domain area but not necessarily trained on privacy or policy issues in general to help in identifying potential areas that may need further review6 and collect details that a privacy expert who isn’t intimately familiar with the domain can use in evaluating implications. W3C now sees widespread use of a self-review questionnaire for both security and privacy (“Self-Review Questionnaire: Security and Privacy” 2020) and a similar questionnaire is included in IETF’s privacy considerations guidance (Hansen et al. 2013).
In the longer term, though, support for privacy, security and other values could be more efficiently maintained if they were designed in from the beginning, rather than spotted as potential problems along the way. Higher level design principles could be tools for these more fundamental changes, but privacy-by-design can be difficult to put into practice, even for those engineers who may already share the ethical commitment to it. Design patterns are documentation tools to codify and communicate abstract solutions to common engineering problems. Privacy design patterns, then, may:7
As a tool for communication, privacy design patterns can also facilitate communication of detailed engineering practice to lawyers or policymakers. Anti-patterns can help to classify the misapplication of a technique or warn of its unintended consequences (Doty and Gupta 2013) or to document the common problems that lead to a lack of privacy in Web standards (Snyder 2019).
I have argued that privacy and security are values of distinctive salience to the Internet and the Web. But those concepts are complex, contested and likely to involve new senses over time. Even in the course of writing this dissertation, the distinctive, topical senses of privacy have changed. Fairness was a new privacy-relevant topic, with the idea that privacy might be the protection against unfair, society-wide inferences about oneself or one’s community. Or perhaps privacy is freedom from the harassment and abuse that trolling and dog-piling on social media have made so easy. More recently still, the trend toward toxic disinformation uses those same social network channels to target not just an individual, but an entire society’s sense of what is real or reliable.
Seen through the handoffs model, there are likely to be many more shifts in how values are maintained (or not) in different socio-technical configurations and how responsibility is distributed. Some paradigmatic shifts around security may be linear trends away from discretion or false reliance on assumptions of goodwill or end user expertise – like the ongoing march toward encrypting the Web. But there will also be the possibility of handoffs to more distributed approaches that involve communication among people, technology and regulatory systems.
Technical standard-setting – or specifically what I have called techno-policy standard-setting – provides an opportunity for multistakeholderism’s promise of democratic and technocratic advantages, in the line of new governance as well as the bridging property of boundary organizations. Standard-setting’s practical focus on interoperability suits it for handoffs to cooperative configurations developed by diverse parties – if those various organizations have incentives to pursue it and that heterogeneous group of individuals can work together. The handoff model encourages holism and asks us to look at the broader socio-technical system and the network of actors involved. Any multistakeholder process takes place embedded in the context of ongoing technical, social, organizational and policy changes that influence it.
Whether these concerns are all considered senses of privacy or not, we face tech policy issues that are urgent, complex and have large impacts on public policy, including criminal justice, equal access to digital public fora, democracy and public health. We need comprehensive responses that integrate technical expertise, policy details and ethical understanding. To respond effectively and promptly, we must use what we have learned from our attempts to enact privacy on the Internet.
What an iSchool is remains an open question welcoming constant refinement and definition, but see for example the iSchools organization: https://ischools.org/About.↩︎
Most recently taught in Fall 2019, with a wiki of Techdel resources.↩︎
See, for example, TechCongress: https://www.techcongress.io/↩︎
This theme was highlighted in Doty (2015) and I believe systematization has slowly increased since.↩︎
This is sometimes called “issue spotting,” inspired by the term from legal practice.↩︎
These project goals are taken directly from the collaborative privacypatterns.org project: https://privacypatterns.org/about/.↩︎