Web
Analytics
DL Seminar | The Digital Complicity of Facebook's Growth Hackers & Chip-implanting Biohackers
top of page
  • Writer's pictureJessie G Taft

DL Seminar | The Digital Complicity of Facebook's Growth Hackers & Chip-implanting Biohackers

Updated: Jan 8, 2019

By Natalie Friedman | MA Student | Cornell Tech


Illustration by Gary Zamchick

DL Seminar Speaker Joseph Reagle

On Thursday, October 11th at Cornell Tech, Dr. Joseph Reagle (pictured above), an expert in digital communications from Northeastern University, spoke to us about ethics and complicity of Facebook employees. He began his talk by defining the term digital complicity as “the intent and embrace of problematic technology” and using that definition, posed the question: “Is Facebook complicit in spreading fake news?”


To answer questions like these, Dr. Reagle created a complicit blameworthiness equation, using 3 distinct components to determine the meaning of accountability in complicity. The variables in this equation included:


1.Badness factor, responsibility factor: voluntariness, knowledge of contribution, knowledge of wrongness,


2. Contribution Factor: centrality, proximity, reversibility, temporality, planning role, responsiveness. How much was I involved in crime?” or “Did I know I was contributing to that wrong?”


3. A shared purpose with wrong-doers

Dr. Reagle referred to elements in this equation throughout his lecture. He focuses on the intentional addiction manipulation by engineers. He asks, “did Facebook have knowledge of their contribution to this harm?” He answers it with a quote from Sean Parker, an investor in Facebook, “We understood it consciously and we did it anyway.” This would indicate a high “badness factor” because there was an obvious knowledge of the contribution and wrongness.


He asks, did Facebook know this wrongness would happen? He answers with a quote from someone in a similar high power role at Facebook, “yes, [the consequences] was in the back of our minds...but not like this.” Then, Dr. Reagle poses the question, what does “not like this” really mean? Facebook engineers knew by the fact that it was “in the back” of their minds. This indicates that they were ignoring the knowledge they had. When Dr. Reagle factors these things into his complicit blameworthiness equation, he said Facebook should receive a .7.


So, what is the context for people to behave in this complicit way? Dr. Reagle explains the social media marketplace is relatively new. Inherently, the novelty of the social network marketplace makes it hard to know what role Facebook is going to play. But again, it seems as though Facebook did understand the consequences prior to when it “got bad.” Additionally, in this marketplace, if there is an important feature that a company will not offer, another company could succeed by offering it. In other words, it is a “winner takes all” situation.


In conclusion, many technology companies could be evaluated with high standards, just as Dr. Reagle has done. If we take into account contribution, responsibility, “badness” and purpose when evaluating technology companies, we can better understand when and where there is problematic or “blameworthiness” complicity.


bottom of page