In 2015, the data visualisation firm The Office For Creative Research (OCR), collaborated with the theatre troupe Elevator Repair Service (ERS) to craft a live performance called A Sort of Joy, using the metadata from MoMA’s collection database — all 123,951 of them — as the source material for its script. The performance seemed innocent at first. It started with a group of white men standing in the centre of the room, facing out towards the audience. They wore headphones and had an iPad each in their hands, of which they held out so the audience could see. A group of white women walked in a circle among them, also with iPads in their hands. The men then began to recite the first names that came out on their iPad screens — John (Baldessari, Cage, Lennon, Waters, etc.) then a few long seconds later, Robert, then much longer seconds, David and so on so forth. The women kept pacing around them while having their eyes on the men’s screens, all at the same time without saying a word for a mere three minutes.
Then there was a ‘Mary’ on the screen, and the women said the name, almost an exclamation. Then it started to make sense to the audience — who previously were probably wondering, why are the men in the inner circle? Why are the women encircling around them like they are to be preyed on? Who are those names being mentioned? — that the names belong to some organisation / community / institution etc. where the representation is skewed towards the male population. They then started to anticipate for more female names, until ‘Joan’ came out in another minute, then ‘Barbara’, then the male names made more and more appearance. We never heard a ‘Khalil’, a ‘Salama’, a ‘Dinesh’, a ‘Eun-Jung’, a ‘Kwame’, or anything that does not remotely represent Western names, because the names being recited belonged to the names of the artists according to the most works in the collection. This is why ‘John’ came first, then more male white names, before we heard a female white name.
The performance — a result of the many ways my brain links information with each other — was what first came to my mind when I watched Netflix’s The Social Dilemma last week. The docudrama, directed by Jeff Orlowski, gave us a picture of the many ways the social media and Big Tech had sown trouble in both personal and societal level. As someone who had the opportunity to be educated in STS (Science and Technology Studies) field in my brief academic years — speaking not from a place of expertise, but more from a place of sharing — I am not here to refute the messages laid out in the docudrama, nor to question the redeeming intention of the tech personalities portrayed in it.
But I couldn’t help to point out one glaring element that tech industry still lack of despite the years — that became one of the ways that led to this ‘oversight’, this “has a mind of its own” (it doesn’t, your values are embossed in the innovations you built), this “none of us had ever anticipated/intended this” (of course, because you don’t seem to want to listen to others) — and that is the problem of representation. The overwhelmingly male and white (or the intentional hiring of any majority population in any industry) and the persistent assurance that all of this “were never intended” was how Robert Moses came up with an overpass bridge so low that buses which carry people of colour and less affluent could not go through to access the beautiful parks of Long Island. He never intended it, I mean how could he see it — he did not live the very lives of the people he hindered access to the parks to. Oops! The lack of representation of other minorities in tech was also how Black people was classified as ‘gorillas’ in the AI facial recognition, trans people would get pulled aside at airports for security check, Black defendants were marked of a higher recidivism than their white counterparts, women hired less and less, how people could literally get killed, and how we still have this shit today — can you believe how tired we are?
(And can you believe 1) despite undergoing — I presume, multiple — peer reviews, none of the reviewers ever stood up and said, “hey uhm, I think this is wrong” and 2) the guy who looked the most trustworthy looked like Mark Z??? BYE)
These are just small samples of ‘oopsies’ that happened when your innovations do not consider the real lives of others, and it has to start with representations.
Not only that, Black and PoC scholars had been doing the work and sounding the alarm for YEARS, and in the docudrama we can see the first non-white interviewee showing up only after 50 minutes. The producers knew they had a problem with representation, probably a little bit too late, which is why they threw in the Black dad in the cringe-inducing drama re-enactment, with his appearances being so minor and his lines so few.
Now — I’m not saying we should not listen to the messages in The Social Dilemma, but you can be in one industry you held in high regard and offer your criticisms in the name of wanting us to do better. The deal with The Social Dilemma is that these problems — fake news, dark patterns, psychological and behavioural tinkering in the name of ‘growth hack’, etc. — seemed to not be seen as a problem unless a prodigal tech bro woke up one day and decided to change the world out of the harms he had been complicit in through his slide deck and TED talks. I’m not saying it isn’t a noble cause, but to go back on the idea that your ideas were built on the shoulders of giants, as what some of us academics had been trained to, always check if someone had done the work before us and always cite cite cite, and if it helps, ask yourselves these four questions before embarking on a project you think you are the first of its kind:
- Are you listening to experts and vulnerable communities?
- Can you join existing efforts?
- Can your technology do what you say it’s going to do?
- How does your technology shift power?
If you like The Social Dilemma and found it compelling for many of your own reasons, I would suggest that you follow and read the works of these scholars of the sociocomputing field: Ruha Benjamin, Safiya Noble, Joy Buolamwini, Sarah Roberts, Marie Hicks, Sasha-Costanza-Chock, Zeynep Tufekci, Charlton McIllwain, Anita Say Chan, Wendy Chun, Lisa Nakamura, Virginia Eubanks, and Beth Coleman — and because by no means this is an expansive list: Mozilla has a thread on the works of more of these amazing scholars and some resources you can read if you want to know more about algorithmic biases.
Walking into the new week with this much freedom, vigour, and wholesomeness:
Reading in my tabs:
- How to run a small social network site for your friends.
- If there is one thing you have to choose to read from my newsletter today, read this.
- “Our report suggests that ethics owners engage in outreach to advocacy groups in order to bring the experiences of those outside of tech companies into these decision-making rooms, and encourages sharing lessons-learned across the industry. But these measures cannot replace true inclusion, particularly when it comes to Black, Indigenous, and people of colour (BIPOC) ethics owners. Ultimately, companies must change who is already in such rooms so that those sitting around tables “thinking hard” about the ethical implications of products and services include people who have experienced the world in diverse ways.” Examining race in tech company ethics.
- I joined a webinar about Hagia Sophia — a monument I found very majestic (and I am sure I am not alone in this notion) — from the perspectives of cultural heritage and one of the panels spoke about this very interesting acoustic archaeology project.
- ““It’s not because they are women that they were nonviolent and innovated tactics better,” Chenoweth said. “It’s that their particular position, the gendered roles that they had in society, gave them access to knowledge about social power.”” Why social movements led by women tend to succeed more.
- The Internet can be vile, but trans people have found love and support within it — and that’s the among the times online anonymity is important.
- Speaking of online anonymity, Telepath is a new, kinder social network that requires you to sign up using your own legal name (with the exception of people who choose to be known by other names, such as trans people). But I couldn’t stop thinking — is unkindness the problem in today’s social networks, or the refusal of critical thinking? I mean, Abeba, a Black woman scientist, was chided for her ‘lack of civility’ when she called out this very racist scientific work.
- Reveal the specific user-tracking technologies on the site and who’s getting your data using this real-time website privacy inspector. Related: the high privacy cost of a ‘free’ website.
- MIT’s biology department is offering a new online class called COVID-19, SARS-CoV-2 and the Pandemic.
- “They worked / and they died / They died broke / They died owing.”
- Reading: Catherine D’Ignazio’s and Lauren F. Klein’ s Data Feminism, Sarah Burton’s The Strange Adventures of H, and Pablo Neruda’s Los Versos del Capitan.
- Listening: A client recommended me to the music of Malian guitarist Samba Touré, and it’s all I ever play the entire week.
- Watching: Ocean Vuong on the yin/yang of creativity — yin is akin to “fishing with a large net, casting your net wide and waiting for the bounty to fill” and yang is “the decisive moment where decisions are made where order happens — architectures built in service of a final goal — yang is when you harvest the fish and chop it up and send it to market and produce it and tend to communities or the rest of the larger world.” Also, Enola Holmes.
- Food & Drink: The enhanced lockdown was lifted yesterday (yay!) and despite still being wary of the rising cases nationwide, I managed to go to the local market and get some mackerel pieces and turned them into spicy grilled fish!