28.7 C
New York
Friday, July 26, 2024

The Humanities Can't Save Big Tech From Itself

The problem with tech, many declare, is its quantitative inclination, its “hard” math deployed in the softer human world. Tech is Mark Zuckerberg: all turning pretty girls into numbers and raving about the social wonders of the metaverse while so awkward in every human interaction that he is instantly memed. The human world contains Zuck, but it is also everything he fails at so spectacularly. That failure, the lack of social and ethical chops, is one many believe he shares with the industry with which he is so associated.

And so, because Big Tech is failing at understanding humans, we often hear that its workforce simply needs to employ more people who do understand. Headlines like “Liberal arts majors are the future of the tech industry” and “Why computer science needs the humanities” have been a recurring feature of tech and business articles over the past few years. It has been suggested that social workers and librarians might help the tech industry curb social media’s harm to Black youth and proliferation of disinformation, respectively. Many anthropologists, sociologists, and philosophers—especially those with advanced degrees who are feeling the financial squeeze of academia’s favoring of STEM—are rushing to demonstrate their utility to tech behemoths whose starting salaries would make the average humanities professor blush.

I’ve been studying nontechnical workers in the tech and media industries for the past several years. Arguments to “bring in” sociocultural experts elide the truth that these roles and workers already exist in the tech industry and, in varied ways, always have. For example, many current UX researchers have advanced degrees in sociology, anthropology, and library and information sciences. And teachers and EDI (Equity, Diversity, and Inclusion) experts often occupy roles in tech HR departments.

Recently, however, the tech industry is exploring where nontechnical expertise might counter some of the social problems associated with their products. Increasingly, tech companies look to law and philosophy professors to help them through the legal and moral intricacies of platform governance, to activists and critical scholars to help protect marginalized users, and to other specialists to assist with platform challenges like algorithmic oppression, disinformation, community management, user wellness, and digital activism and revolutions. These data-driven industries are trying hard to augment their technical know-how and troves of data with social, cultural, and ethical expertise, or what I often refer to as “soft” data.

But you can add all of the soft data workers you want and little will change unless the industry values that kind of data and expertise. In fact, many academics, policy wonks, and other sociocultural experts in the AI and tech ethics space are noticing a disturbing trend of tech companies seeking their expertise and then disregarding it in favor of more technical work and workers.

Such experiences particularly make clear this fraught moment in the burgeoning field of AI ethics, in which the tech industry may be claiming to incorporate nontechnical roles while actually adding ethical and sociocultural framings to job titles that are ultimately meant to be held by the “same old” technologists. More importantly, in our affection for these often underappreciated “soft” professions, we must not ignore their limitations when it comes to achieving the lofty goals set out for them.

While it is important to champion the critical work performed by these underappreciated and under-resourced professions, there is no reason to believe their members are inherently more equipped to be the arbiters of what’s ethical. These individuals have very real and important social and cultural expertise, but their fields are all reckoning with their own structural dilemmas and areas of weakness.

Take anthropology, a discipline that emerged as part and parcel of the Western colonial project. Though cultural anthropology now often espouses social justice aims, there are no guarantees that an anthropologist (85% of whom are white in the US) would orient or deploy algorithms in a less biased way than, say, a computer scientist. Perhaps the most infamous example is PredPol, the multimillion-dollar predictive policing company that Ruha Benjamin called part of The New Jim Code. PredPol was created by Jeff Brantingham, an Anthropology professor at UCLA.

Other academic communities championed by those pushing for soft data are similarly conflicted. Sociology’s early surveillance and quantification of Black populations played a role in today’s surveillance technologies that overwhelmingly monitor Black communities. My own research area, critical internet studies, skews very white and has failed to center concerns around race and racism. Indeed, I am often one of only a handful of Black and brown researchers in attendance at our field’s conferences. There have been times I was surrounded by more diversity in tech industry spaces than in the academic spaces from which the primary critiques of Big Tech derive.

Social workers would likely add some much-needed diversity to tech. Social work is overwhelmingly performed by women and is a pretty diverse profession: over 22% Black and 14% Hispanic/Latinx. However, social workers are also implicated in state violence toward marginalized communities. For example, a social worker coauthored a controversial paper with Brantingham extending his predictive policing work to automated gang classification.

Most PopularBusinessThe End of Airbnb in New York

Amanda Hoover

BusinessThis Is the True Scale of New York’s Airbnb Apocalypse

Amanda Hoover

CultureStarfield Will Be the Meme Game for Decades to Come

Will Bedingfield

GearThe 15 Best Electric Bikes for Every Kind of Ride

Adrienne So

For another example of efforts to integrate social work and automated prediction, look to Columbia University’s SAFELab. One of the first major projects out of the lab focused on gang violence and uses social workers, data scientists, and local community members to collect and annotate social media posts by youth in Chicago. The lab developed a tool to identify “aggressive” posts and predict which might lead to offline violence. Although certainly well-intentioned—the researchers stress they want social worker and not police intervention—the project inevitably concentrates this surveillance on Black and brown communities. And while social workers often have good intentions, Dorothy Roberts points out that the child welfare system is a racist institution that disproportionately breaks up Black families. Ultimately, as Chris Gilliard recently wrote: “Surveillance technology always ‘finds its level.’ Its gaze is always going to wind up focused on Black folks—even if that was not the ‘intent’ of the inventor. […] Surveillance systems, no matter their origin, will always exist to serve power.” This truism is shown in how quickly SAFELab’s work has been embraced by police, and in the grant of over half a million dollars that the researchers received from DARPA to expand their automated gang “aggression-indicator” to a similar tool for investigating ISIS recruitment on social media. Sarah T. Hamid, policing tech campaign lead at the Carceral Tech Resistance Network, cited this grant as an example of how police technology is tested domestically in racialized communities “in order to design war tools for similar communities overseas.”

Finally, though the librarian profession is often cited as one that might save Big Tech from its disinformation dilemmas, some in LIS (Library and Information Science) argue they collectively have a long way to go before they’re up to the task. Safiya Noble noted the profession’s (just over 83% white) “colorblind” ideology and sometimes troubling commitment to neutrality. This commitment, the book Knowledge Justice explains, leads to many librarians believing, “Since we serve everyone, we must allow materials, ideas, and values from everyone.” In other words, librarians often defend allowing racist, transphobic, and other harmful information to stand alongside other materials by saying they must entertain “all sides” and allow people to find their way to the “best” information. This is the exact same error platforms often make in allowing disinformation and abhorrent content to flourish online.

The struggles of these professions and disciplines aren’t unique—every American institution is confronting similar realities. Detailing these fields’ limitations does not mean they aren’t in many ways miles ahead of the tech industry’s ahistorical, “colorblind” techno-utopianism that often discounts and devalues social and cultural experiences and impacts. And I would never argue, as universities engage in an often-losing fight to fund nontechnical disciplines, that those same disciplines aren’t crucial to STEM majors and their future workplaces. Of course they are. However, we must be honest about what can realistically be accomplished by these piecemeal attempts at stitching sociocultural expertise into technical teams and organizations that ultimately hold all of the power.

Most PopularBusinessThe End of Airbnb in New York

Amanda Hoover

BusinessThis Is the True Scale of New York’s Airbnb Apocalypse

Amanda Hoover

CultureStarfield Will Be the Meme Game for Decades to Come

Will Bedingfield

GearThe 15 Best Electric Bikes for Every Kind of Ride

Adrienne So

Putting aside the structural shortcomings of these fields, saddling these professionals with tech’s ongoing failings (especially without robust resources and real power) is not only doomed from the start, but potentially quite harmful. For example, the diversity these roles might add would likely not be enough to cause tech companies to treat marginalized people any better, especially those brought in to point out problems. To ask women—especially Black women—to “fix” tech is a problem in itself, and uncomfortably echoes the larger American political scene of the last few years that asks Black women to save democracy, all while failing to support them. And tech has a long record of ignoring Black women and other marginalized communities. Plus, the kind of work soft data workers are meant to do will ultimately be the most difficult in tech. (Indeed, the task of achieving true artificial intelligence pales in comparison to making Facebook resemble a safe democratic online space.) The messy work of considering people, society, and culture must ultimately be a responsibility for the industry as a whole, not an afterthought that those most hurt by the industry must fight an uphill battle over.

While soft data may currently be in vogue, this focus on sociocultural expertise may ultimately represent a passing trend for the industry. And even as we argue its value to tech, we cannot allow the funding and success of the humanities to hinge on sustained interest from technology industries and STEM disciplines. If, as Meredith Whittaker has detailed so well, the tech industry and military have done an impressive job of capturing technical academic AI research, there is no reason to think the constantly under-fire humanities and social sciences are any more likely to withstand their overwhelming power. We must arm these fields with the funding and other resources necessary to withstand Big Tech’s whims and influence. And that means respecting the humanities’ existence and expertise independent of STEM, AI, and other technological pursuits. Anything less almost guarantees their efforts will be folded into, and even used to optimize or disguise, the harmful impacts of Big Tech’s products.

Unquestionably, the tech industry needs to think of creative and meaningful ways to marry technical and sociocultural expertise, throughout the development and deployment of technologies. The first attempts at doing this will be crucial. An important requirement to truly be effective will be one with which many technologists are familiar: It will need to scale. These new soft data roles cannot be few and far between in the industry. Tech will need enough of these workers to ensure they aren’t isolated voices but respected teams or team members. This will also allow for much-needed intracommunity disagreement and discussion. No field or community is a monolith, and to ensure any of these experts will have a true impact in tech, there need to be enough of them, in stable enough positions, to engage with and provide oversight for one another.

This will assist with another core requirement: Companies will need to know the right roles for the right problems. To a tech industry long comfortable siloing itself off from other industries and ways of thinking, a social worker may seem interchangeable with a sociologist, or even a librarian. However, each has its own area of expertise, and it will take time, and probably a lot of trial and error, to figure out where different fields and professions will integrate with more technical workers. Without companies attending closely to needs like these, soft data workers will be put in positions in which they have no community or power and will function simply to provide an ethical sheen to policies and practices that were what tech executives planned all along.

Most PopularBusinessThe End of Airbnb in New York

Amanda Hoover

BusinessThis Is the True Scale of New York’s Airbnb Apocalypse

Amanda Hoover

CultureStarfield Will Be the Meme Game for Decades to Come

Will Bedingfield

GearThe 15 Best Electric Bikes for Every Kind of Ride

Adrienne So

So back to Mark Zuckerberg, who works as a stand-in for the socially disinclined tech industry. Picture assigning billionaire Zuck a sociologist buddy, an ethics chaperone, a social worker. Imagine they were all women of color, probably with multiple degrees and more letters behind their name than spell metaverse. Do we seriously think doing so would orient his decisions toward equality, true diversity, democracy, or even fairness—all at the cost of profits? Or would we be sending some real human-loving people into the lion’s den? Now is the time for us to consider what a truly balanced tech industry might look like and to clear the way so people with all forms of knowledge have a real say in that future.


More Great WIRED Stories📩 The latest on tech, science, and more: Get our newsletters!4 dead infants, a convicted mother, and a genetic mysteryThe fall and rise of real-time strategy gamesA twist in the McDonald’s ice cream machine hacking sagaThe 9 best mobile game controllersI accidentally hacked a Peruvian crime ring👁️ Explore AI like never before with our new database✨ Optimize your home life with our Gear team’s best picks, from robot vacuums to affordable mattresses to smart speakers

Related Articles

Latest Articles