Quantcast
Channel: The New Republic
Viewing all articles
Browse latest Browse all 15330

Digital Privacy Is a Class Issue

$
0
0

When you see an ad for online gambling, it is never a matter of chance. Take, for example, the story of Sportsbet, an Australian company owned by the global gambling-industry behemoth Paddy Power. A recent investigation found that the company spent hundreds of millions of dollars on highly integrated promotional activities. Traditional advertising and celebrity ambassador programs were complemented by a web presence on major platforms, informed by data collected from a wide range of sources. Banks, for instance, are a provider of data to analytics firms that gambling companies rely on to ensure advertising dollars are well spent. The data may be de-identified, but in the grand scheme of things, this is irrelevant. As we move around the web, leaving a trail of data, it gets swept up and sold in all sorts of covert and unexpected ways.

And it’s not just gambling. The digital age has allowed a host of businesses—from online lending to for-profit education—to access detailed information about potential customers, the platforms that can target them, and the money to put it all together.

Often when we think about the right to privacy, we think of the right to be let alone, unmolested by the state. But the digital age has given rise to industrialized data mining, content curation, and automated decision-making, all of which undermine democracy and intensify social divisions. It calls for a more sophisticated understanding of privacy—one that can appreciate both the collective and individual nature of this right.

Our digital spaces are increasingly organized around our capacity and propensity to spend. The apparatus of observational intelligence that dominates the web was established by private companies, and assesses our worth as individuals through the lens of consumerism, according to Shoshana Zuboff, as described in her book Surveillance Capitalism. All kinds of companies pick through our online behavior for clues about how we might be convinced to spend money. These practices particularly affect poor people, who are more dependent on cheap or free online services. The services appear to cost nothing, but payment is in data rather than dollars. Such a transaction renders the user into a source to be mined for information and—in a way that is not obvious to the web consumer—transforms the experience of online life into one saturated by the logic of the market. As Michael Fertik, the founder of reputation.com, bluntly put it: “The rich see a different Internet than the poor.”

The for-profit education sector is an illustrative example. Nearly every private college in the U.S. earns most of their revenue from multi-billion-dollar federal financial aid programs. These colleges specifically target poor and vulnerable people, luring them with the promise of social mobility. “A potential student’s first click on a for-profit college website,” wrote Cathy O’Neil in her book Weapons of Math Destruction, “comes only after a vast industrial process has laid the groundwork.” This includes finding the perfect moment at which people are most open to taking drastic steps to improve their situation, as revealed through Google searches or college questionnaires. Companies also buy information from websites that post fake job ads, or ads promising to help people obtain Medicaid or food stamps, and then ask those responding to the ads if they are interested in a college education. The poor are basically stalked into enrolling, with colleges routinely having bigger budgets for marketing than they do for the costs of providing tuition. “The for-profit colleges do not bother targeting rich students,” O’Neil observed. “They and their parents know too much.”

It’s not that poor people are stupid or gormless, it’s that an entire economy has evolved around exploiting their online lives, like a parasitic vine choking the tropical ecosystem of human experience. Numerous predatory industries, such as gambling and payday lending, shell out big dollars to place targeted advertisements before segmented audiences. They play on some of the most powerful forces in our psychology—shame, desire, guilt—for the purpose of making money. The base drivers of capitalism strip away the complexities of our personalities and pulp them in the name of consumerism. They redefine our personal history, rendering it into data to be consumed by others, framed around our inclinations to buy. It represents a coup on our consciousness; a takeover of the parts of us that we instinctively believe ought to be under our control.

There is no suggestion that this market for eyeballs is based on choice or meaningful, informed consent to the sharing of information. Most people would have little concept that data disclosed by the user for one purpose might be sold to third parties for countless other purposes. And even if they did, there is no way to truly and permanently opt out. Once enough information is known about the basic behaviors of certain segmented audiences, it is possible to draw conclusions about those who fit that demographic, even if they have never shared a thing. Our digital experiences are crafted for us on the basis of our membership in a class—an intersecting set of collective traits, rather than an individual with agency or dignity. To suggest a person alone can challenge these practices by taking individual responsibility for their privacy fundamentally misunderstands our social and political environment.

These approaches to data management also influence the delivery of public services. In her recent book Automating Inequality, Virginia Eubanks argued that current models of data collection and algorithmic decision-making create what she calls a “digital poorhouse,” which serves to control collective resources, police our social behavior, and criminalize non-compliance. Eubanks, who is co-founder of the grassroots anti-poverty organization Our Knowledge, Our Power, showed government services like child-welfare monitoring, the provision of food stamps, and Medicare are subject to biases that arise from substandard approaches to data collection, or just plain bad analysis. The result: A waste of public resources—and significant human misery.

This misuse is allowed to happen because the traditional understanding of privacy is so poorly suited to the current digital moment. There is no sense that people should have a say in what government is permitted to know about them, or in what kinds of data can be used to make decisions about public programs.

For these reasons, the right to privacy is a paradoxical combination of the individual and collective. It is the right to be taken seriously as a unique person with agency, not a set of characteristics that can be used to make assumptions about future behavior. But it is also the right to be able to collaborate and communicate in shared spaces without being judged—to be part of a group on our own terms.

If we are to reclaim our agency over our online lives, and reclaim the potential of the digital age, we need to break the business model of surveillance capitalism. We need laws and policies that enshrine respect for the intrinsic value of personal data, rather than allow it to be treated as a resource that can be exploited for profit. Similar to the way rules now govern how doctors or lawyers must treat information given to them in a professional capacity, there should be stronger standards for companies that hold personal data gleaned from web use. Such outfits should not be permitted to profit from selling this information if it was not given with informed consent. They should be required to keep transparent and accessible records on what they have, how they got it, and how they use it. And companies should only collect and hold information that is necessary for the delivery of their product or service.

The right to privacy is an important bulwark against state overreach, and is rightly relied on by whistle-blowers and journalists. But to be meaningful, it also has to find application among everyday people who are tired of being manipulated and categorised in pursuit of profit. As it now stands, the internet experience undermines universal concepts of social welfare, fairness, and public participation for far too many. A positive vision of privacy treats it as a communal right—one that the poor, rather than the rich, have the greatest stake in exercising.


Viewing all articles
Browse latest Browse all 15330

Trending Articles