Have you ever hailed a ride from an unrated Uber driver? Dined at a zero-star restaurant? Made a pricey online order from the lowest-rated Amazon vendor?
Likely not. That's because rating systems have overhauled the way we travel, eat and shop. Born from the early days of e-commerce on sites like eBay, ratings help weed out scammers and lend some semblance of order to a fast-changing online marketplace.
But there's a darker side to this reliance on ratings and rankings, says Marion Fourcade, a UC Berkeley sociology professor and director of Social Science Matrix. Supercharged with AI technology, they're increasingly changing how we value each other and ourselves.
In her new book, The Ordinal Society, Fourcade and co-author Kieran Healy, a sociology professor at Duke University, explore the origins of the technology that gave rise to the systems that sort us today. They trace the rapid evolution of online search from the early days of Yahoo to the digital capitalism made mainstream through Google. And, they warn, the process companies use to gather information — from data on our online browsing to our biological measurements — can shape markets, societies and social life.
"In domain after domain," they wrote, "it is changing the overall distribution of opportunity, the everyday experience of status, and the nature of economic competition. In its wake, our moral intuitions about merit and personal worth are changing, too."
Berkeley News spoke with Fourcade about why she wrote this book now, why sorting can make society more accepting of inequality, and about her stark warning that we all could soon be defined by a single number — if we aren't already.
Berkeley News: In the opening pages, you write that, 'Our moral intuitions about merit and personal worth are changing.' That sounds ominous. What do you mean by that?
Marion Fourcade: What we call "society" is a process of sorting people into positions and statuses. Today, this process is increasingly driven by the automated scoring of personal data. Think about your credit score, your fitness score or your Uber rating as a passenger or a driver. Somehow, you cannot help but keep an eye on those. Increasingly, you're thinking that these scores are saying something important about you. You care about them being "high" because it makes you feel good about yourself. You think it says that you're a good credit, that you're healthy or that you're a nice person. You get a pang of satisfaction when your app sends you a congratulations message, when you realize you're rated better than the person next to you or when your credit score rises. And when you get rewarded on that basis, you feel like you have earned that reward.
Now, this is not exactly new. We've been scored for a long time. It begins in school. From early on, kids learn to compete against each other through grades. In fact, the term "meritocracy" refers to a society whose social hierarchies are legitimated by this kind of educational grading. You did well in school, you should be rewarded by better job opportunities and a good social position overall. That’s the meritocratic ideal, and even though it is riddled with all kinds of problems, it is tenacious. Today, it seems to have proliferated well beyond education.
Credit is one of those areas where a new form of meritocracy has arisen. Your credit score, calculated on the basis of your credit behavior, becomes a general index of your trustworthiness. Banks use it to determine the terms of loans, but it creeps into a lot of other domains. It becomes incorporated into insurance scoring, perhaps tenant scoring. Some dating websites may find it a useful piece of information to match you with prospective dates.
And I'm assuming that's how we arrive at this 'ordinal society,' right? Why does this matter so much that you'd name your book after it?
We wanted to speak to the universality of this process. There are very few social institutions today that are not being reformatted by it.
There are two things that are important about the word "ordinal." One, the French word for computer is "ordinateur," so ordinal captures the fact that not just any machine, but specifically computers, are running this reprogramming of social life. But ordinal is also a reference to the ordering effects of digital scores and rankings. In that respect, it speaks to the social sciences' interest in social stratification and inequality.
In other words, the idea of an ordinal society is not simply about the computerization or quantification of social life. It is about the allocation of goods, services and rights. It is about the distribution of opportunities and recognition. These processes of ordinalization (or computer-generated rank-orderings) are increasingly common across domains of social life.
On an individual level, we track everything from sleep to steps. But you're using this kind of data-everywhere in individual experience to speak to bigger systems, right?
Every individual is being scored on every possible dimension now. Every organization, too. You cannot pick a new restaurant for dinner without looking up how it is scored by one or more recommender systems. It's really embedded in our life. And there are plenty of ways in which this individual level matters. People try to game it. They buy fake reviews, they trade ratings with their drivers, they fake their level of activity, you name it. As the saying goes, "If a measure becomes a target, it ceases to be a good measure."
But the other deeper way to think about this is as an invisible and, yes, systemic infrastructure that is transforming the whole social process — the process by which people are sorted into social positions.
Tracking is important not simply because of its effects on individual psyches and behaviors, but because we live in a society where digital surveillance is now the normal state of affairs. Scoring matters not simply because we are self-conscious about the eye of the algorithm on us, but because of its social effects. It reorganizes social trajectories; as many have shown, it partly encodes old inequalities. But it also creates new pathways to opportunity and failure. In our view, that's sort of the more powerful way to think about it.
To put it another way: On the individual level, you may think that a specific score is saying something about you as a person, as we just discussed. But what it is really saying is something about what the organization that originated the score wants to get out of you.
This poses the much bigger question of the distribution of power, social control and profit.
And on a broader level, this poses the question of what a society that is regulated by these methods looks like. Once this fundamentally individualistic instrument of the score regulates many different aspects of our lives, how do we foster solidarity? How do we mobilize? How do we build a good society?
This is also a book about the rapid rise and evolution of the internet that has made this ordinal society possible. As a social scientist, how do you think about this change?
The emergence of the internet was sustained primarily by two institutions: on the one hand, government funding; on the other, the much more decentralized and hobbyist hacker community, later joined by venture capital. And after that, a pretty steady movement of concentration around a few large players. You can see this pattern in wave after wave of innovation — from the commercial web to social media to generative AI today.
Very large platforms have arisen at every stage — as intermediaries in the market for goods or the market for attention. These platforms today are enormously powerful, and they have tended to use their powers in ways that make them even more powerful. And as others have shown, that ends up undermining not only personal freedoms (via privacy invasions and lack of options), but also the quality of the service that these platforms provide.
Just look at recommendations on Amazon. They used to be about things you'd be likely to be interested in. Now, it's pretty much all sponsored results.
What about the argument that technology has been a net good for social change?
Everywhere, there is the claim that these systems will empower individuals and will be fairer than what was there before. And there's some truth to that. Who today would want to go back to the days before the internet where, if you wanted to access knowledge, you had to go through a card catalog at the library? It has massively expanded opportunities for connection, opportunities for gathering knowledge, opportunities for self-expression, getting a job, etc.
If we think specifically of behavioral scoring systems, like credit scores, they also have the potential to be more inclusive. They are designed to be blind to the kinds of social differences that mattered a lot in earlier eras. In that sense, a system that relies on credit scores is fairer than one that relies on, say, redlining, which was openly discriminatory. People are being scored on the basis, essentially, of their past behavior rather than some demographic characteristic (neighborhood as a proxy for race) that would exclude them a priori.
In fact, behavioral scoring has blossomed, in part, because governments have outlawed these forms of discrimination. But — and that’s an important but — you'd be foolish to believe that by using these new methods you have eliminated all the social forces that gave us redlining in the first place. Determining who is a good and who is a bad risk is not a neutral proposition, because present society is not neutral, and history even less so.
Where does our ordinal society go from here?
There are two possibilities. One, the most likely, is that we'll live in a society regulated by specific scores in different areas; these scores will be tailored to whichever good or service you're trying to access.
Another possibility is that we're going to see the emergence of something that looks like a superscore that will be widely used everywhere. China developed this idea with their social credit score, although in practice it is still a hodgepodge of different systems, which vary across geographical locations and do not have much bite on everyday life, at least yet. But the ambition is that this will someday be some sort of superscore that will close or open doors across domains.
In the U.S. right now, the one score that is the best candidate for this role and that is already used in this very flexible way is the credit score. The reason is that its predictive power is high, even in areas that are only tangentially related to credit. So we have some sort of metascore that already exists.
The second possibility, then, is the further elaboration of the credit score with more and more data (from other payment sources, perhaps some network data, some social media, etc. …) being channeled toward it, with flexible variations around a core technology. We're already seeing this for people who have a so-called thin credit file.
Do you find this all to be exhausting or a bit overwhelming?
The last sentence of the book is: "Life in the ordinal society may well be unbearable." What we mean by that is that living under the eye of machines that constantly adjust to what you do and whose logic you don’t understand can be incredibly exacting. This is especially true for the most vulnerable populations, because the decisions matter a lot more for their life chances.
People are pushing back against the implementation of some of these tools in various domains, particularly at work, but they often don’t have much leverage. So that's our first worry. Our second worry is that the individualizing tendencies of these technologies are making it harder to build or imagine solidaristic institutions. Even the safeguards that exist in some contexts (for instance, in the EU now) are not reversing the trend of turning social relations into algorithms.
Since you mentioned the book's last line, I highlighted one above it: 'Public goods and collective goals are being dissolved in the acid bath of individualization and competition, leaving us increasingly alone in a hyperconnected world whose social ordering is precisely metered and, in its factitious way, inarguably 'right.'' That's provocative!
We were trying to get readers to think about what kind of people this ordinalization of life is turning us into. The book does not offer solutions. It's not like we don't have any ideas — we do. But it didn't feel like this is where we wanted to put them. We just wanted to encourage people to grapple with some of the complexities that we talk about.
One of them is the fact that these systems, for all their faults, are wildly attractive, and not simply to the organizations that deploy them. They are also attractive to those who are subject to them. Nobody wants to be under the eye of the ranking system, except if they do very well. But they like it if others are scored. So that's very pernicious! What kind of morality is this producing? How are we going to look at each other? What kind of institutions are we building if we put these systems at the core of our social sorting process?
We also wanted to think about what it means to be a citizen in a society that is regulated in this way. One important implication is that this may offer a good argument to think hard about universal programs again. If you have a universal good, you don't need to sort. And if you don't need to sort, you don't need to pass a moral judgment on people. You don't have to determine who deserves it or not because everybody gets access.
We need to rethink our institutions in ways that make the necessity to sort less overwhelming, so that the ordinal society and its unbearableness loses much of its edge and much of its power.