Tag Archives: implicit bias

New on SSRN: “The New Public Accommodations”

A full draft of my article “The New Public Accommodations,” coauthored with Aaron Belzer, is now available on SSRN. Here’s the abstract:

The sharing economy raises important new questions about public accommodation laws. Such laws originally were enacted to prohibit establishments open to the public—for example, hotels, restaurants, taxi services, and retail businesses—from discriminating on the basis of characteristics such as race, color, religion, and national origin. Sharing economy businesses are functional substitutes for these traditional public accommodations. Yet existing public accommodation laws are not always a good fit for the unique features of the sharing economy.

This Article is the first to argue that public accommodation laws must evolve to address race discrimination in the sharing economy. Available evidence suggests that, in many circumstances, race discrimination affects the sharing economy in much the same way it affects the traditional economy. Sharing economy businesses use online platforms to connect providers of goods and services (drivers; landlords) with users of those goods and services (passengers; renters). These platforms often make race visible to both providers and users by requiring that they create profiles that include names, photographs, and other information. Such profiles may trigger conscious and unconscious bias and result in discrimination even if the parties never meet in person. Moreover, sharing economy businesses encourage or even require providers to rate users. Rating systems aggregate biases, and users who are members of disfavored racial categories may begin to receive worse service, or, eventually, to be denied service altogether.

This Article examines existing public accommodation laws—Title II of the Civil Rights Act of 1964, 42 U.S.C. § 1981, 42 U.S.C. § 1982, and the Fair Housing Act—and concludes that they hold considerable promise for remedying discrimination in the sharing economy. Nonetheless, the sharing economy presents new issues that existing laws do not entirely address. To the extent that sharing economy businesses perform the same function as traditional public accommodations yet escape existing laws, we argue that those laws should be amended and briefly describe the form the new laws should take.

Feedback is very much welcome–please feel free to contact either me or Aaron.

In Progress: The New Public Accommodations

I’m still taking a break from blogging while I continue to try to fully resolve my hand issues. I have, however, posted a short summary of my work in progress relating to race discrimination in the sharing economy on SSRN. The piece is called “The New Public Accommodations.” Feedback is welcome, and if you would like to see a longer draft in progress — one I’m not quite ready to post publicly — please feel free to email me.

New Salon.com Piece: “The Sharing Economy Has A Race Problem”

I have a piece in Salon today about racial bias in the sharing economy. How can we prevent the race discrimination that affects businesses in the traditional economy from infecting the new sharing economy as well? The Salon piece gestures at a larger project I’m currently working on that will probably take the form of a law review article, tentatively titled “The New Public Accommodations,” that will look at how we can prevent private-actor race discrimination on within the sharing economy. Continue reading

Labels, or, Why It is Not Helpful to Call Someone a Racist

In both the media and public discourse, the question of whether a particular public figure “is a racist” is an extremely common one. Is Justin Bieber a racist? Is Donald Sterling a racist? Is George W. Bush a racist? Is Barack Obama a racist? Is Paula Deen a racist? Is Cliven Bundy a racist? Is George Zimmerman a racist? Is Miley Cyrus a racist?

For several reasons, I do not think this is a helpful question.

One reason is that — in at least one sense of the term — pretty much everyone is a racist. By this I mean that, inadvertently or otherwise, people often treat other people differently because of race. Consider a few examples of research from the past few months. One study found that law firm partners who were asked to evaluate a writing sample gave it significantly higher ratings when they thought the author was white than when they thought the author was black, even though the resume attached to the writing sample was identical in both instances. Another study of 6500 professors at 259 different colleges and universities found that professors respond more frequently and more positively to requests for mentorship from male students with white-sounding names than they do to identically-phrased requests from female students and students with non-white-identified names. Still another study found that twice as many drivers failed to yield for black pedestrians at crosswalks versus white pedestrians, and that black pedestrians waited on average 1/3 longer to cross the street at crosswalks.

Again, these are just a few examples from the past few months. I could list literally thousands more, but I won’t because this is a blog post. And while some people might devote effort to poking holes in any individual study, at some point the most straightforward explanation — Occam’s razor, if you will — is simply that implicit racial bias exists and that it affects most, and perhaps all, people in at least some situations. Continue reading

Uber, Privacy, and Discrimination

ETA: On the basis of the blog post below, I was invited to discuss possible problems with Uber’s passenger rating system on NPR’s “On the Media.” You can listen here.

For the uninitiated, Uber is a taxi-like company that, with the touch of a button on a ridiculously easy-to-use app, sends a driver in a classy black car to your location using GPS.  I recently learned that Uber not only stores data about its passengers, but also allows its drivers to rate passengers and makes the ratings available to other drivers.

I’ve long known that customers rate Uber drivers; when you order a car, you can see the rating from one to five stars of the driver who is picking you up.  And I’ve always assumed that the company keeps records of when you use the app and where you go.  But it turns out that the drivers also rate passengers, and that — at least anecdotally — the various drivers use your rating information to decide whether and how quickly to pick you up.  The app interface looks like this before you order a car, so you can see where the cars are in real time and an estimate of how long it will take a car to pick you up:

Continue reading