The Ninth Circuit made a potentially big decision yesterday – Fair Housing Council v. Roommates.com -- that could significantly increase website companies’ liability for content posted by their users (I’ve posted the decision below the fold as an iPaper). Eugene Volokh and Susan Crawford both have interesting things to say about the case. Volokh thinks it’s both correct and fairly inconsequential. I’m not sure I agree. To me, this case illustrates the stark divide between law-in-theory and law-in-practice. As a purely theoretical matter, I tend to agree with Volokh that the decision isn’t a big deal. But when you consider the practical implications, I think it becomes quite harmful.
First, some background. Let’s say that, in the comments section here, Gary Farber accuses “Cleek” of having poor indie-rock sensibilities. Outraged, Cleek sues Typepad (the host of our blog) for defamation. Typepad, however, didn’t actually say anything about Cleek's music tastes, it just made blog space available for others to provide comments.
In this case, Cleek would be out of luck because of 47 U.S.C. § 230. Section 230 grants immunity to websites and service providers for content posted by other parties that use their sites. Thus, you can’t sue Yahoo for offensive statements made on chat boards, nor can you sue Google for content that its search engine pulls up. It’s a good law, and it’s been interpreted quite broadly over the years to avoid “chilling” Internet activity. For instance, Blogger wouldn’t exist if the company could have been sued for every random comment on blogspot sites.
With that in mind, the next relevant law is the Fair Housing Act. To be grossly general, it prevents housing brokers from discriminating on the basis of, among other things, sex and family status. Because brokers can’t refuse to sell to families, they generally can’t even ask whether you have children.
Enter Roommates.com. As the name suggests, this site allows people to find and provide housing. To get started, the site's users must enter information about themselves in various prompts (they were drop-down windows as I understand). Some of these questions involved children and gender (e.g., “Children present” or “Children not present”) that would normally be illegal.
Long story short, Roommates got sued under the Fair Housing Act. Roommates responded that they are immune under Section 230 because its users were the ones who actually entered the information. The court disagreed, finding – and here’s the key – that the structure of the question prompts was itself illegal content creation. In other words, the question prompts themselves were illegal because they required people to answer illegal questions. Critically, the court went on to find that the site’s search engine also lacked immunity because its results were based on these illegally-structured question-and-answers. (This is a very brief description, so read the opinion below the fold if you want more detail).
As a purely theoretical matter, the decision seems harmless enough. As Volokh explains, Roommates was “channel[ing] the speech in likely illegal directions.” Also, as Crawford notes, the logic of the case could be limited to the more egregious facts of this case. If it is so limited, the decision won’t be that big a'deal.
The problem, though, is that the case will create big problems in the real world. Specifically, it will be impossible to cabin the case’s logic to these specific facts. Vague holdings, like children, tend to grow more expensive through time. If the Supreme Court lets the decision stand, I predict that it will significantly increase litigation and chill Internet activity (e.g., sites like Roommates will be much less efficient as prompts become bulletin boards).
The reason I’m skeptical is that litigation if often done in bad faith. As any real litigator will tell you, the point of litigation isn’t necessarily to vindicate a right, but to harass an opponent with discovery, document productions, and other expensive tactics. So long as a claim is plausible, you can inflict real damage (and maybe get a favorable settlement) even if you think you will ultimately lose.
The beauty of broad, bright-line immunity under Section 230 is that it allows parties to end this type of litigation quickly. If Google or Craigslist gets sued for content its users create, then the companies can immediately file a summary judgment motion (or a motion to dismiss) and end the case before discovery begins. If that line becomes less bright, then it’s harder to shut down frivolous (or at least losing) litigation early.
As Crawford notes, this decision clearly blurs that line. The real-world implication is that litigation will become easier to start and harder to finish. If Google attempts to dismiss a lawsuit, the plaintiff can always respond that broad, expensive discovery is needed to look at the inner workings of the search engine algorithms, or to depose the people who engineered the prompts, or whatever. In fact, the mere potential of expensive litigation will require companies to spend scarce resources on compliance and oversight to make sure their programs don’t cross the line.
Appellate judges need to realize that making bright lines fuzzy vastly increases litigation costs and business expenses. These increased costs are not necessarily a sufficient reason to abandon bright lines. But when you think about the immense benefits of keeping the Internet humming, we need to be very wary of imposing these sorts of costs.
Case below - (by the way, this feature seems pretty cool):