8.7 Privacy Peril: Beacon and the TOS Debacle
Learning Objectives
After studying this section you should be able to do the following:
- Understand the difference between opt-in and opt-out efforts.
- Recognize how user issues and procedural implementation can derail even well-intentioned information systems efforts.
- Recognize the risks in being a pioneer associated with new media efforts, and understand how missteps led to Facebook and its partners being embarrassed (and in some cases sued) by Beacon’s design and rollout issues.
Conventional advertising may grow into a great business for Facebook, but the firm was clearly sitting on something that was unconventional compared to prior generations of Web services. Could the energy and virulent nature of social networks be harnessed to offer truly useful, consumer information to its users? Word of mouth is considered the most persuasive (and valuable) form of marketing (Kumar, et. al., 2007), and Facebook was a giant word of mouth machine. What if the firm worked with vendors and grabbed consumer activity at the point of purchase to put into the News Feed and post to a user’s profile? If you rented a video, bought a cool product, or dropped something in your wish list, your buddies could get a heads-up and they might ask you about it. The person being asked feels like an expert, the person with the question gets a frank opinion, and the vendor providing the data just might get another sale. It looked like a home run.
This effort, named Beacon, was announced in November 2007. Some forty e-commerce sites signed up, including Blockbuster, Fandango, eBay, Travelocity, Zappos, and the New York Times. Zuckerberg was so confident of the effort that he stood before a group of Madison Avenue ad executives and declared that Beacon would represent a “once-in-a-hundred-years” fundamental change in the way media works.
Like News Feeds, user reaction was swift and brutal. The commercial activity of Facebook users began showing up without their consent. The biggest problem with Beacon was that it was “opt-out” instead of “opt-in.” Facebook (and its partners) assumed users would agree to sharing data in their feeds. A pop-up box did appear briefly on most sites supporting Beacon, but it disappeared after a few seconds (Nakashima, 2007). Many users, blind to these sorts of alerts, either clicked through or ignored the warnings. And well…there are some purchases you might not want to broadcast to the world.
“Facebook Ruins Christmas for Everyone!” screamed one headline from MSNBC.com. Another from U.S. News and World Report read “How Facebook Stole Christmas.” The Washington Post ran the story of Sean Lane, a twenty-eight-year-old tech support worker from Waltham, Massachusetts, who got a message from his wife just two hours after he bought a ring on Overstock.com. “Who is this ring for?” she wanted to know. Facebook had not only posted a feed that her husband had bought the ring, but also that he got it for a 51 percent discount! Overstock quickly announced that it was halting participation in Beacon until Facebook changed its practice to opt in (Nakashima, 2007).
MoveOn.org started a Facebook group and online petition protesting Beacon. The Center for Digital Democracy and the U.S. Public Interest Research Group asked the Federal Trade Commission to investigate Facebook’s advertising programs. And a Dallas woman sued Blockbuster for violating the Video Privacy Protection Act (a 1998 U.S. law prohibiting unauthorized access to video store rental records).
To Facebook’s credit, the firm acted swiftly. Beacon was switched to an opt-in system, where user consent must be given before partner data is sent to the feed. Zuckerberg would later say regarding Beacon: “We’ve made a lot of mistakes building this feature, but we’ve made even more with how we’ve handled them. We simply did a bad job with this release, and I apologize for it” (McCarthy, 2007). Beacon was eventually shut down and $9.5 million was donated to various privacy groups as part of its legal settlement (Brodkin, 2009). Despite the Beacon fiasco, new users continued to flock to the site, and loyal users stuck with Zuck. Perhaps a bigger problem was that many of those forty A-list e-commerce sites that took a gamble with Facebook now had their names associated with a privacy screw-up that made headlines worldwide. A manager so burned isn’t likely to sign up first for the next round of experimentation.
From the Prada example in Chapter 3 “Zara: Fast Fashion from Savvy Systems” we learned that savvy managers look beyond technology and consider complete information systems—not just the hardware and software of technology but also the interactions among the data, people, and procedures that make up (and are impacted by) information systems. Beacon’s failure is a cautionary tale of what can go wrong if users fail to broadly consider the impact and implications of an information system on all those it can touch. Technology’s reach is often farther, wider, and more significantly impactful than we originally expect.
Reputation Damage and Increased Scrutiny—The Facebook TOS Debacle
Facebook also suffered damage to its reputation, brand, and credibility, further reinforcing perceptions that the company acts brazenly, without considering user needs, and is fast and loose on privacy and user notification. Facebook worked through the feeds outrage, eventually convincing users of the benefits of feeds. But Beacon was a fiasco. And now users, the media, and watchdogs were on the alert.
When the firm modified its terms of service (TOS) policy in Spring 2009, the uproar was immediate. As a cover story in New York magazine summed it up, Facebook’s new TOS appeared to state, “We can do anything we want with your content, forever,” even if a user deletes their account and leaves the service (Grigoriadis, 2009). Yet another privacy backlash!
Activists organized, the press crafted juicy, attention-grabbing headlines, and the firm was forced once again to backtrack. But here’s where others can learn from Facebook’s missteps and response. The firm was contrite and reached out to explain and engage users. The old TOS were reinstated, and the firm posted a proposed new version that gave the firm broad latitude in leveraging user content without claiming ownership. And the firm renounced the right to use this content if a user closed their Facebook account. This new TOS was offered in a way that solicited user comments, and it was submitted to a community vote, considered binding if 30 percent of Facebook users participated. Zuckerberg’s move appeared to have turned Facebook into a democracy and helped empower users to determine the firm’s next step.
Despite the uproar, only about 1 percent of Facebook users eventually voted on the measure, but the 74 percent to 26 percent ruling in favor of the change gave Facebook some cover to move forward (Smith, 2009). This event also demonstrates that a tempest can be generated by a relatively small number of passionate users. Firms ignore the vocal and influential at their own peril!
In Facebook’s defense, the broad TOS was probably more a form of legal protection than any nefarious attempt to exploit all user posts ad infinitum. The U.S. legal environment does require that explicit terms be defined and communicated to users, even if these are tough for laypeople to understand. But a “trust us” attitude toward user data doesn’t work, particularly for a firm considered to have committed ham-handed gaffes in the past. Managers must learn from the freewheeling Facebook community. In the era of social media, your actions are now subject to immediate and sustained review. Violate the public trust and expect the equivalent of a high-powered investigative microscope examining your every move, and a very public airing of the findings.
Key Takeaways
- Word of mouth is the most powerful method for promoting products and services, and Beacon was conceived as a giant word-of-mouth machine with win-win benefits for firms, recommenders, recommendation recipients, and Facebook.
- Beacon failed because it was an opt-out system that was not thoroughly tested beforehand, and because user behavior, expectations, and system procedures were not completely taken into account.
- Partners associated with the rapidly rolled out, poorly conceived, and untested effort were embarrassed. Several faced legal action.
- Facebook also reinforced negative perceptions regarding the firm’s attitudes toward users, notifications, and their privacy. This attitude only served to focus a continued spotlight on the firm’s efforts, and users became even less forgiving.
- Activists and the media were merciless in criticizing the firm’s Terms of Service changes. Facebook’s democratizing efforts demonstrate lessons other organizations can learn from, regarding user scrutiny, public reaction, and stakeholder engagement.
Questions and Exercises
- What is Beacon? Why was it initially thought to be a good idea? What were the benefits to firm partners, recommenders, recommendation recipients, and Facebook? Who were Beacon’s partners and what did they seek to gain through the effort?
- Describe “the biggest problem with Beacon?” Would you use Beacon? Why or why not?
- How might Facebook and its partners have avoided the problems with Beacon? Could the effort be restructured while still delivering on its initial promise? Why or why not?
- Beacon shows the risk in being a pioneer—are there risks in being too cautious and not pioneering with innovative, ground-floor marketing efforts? What kinds of benefits might a firm miss out on? Is there a disadvantage in being late to the party with these efforts, as well? Why or why not?
- Why do you think Facebook changed its Terms of Service? Did these changes concern you? Were users right to rebel? What could Facebook have done to avoid the problem? Did Facebook do a good job in follow-up? How would you advise Facebook to apply lessons learned form the TOS controversy?
References
Brodkin, J., “Facebook Shuts Down Beacon Program, Donates $9.5 Million to Settle Lawsuit,” NetworkWorld, December 8, 2009.
Grigoriadis, V., “Do You Own Facebook? Or Does Facebook Own You?” New York, April 5, 2009.
Kumar, V., J. Andrew Petersen, and Robert Leone, “How Valuable Is Word of Mouth?” Harvard Business Review 85, no. 10 (October 2007): 139–46.
McCarthy, C., “Facebook’s Zuckerberg: ‘We Simply Did a Bad Job’ Handling Beacon,” CNET, December 5, 2007.
Nakashima, E., “Feeling Betrayed, Facebook Users Force Site to Honor Their Privacy,” Washington Post, November 30, 2007.
Smith, J., “Facebook TOS Voting Concludes, Users Vote for New Revised Documents,” Inside Facebook, April 23, 2009.