Senin, 19 Juli 2010

Facebook must stop tricking its users, says Danah Boyd

Danah Boyd, the researcher who accused Google and Facebook of failing users on privacy earlier this year, has called on Facebook to embrace “radical transparency”. In a post on her blog at the weekend, Boyd recalled the reaction to her speech at the South By Southwest conference in Texas:

“After my talk, I received numerous emails from folks at Google, including the PM in charge of Buzz. The tenor was consistent, effectively: ‘we f—– up, we’re trying to fix it, please help us.’ What startled me was the radio silence from Facebook…”

Boyd joins a growing number of technology experts who are criticising Facebook’s approach to privacy. Her argument is particularly powerful since she takes the time to gather data about these things.

“Youth are actually much more concerned about exposure than adults these days. Why? Probably because they get it. And it’s why they’re using fake names and trying to go on the DL (down-low).

“A while back, I was talking with a teenage girl about her privacy settings and noticed that she had made lots of content available to friends-of-friends. I asked her if she made her content available to her mother. She responded with, ‘of course not!’ I had noticed that she had listed her aunt as a friend of hers and so I surfed with her to her aunt’s page and pointed out that her mother was a friend of her aunt, thus a friend-of-a-friend. She was horrified. It had never dawned on her that her mother might be included in that grouping.”

This kind of confusion is understandable given the complexity of Facebook’s privacy settings. Elliot Schrage, Facebook’s vice president for public policy, told the New York Times last week that Facebook can’t win – it is criticised for not allowing enough control if the privacy settings are simple and it is criticised for being too confusing when it allow more granular control.

However, Boyd says Facebook can do more:

“If Facebook wanted radical transparency, they could communicate to users every single person and entity who can see their content. They could notify then when the content is accessed by a partner. They could show them who all is included in ‘friends-of-friends’ (or at least a number of people). They hide behind lists because people’s abstractions allow them to share more. When people think ‘friends-of-friends’ they don’t think about all of the types of people that their friends might link to; they think of the people that their friends would bring to a dinner party if they were to host it. When they think of everyone, they think of individual people who might have an interest in them, not 3rd party services who want to monetize or redistribute their data. Users have no sense of how their data is being used and Facebook is not radically transparent about what that data is used for. Quite the opposite. Convolution works. It keeps the press out.”

Boyd closes by emphasising the importance of choice for Facebook users:

“The battle that is underway is not a battle over the future of privacy and publicity. It’s a battle over choice and informed consent. It’s unfolding because people are being duped, tricked, coerced, and confused into doing things where they don’t understand the consequences. Facebook keeps saying that it gives users choices, but that is completely unfair. It gives users the illusion of choice and hides the details away from them ‘for their own good.’”

However, Facebook is closing in on 500 million users. It might calculate that it can afford to annoy a few of them in order to get what it wants.

Design by 2007-2008