Here I am once again pulling back in frustration and anger from my use of Facebook.
I don't want to have to do this. I want there to be an easy way for me to keep up with people I like, to promote things I like and help them succeed, and to engage in good conversations about what's happening in the world. I want the web to be smart and save me redundant effort. I want it to be easier for information and reactions to it to flow throughout multiple sites. I want people to be able to use the tools they are most comfortable with and for that choice to be independent from the content encountered through the tools.
But I don't want my or my friends' demographic information and details of our activity to be continually and pervasively leveraged for corporate marketing purposes. I want it to be possible to share and participate without providing a neatly packaged commodity that can be used to alter our perception of the online world.
What Facebook's new Instant Personalization feature reminds me of is that manipulation of our online reality. Certainly the list of wants I gave above would be a change to my online reality of great power, but such a change demands great responsibility. It demands transparency – what is being changed, by whom, and what other ways might it be presented to someone else? – and it demands control – opt in, not opt out. Facebook is not demonstrating that responsibility, nor does its history or the statements of its CEO suggest it is likely to.
In many specifics this is a design problem. My experience with the feature so far is that it's very hard to see what is happening or why. I want to give permission before a site can use my data or my friends'; that's something I should decide, not Facebook's business partnerships team. I want a way to lift the hood and see just what's being done underneath. I want a way to have a certain thing not done – and I want that way to be very obvious and easy.
Think about the difference between visiting an Instant Personalization site (e.g. the surprise of seeing my contacts from Facebook's list of articles they commented on when I visit a site I didn't even know they were reading and then trying to figure out how to stop that sharing of data with this site) vs. visiting a site and having Firefox ask if I want to allow this site to open pop-up windows. The former is confusing and opaque, the latter clear and easily controlled. I don't want to have to fumble around trying to figure out how to prohibit the undesired action after the fact, I want to be asked first and be given the option to set a policy for this site henceforth.
Why does this matter? I am very confident that Facebook's marketing and business growth aims do not map exactly to a map of my trust. Just because they might think a particular company should be allowed to receive a package of social data (me, my demographics, who my friends are, and all their demographics, for instance) doesn't mean I would ever choose to package up all that info for the site myself.
"But it's public information!" you might say. Perhaps – though my confidence over what will and won't be shared is shaky given Facebook's company culture – but the information wasn't shared by me (or my friends) for this purpose or context. It wasn't packaged by us for use across the web. There's a difference between me saying to Facebook "my friends can know when my birthday is" and Facebook saying to an online store "this user falls into this demographic group by age, gender, and location" so that they can adjust their pricing based on market research of what that particular group is willing to pay for their products. That's a hypothetical example off the top of my head, but it certainly seems to fit within the existing capabilities of the feature.
What compounds all these concerns is the fact that Facebook friends can share your data. User A can go to Site X and by not blocking the feature tell them all kinds of things about his friend User B. Maybe User B never goes to Site X because she does not trust them with her information, but it's passed out of her control now.
In the course of removing all my "friends" on Facebook (and letting each know we're still friends in non-Facebook contexts), I was chatting about these concerns with my friend Glenda Bautista and she brought up a great analogy:
When you add a friend on Facebook or allow someone to add you as a friend, you end up being responsible for each other in your actions. As she said, it's messed up logic to have to treat a tool like this as an STD, but that's just what it is: socially transmitted.
Play safe, gang.