Great power, great responsibility, and unsolved challenges

Here I am once again pulling back in frustration and anger from my use of Facebook.

I don't want to have to do this. I want there to be an easy way for me to keep up with people I like, to promote things I like and help them succeed, and to engage in good conversations about what's happening in the world. I want the web to be smart and save me redundant effort. I want it to be easier for information and reactions to it to flow throughout multiple sites. I want people to be able to use the tools they are most comfortable with and for that choice to be independent from the content encountered through the tools.

But I don't want my or my friends' demographic information and details of our activity to be continually and pervasively leveraged for corporate marketing purposes. I want it to be possible to share and participate without providing a neatly packaged commodity that can be used to alter our perception of the online world.

What Facebook's new Instant Personalization feature reminds me of is that manipulation of our online reality. Certainly the list of wants I gave above would be a change to my online reality of great power, but such a change demands great responsibility. It demands transparency – what is being changed, by whom, and what other ways might it be presented to someone else? – and it demands control – opt in, not opt out. Facebook is not demonstrating that responsibility, nor does its history or the statements of its CEO suggest it is likely to.

In many specifics this is a design problem. My experience with the feature so far is that it's very hard to see what is happening or why. I want to give permission before a site can use my data or my friends'; that's something I should decide, not Facebook's business partnerships team. I want a way to lift the hood and see just what's being done underneath. I want a way to have a certain thing not done – and I want that way to be very obvious and easy.

Think about the difference between visiting an Instant Personalization site (e.g. the surprise of seeing my contacts from Facebook's list of articles they commented on when I visit a site I didn't even know they were reading and then trying to figure out how to stop that sharing of data with this site) vs. visiting a site and having Firefox ask if I want to allow this site to open pop-up windows. The former is confusing and opaque, the latter clear and easily controlled. I don't want to have to fumble around trying to figure out how to prohibit the undesired action after the fact, I want to be asked first and be given the option to set a policy for this site henceforth.

Why does this matter? I am very confident that Facebook's marketing and business growth aims do not map exactly to a map of my trust. Just because they might think a particular company should be allowed to receive a package of social data (me, my demographics, who my friends are, and all their demographics, for instance) doesn't mean I would ever choose to package up all that info for the site myself.

"But it's public information!" you might say. Perhaps – though my confidence over what will and won't be shared is shaky given Facebook's company culture  – but the information wasn't shared by me (or my friends) for this purpose or context. It wasn't packaged by us for use across the web. There's a difference between me saying to Facebook "my friends can know when my birthday is" and Facebook saying to an online store "this user falls into this demographic group by age, gender, and location" so that they can adjust their pricing based on market research of what that particular group is willing to pay for their products. That's a hypothetical example off the top of my head, but it certainly seems to fit within the existing capabilities of the feature.

What compounds all these concerns is the fact that Facebook friends can share your data. User A can go to Site X and by not blocking the feature tell them all kinds of things about his friend User B. Maybe User B never goes to Site X because she does not trust them with her information, but it's passed out of her control now.

In the course of removing all my "friends" on Facebook (and letting each know we're still friends in non-Facebook contexts), I was chatting about these concerns with my friend Glenda Bautista and she brought up a great analogy:

When you add a friend on Facebook or allow someone to add you as a friend, you end up being responsible for each other in your actions. As she said, it's messed up logic to have to treat a tool like this as an STD, but that's just what it is: socially transmitted.

Play safe, gang.

Published by

Dinah from Kabalor

Author. Discardian. GM. Current project: creating an inclusive indie fantasy ttrpg https://www.patreon.com/kabalor

4 thoughts on “Great power, great responsibility, and unsolved challenges”

  1. A few more thoughts on the topic.
    This isn’t a move driven by fear as much as it is irritation at the way they run their business. If their default was “opt in” instead of “opt out” I’d feel differently, but there’s clearly a disconnect between the way they encourage people to use Facebook and how the business behaves (& what the CEO says).
    For example, they create a feeling of privacy, of just talking to your friends, but then they add on a feature like this Instant Personalization which – unless you make quite a bit of effort to notice the change and find out how to turn it off – will pass along information on you and your friends to third parties that they’ve decided to form a business relationship with. They’re encouraging a business culture of “opt out” as the default model which is counter to existing, better models of privacy and respect for users. It feels like their first focus is making quick money, not creating wildly happy long-term users/customers.
    Think about the bullshit that credit card companies were doing that had to be legislated to stop them trying to hide the ways they were making it hard for people to be good managers of their money. It’s going to be a while before good user experiences are mandated to keep companies from making it hard for their users to be bad managers of their information. In the meantime, vote with your time, energy, and money. I don’t get enough payoff from personal use of Facebook to be worth encouraging either the business practice shift I think they’re trying to make or to feel comfortable doing corporate demographics research for free for every company they partner with.
    It’s like deciding to buy my food from a local business that takes good care of its employees and my community instead of from some place owned by a distant megacorp that doesn’t. A little Discardian choice of where I will and won’t apply my time/energy/focus.

    Like

  2. Bravo!
    I really don’t like facebook either and have deactivated my account once already, but for a work context to communicate with a team with a page I had to reactivate it. That was two years ago and it has tumbled downhill since.
    Last night I got to the point where I wanted to deactivate again and a bunch of my twitter friends tried to talk me out of it by giving me elaborate & time costly ways of protecting my information, sanity, and friends.
    I am just don’t think that it is worth it. Like you, I would rather have an opt-in rather than opt-outs that are really really hard to find and activate.
    Basically, Facebook is now all set up to exploit the users not create a social network.

    Like

  3. My solution was to send a “Still friends! See me elsewhere!” note to everyone friended my personal profile (as opposed to friends or “likers” of the Discardia page) and then unfriended everyone.
    Unsurprisingly, I’ve gotten a lot more done since eliminating that low-payoff, high-temptation distraction. 🙂

    Like

  4. I appreciate the tension between building an awesome social infrastructure and then needing to exploit it for commercial gain. It seems to me the net has been through that cycle as many times as you can count. There have been social networks that have died under the weight of spam (Usenet), networks that couldn’t monetize based on advertising revenue and thus went private (Ning comes to mind), and networks that got assimilated by something much larger so they didn’t have to justify themselves quite so carefully (Blogger). And then there’s all the ones that died (Compuserve, Prodigy, Delphi ….) as technology marched forward.
    I’m personally fond of the worlds that I can inhabit (like Typepad) where there is a low but non-zero cost of entry; Metafilter lives in this space too.

    Like

Leave a Reply to Dinah Cancel reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s