In another place, in another life (ie, my day job), I manage social media for a nonprofit. If that sounds boring, that’s because I don’t use ridiculous phrases like “social media guru”—because I’m a self-respecting human being.

Anywho. I do the social mediaz for a living, and, I should say that for the longest time my platform of choice was Facebook—until it went public, that is. After the company launched its disappointing IPO, things began to change. They changed algorithms, essentially quashing the small publisher (and, might I add, small nonprofit) user base. We began to see more clickbait, fewer wedding announcements. The platform that began with “[Your name] is…” has come to the avenue where personal connections matter less than referral traffic.

I noticed something else along the way, as well. As a libertarian, I was beginning to see more links, statuses, memes, and rants from my fellow libertarian friends—and less and less from my college friends, 99% of whom were liberal, or from my family, who are 99% conservative. What gives? I like those people, enjoy talking with those people, even when we don’t agree. In fact, I probably enjoy talking to those people more when they don’t disagree because they’re less likely to pull a No True Scotsman or straw man me and send me into a blind, seething rage that makes me want to push through a computer screen and shake the person by the shoulders screaming, “WHY DO YOU EXIST?!”

Why have these people, with whom I generally enjoy conversing, did so on a regular basis on my Facebook page (oh, sorry, excuse me. My “timeline”), suddenly vanished? Likely because Facebook was giving me what it thought I wanted. After all, if Facebook determines what I want to see more of by the data I put into it—the things I “like” and what I click through to read—why wouldn’t it think that I wanted to see more from my libertarian friends? After all, social media conscious woman though I am, even I am not immune to the impulse to “like” something I agree with and move on with my life.

Therein lies the problem. Facebook’s algorithm (or, really, any algorithm based on spontaneous feedback) thinks it knows me and attempts to get me more of what it thinks I want. And, to Casual Reader Gina, that may be what I want: to skim a page and see all my biases and worldviews confirmed and reaffirm that I am indeed Awesome and Right.

But that is not what I need. That is not what anyone needs, to be constantly affirmed in what they believe. It is fundamentally good to be confronted with ideas that you don’t agree with, even when they seem outrageous or offensive.

The problem is that there’s no real easy way to get what you need through Facebook or other algorithm-based feeds. Sure, you can go in and create lists and make sure to constantly check the list of people you respect but don’t agree with. But doing that is kinda like eating Brussels sprouts: You know you should do it, but if doing it’s a chore, you’re going to do it a lot less often.

There’s no survey, no settings to fiddle with, no one to contact at Headquarters to say “Hey, can you reset my algorithm preferences? Kthxbai.” If I wanted to counter the Facebook algorithm, I’d have to spend days, weeks, years, going through and consciously liking things I didn’t agree with just to get them to my screen. So the only people getting what they need are people who care enough to make the conscious effort—and quite a large effort at that.

Is this what our world of ideas has come to?

Don’t get me wrong. I am not a technophobe. I do not think that technology inherently creates an echo chamber (although I do think algorithm-based feeds do get us much closer to that than we ever have been). Human beings instinctively seek out what they want, not what they need, and have always done so. Technology just enables us to do more of what we have always done. But it is a little disconcerting that, with the miracles technology affords us, it seems that getting what we need seems to be much more difficult than getting what we want—at least in this instance.

I feel like there might be a point in here somewhere about markets delivering to us what we want but not what we need—but it could also just be a matter of Facebook not delivering a full suite of services to its customers. Whichever is the case, it behooves those, like me, who can’t shake the Facebook addiction, to be aware of just how they are receiving information—and that maybe getting what you need is worth the brief sacrifice in pride that happens when you “like” something on Facebook that you hate.

  • In this case isn’t diversity of opinions what you want though?

    • Well, sometimes what you need and what you want coincide, but I think that it’s, like, like any human being I want to have my biases confirmed, but as a rational person, I know that I need to hear conflicting viewpoints.

      Just like I want to eat chips and soda and not get fat, but I know I need to eat other things too. On some level, I want that as well, because I don’t want to die, but we’re just talking basic definitions of want and need here.

  • Christopher Shafer

    Echo chambers are dangerous things. It’s the mental equivalent of avoiding vaccines because you think they make you autistic.

    • ironically everyone here is in agreement.

  • AuntMerryweather

    Excellent piece, Gina. But Facebook isn’t going to give you what you need, only what you click on. No algorithm can parse the human psyche, or Internet snark for that matter. No program can differentiate a hate-click from an actual click. Facebook is the new “vast wasteland” that TV was in the 60s – 90s. I expect that those who are willing and able to disengage from the matrix and live an “unquantified life” from time to time will report higher levels of mental health and well-being than those who opt to stay plugged in (as though plugged-in were always the default state of being, instead of an invention of the last decade).

    • Excellent piece, Gina.

      :ginormous grin: Seriously. Praise from you is high praise indeed. 😀

      But Facebook isn’t going to give you what you need, only what you click
      on. No algorithm can parse the human psyche, or Internet snark for that
      matter.

      I agree, which is why I suggest that people, myself included, click and like things that they dislike, to try and push the algorithm back a bit. It spits out what you put into it. But it’s also hella difficult to resist the impulse to like the things that confirm your biases and just ignore and move on from those that don’t.

      I’m still working on it m’self.

      I expect that those who are willing and able to disengage from the
      matrix and live an “unquantified life” from time to time will report
      higher levels of mental health and well-being than those who opt to stay
      plugged in (as though plugged-in were always the default state of
      being, instead of an invention of the last decade).

      Yeah, I think so too, to a certain extent. I’ve never had a problem unplugging (whereas I see lots of people checking Facebook on their phones in the middle of dinner or a friend visit, during non-idle time. Drives me nuts) and usually go most weekends w/o FB.

      Still, I find it a useful content distributor. I just think that ever since the FB algorithm shift, I’ve had fewer interesting conversations and more just “likes” from the same people over and over again—and trolls.

      It’s sad. I used to have really, really good conversations on FB.