Some Thoughts on the Issues to Consider for Leaving Facebook
The benefits of staying versus compromising one’s personal values and supporting a possibly harmful business.
Listen to this week's Think Queerly Podcast episode for a more in-depth discussion:
Prefer to watch this episode? Subscribe to my YouTube channel:
Last week, a friend of mine wrote a post on Facebook to say that she was strongly considering leaving Facebook forever.
While she benefits from some aspects of the platform, she wrote,
“I also feel out of integrity supporting a platform that is detrimental to the current and future society I want to live in. Simply by being here, I encourage their business model and reward their decision-making. From a values perspective, I feel like I should go.”
She concluded by asking people to offer contrarian perspectives. Here is what I wrote:
I’m sharing some thoughts from the top of my head that I’ve been thinking about for some time. Thanks to your post, these ideas are now going to get some air. Please note this is somewhat a stream of consciousness and I look forward to other people’s insights.
There are a great number of issues to consider around leaving Facebook.
These include the reasons for using the platform in the first place and contrasting that with other social media platforms; the weight of the value and benefits it offers you versus the weight of your “higher” values being contravened.
Then there are the issues that surround the structure of Facebook. In the United States, I believe that Facebook is not considered a news media corporation and is not subject to the various constitutional laws that govern news media.
When it comes to how the platform disseminates information, this is where, I believe, the platform should be broken apart. We cannot deny that Facebook has become a source of news with many people going to Facebook instead of buying a newspaper, watching the news, or reading the news directly online from its source, like say, The Guardian. If the part of Facebook that acts as a news service were governed as a news organization that might alleviate some of the major challenges around the spreading of information that is false.
Oversight of Facebook is required for greater public safety. Can users of the platform have a direct and meaningful impact? How could we come up with a democratic process such that the people who keep the platform alive — the individual users — have a vote of some form that results in measurable, structural, “positive” changes?
Most of us have forgotten the time from about 15 years ago when Facebook did not exist.
What did we do back then? The way we maintained relationships involved things like visiting in person, phone calls, emails, and writing letters.
Facebook came along and slowly made most of those options virtual but also expanded the number of people we could call “friends”. Most of those people who we call friends on Facebook are not. Just because Facebook uses the label, “Friends,” doesn’t mean that they are, not unlike going into an independent coffee shop and ordering a “Grande”, which is a term coined by Starbucks.
That’s not to be mean or disrespectful to the people who are following someone’s profile; it is simply the truth. As social beings, even the most gregarious person doesn’t have more than a handful of very close friends (not counting family) and perhaps no more than two digits worth of other people they might also call friends — or acquaintances.
For those of us who have a message to share with the world or who want to put the work that we do at the forefront on social media, Facebook has continuously adjusted their algorithm to make organic reach next to impossible — unless you pay to advertise which is now the only way to get your post to the top of peoples’ feeds who follow you.
The level of organic engagement has depreciated to pretty much zero. For me, it begs the question, why can’t people read what I have to say and interact with me without me having to pay to have what I share visible in their newsfeed?
Facebook does not allow the individual user to control how they want to prioritize their own feed, which I think should happen at both the macro and micro levels. The option, for example, to “see less recommended posts/or people” is infuriating: I don’t want to see recommendations at all. The algorithm is already controlling my feed so don’t gaslight me with the “idea” that I have a choice about what I get to see and control.
Is there not enough profit for Facebook to earn from selling paid advertisements or even considering some sort of a membership structure instead of suppressing the information that people want to share with those who follow them? That’s a rhetorical question.
What all this demonstrates (in my opinion) is Zuckerberg’s need to control how people consume information — and how they can be manipulated to remain (hooked) on the platform.
Facebook has changed the landscape for the sharing and consumption of information. I believe this is something that needs to be analyzed and disrupted back to something that we used to have — a much more humane way of communicating and interacting than what we are experiencing now.
Member discussion