In Social Media We (Don’t) Trust
In a post-Cambridge Analytica world, can we still rely on Facebook and friends?
Loss of trust is a powerful thing. Once it’s gone, it’s nearly impossible to get back. Yet, the last few years have shown that social media platforms don’t prioritise users’ trust at all.
Social media, once a place to share life updates and cute animal videos, has become something more insidious, more toxic than any of us could have imagined.
And while Facebook is the main culprit – and by Facebook I also mean WhatsApp, Instagram and Messenger – companies such as Twitter and LinkedIn aren’t completely blameless.
This begs the question: can we even trust social media companies anymore?
Like millions of millennials, I grew up on social media – painstakingly ranking my Top 8 friends on MySpace, writing angsty blogs on LiveJournal and poking friends during the early days of Facebook. But seeing the dark turn these platforms have taken in recent years has made us reevaluate what it means to be online in today’s world.
And for me, it goes beyond my personal use, as I also train companies on how to use social media in strategic, effective ways. Yes, it’s still about getting those clicks and reaching those eyeballs. But how do we do that while respecting privacy and without abusing trust?
In nearly every session with clients, I’m asked some version of the trust question – and it’s a fair point. We’re all anxious about what’s to come on these platforms.
But I’m here to say there’s hope. To loosely borrow from John Lennon and Yoko Ono: Distrust in social media is over – if you want it.
Change has to come from all of us. Social media companies and regulators need to work together to prevent history from repeating itself. And as users, we need to demand better from them.
What Social Media Companies Can Do
It may not be the fun or sexy thing to do, but social media platforms need to continue to build tools and resources for users to better understand privacy and how their data is being protected – or not protected.
And I’m not talking about a hidden privacy policy filled with jargon and confusing legalese. These companies need to create clear, interactive resources that all users can quickly access and understand. They need to equip users with the information needed to make smart decisions about their data.
On top of that, social media companies need to take ownership of their mistakes and take concrete actions to prevent history from repeating itself.
Should Facebook put so much time and effort into launching Libra and Facebook Dating when it hasn’t even fully fixed its issue of false information being virally disseminated? Probably not.
Instead, it should focus on creating something like regular automated reports that tell users what data has been captured and give them the option to change privacy settings on the spot. It’s the least they could do, considering versions of these features exist in less-obvious places on your social media accounts already.
Social media companies need to follow through with their promises to users before they divert their attention to making shareholders happy. In other words, maybe tone it down with the cryptocurrency until you tackle the real issues, Zuck.
What Regulators Can Do
The biggest issue with regulators, for me, is an obvious one: they have no idea what they’re doing.
Probably the most infamous moment of Zuckerberg’s testimony to the US Congress in 2018 was when he had to explain how Facebook makes money to someone who obviously doesn’t understand Facebook:
If those in power can’t wrap their heads around the basic business models of these companies, how can we expect them to create effective legislation that can have a real impact?
Regulators need to educate themselves on how these platforms work, as well as gain a better understanding of what’s at stake if they don’t do something. If we keep having government hearings with the Mark Zuckerbergs and the Jack Dorseys of the world without any knowledge of what is actually going on, we’ll never find the solutions we need to affect change.
Tech literacy is the key to addressing the issues users face on social media today, but it needs to go a step further. Regulators need to legally formalise the duties that must be carried out by companies to ensure reasonable protection of users’ data. After all, initiatives like GDPR only work if everyone actually understands and abides by them.
We need to make sure that prioritising users’ privacy is the industry standard, not just an empty promise that social media giants will break over and over again.
What We Can Do
As users of these platforms, we have to be better too.
Just like it’s important to be a positive, contributing citizen to the community around you, it’s equally important to be a positive, contributing netizen to the communities you are a part of online.
This means being more aware of the content you’re reading and sharing, reporting harmful or malicious posts and knowing how your data is being used to target you with ads and messages.
We owe it to ourselves and each other to treat our online communities the way we treat our local communities. Write letters to your lawmakers, sign petitions, make your voice heard, educate yourself on the issues – the faster we all recognise that data rights are human rights, the better off we become.
Add your comment