[ad_1]
In her written testimony to the US Senate on October 4, Facebook whistleblower Frances Haugen states, “Right now, Facebook chooses what information billions of people see, shaping their perception of reality.”
A data scientist and former Facebook employee, Haugen leaked a trove of internal documents to The Wall Street Journal and the US law enforcement, alleging that the tech giant knew its products were fuelling hate and harming children’s mental health by constantly choosing “profit over safety”.
“Even those who don’t use Facebook are impacted by the radicalisation of people who do. A company with control over our deepest thoughts, feelings and behaviours needs real oversight.”
Haugen testified before the US Senate, detailing the harmful policies of the “morally bankrupt” company. This hearing was prompted by a damning Wall Street Journal report that indicted Facebook for downplaying its own research on the negative impact of its Instagram app on teenagers’ mental health.
Facebook then quietly published its internal research, following which the US Senate grilled the multinational tech company in an hours-long Capitol Hill hearing early this month. Lawmakers and news outlets like The Washington Post and Bloomberg are dubbing this event as Facebook’s ‘Big Tobacco’ moment. Following this, several US news organisations are publishing related stories, collectively called ‘The Facebook Papers’.
After damning reports surfaced that implicated the social media giant in sowing discord among its users and ignoring mental health red flags, it is time to re-examine Big Tech’s role in moderating content
Haugen was part of Facebook’s civic integrity team but left after witnessing that, despite having the tools, the tech giant was prioritising profits and was unwilling to address crucial issues such as disseminating misinformation. She called for the company to be regulated. “Facebook…is subsidising, it is paying for its profits with our safety,” Haugen said.
Haugen also revealed that Facebook is optimising for content that gets user engagement, which is the measurement of comments, likes and shares on social media. “Facebook makes more money when you consume more content,” she added. The social networking site is no stranger to manipulating emotions in order to increase engagement. In 2012, Facebook conducted a controversial human research study that tested the effects of manipulating newsfeeds based on emotions, by altering the algorithms it uses to determine which status updates appeared in the newsfeed of 689,003 users. “[T]he more anger that [users] get exposed to, the more they interact, the more they consume,” Haugen said.
According to her, the company changed its content policies ahead of the 2020 US election, and implemented safeguards to reduce misinformation by giving a lower priority to political content on its news feed. However, it went back to old algorithms that prioritised user engagement after the riot at the US Capitol. “Because they [Facebook] wanted that growth back after the election, they returned to their original defaults,” Haugen said. “I think that’s deeply problematic.”
Speaking to Eos, Muhammad Umair, a research and development engineer based in Sweden, explains, “Most social media apps, including Facebook newsfeed, use basic algorithmic recommendation systems. These recommendation systems are ingrained in almost everything digital these days.”
Websites usually use cookies to track the activities and past behaviour of their users. In machine learning, recommender systems are a popular application today, used to predict the user’s relevant preference or interest. They are designed to help users avoid choice overload, but also serve to fulfil the provider’s goals, such as user engagement or increased sales.
“So for instance when you start browsing a certain product on an online shopping portal like Amazon, you start getting recommendations of similar products that are either substitutes or complements,” Umair explains. “When you are watching a certain show on Netflix, similar shows will pop up in your recommendations. On Instagram, if you are following a certain type of influencer, you will be recommended to follow similar people. In this way, apps make sure that users stay engaged.”
Noman Khalid, founding partner and chief data architect at Love For Data — a data science and decision management consultancy firm using artificial intelligence and statistical learning tools — says, “The entire social media is biased.” When you use search engines, your results are “customised” to what you want to see. “You and I will get different results when we search for the same thing,” says Khalid. Every website that wants to sell or “is interested in knowing you” collects your data. “How they store it, how they process it and how they monetise it, is all dependent on their own moral and ethical values,” he says.
The goal is to design an app or user interface that is hard for people to get away from. “Initially, Facebook was only about sharing or viewing statuses and photos of your life’s highlights,” says Umair “Then Snapchat introduced stories which allowed people to share snapshots of their day-to-day lives. This feature was then adopted by Instagram and Facebook, so now we are constantly sharing our lives,” he continues.
“One of the things I have noticed, even with my clients, is that they mention how people on Instagram have such good lives, the places they are visiting and the way they look so perfect,” says Ali Madeeh Hashmi, a psychiatrist and professor of psychiatry at King Edward Medical University. “Often they fail to realise that Facebook and Instagram are curated platforms. People never put out their worst photos, only their best ones.”
Hashmi adds that the one thing that nudges people towards depression is when they start comparing their worst days to other people’s best “Instagram days”. The constant comparison with others’ lives pushes them into a downward spiral.
“People usually tend to put their ‘best face’ online, giving the impression to a user that everyone else is having fun, enjoying their life, leading naturally to reflect that you are, perhaps, ‘missing out’ — on what exactly? Hard to tell, impossible to decide,” says Hashmi.
Multiple studies have shown how excessive use of social media can cause or aggravate existing mental health issues, even leading to suicidal behaviour. According to Hashmi, teenage girls especially, due to a variety of hormonal and cultural influences, tend to be very sensitive about body-image issues. So, when they spend a lot of time on social media and are constantly exposed to filtered, carefully curated photos of celebrities, it chips away at their self-image.
“It really makes sense, especially for teenage girls who are already going through changes and then they start comparing themselves to others,” says Shahzor Hashim, a clinical psychologist. “It’s easy to put yourself down and compare yourself to people who are apparently perfect on social media.”
Hashmi concludes that while all of these tools have revolutionised communication, business, healthcare and even politics, “they have all been engineered specifically to use technology, including behavioural engineering, to make sure you spend as much time as possible on a website or app. “The more time you spend on it, the better for the app-makers, since that allows them to then ‘sell’ your time to advertisers.”
So how can we check if we are subtly being guided and affected by the social media app we use the most? “If you delete a particular social media app from your phone,” says Khalid, “after a few weeks, your social media feeds will change as they will not be getting the data from your phone anymore. We need a lot more awareness on what these social media platforms entail, how they operate, and what it is costing us,” he says.
The writer is a clinical associate psychologist and freelance journalist. She can be reached at rabeea.saleem21@gmail.com
Published in Dawn, EOS, October 31st, 2021
[ad_2]
Source link