Home Health Online creators are de facto therapists for millions. It’s complicated.

Online creators are de facto therapists for millions. It’s complicated.

0
Online creators are de facto therapists for millions. It’s complicated.

[ad_1]

Creator of online content Rayne Fisher-Quann at her home in Toronto on Aug. 18, 2022. (Hao Nguyen for The Washington Post)
Creator of online content Rayne Fisher-Quann at her home in Toronto on Aug. 18, 2022. (Hao Nguyen for The Washington Post)

Faced with explosive demand and few safeguards, creators of mental health content are defining their own ethics

Comment

Issey Moloney signed up for therapy through Britain’s National Health Service when she was just 12 years old. She was on a waiting list for four years.

In the meantime, social media helped her feel less alone, says the now 17-year-old who lives in London. She connected with people online as the pandemic isolated her from real-life friends. Eventually, she started making her own content. Now, she has 5.9 million TikTok followers — about 85 percent of them young women between the ages of 14 and 18 — and a collection of videos about friends, relationships and mental health.

Some of her clips are general, such as a short ode to the relationship between mentally ill people and pasta, while others address real diagnoses, such as “signs you might have BPD,” or borderline personality disorder. Sometimes, people ask her to address particular conditions. She tries to to research for at least a week, checking websites and message boards and interviewing by direct message people who have the particular diagnosis. She adds disclaimers: “Everyone deals with [panic attacks] differently and not all of them feel the same.”

She has no official training and often talks about feelings that are to some degree universal, such as anxiety and depression. Commenters occasionally accuse her of pathologizing just “being a teenager” or encouraging self-diagnosis.

In real life, mental health information and care are sparse. In the United States, 1 in 3 counties do not have a single licensed psychologist, according to the American Psychological Association, and Americans say cost is a top barrier to seeking mental health help. On the internet, however, mental health tips are everywhere: TikTok videos with #mentalhealth in the caption have earned more than 43.9 billion views, according to the analytics company Sprout Social, and mentions of mental health on social media are increasing year by year.

The growing popularity of the subject means that creators of mental health content are filling a health-care gap. But social media apps are not designed to prioritize accurate, helpful information, critics say, just whatever content draws the biggest reaction. Young people could see their deepest struggles become fodder for advertisers and self-promoters. With no road map even for licensed professionals, mental health creators are defining their own ethics.

“I don’t want to give anyone the wrong advice,” Moloney says. “I’ve met some [followers] who’ve just started crying and saying ‘thank you’ and stuff like that. Even though it seems small, to someone else, it can have a really big impact.”

As rates of depression and anxiety spiked during the pandemic and options for accessible care dwindled, creators shared an array of content including first-person accounts of life with mental illness and videos listing symptoms of bipolar disorder. In many cases, their follower counts ballooned.

For teens, navigating the mental health pitfalls of Instagram is part of everyday life

Creators and viewers alike say the content is helpful. They also acknowledge that embracing it carries risks such as misinformation and harmful self-diagnosis. Some high-profile accounts have been criticized for sharing advice not backed by most professionals. Many creators sell courses and books or enter advertising partnerships, opening the door to conflicts of interest. Much online content simply tells listeners what they want to hear, creators say, and relatively rare conditions such as narcissistic personality disorder receive outsize attention, with commenters diagnosing their least-favorite people. And because of algorithms, people who show interest in this type of content see more of it.

Sometimes, creators find themselves dealing with a flood of messages from followers or struggling to control how audiences interpret their content.

“It’s definitely strange seeing myself drawn into a commodifiable object for people to define ‘mental illness’ by, and to a certain extent for me to be eaten up by the algorithm that encourages people to go down this pipeline,” said Rayne Fisher-Quann, who openly talks about her struggles with mental illness with her 225,000 followers on TikTok. “There absolutely is a concerted effort to really capitalize on mental illness and particularly on young women’s mental illness. It’s a very marketable commodity right now.”

Although professional organizations such as the American Counseling Association issue some social media guidelines, they tend to misunderstand or ignore the demands of the creator economy, therapists said. Nonprofessionals, meanwhile, can say almost anything with few consequences. Young people cannot always tell the difference between experts and hacks, creators say.

“Even if a therapist isn’t on social media, their clients are, and those clients are impacted by what they see on social media, and they’re bringing that directly into the session,” said Sadaf Siddiqi, an Instagram creator and licensed therapist.

Training is valuable. So is experience, creators say.

Many creators are not experts, and many say they’ve previously been failed by experts.

Fisher-Quann’s inbox is full of the types of questions you’d whisper to a best friend at midnight: Do these difficult feelings mean I have depression? Does having a queer sexual experience mean I’m gay?

If the question touches on something she’s experienced, she might respond. Other times, the messages go unanswered, said the 21-year-old writer and cultural critic. People occasionally message her to say they’re contemplating suicide, and she says she directs them toward professional resources. But it hurts to know they might not receive the real-world help they need, Fisher-Quann said.

“Because of that institutional failure, I don’t feel comfortable basically telling people to institutionalize themselves,” she said. “But I’m also very critical of capitalistic platforms where people present themselves as experts and offer advice that could ultimately be very myopic.”

Deciding who counts as an expert isn’t always straightforward. Klara Kernig, a creator with 159,000 followers on Instagram, describes herself in her biography as a “people-pleasing expert.” She earned that title through experience, she said.

After dropping out of her dream PhD program against her family’s wishes, she said, Kernig started learning about codependency, trauma and “people-pleasing” from books and the internet. Now she’s a lot healthier, she said, and makes her own mental health content, including “5 things we think are nice that are people-pleasing behaviors.”

“I don’t want to discredit therapists, but I also want to say there are other ways of educating people and of having that information,” she said. “Maybe I’ll even put something out there that’s wrong, and then I hope that my community and also the therapists there point that out to me in a loving way.”

Some creators take it upon themselves to challenge content that is not supported by research. Psychology professor Inna Kanevsky of San Diego Mesa College, who is a TikTok creator with an audience of 1.1 million, frequently rebuts what she sees as irresponsible claims in videos posted by other creators. Some of the subjects of her criticism have said Kanevsky talks down to them, invalidates their experiences or misinterprets their intentions.

“It’s funny because people will say, ‘You’re being passive-aggressive,’ ” Kanevsky said. “And I’m like, ‘No, I’m being aggressive-aggressive.’ If you posted nonsense, I’m going to tell you.”

Creators control content but not its interpretation

There’s an important difference between providing therapeutic advice and making relatable content, creators maintain. But those lines can blur quickly.

In addition to making posts for her 129,000 Instagram followers, Siddiqi treats clients over video call. They often send her posts from other mental health creators to discuss during their sessions, and she helps them to assess the information and decide whether it applies.

The posts lead to good conversations and deeper insights, Siddiqi said. But she worries about where the algorithm sends people afterward and whether audiences get enough time to reflect. It’s easy for people without real-life support to misinterpret mental health content or unfairly label themselves or others, she said.

The idea of people piecing together their own mental health journeys on a monetized, algorithm-influenced app can feel scary, but critics need to pump the brakes, said Dusty Chipura, who makes TikTok videos about attention-deficit/hyperactivity disorder (ADHD) and mental health. She isn’t too worried about self-diagnosis, because totally healthy people aren’t generally the ones scrolling for information about symptoms and treatments, she said. Furthermore, health-care professionals habitually discount people’s concerns, she said, so, many people with real disorders never get formal diagnoses.

“You don’t need a diagnosis of ADHD to benefit from the tips and tricks and strategies,” Chipura said.

Audiences know to consider the context and to not accept as truth every word uttered by a creator, said Nedra Glover Tawwab, a licensed therapist and Instagram creator with 1.5 million followers. As with any marketplace, the onus is on consumers to decide whether they’re buying what a particular creator is selling, she said.

Who’s responsible for evaluating mental health content?

In the world of online mental health guidance, there’s little accountability for platforms or creators if something goes wrong.

Instagram in June launched a pilot called the Well-being Creator Collective, which it says provides funding and education to about 50 U.S. creators to help them produce “responsible” content on emotional well-being and self-image. The program is guided by a committee of outside experts, the company says.

Linda Charmaraman, senior research scientist and director of the Youth, Media & Well-being Research Lab at Wellesley Centers for Women, is on that committee and said that overall, participants seem to care deeply about using their platforms for good.

TikTok said it is “committed to fostering a supportive environment for people who choose to share their personal wellness journeys while also removing medical misinformation and other violations of our policies,” according to a spokeswoman.

“We encourage individuals to seek professional medical advice if they are in need of support,” she said in a statement.

Ideally, social media apps should be one item in a collection of mental health resources, said Jodi Miller, a researcher at Johns Hopkins University School of Education who studies the relationships among young people, technology and stress.

“Young people need evidence-based sources of information outside the internet, from parents and schools,” Miller said.

Often, those resources are unavailable. So it’s up to creators to decide what mental health advice they put stock in, Fisher-Quann said. For her, condescending health-care providers and the warped incentives of social media platforms haven’t made that easy. But she thinks she can get better — and that her followers can, too.

“It all has to come from a place of self-awareness and desire to get better. Communities can be extremely helpful for that, but they can also be extremely harmful for that,” she said.

Linda Chong in San Francisco contributed to this report.



[ad_2]

Source link

LEAVE A REPLY

Please enter your comment!
Please enter your name here