Exclusive

Indigenous health officials raise concern over Facebook’s suicide prevention efforts

article-aa

Facebook said it has “more to do” after federal officials focused on Indigenous health expressed concerns about the “intuitiveness and appropriateness” of a safety feature meant to support suicide prevention.

Facebook debuted the feature—which allows users to flag posts where friends appear to express suicidal thoughts—to Canadian users in June 2016. After reviewing a reported post and confirming it may indicate a user has suicidal feelings, Facebook sends the user notifications that one of their friends has tried to help and offers access to suicide prevention resources.

According to a memo prepared last year—obtained by The Logic through an access-to-information request—Indigenous Services Canada’s First Nations and Inuit Health Branch (FNIHB) first raised issue with the feature to Facebook in June 2017, and internally noted it “may instead increase stigma around suicide and mental health.”

Government officials followed up with Facebook in September using screenshots to describe the issues and held a conference call with the company on Sept. 21, 2017.

Purchase a subscription to read the full article

By entering your e-mail you consent to receiving commercial electronic messages from The Logic Inc. containing news, updates, offers or promotions about The Logic Inc.’s products and services. You can withdraw your consent at anytime. Please refer to our privacy policy or contact us for more details.

Already a subscriber?

“We regularly meet with and ask for feedback from experts in the field,” said a Facebook spokesperson. “Usually, the improvements we make to our reporting flows and resources are made as a direct result of consultations like this.”

The spokesperson acknowledged “there is more to do here” regarding some of the issues flagged by FNIHB: “We’re committed to ongoing dialogue with our stakeholders and regularly meet with Indigenous communities in Canada to ask for feedback on our content policies, including suicide prevention.

Facebook said its tools have been built in collaboration with suicide prevention experts, and tested by people who have considered self-injury or suicide in the past.

“The Government continues to work with Indigenous leaders and organizations to develop a long-term plan to address mental health crises being faced by Indigenous peoples, which includes suicide prevention,” wrote an Indigenous Services spokesperson in an email.

But there are some who believe the company’s partnership with government on suicide prevention deserves greater scrutiny.  

“Fundamentally, Facebook has not shown the maturity or the adaptability on a whole manner of social issues that are confronting the platform, let alone letting them be in the driver’s seat for something as complex and horrific as the Indigenous youth suicide crisis,” said Charlie Angus, the member of parliament for Timmins—James Bay and the New Democratic Party critic for Indigenous youth.

Talking Point

Elements of Facebook’s suicide reporting feature “could contribute to stigma around suicide and mental health” according to officials, and miss out on identifying some in need of help. Facebook said it is making changes in collaboration with experts and Indigenous communities.

Officials raised concerns in the memo about the number of steps a user would need to take to flag posts as distressing. They took issue with categorizing suicide along with problematic activity—such as threatening behaviour, pornographic images and bullying—in the reporting process. They also said the reporting tool could miss sending helpful resources if Facebook did not deem the post distressing, which could be further complicated by Facebook’s methodology not being able to recognize culturally-specific communication. Finally, the memo outlined that the user who reported the post was given the option to unfriend or block their friend who wrote it.

“In its current application, there is a risk that the feature could contribute to stigma around suicide and mental health, and dissuade individuals from using the feature,” says the memo.

The Logic tested the reporting feature on August 13 and found that at least some of the issues outlined by FNIHB still existed. For example, Facebook’s platform provided a set of options, placing suicide and self-harm alongside subjects such as violence, nudity, harassment, hate speech, spam and fake news; it also presented the option to unfriend, unfollow or block the reported user.

“We have begun to update the language in the reporting flow so that users now have the option to select ‘Give Feedback on this Post’, with Suicide and Self-Injury as an immediate option choice,” said a Facebook spokesperson.

Tests of the reporting feature conducted by The Logic on August 13, 2018 showed that some of the issues outlined by FNIHB were still active.

Facebook also said that, based on feedback from experts including Health Canada, it is testing a streamlined reporting process that will use pattern recognition from posts previously reported for suicide. It said that this artificial intelligence approach will make the option to report a post about “suicide or self injury” more prominent for potentially concerning posts.

Facebook said its goal “is to support and provide resources to someone in need in a more timely manner, so we’re making reporting easier, access to support resources more immediate for the reporter, and the display of resources more prominent for the person in need.”

“To me, that these things are being flagged internally at the department, that’s a good sign,” said Angus, who is one of the country’s leading advocates for First Nations children, and is also the vice-chair of the Standing Committee on Access to Information, Privacy and Ethics that recently studied Facebook’s role in the Cambridge Analytica data breach scandal.

The issue of youth suicide prevention is of particular importance in Canada, where suicide rates for First Nations youth are five to seven times higher than for non-Indigenous youth. Rates among Inuit youth in the country are among the highest in the world, at 11 times the national average.

Two preteen girls who died last year at Wapekeka First Nation, and whose deaths were suspected to be part of a suicide pact, made posts on social media suggesting they were struggling with their mental health.

While the government raised concerns about Facebook’s suicide prevention feature, it has otherwise remained eager to work with the tech company, noting it could provide cost-savings. In a separate memo prepared for the Minister of Health, written after a May 24 2017 meeting with Facebook, FNIHB wrote that working with the social media giant “could provide an effective and low cost way of enhancing suicide prevention efforts.”

(Until August 2017, FNIHB operated under Health Canada. That month, it was brought under the newly-created Indigenous Services Canada.)

This raised additional flags for Angus, who said suicide prevention programs for Indigenous populations are chronically underfunded and that the government should not be looking for “low cost benefits to deal with a humanitarian crisis.”

According to data released by Indigenous Services Canada in November 2017, spending on youth suicide prevention by the department has been relatively flat for the past nine years: $10.9 million was spent in 2016-17, down from $12.4 million in 2008-09.

“Why don’t you fund what’s already underfunded that works, and then we can talk about expanding it through social media?” asked Angus. “There’s teams of people who know this stuff, front line workers who deal with the realities, who know the cultural issues, the social issues and how to speak to youth whether they’re from Nunavut or whether they’re from Six Nations. That’s what has to drive this process, not a cool new app.”

Angus noted that the Canada Suicide Prevention Service—the country’s only nationwide text message and chat helpline for suicide prevention—suspended operations in July because of a lack of funding. In its six months of operation, it received almost 8,000 contacts through text or online chat—far more than had ever been anticipated. Angus said the service “was set up to give young people immediate response through texting. It’s a lot easier to go through than the hurdles that the Facebook platform had set up.”

“We have heard from youth that generally they would prefer to talk with Elders in times of need,” said the Assembly of First Nations in a statement. “Any efforts aimed at life promotion and mental wellness need to be supported with investments in community-based mental wellness programs and services for First Nations youth.”

The government began collaborating with Facebook on the issue in the first half of 2017, after FNIHB and Health Canada reached out to discuss partnership opportunities on public health and suicide prevention. That led to Facebook incorporating the toll-free First Nations and Inuit Hope for Wellness Help Line into its suicide prevention tools.

Facebook announced the helpline integration at a June 2017 summit in Iqaluit, co-hosted with Nunavut Suicide Prevention Partners. The company promoted its suicide prevention tools at the event and at subsequent events, including a January 2018 forum in Ottawa with We Matter, a non-profit for Indigenous youth.

Kelvin Redvers, who co-founded We Matter with his sister Tunchai, said the organization’s interactions with the social network have been “overwhelmingly positive,” and that it has been receptive to feedback about its suicide prevention tools.

The Memo

Indigenous health officials at FNIHB raised concerns to Facebook multiple times that the safety feature may increase stigma around suicide and mental health and may discourage use.

 

Government officials listed four primary issues with the feature:

 

  • In order to ‘report’ a distressing post, users must flag the post as ‘inappropriate for Facebook”.
  • The user must go through seven different steps in order to ‘flag’ the post. These different steps require the user to make several non-intuitive decisions, and may inadvertently link suicide and self-harm with threatening behaviour, pornographic images, and bullying.
  • If Facebook finds that a message does not qualify as ‘distressing,’ the user does not receive links to resources. There is a risk that Facebook’s methodologies for identifying potential risks may not capture culturally-specific communication.
  • Once the post has been screened by Facebook, the user who reported the post receives an option to ‘unfriend’, ‘unfollow’, or ‘block’ the user who had posted the distressing message.

 

Source: FNIHB Memo, Access to Information Request, Undated. 2017

“One of the main things that was mentioned from the youth who we interacted with… is that the original issue they had with the reporting tool was the steps to go through were a bit confusing,” he said. “That’s what I even found. There were conversations about that problem with Facebook; they talked with their head office.”

Redvers said that after the January forum, he and his sister were invited to Facebook’s offices in Palo Alto, Calif. to be part of a committee of global representatives that would advise Facebook’s safety team on their mental health programs.

“It was very genuine,” he said. “I never felt we were there as token representatives. They wanted to listen. They knew there were things that they could do better.”

After the trip, Redvers noticed that some changes had been made to the reporting feature, reducing the number of steps involved to make it less confusing. He has since used the suicide reporting feature, and “found the tool was quite effective.”

He said a next good step for improving the tool would be in line with the suggestion from others, to work towards more localization by having people with knowledge of Indigenous communities and culture involved in vetting flagged posts.

Share the full article!
Send to a friend

Loading...

Thanks for sharing!

You have shared 5 articles this month and reached the maximum amount of shares available.

Close
x

A global team works 24/7 to vet posts. However, once a post has been deemed necessary to offer resources, there are some localized help options through the First Nations Hope for Wellness Helpline, which offers round-the-clock crisis intervention counselling services in French, English, Cree, Ojibwa and Inuktitut.

The use of social media as a platform for suicide prevention is relatively new, such that its “risks and benefits… are still not fully understood,” according to the memo prepared for the minister of health.

“While there is 
research showing effectiveness and promise, there is also research suggesting that social media may be contributing to suicide epidemic, through cyberbullying, suicide pacts, method instruction and ‘media contagion effect’ (emulation of successful 
suicides),” reads the memo. “Such unwanted outcomes over social media have been documented in 
Canada’s First Nations and Inuit communities in the very recent past (e.g., Wapekeka First Nation).”

A 2017 study of U.S. teens found those who used electronic devices, including smartphones, for at least five hours a day were 70 per cent more likely to have suicidal ideation or actions than those who reported one hour of daily use. Researchers pointed to cyberbullying and posts depicting “perfect” lives as two potential causes for distress.

“Mental health experts agree that one of the best ways to prevent suicide is for those in distress to hear from people who care about them,” said a Facebook spokesperson. “Facebook has a unique role—through friendships on the site—to play in connecting people in distress with people who can support them.”

Given the early-stage nature of the research, officials recommended in the memo that the government take a “measured and systematic approach” to new technologies in suicide prevention.

 

Follow The Logic on Facebook, LinkedIn and Twitter (@the_logic), and sign up for the Daily Briefing newsletter.