Facebook’s new tools to block discrimination in housing, job and credit ads will not apply outside the United States, The Logic has learned.
The social media giant said it would block features allowing advertisers to discriminate based on age and gender two weeks ago. However, the changes will only apply in the United States. And, tests conducted by The Logic show that Facebook is currently approving ads in Canada that appear to discriminate.
The Logic created six ads to test whether it could exclude audiences based on gender and age in housing and job posts. Each ad excluded groups of a specific age and gender. In addition, The Logic was able to post an ad with the headline “Apartment For Rent: Indigenous People Need Not Apply.” The same ad was approved in June 2018; at the time, Facebook said it violated the company’s anti-discrimation policy and should not have been approved. Nine months later, it was approved in three minutes.
Facebook agreed to bar housing, employment and credit ads that target audiences based on gender and age, as part of a settlement with several advocacy groups in the U.S. The changes will not apply in Canada, The Logic has learned. The Logic created six ads that violated the platform’s anti-discrimination policies, and that housing and labour lawyers said violated Canada’s human rights codes. Facebook approved all of the ads.
The U.S. Department of Housing and Urban Development (HUD) filed a lawsuit against Facebook on Thursday, alleging the company allowed advertisers to block minorities and other protected groups from viewing real estate postings.
The charges follow a settlement reached in March between Facebook and several advocacy groups in the U.S. over apparent discrimination in housing, credit and employment ads. Facebook agreed to pay US$5 million to settle five lawsuits and vowed to introduce new tools to bar advertisers from targeting audiences based on gender and age.
The new tools will not be available in Canada or globally—they’ll only affect advertisers targeting audiences in the U.S. “We will examine extending these requirements globally in the future,” Erin Taylor, communications manager at Facebook Canada, said in an email to The Logic.
Facebook approved The Logic’s test ads excluding women and families from looking for apartments to rent, as well as men and Indigenous people seeking housing. Ads excluding women and older employees were also approved.
One test ad for a rental apartment, for example, targeted women between the ages of 25 and 40 who spoke English, excluding all others from viewing the post. Another housing ad targeted men between the ages of 21 and 35 who spoke English only; the description for the ad specified that Indigenous tenants or those with children would not be granted a viewing. A job ad for a software engineer targeted English-speaking men, ages 21 to 35.
Five of the six ads were approved in under five minutes. One job ad that targeted English-speaking men was in review for more than three hours before Facebook approved it.
Taylor said the ads violated the company’s discriminatory advertising policy. “Discriminatory advertising has no place on Facebook and we are continually strengthening our policies to prohibit advertisers from using our ads products to discriminate against people,” she added. “We take abuse of our systems incredibly seriously.”
Taylor did not, however, directly respond to whether the platform’s automated and manual system for reviewing ads should have been able to flag the posts, noting that the system doesn’t catch all discrimination. “Advertisers are responsible for ensuring that their ads do not violate laws, including laws against discrimination,” said Taylor.
Facebook could be breaking laws in Canada by not changing its ad targeting tools, said Andrew Langille, a Toronto-based labour lawyer. “There’s prohibition against the exact same behaviour in Canada that’s been outlawed in the U.S.,” said Langille. “Facebook is potentially vulnerable to individual actions by people who feel they’ve been discriminated against and they’d be vulnerable to a class action. Also, they’re vulnerable to complaints from the Canadian Human Rights Commission and Human Rights Tribunals in the individual provinces.”
Taylor’s statement on behalf of Facebook challenged the idea that targeting based on gender and age in protected ad categories is discriminatory or illegal. She compared the practice to ads placed in magazines or TV shows intended to reach a certain age or gender demographics. Unlike Facebook, magazine and television advertisers can’t expressly block certain demographics from viewing their ads. “It’s also important to note that individual ads may be part of broader-based recruitment efforts designed to reach all ages and all backgrounds,” Taylor said. “Examining an individual ad for discriminatory practices is not effective if that ad is part of a broader campaign or ad set.”
Facebook has been under pressure for years to ban discriminatory ads. U.S. journalism outlet ProPublica reported in October 2016 that Facebook gave advertisers the option to hide housing and employment ads from people interested in specific demographics through its “Ethnic Affinity tool.” Four months later, Facebook disabled the option and removed “thousands” of categories that targeted people based on factors such as race, ethnicity, sexual orientation and religion.
Click here for a timeline of Facebook’s responses to allegations of discriminatory advertising
Share the full article!Send to a friend
Thanks for sharing!
You have shared 5 articles this month and reached the maximum amount of shares available.Close
Share the full article!
Share the full article with your friends. Recipients will be able to read the full text of the article after submitting their email address. They will not have access to other articles or subscriber benefits.
You have shared 0 article(s) this month and have 5 remaining.
In June 2018, however, The Logic was able to create 12 test ads for housing and jobs that excluded audiences based on factors related to terms “Indigenous peoples,” [sic] “metis,” [sic] “Anishinaabe” and “Cree.” Facebook disabled the screening options hours after being contacted by The Logic.
Two months later, the company said it would remove another 5,000 targeting options and that all U.S. advertisers would have to complete a non-discrimination certification to keep advertising on the platform. The certification system has started rolling out to advertisers in Canada, too.
Facebook said last year that it was doing more to catch discriminatory advertisements by hiring more people to review ads and improving its machine learning to detect posts that violate its anti-discrimination policies. It also introduced prompts for advertisers to review the policy before they create their posts. The Logic was prompted once to voluntarily review the policy over the course of creating six ads.
“Discrimination in housing remains a real problem and disproportionately impacts marginalized groups,” said Karen Segal, staff counsel at Toronto-based Women’s Legal Education and Action Fund. “This kind of discrimination makes it harder for women, who disproportionately bear child care responsibilities, to find a safe and affordable place to live,” said Segal, noting that parents, expecially single mothers, have a harder time finding rental housing in Canada than people without children. “As Canadians are increasingly turning to online sources, including social media, to find services they need, it is even more important to ensure that sources like Facebook comply with their legal obligation to respect and protect human rights, including equality rights.”
The HUD is also reviewing Google’s and Twitter’s housing ad policies, the Washington Post reported Thursday.
On Saturday, Facebook CEO Mark Zuckerberg wrote an op-ed in the Washington Post calling for governments to adopt new internet regulation in four areas: harmful content, election integrity, privacy and data portability. The letter called for the regulation of political advertisements; private advertisements were not mentioned.
In the past, Facebook has maintained that it is not liable for the content that third parties post to its platform. While Canada doesn’t currently have laws to protect platforms from third-party content, under the new United States-Mexico-Canada Agreement (USMCA), internet service providers won’t be responsible for users’ content, as long as the platform is strictly a passive forum for the ads.
However, some legal experts argue that when online platforms, including Facebook, use algorithms to help facilitate discrimination or other illegal activity, they should be held responsible. Langille agrees, noting, “Facebook would also bear liability in addition to the individual poster.”