Wednesday 8 February 2017

Want to post a discriminatory ad? Facebook may try to stop you automatically

Following Facebook's November promises to take action, the company unveiled a suite of rules and machine-learning tactics on Wednesday, all in the name of curtailing discriminatory ad-targeting practices.

The most notable of these is Facebook's new automated toolset that is built to identify advertisements for "housing, employment or credit opportunities"—and flags them if they employ the site's "multicultural affinity targeting" system. In other words, if those types of ads in any way are built with requests that Facebook not deliver them to African-American, Hispanic, or Asian-American viewers, the site will attempt to automatically block the ad with a relevant notice. Advertisers can then either remove the cultural limitations from the ad or request a manual review for its approval.

Should the automated toolset recognize this type of ad but not pick up on apparent cultural targeting, advertisers will instead be directed to a three-paragraph "certification" notice, which advertisers will have to sign. This notice, among other things, forces advertisers to pledge that they "will not use Facebook advertising to improperly discriminate." This notice coincides with Facebook updating language in its advertiser-policy pages about discriminatory practices. In an October report, Pro Publica exposed previous issues in the social network's advertising platform by buying and running discriminatory ads that flew in the face of the Fair Housing Act.

Read 2 remaining paragraphs | Comments



from Want to post a discriminatory ad? Facebook may try to stop you automatically

No comments:

Post a Comment