About that Facebook trust ranking

To the complete horror and amusement of those watching the grand experiment Facebook is doing on everyone, this week we found out the company is assigning a reputation score to users that ranks their trustworthiness. The perversity of the situation was lost on no one. (And no, it's not the kind of perversity we like; this is Facebook, after all, the anathema to human sexual expression.)

The company told Washington Post Tuesday that its ranking system is a new automated tool to aid in its effort to fight "fake news." This aspect of scoring people relies on factors around a person who uses the "fake news" reporting tool.

As in, the people using the flagging system Facebook announced in a December 2016 post.

Facebook assured the public in April 2017 that, because of its fabulous new flagging system, "overall that false news has decreased on Facebook" — but did not provide any proof. That was because "it's hard for us to measure because we can't read everything that gets posted." Then, in September 2017, Facebook told the press the "disputed" system worked and there was data to prove it; problem was, Facebook's fact-checkers told the same outlets that they had no idea if it worked and might be worsening the problem. By December 2017 Facebook killed the reporting tool's "disputed" flag and said it'd show people Related Articles instead.

Deleting Facebook App from Smartphone

As many predicted, it looks like users were being naughty with the tool. Shocking, I know.

Anyway, according to what Facebook told the Washington Post this week, "A user's trustworthiness score isn't meant to be an absolute indicator of a person's credibility ... nor is there is a single unified reputation score that users are assigned."

Rather, the score is one measurement among thousands of new behavioral clues that Facebook now takes into account as it seeks to understand risk. Facebook is also monitoring which users have a propensity to flag content published by others as problematic and which publishers are considered trustworthy by users.

Trustworthy publishers indeed. Cool, so now they'll know that all Trump supporters think Breitbart and Fox are trustworthy news sources. But the devil is in the details — and the data. "It is unclear what other criteria Facebook measures to determine a user's score," WaPo wrote, "whether all users have a score and in what ways the scores are used."

After everyone freaked out about the report and about 10 million comparisons to China and Black Mirror were made across the media, Facebook went on damage control. "The idea that we have a centralized 'reputation' score for people that use Facebook is just plain wrong," a Facebook spokesperson told Gizmodo. "What we're actually doing: We developed a process to protect against people indiscriminately flagging news as fake and attempting to game the system."

Gizmodo related that a large number of "trustworthy" users flagging a post might push that post higher in the queue to be reviewed by fact-checkers, but according to a spokesperson, that's about it. Ah yes, the same fact-checkers who've been throwing up their hands in frustration and disgust, citing Facebook's own refusal to work with them. The ones who were freaked about the "disputed" flag not working.

Muslims Rally Against Facebook

So Facebook is trying to better automate which news items are a priority for fact-checkers by attempting to undermine users who are trying to game the system — while still offloading the problem of being a disseminator of fake news onto users and fact-checkers, of course. The actual problem is that anyone trusts Facebook as a place to get their news in the first place.

It's a fun recursive loop if you're a nihilist.

So yeah: The company with the reputation for being the least trustworthy is rating the trustworthiness of its users. We'd be fools to think that it hasn't been doing this all along in other areas. Some animals are more equal than others. The thing is, Facebook long ago decided who was more trustworthy — its real customers, its advertisers. It only pretended you'd be more trustworthy if you gave them your ID. I'll wager that Facebook's secret trust-ranking sauce is going to be in who you're connected to, what neighborhood you live in, and job status (or type of job -- sorry, sex workers). If it isn't secretly already doing exactly that -- to users and to people's shadow profiles.

According to the article revealing the trust-rankings, Facebook isn't being transparent about its reputation ranking because that would enable users to game it. And so Facebook attempts to cut the horror evoked by its admission of reputation rating by feigning amazement that people would game its systems. "I like to make the joke that, if people only reported things that were false, this job would be so easy!" Facebook's rep told Washington Post. "People often report things that they just disagree with." As if this wasn't about catering to neo-fascist conservatives, NYU just revealed that Facebook's biggest political ad spenders are Trump campaign toadies.

Facebook's highest priority, it seems, is reputation repair and its own grievances against transparency. It's the reason why Facebook has failed, dramatically, as the steward of community safety it pretends to be, and survives, grossly, as the world's top tool for doing harm by way of strong-arming people out of their data.

So, will it work? The new effort to fight fake news by reverse-gaming the bad actors who've been gaming Facebook since its inception?

I'm sure we'll never know.

Images: NurPhoto via Getty (Protesters) Getty (Phone)



via Engadget RSS Feed https://ift.tt/2BLxBKp
RSS Feed

If New feed item from http://www.engadget.com/rss-full.xml, then send me

IFTTT

Comments

Popular posts from this blog

Evernote cuts staff as user growth stalls

The best air conditioner

We won't see a 'universal' vape oil cartridge anytime soon