Facebook’s phony claim that “you’re in charge.”

It simply isn't true that an algorithmic filter can be designed to remove the designers from the equation. That assertion melts on contact, and a New York Times reporter who receives such a claim from a Facebook engineer should somehow signal to us that he knows how bogus it is.

27 Oct 2014 12:04 pm 15 Comments

In today’s New York Times, media reporter Ravi Somaiya visits with Facebook to talk about the company’s growing influence over the news industry, especially with News Feed’s dominance on mobile devices. Greg Marra, a 26 year-old Facebook engineer who heads the team that writes the code for News Feed, was interviewed. Marra is “fast becoming one of the most influential people in the news business,” Somaiya writes.

Mr. Marra said he did not think too much about his impact on journalism.

“We try to explicitly view ourselves as not editors,” he said. “We don’t want to have editorial judgment over the content that’s in your feed. You’ve made your friends, you’ve connected to the pages that you want to connect to and you’re the best decider for the things that you care about.”

In Facebook’s work on its users’ news feeds, Mr. Marra said, “we’re saying, ‘We think that of all the stuff you’ve connected yourself to, this is the stuff you’d be most interested in reading.’ ”

It’s not us exercising judgment, it’s you. We’re not the editors, you are. If this is what Facebook is saying — and I think it’s a fair summary of Marra’s comments to the New York Times — the statement is a lie.

I say a lie, not just an untruth, because anyone who works day-to-day on the code for News Feed knows how much judgment goes into it. It simply isn’t true that an algorithmic filter can be designed to remove the designers from the equation. It’s an assertion that melts on contact. No one smart enough to work at Facebook could believe it. And I’m not sure why it’s sitting there unchallenged in a New York Times story. For that doesn’t even rise to the level of “he said, she said.” It’s just: he said, poof!

Now, if Greg Marra and his team want to make the point that in perfecting their algorithm they’re not trying to pick the day’s most important stories and feature them in the News Feed, the way an old fashioned front page or home page editor would, and so in that sense they are not really “editors” and don’t think in journalistic terms, fine, okay, that’s a defensible point. But don’t try to suggest that the power has thereby shifted to the users, and the designers are just channeling your choices. (If I’m the editor of my News Feed, where are my controls?)

A more plausible description would go something like this:

The algorithm isn’t picking stories the way a home page or front page editor would. It’s not mimicking the trained judgment of experienced journalists. Instead, it’s processing a great variety of signals from users and recommending stories based on Facebook’s overrrding decision rule for the design of an editorial filter: maximizing time on site, minimizing the effort required to “get” a constant flow of personal and public news. The end-in-view isn’t an informed public or an entertained audience but a user base in constant contact with Facebook. As programmers we have to use our judgment — and a rigorous testing regime —to make that happen. We think it results in a satisfying experience.

That would be a more truthful way of putting it. But it doesn’t sound as good as “you’re in charge, treasured user.” And here is where journalists have to do their job better. It’s not just calling out BS statements like “you’re the best decider.” It’s recognizing that Facebook has chosen to go with “thin” legitimacy as its operating style, in contrast with “thicker” forms. (For more on this distinction go here.)

By “thin” I mean Facebook is operating within the law. The users are not completely powerless or kept wholly in the dark. They have to check the box on Facebook’s terms of service and that provides some cover. The company has pages like this one on data use that at least gesture toward some transparency. But as this summer’s controversy over the “mood manipulation” study showed, Facebook experiments on people without them knowing about it. That’s thin.

Jeff Hancock, the Cornell researcher who worked on the mood manipulation study, said this last week: One of his big discoveries was that most users don’t grasp the basic fact that the Facebook algorithm is a filter. They think it’s just an “objective window into their social world.” That’s thin too. (See my post about Hancock and his choices, Why do they give us tenure?) The company doesn’t level with users about the intensity of its drive to maximize time on site. Thin.

Thick legitimacy is where informed consent, active choice and clear communication prevail between a platform and its public, the coders and the users. Facebook simply does not operate that way. Many would argue that it can’t operate with thick legitimacy and run a successful business at scale. Exactly! As I said, the business model incorporates “thin” legitimacy as the normal operating style. For better or for worse, that’s how Facebook works. Reporters should know that, and learn how to handle attempts by Facebook speakers to evade this basic fact— especially from “one of the most influential people in the news business.”

15 Comments

I did not understand Ello when I first heard about it. And then when they became a PBC, I wondered how they would become a workable bankable property?

But I think privacy and transparency of operation and purpose really will become more and more important to a growing group of people, and therein lies the opportunity for active social networkers to engage in a healthier environment.

Jonathan Stray says:

You won’t be surprised to hear that I agree with you. The underlying message of my computational journalism course is that information filtering algorithms are (inherently, necessarily) editorial; you can’t create one without making value-based choices.

Here’s is the framework we use to talk about the “filter design problem” in a more unified way:
https://www.scribd.com/doc/241383785/Algorithmic-Filtering-Computational-Journalism-week-4#page=52

Contrast this with a technical description of the problem, which asks for a piece of code with certain inputs and outputs: https://www.scribd.com/doc/241383785/Algorithmic-Filtering-Computational-Journalism-week-4#page=41

There is a huge gap between these two descriptions of the problem, in terms of language, culture, and concept. It’s actually extraordinarily hard to bridge the two:
http://www.niemanlab.org/2012/06/theres-no-such-thing-as-an-objective-filter/

But the language that Marra is using suggests to me that his team simply does not think in terms of the non-technical framework. He’s right that the classic human editor leaves a lot to be desired, but he hasn’t asked carefully what we’d want out of a news algorithm.

(If I’m the editor of my News Feed, where are my controls?)

Friend, Like, Comment, Unfollow, Hide are your controls.

Jonathan Stray says:

Yes, those are definitely controls. The problem is that they still leave the user wondering whether or not they’ll see any particular post, or type of post.

Perhaps the right way to think about this is transparency of filtering. Does the user know and understand why they do and do not see any particular item?

Current machine learning algos are really bad at this; in fact “explainable” ML is an active area of research.

I would say they are not controls. Not really.

They are user-facing inputs to the algorithm, and they have an effect on the outputs, but that happens through a black box process that converts gestures such as liking something into tendencies in the News Feed (less of, more of) without displaying enough of the logic to make that process intelligible.

If you said, “making lists is a control,” I might agree with that.

Jonathan Stray says:

I like the distinction of “controls” vs. “inputs.”

From help page: Controlling What You See in News Feed.

“Top Stories shows posts from friends, Pages and groups you interact with the most, as well as other popular stories, at the top of your News Feed.” (My italics.)

https://www.facebook.com/help/335291769884272/

The “other popular stories” part? That’s Facebook as editor, using its judgment (popular, so you will like it too, right?)

What I meant by a black box is that we aren’t entitled to know what the mix is between “sources you interact with most” (you, you, you with your interests) and “”other popular stories” (us, us us, with our time-on-site imperative) because that’s a trade secret.

These are not controls. They are user inputs. Facebook retains the control. Thus, you can set your News Feed to “most recent.” Facebook will receive that input and factor it in. But who retains control?

“Note: Your News Feed will eventually return to the Top Stories view.”

Exactly. Because that is one way Facebook expresses its maxmize-time-on-site imperative.

Is there any news site that lets you pick what kind of news you do and do not want to see? Even just having a “fewer like this” button next to each story would help.

Circa does exactly that. You can filter out entire topics, soon you will be able to filter at an even more granular level. While Facebook allows you to follow a thing (which we will too, in a future version) Circa allows you to follow the evolution of a news story as it changes.

The difference is there is no black box with Circa, you choose what to follow and what you never want to appear. Simple as that.

(Celebrities & crimes, that is.)

Thanks Anthony.

Also in desktop on every story there’s a way to get a survey about the stuff u see on your news feed. Stop conplaining and invest some time on it.

“On desktop” leaves about 50% of Fb visits out of your suggestion. An extra click to take a survey leaves about 94% of the visits out of your suggestion.

The point of Prof Rosen’s piece is not to complain, but rather to point out it’s a lie to say there is no editorial process going into story selection on Fb.

This might be interesting to you:
http://papers.ssrn.com/sol3/papers.cfm?abstract_id=2246486
It studies the legal aspects of “Speech engines”, i.e. what big tech companies become once they start ranking the content we see.

I use a little add-on app called Facebook Purity and edit out all the “news” Facebook thinks I want to see. Unless a friend posts something, I just don’t see it, and that’s the way I want it.

Facebook editorial signals can be cooked (journalists can be bought) with tech solutions (paid traffic etc). Facebooks quality signals are all based on spending as much time on it as possible so they can sell you advertising, just like Google wants you to click on adverts that they charge high cpc on – so these are given preference when you type via instant search.

Wallpaper advertising, top down usability wich is ad-centric not user centric.

And big p.r. budgets telling us its all about stuff like friendships and being better citizens!

Glyn.