Facebook: Wall Street Journal series 'contained deliberate mischaracterizations' | TheHill

Business

The Hill 19 September, 2021 - 03:53pm 34 views

Facebook fires back at damning Wall Street Journal reports that accuse the company of being 'riddled with flaws'

msnNOW 19 September, 2021 - 07:00pm

Facebook fired back at the Wall Street Journal following the newspaper's multi-part series that outlined employee concerns about a litany of issues at the social media giant, from the trafficking of humans through the site to turning a blind-eye to the mental health of teenagers.

"The Facebook Files," published last week, found Facebook employees know the social media giant is "riddled with flaws."

On Saturday, Facebook responded by slamming the series as full of "deliberate mischaracterizations" in a statement penned by Nick Clegg, the company's vice president of global affairs.

"At the heart of this series is an allegation that is just plain false: that Facebook conducts research and then systematically and willfully ignores it if the findings are inconvenient for the company," Clegg wrote.

The Journal reviewed internal company research reports, online employee discussions, and drafts of presentations made to management to reveal the platform ignored its impact on young women, maintained a system that protects elite users from being reprimanded for breaking content rules, and more. The investigation found a number of damning instances where researchers identified and escalated information about the negative effects of the platform where the company did not immediately react.

The report also revealed that Facebook spent 2.8 million hours, or approximately 319 years, looking for false or misleading information on its platforms in the US in 2020. Some content that was missed related to the promotion of gang violence, human trafficking, and drug cartels, the Journal said.

In one instance, Apple threatened to kick Facebook off its App Store following an October 2019 BBC report that detailed human traffickers were using the platform to sell victims. The new Journal investigation found that Facebook knew about the trafficking concerns prior to receiving pressure from Apple, with one researcher writing an internal memo stating that a team looked into "how domestic servitude manifests on our platform across its entire life cycle: recruitment, facilitation, and exploitation" throughout 2018 and the first half of 2019.

"With any research, there will be ideas for improvement that are effective to pursue and ideas where the tradeoffs against other important considerations are worse than the proposed fix," Clegg wrote. "What would be really worrisome is if Facebook didn't do this sort of research in the first place."

Clegg concluded that the company "understands the significant responsibility" that comes with operating a platform that half of the people on the planet use.

He said Facebook takes that responsibility seriously, "but we fundamentally reject this mischaracterization of our work and impugning of the company's motives."

Like us on Facebook to see similar stories

Please give an overall site rating:

Facebook fires back at damning Wall Street Journal reports that accuse the company of being 'riddled with flaws'

CNET 19 September, 2021 - 07:00pm

Facebook fired back at the Wall Street Journal following the newspaper's multi-part series that outlined employee concerns about a litany of issues at the social media giant, from the trafficking of humans through the site to turning a blind-eye to the mental health of teenagers.

"The Facebook Files," published last week, found Facebook employees know the social media giant is "riddled with flaws."

On Saturday, Facebook responded by slamming the series as full of "deliberate mischaracterizations" in a statement penned by Nick Clegg, the company's vice president of global affairs.

"At the heart of this series is an allegation that is just plain false: that Facebook conducts research and then systematically and willfully ignores it if the findings are inconvenient for the company," Clegg wrote.

The Journal reviewed internal company research reports, online employee discussions, and drafts of presentations made to management to reveal the platform ignored its impact on young women, maintained a system that protects elite users from being reprimanded for breaking content rules, and more. The investigation found a number of damning instances where researchers identified and escalated information about the negative effects of the platform where the company did not immediately react.

The report also revealed that Facebook spent 2.8 million hours, or approximately 319 years, looking for false or misleading information on its platforms in the US in 2020. Some content that was missed related to the promotion of gang violence, human trafficking, and drug cartels, the Journal said.

In one instance, Apple threatened to kick Facebook off its App Store following an October 2019 BBC report that detailed human traffickers were using the platform to sell victims. The new Journal investigation found that Facebook knew about the trafficking concerns prior to receiving pressure from Apple, with one researcher writing an internal memo stating that a team looked into "how domestic servitude manifests on our platform across its entire life cycle: recruitment, facilitation, and exploitation" throughout 2018 and the first half of 2019.

"With any research, there will be ideas for improvement that are effective to pursue and ideas where the tradeoffs against other important considerations are worse than the proposed fix," Clegg wrote. "What would be really worrisome is if Facebook didn't do this sort of research in the first place."

Clegg concluded that the company "understands the significant responsibility" that comes with operating a platform that half of the people on the planet use.

He said Facebook takes that responsibility seriously, "but we fundamentally reject this mischaracterization of our work and impugning of the company's motives."

Like us on Facebook to see similar stories

Please give an overall site rating:

Facebook rebuffs reports that say Instagram damaging mental health of teens

Mint 18 September, 2021 - 07:08pm

Facebook Inc. pushed back on reports that the company was aware of the negative impact of its products, claiming that the allegations don’t tell the whole picture. 

The issues of content moderation, mental health risks and misinformation are complex and defy simple policy solutions, according to a statement from Nick Clegg, Facebook’s head of global affairs, posted Saturday. He said the series of articles published by the Wall Street Journal last week is based on incomplete information about difficult subjects. 

The Journal’s reporting ignited another round of outrage in Washington, especially focused on what Facebook knew about the mental health impact that its photo-sharing platform Instagram has on teen girls. Several lawmakers have pledged to investigate the company and called on Facebook to scrap plans for an Instagram product aimed at children. 

“Facebook understands the significant responsibility that comes with operating a global platform," Clegg said. “We take it seriously, and we don’t shy away from scrutiny and criticism." 

The articles detailed how Facebook’s content moderation system takes a light touch with millions of politicians and celebrities, even when they violate the platform’s user guidelines. The reporting also revealed how human traffickers, drug cartels and political leaders take advantage of the platform’s global reach and growth in developing countries. 

The Journal series cites leaked documents about Facebook’s own internal research. Clegg said those studies are designed to “hold up a mirror to ourselves and ask the difficult questions about how people interact at scale with social media." He said the Journal’s claims are based on selective quotes and don’t show the whole picture of a company trying to improve its products. 

“I wish there were easy answers to these issues, and that choices we might make wouldn’t come with difficult trade-offs," Clegg said. “That is not the world we live in."

Log in to our website to save your bookmarks. It'll just take a moment.

Oops! Looks like you have exceeded the limit to bookmark the image. Remove some to bookmark this image.

Your session has expired, please login again.

You are now subscribed to our newsletters. In case you can’t find any email from our side, please check the spam folder.

Facebook Decries Wall Street Journal Reporting on Internal Research

Axios 18 September, 2021 - 05:32pm

The company says recent Wall Street Journal reports 'have contained deliberate mischaracterizations' of its operations. But it's not the only publication publishing damning reports about the social network lately.

Facebook VP of Global Affairs Nick Clegg has officially responded to "The Facebook Files," a series from The Wall Street Journal based on internal Facebook documents, and saying he doesn't seem impressed would be an understatement.

So far, the Journal has reported on a program that held high-profile people to different standards than ordinary Facebook users; indications that Facebook is aware of the risks Instagram poses to teenage users; a change to Facebook's algorithms that backfired; the company's struggle to handle dangerous content; and the ways anti-vaxxers have abused the platform.

"These are serious and complex issues," Clegg said in a statement released on Saturday, "and it is absolutely legitimate for us to be held to account for how we deal with them. But these stories have contained deliberate mischaracterizations of what we are trying to do, and conferred egregiously false motives to Facebook’s leadership and employees."

This is the crux of Clegg's objection to the Journal's reporting:

"At the heart of this series is an allegation that is just plain false: that Facebook conducts research and then systematically and willfully ignores it if the findings are inconvenient for the company. This impugns the motives and hard work of thousands of researchers, policy experts and engineers at Facebook who strive to improve the quality of our products, and to understand their wider (positive and negative) impact. It’s a claim which could only be made by cherry-picking selective quotes from individual pieces of leaked material in a way that presents complex and nuanced issues as if there is only ever one right answer."

The Journal, for its part, said in the first report for "The Facebook Files" that its investigation was based on "an extensive array of internal Facebook communications" as well as "interviews with dozens of current and former employees." It also said at least some of these documents have been submitted to Congress and the Securities and Exchange Commission.

Clegg's response wasn't limited to accusations of cherry-picking. "Facebook understands the significant responsibility that comes with operating a global platform," he said. "We take it seriously, and we don’t shy away from scrutiny and criticism. But we fundamentally reject this mischaracterization of our work and impugning of the company’s motives."

Yet The New York Times reported that Facebook executives wanted to "selectively disclose its own data in the form of carefully curated reports, rather than handing outsiders the tools to discover it themselves," just one month before the social network published its first "Widely Viewed Content Report," in a purported attempt to be more transparent about its platform.

That report did reveal that Facebook had mistakenly provided researchers just half of the data it had promised to give them regarding engagement with political content in the US. (Information gathered from other countries was reportedly unaffected.) The company acknowledged this error and said it would attempt to send the full data in the next few weeks.

Facebook has also been accused of being hostile to outside researchers in recent months, first with New York University's efforts to examine political advertisements on the social network, then with a nonprofit called AlgorithmWatch that said the company issued a "thinly veiled threat" related to its research into Instagram's recommendations algorithms.

The company originally claimed that its 2019 agreement with the Federal Trade Commission, which said in a recent amendment to its antitrust complaint against Facebook that it's "a monopolist that abused its excessive market power to eliminate threats to its dominance," required it to interfere with NYU's research. But the FTC said that claim was false.

"What would be really worrisome is if Facebook didn’t do this sort of research in the first place," Clegg said today. "The reason we do it is to hold up a mirror to ourselves and ask the difficult questions about how people interact at scale with social media. These are often complex problems where there are no easy answers—notwithstanding the wish to reduce them to an attention-grabbing newspaper headline."

Not that many newspapers can publish attention-grabbing headlines anymore: Many smaller publications were shut down after they lost their advertising revenues to Google and Facebook, and larger publications laid off countless writers to focus on video content in a move that later backfired in no small part due to Facebook inflating ad metrics for its videos.

Sign up for What's New Now to get our top stories delivered to your inbox every morning.

This newsletter may contain advertising, deals, or affiliate links. Subscribing to a newsletter indicates your consent to our Terms of Use and Privacy Policy. You may unsubscribe from the newsletters at any time.

Your subscription has been confirmed. Keep an eye on your inbox!

Nathaniel Mott is a writer and editor who has contributed to The Guardian, Tom's Hardware, and several other publications in varying capacities since 2011.

PCMag.com is a leading authority on technology, delivering Labs-based, independent reviews of the latest products and services. Our expert industry analysis and practical solutions help you make better buying decisions and get more from technology.

© 1996-2021 Ziff Davis, LLC. PCMag Digital Group

PCMag, PCMag.com and PC Magazine are among the federally registered trademarks of Ziff Davis, LLC and may not be used by third parties without explicit permission. The display of third-party trademarks and trade names on this site does not necessarily indicate any affiliation or the endorsement of PCMag. If you click an affiliate link and buy a product or service, we may be paid a fee by that merchant.

Facebook slams Wall Street Journal reports as ‘deliberate mischaracterisations’

The Guardian 18 September, 2021 - 03:34pm

The newspaper’s work contained “deliberate mischaracterisations of what we are trying to do, and conferred egregiously false motives to Facebook’s leadership and employees”, the former British deputy prime minister said.

Reporting by the newspaper on what Facebook knew about the mental health impacts on teenage girls of its photo-sharing platform Instagram has led to outrage and calls for increased regulation.

The paper also reported that Facebook content moderation goes easy on politicians and celebrities even when they violate user guidelines, and said human traffickers and drug cartels take advantage of Facebook’s reach and growth in developing countries.

In a statement posted on Facebook’s corporate website on Saturday under the title “What the Wall Street Journal Got Wrong”, Clegg said the paper had not presented the whole picture on the “most difficult issues we grapple with as a company – from content moderation and vaccine misinformation, to algorithmic distribution and the well-being of teens”.

He also said the reporting was based on selective quotes from internal reports designed to “hold up a mirror to ourselves and ask the difficult questions about how people interact at scale with social media”.

“These are serious and complex issues and it is absolutely legitimate for us to be held to account for how we deal with them,” Clegg wrote.

“[But] at the heart of this series is an allegation that is just plain false: that Facebook conducts research and then systematically and willfully ignores it if the findings are inconvenient for the company.”

Clegg said Facebook “understands the significant responsibility that comes with operating a global platform. We take it seriously, and we don’t shy away from scrutiny and criticism.

“I wish there were easy answers to these issues, and that choices we might make wouldn’t come with difficult trade-offs … [but] that is not the world we live in.”

Business Stories

JCPenney