20% of Americans believe government is injecting microchips in COVID-19 vaccines, survey finds

Health

fox8.com 19 July, 2021 - 03:46pm 20 views

The company doesn’t know some specifics about how falsehoods about Covid-19 and vaccines for the virus spread on its social network.

SAN FRANCISCO — At the start of the pandemic, a group of data scientists at Facebook held a meeting with executives to ask for resources to help measure the prevalence of misinformation about Covid-19 on the social network.

The data scientists said figuring out how many Facebook users saw false or misleading information would be complex, perhaps taking a year a more, according to two people who participated in the meeting. But they added that by putting some new hires on the project and reassigning some existing employees to it, the company could better understand how incorrect facts about the virus spread on the platform.

The executives never approved the resources, and the team was never told why, according to the people, who requested anonymity because they were not authorized to speak to reporters.

Now, more than a year later, Facebook has been caught in a firestorm about the very type of information that the data scientists were hoping to track.

The White House and other federal agencies have pressed the company to hand over data about how anti-vaccine narratives spread online, and have accused Facebook of withholding key information. President Biden on Friday accused the company of “killing people” by allowing false information to circulate widely. On Monday, he walked that back slightly, instead directing blame at people who originate falsehoods.

“Anyone listening to it is getting hurt by it,” Mr. Biden said. He said he hoped that instead of “taking it personally,” Facebook would “do something about the misinformation.”

The company has responded with statistics on how many posts containing misinformation it has removed, as well as how many Americans it has directed to factual information about the government’s pandemic response. In a blog post on Saturday, Facebook asked the Biden administration to stop “finger-pointing,” and casting blame on Facebook after missing its goal of vaccinating 70 percent of American adults by July 4.

“Facebook is not the reason this goal was missed,” Guy Rosen, Facebook’s vice president of integrity, said in the post.

But the pointed back-and-forth struck an uncomfortable chord for the company: It doesn’t actually know many specifics about how misinformation about the coronavirus and the vaccines to combat it have spread. That blind spot has reinforced concerns among misinformation researchers over Facebook’s selective release of data, and how aggressively — or not — the company has studied misinformation on its platform.

“The suggestion we haven’t put resources toward combating Covid misinformation and supporting the vaccine rollout is just not supported by the facts,” said Dani Lever, a Facebook spokeswoman. “With no standard definition for vaccine misinformation, and with both false and even true content (often shared by mainstream media outlets) potentially discouraging vaccine acceptance, we focus on the outcomes — measuring whether people who use Facebook are accepting of Covid-19 vaccines.”

Executives at Facebook, including its chief executive, Mark Zuckerberg, have said the company committed to removing Covid-19 misinformation since the start of the pandemic. The company said it had removed over 18 million pieces of Covid-19 misinformation since the start of the pandemic.

Experts who study disinformation said the number of pieces that Facebook removed was not as informative as how many were uploaded to the site, or in which groups and pages people were seeing the spread of misinformation.

“They need to open up the black box that is their content ranking and content amplification architecture. Take that black box and open it up for audit by independent researchers and government,” said Imran Ahmed, chief executive of the Center for Countering Digital Hate, a nonprofit that aims to combat disinformation. “We don’t know how many Americans have been infected with misinformation.”

Mr. Ahmed’s group, using publicly available data from CrowdTangle, a Facebook-owned program, found that 12 people were responsible for 65 percent of the Covid-19 misinformation on Facebook. The White House, including Mr. Biden, has repeated that figure in the past week. Facebook says it disagrees with the characterization of the “disinformation dozen,” adding that some of their pages and accounts were removed, while others no longer post content that violate Facebook’s rules.

Renée DiResta, a disinformation researcher at Stanford’s Internet Observatory, called on Facebook to release more granular data, which would allow experts to understand how false claims about the vaccine were affecting specific communities within the country. The information, which is known as “prevalence data,” essentially looks at how widespread a narrative is, such as what percentage of people in a community on the service see it.

“The reason more granular prevalence data is needed is that false claims don’t spread among all audiences equally,” Ms. DiResta said. “In order to effectively counter specific false claims that communities are seeing, civil society organization and researchers need a better sense of what is happening within those groups.”

Many employees within Facebook have made the same argument. Brian Boland, a former Facebook vice president in charge of partnerships strategy, told CNN on Sunday that he had argued while at the company that it should publicly share as much information as possible. When asked about the dispute with the White House over Covid misinformation, he said, “Facebook has that data.”

“They look at it,” Mr. Boland. But he added: “Do they look at it the right way? Are they investing in the teams as fully as they should?”

Mr. Boland’s comments were widely repeated as evidence that Facebook has the requested data but is not sharing it. He did not respond to a request for comment from The New York Times, but one of the data scientists who pushed inside Facebook for deeper study of coronavirus misinformation said the problem was more about whether and how the company studied the data.

Technically, the person said, the company has data on all content that moves through its platforms. But measuring and tracking Covid misinformation first requires defining and labeling what qualifies as misinformation, something the person said the company had not dedicated resources toward.

Some at Facebook have suggested the government, or health officials, should be the ones who define misinformation. Only once that key baseline is set can data scientists begin to build out systems known as qualifiers, which measure the spread of certain information.

Given the billions of individual pieces of content posted to Facebook daily, the undertaking of measuring, tracking and ultimately calculating the prevalence of misinformation would be a huge task, the person said.

The meeting held at the start of the pandemic was not the only time Facebook had internal discussions about how to track misinformation.

Members of Facebook’s communications team raised the question of prevalence as well, telling executives last summer and fall that it would be useful for disputing articles by journalists who used CrowdTangle to write articles about the spread of anti-vaccine misinformation, according to a Facebook employee involved in those discussions.

After the 2016 presidential election, Mr. Zuckerberg sought a similar statistic on how much “fake news” Americans had seen leading up to it, a member of Facebook’s communications team said. One week after the vote, Mr. Zuckerberg published a blog post saying the false news had amounted to “less than 1 percent,” but the company did not clarify that estimate or give more details despite being pressed by reporters.

Months later, Adam Mosseri, a Facebook executive who was then the head of NewsFeed, said part of the problem was that “fake news means different things to different people.”

Read full article at fox8.com

51 percent of unvaccinated individuals think the COVID-19 vaccine contains a microchip

Yahoo News 19 July, 2021 - 08:00pm

When respondents were asked how likely they thought it to be true that "the U.S. government is using [the vaccine] to microchip the population," 20 percent of U.S. adults said they thought it "definitely/probably true" and 14 percent weren't sure. 66 percent denied such a claim as "definitely/probably false." Notably, when broken down by vaccination status, 51 percent of "vaccine rejectors" believed the microchip theory, as opposed to just 9 percent of those who are fully vaccinated.

On a more broad level, 85 percent of those who don't want to get vaccinated believed the "threat of the coronavirus was exaggerated for political reasons."

The Economist and YouGov surveyed 1,500 people between July 10-13, 2021. Results have a margin of error of approximately 3 percent. See more results at YouGov.

Believers of the conspiracy theory say the microchips are being used to track people.

We have seen an 'infodemic' alongside the pandemic - a battle against viral misinformation that has impeded efforts to battle the virus.

Psaki said data suggests misinformation - including the conspiracy theory that microchips are in the vaccine - is what's behind the ongoing hesitancy.

President Joe Biden and his administration are trying to soften their expressions of frustration with social media platforms over the spread of coronavirus misinformation online.

While Tucker Carlson has described proof of vaccination measures as "medical Jim Crow," the network's parent corporation is rolling out their own.

U.S. Surgeon General Vivek Murthy said on Sunday that he does not think there is value to incarcerating people for marijuana use and that science should be our guide in policymaking. “When it comes to decriminalization, I don't think that there is value to individuals or to society to lock people up for marijuana use. I don't think that serves anybody well,” Murthy told host Dana Bash on CNN’s “State of the Union.” Dash asked Murthy if he supported, from a health perspective, the draft bill unve

Haitian president reportedly spent 10 minutes frantically calling for help before assassination

Here is a roundup of some of the latest scientific studies on the novel coronavirus and efforts to find treatments and vaccines for COVID-19, the illness caused by the virus. No traces of mRNA vaccines end up in mothers' breast milk, a small study suggests. The COVID-19 vaccines from Pfizer/BioNTech and Moderna deliver a synthetic version of messenger RNA molecules, designed to instruct cells to build replicas of the coronavirus spike protein.

Debbie Wasserman Schultz accused Fox News and Florida Governor Ron DeSantis of spreading misinformation about COVID-19 and the vaccines.

President Joe Biden's chief medical adviser, Anthony Fauci, said he believes it is "reasonable" for students over the age of 2 to wear masks this coming school year.

Vaccinated Californians shouldn't have to suffer because of holdouts who are fueling the Delta variant surge.

Data still suggests that being vaccinated heavily reduces someone’s chances of being hospitalized or dying if infected.

Dr. Scott Gottlieb, former head of the Food and Drug Administration, is urging more people to get vaccinated

Insider reported this moment as part of its definitive oral history of how Trump would rise to become king of the Republican Party.

The hosts of “The View” once again kicked off a new week discussing the COVID-19 vaccination rates and why they believe it has been consistently getting lower. Though most of the panel argued that Fox News and other media outlets are to blame, host Meghan McCain defended the channel and its hosts. “I wish we could all come together on it, but I was watching ‘Fox & Friends’ this morning. Steve Doocy said get the vaccine,” McCain noted, which you can watch in the video above. “‘Fox & Friends’ is l

A new public health advisory asks you to consider: “Only 40% of Missourians are fully vaccinated... assume that 1 in 2 people in any crowd or gathering may be unvaccinated.”

The recent COVID outbreak among the New York Yankees baseball team underscores the fact that the coronavirus vaccines are not 100% effective but are still crucial to preventing hospitalization and death.

Michael Wolff, the author best known for the tell-all “Fire & Fury: Inside The Trump White House,” appeared on “Reliable Sources” this weekend to promote his newest book — but also took the opportunity to air some of his grievances with host Brian Stelter. What started as a general complaint about all media and Wolff defending comments he’s made in the past quickly turned into the veteran writer tearing into Stelter directly. “I think you yourself, you know, while you’re a nice guy, you’re full

Lindsey Graham's threat to flee Washington is part of a troubling trend

Alarming number of Americans think vaccines contain microchips to control people | TheHill

The Hill 19 July, 2021 - 08:00pm

On Sunday, U.S. Surgeon General Vivek Murthy criticized social media companies for not doing enough to curb vaccine misinformation on their platforms.

Opinion | Covid-19, vaccine hesitancy and the misinformation conundrum

The Washington Post 19 July, 2021 - 08:00pm

Where are Americans getting these kooky ideas? Politicians and pundits have been quick to blame social media platforms.

That’s understandable. Misinformation has flourished on Facebook and other sites for many years. Unlike truths, lies are unconstrained by reality, which means they can be crafted to be maximally interesting, sexy, terrifying. In other words, they’re optimized to generate traffic, which happens to be good for tech companies’ bottom lines. “Fake news” — whether fashioned by enterprising Macedonian teenagers, malicious state actors, U.S. political groups, snake-oil salesmen or your standard-issue tinfoil-hatters — drove tons of engagement on these sites in the lead-up to the 2016 election and has continued to do so.

Whether out of principle or financial self-interest, tech executives initially said they weren’t in the business of taking down content simply because it was false. (This included, infamously, Holocaust-denial claims.) Intense blowback followed, along with pressure for tech companies to recognize how their tools were being exploited to undermine democracy, stoke violence and generally poison people’s brains; the firms have since ramped up fact-checking and content moderation.

During the pandemic, Facebook has removed “over 18 million instances of COVID-19 misinformation” and made less visible “more than 167 million pieces of COVID-19 content debunked by our network of fact-checking partners,” the company wrote in a blog post over the weekend. This was in response to President Biden’s comments Friday that social media platforms were “killing people” by allowing vaccine misinformation to flourish.

On the one hand, yes, social media companies absolutely still can and must do more to scrub misinformation from their platforms. Case in point: A recent report on the “Disinformation Dozen” estimated that 12 accounts are responsible for 65 percent of anti-vaccine content on Facebook and Twitter. Their claims include that vaccines have killed more people than covid and are a conspiracy to “wipe out” Black people. All 12 remain active on at least Facebook or Twitter.

But on the other hand: Actually doing more to stamp out this misinformation is challenging. Not because these firms lack the workers or technology to identify problematic content; the real obstacle is political.

Politicians of both parties hate Big Tech’s approach to content moderation and think it should change — but propose diametrically opposite directions.

Democrats are mad that the companies suppress too little speech, allowing conspiracy theories to proliferate. Republicans are mad that these companies are suppressing too much speech, since often it’s right-wing content that gets (rightly) flagged as fake. Absent some political consensus on which way these companies are at fault, or regulation that tells Facebook and other platforms what content is acceptable (a move that would likely face First Amendment challenges), the firms will always be nervous about censoring too aggressively.

The fact that a lot of this same disinformation is being disseminated on prime-time cable also makes it politically harder for tech companies to justify taking it down.

It’s not merely a few no-name Facebook accounts promoting anti-vaccine nonsense; it’s also the most influential media personalities on TV. Fox News host Tucker Carlson, for instance, recently gave an entire monologue linking the government’s coronavirus vaccination effort to historical forced sterilization campaigns. His show then posted the clip on Facebook, which flagged it with a generic note about how coronavirus vaccines have been tested for safety.

Should Carlson’s insinuations have been removed entirely? That’s risky. As conspiracy-theorizing becomes more mainstream, and gobbles up an entire political party and the media ecosystem that sustains it, policing those conspiracy theories and the conservative leaders who promote them appears more politically motivated. Not coincidentally, the White House has reserved its harshest criticism about anti-vaccine content for social media companies rather than conservative news organizations parroting similar messages. Already despised by both parties, Big Tech is a safer target.

Now, one could argue that these tech firms should step up and impose the moderation policies they think are right, political (and perhaps financial) fallout be damned. Perhaps these companies could more forcefully rebut Republicans’ claims of politically motivated censorship and “shadow-banning” by pointing out that right-wing content still dominates the most popular posts every day on Facebook.

But if even White House officials appear tentative about picking fights with the right-wing industrial complex, it’s not surprising that tech firms would follow suit.

Read more:

Coronavirus maps: Cases and deaths in the U.S. | Cases and deaths worldwide

Vaccines: Tracker by state | Guidance for vaccinated people | Kids | How long does immunity last? | County-level vaccine data

What you need to know: Delta variant | Other variants | Symptoms guide | Masks FAQ | Personal finance guide | Follow all of our coverage and sign up for our free newsletter

Got a pandemic question? We answer one every day in our coronavirus newsletter

The most important news stories of the day, curated by Post editors and delivered every morning.

20% of Americans believe microchips are inside COVID-19 vaccines - study

The Jerusalem Post 19 July, 2021 - 08:00pm

By subscribing I accept the terms of use and privacy policy

Who is spreading most of the COVID vaccine misinformation on social media

The Jerusalem Post 19 July, 2021 - 08:00pm

By subscribing I accept the terms of use and privacy policy

12 anti-vaxxers responsible for most disinformation online, study says

Business Insider 19 July, 2021 - 08:00pm

“No Rules Rules: Netflix and the Culture of Reinvention”

Get it now on Libro.fm using the button below.

The CCDH analyzed 812,000 anti-vaccine posts shared on Facebook and Twitter between February 1 and March 16, 2021. It found that 65 percent of this content could be attributed to what is being dubbed the "disinformation dozen."

On Facebook alone, the CCDH found that those 12 people were responsible for 73 percent of the anti-vaccine content on the platform.

The disinformation dozen is made up of a bodybuilder, a wellness blogger, and a religious zealot, The Guardian reported.

Also, most notably, it includes the nephew of former President John F Kennedy. Robert F Kennedy Jr is a prominent anti-vaxxer who has proliferated disinformation connecting vaccines to autism and the COVID-19 shots to 5G phone technology.

His account was part removed by Instagram, the CCDH said, but he remains active on Facebook and Twitter.

Fewer than half of the members of the disinformation dozen — Kennedy, Sherri Tenpenny, Rizza Islam, Sayer Ji, and Kelly Brogan — have had one of their social media accounts removed or partially removed, the study said.

The CCDH is now calling on Facebook, Instagram, Twitter, and YouTube to de-platform every member of the disinformation dozen with haste.

"The most effective and efficient way to stop the dissemination of harmful information is to de-platform the most highly visible repeat offenders, who we term the disinformation dozen," the study said. "This should also include the organizations these individuals control or fund, as well as any backup accounts they have established to evade removal."

Health Stories

Top Stores