I am becoming aware of the effect a lack of trust in the media has had on people, paired with a dearth of research skills.
I’m thinking about the argument I got caught in yesterday- the subject of it doesn’t matter.
Often, pseudoscience and misinformation comes packaged with a lot of very important sounding words, and the jargon gets to the point where it seems like a lot of work to fact check it. Which makes the ‘I encourage you to do your own research’ statements real obnoxious. If it’s phrased in a way that’s impossible to navigate, good luck.
It sucks, but you gotta.
If you don’t want to fact check individual words, that’s fine. That’s a lot to ask of someone that’s just trying to figure out whether something is true.
This is where we get into something called 'lateral research.’ Instead of trying to draw a map to a sentence, you check the credibility of their source material.
This is your Snopes, your Fact Check/Media Bias, your Follow The Money.
Knowing more context about what someone is saying will save you a lot of time and energy.
If you’re not sure about something, question it.
I feel like I’ve been throwing this around a LOT lately, but:
Practice SIFT! SIFT is based on lateral research and can be very helpful for these situations.
DON’T just share information without doing your due diligence.
whyyy the fuck does this not have more notes please rb this more often qwq
Well, I mean… probably because I posted it like an hour ago.
STOP
i have found this post and infographic and i want to share it
INVESTIGATE THE SOURCE
zetabrarian’s blog says they are a socially progressive librarian monsterfucker, which a quick scroll through their blog seems to support. This makes them pretty cool but not necessarily the perfect source – anyone can
say they are a librarian, and surely not every librarian is correct about processing information
FIND BETTER COVERAGE
if i go to a search engine (in this case google via firefox) i see that several universities, libraries from large municipalities (like Los Angeles) as well as the BBC all agree that this is a real method experts in information fields recommend. I wouldn’t necessarily take any single one of these sources as 100% credible, but they are individually reasonably reliable, and taken together indicate a high probability of factual information
TRACE TO ORIGINAL CONTEXT
A brief search reveals that the SIFT method was created by Mike Caulfield, who is a research scientist at the University of Washington’s Center for an Informed Public, where he studies the spread of online rumors and misinformation. This is an extremely good source of information for how to process information on the internet. As the creator of the SIFT method, he has taught thousands of teachers and students how to verify claims and sources through his workshops.
It corroborates the above information, though there are a few notable differences. For example, under the “trace to original context” section in the Washington U. source (again, as close to the original as i could find) this step contains advice to
check the date. This seems very good to include, as in the fast moving world of internet information, things become outdated or get updated very quickly, and yet first takes and outdated articles hang around and get shared for a long time.
EXTRA CREDIT
I personally find that it is important to outright search for the opposite information. For example, I put in a few searches like “Mike Caulfield discredited” “Mike Caulfield wrong” “SIFT method bad” etc. I found nothing showing me any indications this method has any problems. Interestingly, somehow this did turn up an article about news literacy on Medium, which was actually written by Mike Caulfield in April of 2017
Bots and Russian trolls spread misinformation about vaccines on Twitter to sow division and distribute malicious content before and during the American presidential election, according to a new study. Scientists at George Washington University, in Washington DC, made the discovery while trying to improve social media communications for public health workers, researchers said. Instead, they found trolls and bots skewing online debate and upending consensus about vaccine safety. The study discovered several accounts, now known to belong to the same Russian trolls who interfered in the US election, as well as marketing and malware bots, tweeting about vaccines. Russian trolls played both sides, the researchers said, tweeting pro- and anti-vaccine content in a politically charged context. “These trolls seem to be using vaccination as a wedge issue, promoting discord in American society,” Mark Dredze, a team member and professor of computer science at Johns Hopkins, which was also involved in the study, said.
Chamath Palihapitiya, who joined Facebook in 2007 and became its vice president for user growth, said he feels “tremendous guilt” about the company he helped make. “I think we have created tools that are ripping apart the social fabric of how society works,” he told an audience at Stanford Graduate School of Business, before recommending people take a “hard break” from social media. Palihapitiya’s criticisms were aimed not only at Facebook, but the wider online ecosystem. “The short-term, dopamine-driven feedback loops we’ve created are destroying how society works,” he said, referring to online interactions driven by “hearts, likes, thumbs-up.” “No civil discourse, no cooperation; misinformation, mistruth. And it’s not an American problem — this is not about Russians ads. This is a global problem.”
So how do we spot these accounts in the wild? Following are a number of traits we’ve found in our research. As you might expect, many accounts that are not bots or sockpuppets exhibit some of these traits. None of them are foolproof. But the more of these traits an account displays, the more likely it is to be a disinformation account. In our research, we’ve found it far more helpful to look for evidence of these traits in a large collection of tweets, rather than trying to come up with discrete lists of bots, sockpuppets, trolls, and regular users. It’s often these traits that are most dangerous, and it’s these traits that we can look out for when engaging information online ― and when sharing information ourselves. It is also worth highlighting that many of the traits exhibited by bots and sockpuppets are pulled directly from tactics used in online harassment.
The Disinformation Review collects examples of pro-Kremlin disinformation all around Europe and beyond. Every week, it exposes the breadth of this campaign, showing the countries and languages targeted. We’re always looking for new partners to cooperate with us for that. The Disinformation Review is a collection of disinformation examples sent to the EEAS East StratCom Task Force from a network of over 400 journalists, civil society organisations, academics and public authorities in over 30 countries. The East Stratcom Task Force provides an analysis of the trends emerging from the reports received. Opinions and judgements expressed here do not represent official EU positions.