70: Fake News May be on the Rise Because People are Becoming Less Likely to Fact-Check News Stories

A study on political microtargeting found that when on Facebook, people are just as likely to view an everyday Facebook post to be as trustworthy as an advertisement sponsored by a real company. It was also found that there is no difference between how consumers respond to political ads when they receive education on advertisement tactics. This education involves teaching viewers about tactics such as fake ads, manipulation, and specifically a fairly new strategy, political microtargeting.

Personal Advertisements are Taking Over the Internet Whether People Like it or Not

Internet users seem to notice that there are specific ads that seem to be very personal to their hobbies and interests, this is known as microtargeting. Usually, microtargeting involves looking into a specific person's background information and web history, which allows companies to cater ads to a specific person's likes. During election time, however, these ads are more likely to be aimed at pushing a specific political party or candidate's agenda.

In the political world, advertisers may look at whether a person reads Fox News or NBC news. There are thousands of political-related Facebook pages, so viewing what a person likes or constantly comments on can help get a sense of who they may vote for or the policies they may support.

Some people did not want to be shown political advertisements that were tailored to their interests. Since companies may take very personal information such as race, religion, and party identification, some viewed it as an invasion of privacy. There is no way to stop these companies from using the information that is already out there.

People are Very Trusting of News Stories, This Could Possibly Explain the Rise of Fake News

As mentioned earlier, people are just as likely to believe that a regular Facebook post that could be created by anyone and was made to look like an ad, is just as trustworthy as a post from a political company or campaign group. This information is alarming because it goes to show that people on the internet are extremely likely to believe almost everything and anything they read.

Craig Silverman, the editor of Buzzfeed Canada, conducted a mini experiment. He compared the total number of engagements between the most popular, fake election-related news stories to the most popular, real election-related news stories. In this case, engagement was likes, shares, and comments.

The experiment found that engagement of fake news stories was higher than engagement on real stories, especially when the election came to a close. Some of these fake stories included the Pope endorsing Donald Trump and the death of an FBI agent involved in the Hillary Clinton email investigation.

Silverman found this information to be extremely surprising. While the founder of Facebook, Mark Zuckerberg, claims that 99.9% of the information and news on Facebook is "solid," there is an abundance of fake news out there. Silverman stated that there is no clear way to tell if the fake news played a role in Donald Trump's win, but he says that it definitely had some sort of impact.

If training on advertising tactics and manipulation has virtually no effect on how consumers view ads, it may be hard to stop the emerging popularity of fake news and fake news websites. Since advertising training proved to be unsuccessful, the only thing consumers can do to stop the spread of fake news is to fact check what they read. With the "on-the-go" lifestyle that Americans live, researching news has become a neglected practice. 

Researchers Wanted to Find Out How People Perceived Political Ads and How Those Opinions Would Impact Their Likelihood to Believe or Share the Post

The researchers in the microtargeting study hypothesized that when viewing political posts, a regular Facebook post would have the lowest levels of persuasion knowledge, then a personalized Facebook ad, and a personalized Facebook ad along with training on advertising would have the highest levels of persuasion knowledge. Persuasion knowledge was defined as prior knowledge about advertising strategies. 

Regular Facebook posts were anything that a friend may have liked or reposted. Personalized Facebook ads were sponsored and paid for by companies or political groups. These posts indicate somewhere that they are sponsored. Some research has found that seeing a sponsored label can help consumers realize a post is an ad, while other research refutes this claim.

The final group viewed personalized ads along with a basic training session. The training session included teaching participants about microtargeting and targeted advertising, which is how companies specifically target people and customize ads to their liking and wants.

The second hypothesis predicted that when persuasion knowledge is activated, people will be less likely to view that post as trustworthy. This is because noticing a sponsorship symbol may lead people to believe that the ad is supposed to be persuasive, which will make them become more critical about the ad.

This hypothesis also predicted that people who view personalized ads will be less likely to engage in electronic word of mouth, which is defined as sharing, posting, and commenting. This is because they will believe that the ad is biased due to the fact that it is sponsored by a company.

Most People Were Likely to Believe but Not Share Political Posts When They Could Decipher the Post's Message

The studies found that there was no significant difference on being able to realize if a post was an ad or not. Even when people were taught about microtargeting, every group was mostly able to realize it was an advertisement. This shows that advertising training may be unhelpful.

Each of the conditions viewed the posts as having virtually the same level of trustworthiness. All of the conditions were unlikely to participate in electronic word of mouth. When asked if they would like the Facebook post, most responded that they were unlikely to do so.

To access more detailed information on this study, click here.

No comments: