Business Unusual

When you find out this is true, it will make you angry

You are more connected on social media now, and your friends and family are posting more than ever. No wonder you spend so much time looking through your social feeds.

You might have so little time that you can barely comment on half of it let alone actually read it or determine if what is claimed in headlines (like the one above) is even true. But if it looked interesting and sounded possible, you would probably share it and allow others to decide for themselves.

A post that is neutral does not only get less interaction, it very likely does not even get shown.

In 2013, Facebook estimated that there were about 1500 potential stories to show you on any given day. You can’t read that much, so they needed to find out how to show the ones that will have the most impact.

Tests to determine what is more likely to get you to react have been conducted on Facebook and Twitter to determine to what degree an emotional contagion works on social media. No surprise that those that elicit positive and negative emotions are more effective than neutral ones.

Those that make us angry are the most powerful.

Consider that for a moment; your social feed is geared to show you content you are most likely to respond to. That means either strongly positive or negative stories has the most chance of appearing in your feed and remaining there as you tend to interact with it more.

A post that is neutral does not only get less interaction, it very likely does not even get shown.

Does this explain why your news feed is filled with lots of cute cats, inspiring videos and reports of horrible injustice that would make you want to punch someone?

The pressure to retain interest from consumers has forced media organisations to tweak their stories and story selection to compete with the highly engaging if not always true content online.

The focus to only show the stories that get a reaction leads to seeing less posts that would contradict your world view unless it also made you angry.

This is how echo chambers develop which for some could significantly alter how they perceive an issue. The degree to which political camps in the US could be divided into two very separate groups has been the basis for how social media affected the US elections.

The Wall Street Journal created an illustration of how your Facebook feed could be an echo chamber of only those that share your point of view.

Facebook CEO Mark Zuckerberg responded to the accusations in the following response.

The good news is that showing you posts that you and others respond to strongly is not a bad way to add power to messages.

The woman who could not stop laughing when she bought a Chewbacca mask is a good example. The video has been seen 162 million times. While it was not trying to promote anything, the mask became a very popular item afterwards.

Truthiness

Stephen Colbert used his alter ego "Stephen Colbert" to update his word truthiness with Trumpiness. The video was posted in July and describes the support Trump would use to win the election. The original video is not available in South Africa.

Stephen Colbert created the term in 2005, to describe information that felt true without actually needing to be true. He updated it this year to reflect the news around Donald Trump's campaign. It sets out how Trump would win the election by appealing to how a section of the US electorate felt about politics and their country. He posted the video in July.

After the “shock” election result many that opposed Trump have looked to find another way to explain his win. Echo chambers, Facebook’s news algorithm, fake news and even sabotage by the FBI and Wikileaks.

Is it true?

It could feel true. It has lots of truthiness, but is probably a better reflection of how much the Clinton campaign underestimated his chances of attracting enough support.

There is money to be made from generating clicks to websites from social media. There is also a powerful way to get paid-for messages in front of specific audiences using both of the planet’s most influential platforms - Facebook and Google.

Buzzfeed reported that posting false political information became a cottage industry in Macedonia.

In recognition of the issue, both have decided to prevent those found to be posting false information from using paid posts to spread it.

It is a good start, but more can be done.

In fairness to them, they did not set out to become news sources, but even though they strictly speaking are not, it is how many have come to see them.

If you search for “Donald Trump is a shape shifting lizard” that is exactly what you will find, if you search “Donald Trump is a democrat” you will find that too.

Google will not tell you which, if either, are true, but we tend to assume that if so much we search for on Google is true, then it is likely that everything we search for on Google is true.

Images are particularly powerful as even ones known to be fake can change you memory of an event.

People tend to be poor eyewitnesses. People can confuse what they think happened with what did happen and people lie.

It is part of the reason journalists exist. But now everyone is an eyewitness and everyone can report with as much impact as a journalist. So what is missing? Verification.

All journalists are expected to verify anything they are told. The more significant the claim the more rigorous the verification check needed. The starting point is that any claimed fact must be confirmed by at least two independent sources.

It is a rule that everyone should follow; if verification cannot be found you should not share the story.

Towards a solution

While the bar for journalists is much higher, everyone should look to do their part to stop fake news and there are some easy steps to at least reduce it.

A search for the claim should return multiple results from a variety of credible publications. If the story is widely reported but very similar in how it is reported, it suggests a limited number of angles. Media outlets try find something to move a story forward, contradictions would highlight the potential for factual inaccuracy.

There are sites set up to fact check stories like Snopes. For South African stories there is Africa Check.

Check images with reverse image searches via Google images or Tineye.

Reports with emotive language tend to rely on how you will feel about the content rather than the substance of the arguments and the facts and sources to support it.

Don't assume that in sharing a piece that may or may not be true you could not be contributing to the problem. You have an influence on those around you. When you share something others would assume you know it to be true and would treat it the same way.

Read More
The world in 10 years' time

The world in 10 years' time

The six megatrends the World Economic Forum believes will happen in the next decade.

A code of ethics for code

A code of ethics for code

Almost everything runs code now, we assume it is good code, but how would you know?

MailChimp - odds are you received an email from them in the last 7 days

MailChimp - odds are you received an email from them in the last 7 days

Email is 45 years old. MailChimp is 15 and has made this "old" tech big business.

Whisky - from nasty medicine to the drop of choice for movers and shakers

Whisky - from nasty medicine to the drop of choice for movers and shakers

What began as a search for immortality has turned into a icon of success.

Even death is being disrupted

Even death is being disrupted

Death is final, but changes in cultural norms, technology and the desire to make a quick buck is changing that.

We may all only use one computer in the future

We may all only use one computer in the future

Cloud computing may allow everyone access to the most powerful processing at the lowest cost.