In September 2006, Facebook introduced the news feed. People said they did not like it, 28 000 signed a petition. Facebook accepted the feedback although the effectiveness of the backlash being in part due to how the newsfeed worked was probably lost on some of them.
It remains the mainstay for how most use the service.
Now Facebook is looking to change it again.
In 2009 a new type of profile allowed businesses to create a page. It would work like a second website on Facebook. News, offers and customer care could be managed on the page and users who liked a page would see updates appear in their news feed.
Fast forward to 2016 and Facebook was managing almost 2 billion users and over 60 million pages. When you opened Facebook, there were about 2000 posts that qualified to be shown in your feed, but no-one could get through all that, so Facebook used algorithms to show you what they believed was the most important, relevant and popular posts.
A post by a page was only shown to some of those that liked the page. This was their "organic" reach; if people engaged with it Facebook would show it to more people who liked the page. If you wanted more people to see it, you needed to boost the post and pay. You could pay to show the post to people who already liked your page or those that did not.
Ad spend is Facebook’s primary revenue and, in 2016, that was over $8 billion.
The issues arose as pages looked to game the algorithm to get more people to share content, often those stories were not true. Companies looking to earn revenue from ad impressions on their sites would create pages on Facebook with false content that was targeted to groups that were likely to react, click and share them often as paid ads.
Buzzfeed profiled a group in Macedonia who profited from posting made up stories to appeal to supporters of Donald Trump and investigations are underway to determine if other governments attempted to alter the outcome of the 2016 US elections in the same way.
Facebook has tried a variety of options to reduce the issue, but it appears their recent tests to split the news feed into a “friends” feed and a “pages” feed proved useful. They will not split the feed, but reduce page posts that are unlikely to generate conversation. Facebook has also acknowledged that scrolling through posts you don’t interact with has a negative impact on users.
How to fix it
Dealing with false information is not easy, and you typically have two options; limit who can publish or improve everyone’s journalism skills.
The second option is great, but challenging, so typically the first option has been the chosen one.
It may appear to be a new problem, unique to the digital age, but actually, it has a long history and was a significant issue in the 1920s thanks to a maverick with a new platform looking to create a community in a way that the established players had not considered.
Unfortunately, the maverick was also a fraud.
John R Brinkley and his goat gonad transplants
John Brinkley was a quack doctor in the US whose speciality was claiming to cure impotence with a goat testicle transplant. Men, and sometimes women, paid a lot of money for his cure. He did well with print ads but started doing even better when he launched his own radio station.
Radio was new, and Brinkley began using it to promote his cure. He also offered others the chance to come on the radio - country music singers, comedians, fortune tellers or just about anyone. They all proved very popular. Brinkley even hosted a medical question show. Listeners' questions always had answers to use products that could only be bought from pharmacies with whom he had a sales agreement.
His station was finally shut down, but it did not stop him. He moved to the border of Mexico and built a radio station in Mexico. He would connect the transmitter via telephone to his home and continue broadcasting. His new transmitter was the one of the most powerful on the planet, but he could afford it as his message was reaching across the country with clients from all walks of life searching for a cure handing over their money.
An Act of Congress passed just for him, banned using the telephone broadcast option, so he recorded his shows and sent them to be broadcast instead.
It was the middle of the Great Depression in the US, yet he was earning more than any other doctor. The American Medical Association Journal published a piece calling him out as a complete fraud, not that doctors needed to be told that. Their reach was a fraction of his, but his ego saw him sue them for libel. In court, the lawyers could test his claims and showed them all to be false.
His business collapsed, and he was sued, three years later he was broke and dead.
A documentary, Nuts!, about John Brinkley was made in 2016.
There is no connection to Facebook and Brinkley except that when disruptive new products and platforms are created, it is not the users that are the best regulators. The creators could be very responsible, but it is not in their interests to limit their opportunities.
Either the users need to become more aware of what is happening and how the new service works or someone independent needs to consider measures to ensure honesty and fairness.
Ideally, both measures are needed. Consider the benefits of being able to use a vehicle and the potential dangers if operators did not know the rules of the road or have a licence to prove their ability. It would be chaos.
It is not that strange then to imagine what might happen when a service like Facebook allows anyone to publish anything for users who don’t know how to determine what is true or false.
Facebook operates as a media company in supplying news (almost 50% of its users say it is their principal source of news). It also derives income in the same way as media companies and has products like media companies. It means it needs to meet the same obligations as a media company despite having started as a social media network and platform for connecting people.
Facebook is not alone with this issue. The internet has allowed anyone to become a publisher, but the most successful new platforms now have so many posts that the harm created from the admittedly tiny proportion of false and fraudulent posts can have a significant impact. YouTube and Twitter face a similar challenge.
It is possible for all of us to get basic journalism training online or to improve how widely it is taught in schools, but until that happens, Facebook’s most significant issue with fake news is us.
This article first appeared on 702 : Facebook has a problem and it is us