
Political astroturfing and echo-chambers
In the previous post, we discussed the phenomenon of forming of and belonging to echo-chambers. We have observed two consequential phenomena accompanying echo-chambering on the Internet: polarization and segregation.
What are the most important problems for society that can arise from such echo-chambering grouping in and antagonism among chambers?
Problem 1 to look at: Loss of reasoning ability
The social experiment I created and led with a group of top-notch expert associates in 2015, by forming a group of human bots (astrturfers) that were tasked with (a) being extremely positive, and (b) extremely negative in relation to a hot Twitter #hashtag, showed that
our ability to notice fake news and astroturfing was completely diminished if that source of information is in line with our beliefs.
Only 2% of Twitter users and conversationalists on particular hashtag noticed that something was wrong with the enthusiasm shown by these new Twitter accounts (human bots, political astroturfers).
For everyone (else), the sense of belonging to the same value structure was enough to openly connect with Twitter accounts that were completely false.
[Find the study results here, and download the full report {Only in Serbian} ]
This is to say that we are extremely vulnerable when we are in a group with like-minded people. Or at least, failing to recognize lies and fake news while being in our
Problem 2: Increasing animosity towards different thinking
Other findings we got while conducting public opinion social media analytics on internal Dialogue on Kosovo in Serbia, showed the creation of closed and tight groups that consume certain types of news. At the same time, this grouping reduced their exposure to cross-ideological information.
This isolation is further magnified by their social contacts and individual choices which news they will click on.
As a result, the level of hate speech manifested by a different ideology rises. The stronger the bonds within a group that shares one belief, the greater its hate for the other group.
Which led us to:
Problem 3: Segregated groups are very easy to manipulate
If a certain online community allows the introduction and strengthening of political bots in this arena, collective intelligence will be significantly damaged. And this is the most important driver of innovation and problem-solving.
Bots and political astroturfing are only making it worse.
It is easy to manipulate a close group that shares an ideological belief.
Spin doctors adore such online communities! It only makes their job easier. It is enough for them to toss a bone a day to increase
That’s not a piece of very good news.
This not only yields a highly segregated online society but also a society that is ready to go to war because of the most meaningless topic on fashion or cooking, only because it doesn’t fit into their beliefs.
And what can be done about the fallen society, was explained by the results of the political methodology “divide at