Flash News

E-TJERA

How much can social media influence elections?

How much can social media influence elections?

Germany is holding parliamentary elections on February 23, 2025. For multi-billionaire and Trump confidant Elon Musk, it is clear that the AfD, a party that is classified as partly right-wing extremist by the German Federal Office for the Protection of the Constitution, must win the parliamentary elections. Only the AfD can save Germany, Musk wrote on his website X.

He also offered AfD leader Alice Weidel the opportunity to hold a discussion on the X platform. The AfD is considered the most active party on German social media, especially on the Chinese social media platform TikTok. Videos with AfD positions are viewed on TikTok by hundreds of thousands of people.

According to political and communication expert Johannes Hillje, each AfD video was viewed an average of more than 430,000 times in 2022 and 2023. For comparison: in second place are the videos of the conservative parliamentary group CDU/CSU – with an average of around 90,000 views.

Does social media favor right-wing parties?

No, says Andreas Jungherr, professor of political science and digital transformation at Otto-Friedrich University in Bamberg. “The AfD has been active on social media for a long time.” It has learned what approach works there.

This is a clear advantage in terms of reach – but that alone does not guarantee electoral success, says Jungherr. This is clearly seen in the campaign of US presidential candidate Kamala Harris, who had great success on social media. But as we know, this was not enough to bring her to the White House.

What impact does social media have on values ​​and beliefs?

So-called “filter bubbles” arise in the online space because search results or content are personalized. Algorithms of online service providers determine what is shown to us on the Internet. On social media, an algorithm gives priority to content from well-known personalities or content that has been liked or commented on by many other users. On the other hand, the algorithm may no longer display certain content at all if it has been ignored frequently.

This creates a one-sided perspective: one's worldview is reinforced, while the opinions or attitudes of others are ignored. Above all, such content reinforces the values ​​and beliefs one already holds.

This is why media of all kinds have only a very small influence on voters' decisions, says Judith Möller, professor of communication sciences at the Leibniz Institute for Media Research.

She studies the effects of social media and says, “Voters’ decisions are related to many different factors. It depends on where and how you grew up, what personal experiences you had – especially in the last weeks before the election – or who else you talk to about elections and politics.” According to Möller, the same factors also influence which media we use and what effect they have.

Political movements and new parties can quickly become visible on social media. But it is essentially on social networks that they reach their followers – and perhaps some undecided people. “It is difficult to convince people of something new. Through the media you can only convince those who are already convinced of something.”

Fake news and hate speech

Dealing with fake news and information will become even more problematic in the future. Fake news is likely to increase if, as Marc Zuckerberg announced, the Meta concern will give up professional control and verification of news on the Facebook or Instagram platforms and will block controversial content less and less frequently.

In this context, we can observe two effects, says Prof. Dr. Nicole Krämer, head of the Department of Social Psychology, Media and Communication at the University of Duisburg-Essen. On the one hand, surveys show that people do not want to fall prey to disinformation.

“The more important an issue is to someone's life, the more adept he or she is at seeking out information that is truly helpful, that is, that is reliable and two-sided.”

But on the other hand, if the false information fits with something that is already embedded in someone's brain, the person may consider that information as possible – "even if they initially think: there's no way this can be," says Krämer.

Another mechanism is at work here: “the more often you hear, read, or see a false message, the more likely it is to remain in memory.” This means that false information sometimes takes root – despite the fact that people actually want to avoid it.

The amount of false information on social media could also increase because there are fewer and fewer different opinions there, says Judith Möller.

The reason for this is an increasingly toxic culture of conversation and discussion characterized by insults or hate speech. “As a result, certain groups are excluded from discussions, and only those who can withstand this toxic culture of conversation continue to participate.”/DW/

Latest news