How social media was infected by Covid

When Covid-19 struck, not only was global health at stake. A wave of conspiracy theories and general skepticism towards authorities and science followed. To understand the underlying trends, Danish researchers analyze news and social media posts in a national research project.

“I aim to develop computationally assisted methods to identify trends in the discursive and informational landscape around topics concerning media dynamics, public health and science communication, crisis and risk messaging, as well as the emergence of mis- and dis-information,” says Rebekah Baglini, Associate Professor in Linguistics at Interacting Minds Centre, Aarhus University.

Analyzing news and social media user behavior across Scandinavia during Covid-19, Rebekah Baglini contributes to the national project HOPE (How Democracies Cope with Covid19: A Data-Driven Approach). Like several other researchers in the project, she has taken to High-Performance Computing (HPC), also known as supercomputing, due to the large data volume and complexity involved:

“My earlier work involved smaller language corpora and didn’t require HPC resources. However, as my projects grew in scale, involving large corpus creation, the relevance of supercomputing increased.”

A learning curve involved

Rebekah Baglini has accessed supercomputing through the cloud based Interactive HPC platform operated by DeiC, the national research and education network (NREN) of Denmark.

“There has definitely been a learning curve involved in the transition from locally maintained clusters to the cloud based Interactive HPC platform, particularly because it is also a somewhat new service without comprehensive documentation, and my affiliation with Center for Humanities Computing at Aarhus University has been a valuable resource as there is a great deal of collective experience and knowledge to draw on in the community,” says Rebekah Baglini.

Her research involves collection, processing, and annotation of large-scale media data from traditional and social media sources. In addition, she strives to enhance the existing computational language models for multilingual language processing (NLP), with a particular focus on under-resourced languages.

Identifying emerging narratives

Further, Rebekah Baglini uses HPC resources in a new project entitled Causal Reasoning and Online Science Skepticism. A key point here is training language models to identify and analyze emerging narratives that undermine or counteract verified messaging on scientific findings and public health recommendations.

NLP and computational linguistics are also integral to her teaching, enabling her to offer students practical exposure to working with extensive datasets and large language models, fostering hands-on learning opportunities:

“I recognize that not all projects require HPC. However, it is useful for researchers to gain training in the affordances of HPC, parallel compute, and large models so they know what’s possible, and can potentially take on projects of larger scale or make use of state-of-the-art resources for data processing, modelling, and simulation,” Rebekah Baglini ends.


“The text is inspired by the article “Supercomputing drives deeper insight into linguistics and social media” by Marie Charllotte Søbye at the DeiC website, based on original text published by Center for Humanities Computing, Aarhus University, for DeiC Interactive HPC.

Featured image: Aarhus University facade. Photo credit: Ricochet64 –

Published: 05/2024

For more information please contact our contributor(s):