Home
Unchained
Research Blog

Panic! At The Distro: A Study of Malware Prevention in Linux Distributions

Duc-Ly Vu, Trevor Dunlap, Paul Gibert, John Speed Meyers, and Santiago Torres-Arias

TL;DR -  Most Linux distributions, according to interviews, do not proactively scan their repositories for malware. A new malicious Linux package benchmark dataset reveals that one reason why is that Linux package malware detection tools mostly generate false positives.


Attacks on the users of open-source software package repositories have been a growing concern. These attacks have typically been on users of programming language-specific repositories like the Python Package Index or npm. And while these attacks are important, it’s also true that these repositories are arguably “uncurated.” Anyone with an email address can upload code to the repositories. It’s a free for all, by design.


But after the revelation of the XZ Utils backdoor, concern turned into a panic-inducing crisis. This was an attack on a project that was packaged in many Linux distributions, software repositories that are “curated” carefully. Not just anybody can upload code to these repositories. It was therefore troubling that one of the maintainers was able to slip malicious code into the upstream project itself. Furthermore, the XZ Utils attack, if it had gone undetected for longer, would have provided the attacker with, according to one observer, a “skeleton key” to the internet.  XZ Utils was therefore not just another attack on the open source software supply chain. 


This attack therefore led us to ask two research questions: (1) What measures have maintainers at Linux distributions implemented or considered implementing to counter malware? (2) How effective are current malware detection tools at identifying malicious Linux packages?


To answer these questions, we conducted interviews with maintainers at several major Linux distributions and created a Linux package malware benchmark dataset. Here’s what we found:


Finding #1: Distribution maintainers, according to the interviews, have to date mostly focused on reproducible builds. Our interviews identified that Wolfi OS is an exception in its usage of active malware scanning.


Finding #2: Using a new benchmark dataset, we evaluated the performance of six open-source malicious linux package detectors. The evaluation found that the performance of existing open-source malware scanners is underwhelming. Most studied tools excel at producing false positives. Those that avoid high false positive rates often do so at the expense of a satisfactory true positive rate.


Our findings, which can be found on arXiv, are intended to spur debate on the role of active malware scanning within Linux distribution repositories and provide a dataset to benchmark existing and new malicious linux package detection tools.


What are Linux Distros doing to prevent malware?


We interviewed seven members of five different Linux distributions: Alpine, Arch, Debian, Ubuntu, and Wolfi. There were two interviewees associated with Debian and two with Arch. Four of these five distributions are long-running and widely used. Wolfi, Chainguard’s relatively recent distribution, is the sole exception.


No.

Distro

Years of Experience

1

Alpine

20

2

Arch

8

3

Arch

7

4

Debian

20

5

Debian

25

6

Ubuntu

2

7

Wolfi

2


Here’s what we found:


Interest in reproducible builds and signing

The maintainers generally mentioned two types of pre-existing counter-malware activities. First, several maintainers mentioned the Reproducible Builds project and their own efforts, or their fellow maintainers’ efforts, to increase the number of reproducible packages within each distribution. Second, the maintainers also generally mentioned “cryptographic” approaches that involved ensuring that packages had been signed, though participants also noted that signing packages is only a partial defense to countering an attacker bent on inserting malware.


Maintainers have not given serious consideration to the use of malware scanning tools.

Prior to the XZ Utils attack, maintainers had not given serious consideration to the use of malware scanning tools. Further, even in the aftermath of the XZ Utils attack, there is a general skepticism towards the idea, primarily due to the perceived cost of using commercial-grade scanners and concerns over who would bear the responsibility for the associated overhead of reviewing alerts. Even maintainers that express some openness about such an approach remain uncertain about where to start.


Wolfi is the only distro performing proactive malware scanning. 

Only one Linux distribution, Chainguard’s Wolfi, has embraced proactive malware scanning. The security team associated with Wolfi has built and released an open-source tool called malcontent that is deployed in production within Wolfi. Malcontent is used to generate alerts for each new package update with Wolfi. The difference in the alerts between each package version is analyzed; new high or critical alerts are flagged for reviewing, blocking package updates until the alert is deemed benign.

 

How do existing Linux package malware scanners perform?


The research team created six distinct malware datasets in order to benchmark six malicious Linux package malware detection tools. Table 2 lists the six open-source malicious Linux package detection tools that were benchmarked. Closed-source tools were excluded from this analysis.

Malware Detection Tool

Open Source

Malware Detection

Multi-language Support

Detection Rules Available

Capability Analysis

Bandit4mal



Malcontent

Oss-Detect-Backdoor (ODB)


Packj


VirusTotal




capslock





To construct a comprehensive dataset, we curated a set of benign examples from the Wolfi OS ecosystem. The analysis assumes that each package already in Wolfi is benign. A total of 1,866 Wolfi projects were selected, spanning the programming languages Python, JavaScript, Ruby, and C. See the arXiv paper for in-depth methodological details.


Dataset Name

Type

Description

# Samples

Dataset #1

Source tarballs

Historical Samples of Open Source Source Code Malware

30

Dataset #2

APK

Historical Examples of Malicious Linux Binaries

30

Dataset #3

Source tarballs

Synthetic Examples of Open Source Source Code Malware

30

Dataset #4

APK

Synthetic Examples of Open Source Linux Binaries

30

Dataset #5

APK

Synthetic Example of Linux Malicious Source Code turned into APKs

10

Dataset #6

APK

Synthetic Examples Over Time Golang Malware

10

Wolfi Upstream

Source tarballs

Upstream repositories of Wolfi APKs

1866

Wolfi APKs

APK

Wolfi APKs

1652


Using these new benchmarks dataset, the evaluation found that the performance of existing open source malware scanners was underwhelming. Specifically, these scanners often produce a significant amount of false positives and typically miss malware. Figure 1 provides detailed “ROC” curves for different tools across different datasets.


A series of charts depicting data on Wolfi vs other datasets.
Figure 1: ROC curves for each dataset when running malware scanners with different thresholds

Figure 1 shows that among malware detection tools, VirusTotal proved the most reliable tool, balancing high accuracy in detecting both malicious and benign files. Bandit4Mal showed strong sensitivity but had many false positives.  Malcontent displayed moderate accuracy with lower precision, often mislabeling benign files as malicious. ODB and Packj were sensitive but need tuning to reduce errors. 


Conclusions


Malware prevention is an increasingly significant concern for Linux distributions and their maintainers. Through interviews, we identified that most Linux distributions, with the exception of Wolfi, do not perform active malware scanning. Furthermore, by building a set of Linux package malware benchmark datasets and scanning them with existing malware scanners, we found that these tools exhibit high false positive rates and struggle to accurately identify malware. As a result, these tools are unlikely to be widely adopted by other Linux distributions and are likely to frustrate maintainers who attempt to use them, without further improvements.


Our ultimate hope is that the next time the person or group (or even machine) behind the XZ Utils attack attempts to backdoor a popular open-source project, these tools or improved versions of them, or entirely new tools will detect the attack so swiftly that it becomes a non-event, failing to garner attention in the news. Here’s to hoping.


If you’d like to learn more about our results or proposed future directions, please check out our paper on arXiv.

Share

Ready to Lock Down Your Supply Chain?

Talk to our customer obsessed, community-driven team.

Get Started