A reported Free Download Manager supply chain attack redirected Linux users to a malicious Debian package repository that installed information-stealing malware.
The malware used in this campaign establishes a reverse shell to a C2 server and installs a Bash stealer that collects user data and account credentials.
Kaspersky discovered the potential supply chain compromise case while investigating suspicious domains, finding that the campaign has been underway for over three years.
Now I need to know who the hell has installed Free Download Manager on Linux.
And via a website too. That’s like pushing a car. One of the main strengths of Linux are open repositories, maintained by reputable sources and checked by thousands of reputable people. Packages are checksummed and therefore unable to be switched by malicious parties. Even the AUR is arguably a safer and more regulated source. And it’s actually in there.
And via a website too
Everyone knows real admins do
curl https://raw.githubusercontent.com/something/or/other/install.sh | sudo bash
Instructions unclear, “command not found: 404”.
The same people that would have given that poor nigerian prince their bank account details
It’s still my favorite download manager on Windows. It often downloads file significantly faster than the download manager built into browsers. Luckily I never installed it on Linux, since I have a habit of only installing from package managers.
Do you know of a good download manager for Linux?
How much faster are we talking?
I’ve honestly never looked at my downloads and though huh you should be quicker, well maybe in 90’s.
Right? I’ve not thought about download speeds since the 2000’s.
FDM does some clever things to boost download speeds. It splits up a download into different chuncks, and somehow downloads them concurrently. It makes a big difference for large files (for example, Linux ISOs).
It only makes a difference if the server is capping the speed per connection. If it’s not then it will not make a difference.
I guess many servers are capping speeds them. Makes sense since I almost never see downloads actually take advantage of my Gigabit internet speeds.
It’s interesting to me people still download things in that fashion. What are you downloading?
I occasionally download something from a web server, but not enough to care about using a download manager that might make it marginally faster. Most larger files I’m downloading are either TV shows and movies from torrents and usenet, or games on steam. All of which will easily saturate a 1Gbps connection.
Im curious as to how it would achieve that?
It can’t split a file before it has the file. And all downloads are split up. They’re called packets.
Not saying it doesn’t do it, just wondering how.
It could make multiple requests to the server, asking each request to resume starting at a certain byte.
Interesting.
I feel I’ll save this rabbit hole for weekend and go and have a look at what they do.
The key thing to know is that a client can do an HTTP
HEAD
request to get just theContent-Length
of the file, and then performGET
requests with theRange
request header to fetch a specific chunk of a file.This mechanism was introduced in HTTP 1.1 (byte-serving).
just grabbed a gig file - it would take about 8 minutes with a standard download in Firefox. Use a manager or axel and it will be 30 seconds. Then again speed isnt everything, its also nice to be able to have auto retry and completion.
I was just going to recommend this too; Use axel, aria2 or even ancient hget.
JDownloader, XDM, FileCentipede (this one is the closest to IDM, although it uses closed source libraries), kGet, etc.
And JDownloader is the more useful one for easier download from file hosters.
axel. use axel -n8 to make 8 connections/segments which it will assemble when it is done
Even with wget, wget -c can resume some downloads.
deleted by creator
Removed by mod
Gotta admit, it was me. I’ve only used a computer for short time.
I’ve got my first laptop 3 years ago, and that broke after just 2 months. And anyway, with AMD Athlon 64 it greatly struggled with a browser. So really I only started seriously using computer at the start of 2021, when I got another, usable laptop. And that’s when I downloaded freedownloadmanager.deb. Thankfully, I didn’t get that redirect, so it was a legitimate file.Oh, I know someone who adds the word “free” to various search words like “free pdf reader” or “free flash player” (happened a very long time ago). He’s also the kind of person who I can imagine having a bunch of viruses and malware on his computer.
People not well versed in Linux.
You know, the non-techies, which the Linux community claims should know such things but obviously does not.
Or what is Free Download Manager
I once did.
I’ve installed and used it, and still do.
My internet connection is not that reliable, and when I download big files that are not torrents (say >1000 MB) and the download is interrupted because of internet disconnect, Firefox often has trouble getting back to it while FDM doesn’t.
FDM also lets me set download speed limits, which means I can still browse the internet while downloading.
It’s not my main tool for downloading stuff, but it has its uses.
The article mentions how to check for infection:
If you have installed the Linux version of the Free Download Manager between 2020 and 2022, you should check and see if the malicious version was installed.
To do this, look for the following files dropped by the malware, and if found, delete them:
/etc/cron.d/collect /var/tmp/crond /var/tmp/bs
Also you can check the .deb file’s postinst script. If it looks like shown here, no bueno.
“Non-free download manager”
You allways pay with something
deleted by creator
Free as in free speech, not as in free beer.
Free :: Freedom ≠ No money
deleted by creator
no u
What is a free download manager and why would someone need one?
Back in the day when most stuff was on FTP and HTTP and your connection was crap and could drop at any time, you’d use a download manager to smooth things along. It could resume downloads when connection dropped, it could keep a download going for days on end and resume as needed, and it could abusing the bandwitdh limitations of the source site by using multiple parallel connections that pulled on different file chunks. In some ways it was very similar to how we use BT today.
It was also useful to keep a history of stuff you’d downloaded in case you needed it again, manage the associated files etc.
and it could abusing the bandwitdh limitations of the source site by using multiple parallel connections that pulled on different file chunks
Also for files which had multiple different mirror sites you could download chunks from multiple mirrors concurrently which would allow you to max out your bandwidth even if individual mirrors were limiting download speeds.
It’s a download client that can pause/Resume downloads, as well as use multiple connections to download files
Like a BitTorrent?
I guess I just don’t download that much stuff.
Sucks having your connection drop and having to redlownload the entire thing again. Managers are a fix.
BitTorrent works in chunks basically, or can download it nonlinearly. Downloading from a site in a basic way gets the file from start to finish, the download manager can let you stop it and pick up where you left off, as long as the server you’re getting the file from is configured to allow it.
https://github.com/agalwood/Motrix
(Note: I don’t use that or any other download manager and haven’t since Windows 95, it’s linked as example only)
Back in the 2000s, browsers were really bad at downloading big things over slow connections since they couldn’t resume, a brief disconnect could destroy hours of progress. But I don’t think you need this anymore.
How is it possible that users noticed strange behaviors (new Cron jobs) and they didn’t check the script launched by those jobs 😱
Linux popularity going up means the percentage of users who know what cron is goes down.
Is Disney finally making Tron sequel?
No it’s a disease that makes you poop a lot
No that’s Crohn’s, cron is a type of headwear for monarchs
No, I’m Jamaican, and BTW I use Arch, Mon
No, thats a crown.
A cron is a type of super virus that wants to destroy the entire net. An end to all things… Total crash. Only another virus superpowered by core energy can put a stop to it.
No that is a Cron-virus of the show ReBoot that lasted from 1994-2001.
Cron is just an other word for hag.
No that’s a crone.
To cron is to sing romantic style of pop music.
They actually are, kind of. It’s called Tron: Ares and it’s been in production hell for some years, the most recent delay being due to the ongoing writer’s strike. Filming is expected to start after the strike is over, but personally my enthusiasm for the movie died after they announced Jared Leto as one of the cast.
If they were complaining about cronjobs being created (like the post says), then they must have known what cron is.
Idk if I’d check crontab regularly.
If I noticed strange cron jobs I would!
Finally linux is getting popular enough to make viruses. Yay?. Insert gru meme here
Linux has had viruses for decades
Here is a list of some of the more prominent ones:
(I’m sorry. I’m also reading the same discussion over at the other post with Linux at the workplace.)
Is anything on that list relevant in the wild? That is, are those viruses ‘in theory’, or have they inflicted some damage and actually spread back then? I’m looking for some news articles or actual numbers.
Mmmh. You kinda deserve being infected if you do things like this. Every beginner tutorial specifically tells you not to download random stuff from the internet and ‘sudo’ install it. Every Wiki with helpful information has these boxes that tell you not to do it. I’m okay if you do it anyways. But don’t blame anyone else for the consequences. And don’t tell me you haven’t been warned.
Also I wonder about the impact this had. It went unnoticed for 3 years. So I can’t imagine it having affected many people. The text says it affected few people. And it didn’t have any real impact.
But supply chain attacks are real. Don’t get fooled. And don’t install random stuff. Install the download manager from your package repository instead.
I kind of disagree. Applications often require root permissions to install themselves, since regular users can’t access certain folders like /opt, etc.
Also, do you really think that people would actually read the source and then compile all their software themselves? Do you do the same?
Generally though I do agree, you’re probably fine installing software from your distro’s repos but even that’s not bulletproof and also it’s not like third-party repos are uncommon either.
Yes. I do it the correct way. I use my favourite distro’s package manager to install software. This way it’s tested, a few people had a look at the changes, and sometimes a CI script automatically determines if the installer affects other parts of the system. I go to great lengths to avoid doing it any other way. (I’ve been using some flatpaks in recent times, though. But sometimes I also install it only for a separate user account. Mainly when it’s proprietary or niche.)
It is super rare that I install random stuff from the internet. Or ‘curl’ and then pipe the installer script into a root shell. And when I do, I put in some effort to see if it’s okay. I think i had a quick glance at most of the install .sh scripts before continuing. So yes, I kinda do my best. And I isolate that stuff and don’t put it on the same container that does my email.
Most of the times you can avoid doing it the ‘stupid way’. And even the programming package managers like ‘npm’, ‘cargo’, … have started to take supply chain attacks seriously.
malicious Debian package repository
*laughs in RPM*
This comment was presented by the fedora gang.
Right, but you could do the same with RPM. Not everyone is aware of this, but installing a package executes scripts with root access over your system.
Thanks Captain Obvious.
I had to essentially read the same thing four times before there was any new information in this post. Not sure if that’s a Jerboa thing or what, but probably could have been avoided.
Yeah I agree, sorry about that. I thought that the body-text field was mandatory to fill in, so I used the introductory paragraph from the article so as not to editorialize.
Removed by mod
People who get forced to download proprietary software packed in a shitty .deb
What the hell is Free Download Manager?
That’s what you get from not using
curl
curl --proto '=https' --tlsv1.2 -sSf https://download-more-ram.sh | sh
PHEW thanks, I’m safe.
Glad am not using a deb based distro.