this post was submitted on 12 Sep 2023
332 points (99.1% liked)

Linux

48329 readers
639 users here now

From Wikipedia, the free encyclopedia

Linux is a family of open source Unix-like operating systems based on the Linux kernel, an operating system kernel first released on September 17, 1991 by Linus Torvalds. Linux is typically packaged in a Linux distribution (or distro for short).

Distributions include the Linux kernel and supporting system software and libraries, many of which are provided by the GNU Project. Many Linux distributions use the word "Linux" in their name, but the Free Software Foundation uses the name GNU/Linux to emphasize the importance of GNU software, causing some controversy.

Rules

Related Communities

Community icon by Alpár-Etele Méder, licensed under CC BY 3.0

founded 5 years ago
MODERATORS
 

A reported Free Download Manager supply chain attack redirected Linux users to a malicious Debian package repository that installed information-stealing malware.

The malware used in this campaign establishes a reverse shell to a C2 server and installs a Bash stealer that collects user data and account credentials.

Kaspersky discovered the potential supply chain compromise case while investigating suspicious domains, finding that the campaign has been underway for over three years.

you are viewing a single comment's thread
view the rest of the comments
[–] TrustingZebra@lemmy.one 11 points 1 year ago (5 children)

It's still my favorite download manager on Windows. It often downloads file significantly faster than the download manager built into browsers. Luckily I never installed it on Linux, since I have a habit of only installing from package managers.

Do you know of a good download manager for Linux?

[–] FredericChopin_@feddit.uk 14 points 1 year ago (3 children)

How much faster are we talking?

I’ve honestly never looked at my downloads and though huh you should be quicker, well maybe in 90’s.

Right? I've not thought about download speeds since the 2000's.

[–] TrustingZebra@lemmy.one 7 points 1 year ago (2 children)

FDM does some clever things to boost download speeds. It splits up a download into different chuncks, and somehow downloads them concurrently. It makes a big difference for large files (for example, Linux ISOs).

[–] somedaysoon@lemmy.world 16 points 1 year ago (1 children)

It only makes a difference if the server is capping the speed per connection. If it's not then it will not make a difference.

[–] TrustingZebra@lemmy.one 3 points 1 year ago (1 children)

I guess many servers are capping speeds them. Makes sense since I almost never see downloads actually take advantage of my Gigabit internet speeds.

[–] somedaysoon@lemmy.world 3 points 1 year ago* (last edited 1 year ago)

It's interesting to me people still download things in that fashion. What are you downloading?

I occasionally download something from a web server, but not enough to care about using a download manager that might make it marginally faster. Most larger files I'm downloading are either TV shows and movies from torrents and usenet, or games on steam. All of which will easily saturate a 1Gbps connection.

[–] FredericChopin_@feddit.uk 9 points 1 year ago (1 children)

Im curious as to how it would achieve that?

It can’t split a file before it has the file. And all downloads are split up. They’re called packets.

Not saying it doesn’t do it, just wondering how.

[–] everett@lemmy.ml 13 points 1 year ago (1 children)

It could make multiple requests to the server, asking each request to resume starting at a certain byte.

[–] FredericChopin_@feddit.uk 5 points 1 year ago (1 children)

Interesting.

I feel I’ll save this rabbit hole for weekend and go and have a look at what they do.

[–] drspod@lemmy.ml 18 points 1 year ago (1 children)

The key thing to know is that a client can do an HTTP HEAD request to get just the Content-Length of the file, and then perform GET requests with the Range request header to fetch a specific chunk of a file.

This mechanism was introduced in HTTP 1.1 (byte-serving).

[–] FredericChopin_@feddit.uk 5 points 1 year ago

Huh.. that’s super interesting and thanks for sharing.

[–] westyvw@lemm.ee 6 points 1 year ago (1 children)

just grabbed a gig file - it would take about 8 minutes with a standard download in Firefox. Use a manager or axel and it will be 30 seconds. Then again speed isnt everything, its also nice to be able to have auto retry and completion.

[–] Penguincoder@beehaw.org 1 points 1 year ago

I was just going to recommend this too; Use axel, aria2 or even ancient hget.

[–] Xirup@lemmy.dbzer0.com 8 points 1 year ago (1 children)

JDownloader, XDM, FileCentipede (this one is the closest to IDM, although it uses closed source libraries), kGet, etc.

[–] flontlocs@lemmy.world 1 points 1 year ago

And JDownloader is the more useful one for easier download from file hosters.

[–] westyvw@lemm.ee 4 points 1 year ago (1 children)

axel. use axel -n8 to make 8 connections/segments which it will assemble when it is done

[–] Xirup@lemmy.dbzer0.com 2 points 1 year ago

Even with wget, wget -c can resume some downloads.