AI model weights. Patches for MMOs (World of Warcraft famously used this to good effect).
Asklemmy
A loosely moderated place to ask open-ended questions
Search asklemmy π
If your post meets the following criteria, it's welcome here!
- Open-ended question
- Not offensive: at this point, we do not have the bandwidth to moderate overtly political discussions. Assume best intent and be excellent to each other.
- Not regarding using or support for Lemmy: context, see the list of support communities and tools for finding communities below
- Not ad nauseam inducing: please make sure it is a question that would be new to most members
- An actual topic of discussion
Looking for support?
Looking for a community?
- Lemmyverse: community search
- sub.rehab: maps old subreddits to fediverse options, marks official as such
- !lemmy411@lemmy.ca: a community for finding communities
~Icon~ ~by~ ~@Double_A@discuss.tchncs.de~
I don't think they do it anymore, but spotify started out with a p2p network on the backend.
Super smart way of bootstrapping such a thing without having to upfront huge server costs.
Took it out ten years ago. It was super smart, and there are still situations where it would be helpful, like when a new Taylor Swift album drops that takes the service offline.
I think a good chunk of the Internet Archive is available as torrents, at least the software collections and public domain media.
You can also download a torrent of the whole of Wikipedia, with and without images.
Do you know how big those two Wikipedia downloads are?
Not a direct answer to your question, but this is where I download my stuff from, and it also shows size.
https://library.kiwix.org/#lang=eng
Edit: Wikipedia is available there, the full thing is 109.89GB. I wonder how up-to-date it is.
As of last year, English Wikipedia, articles only, text only, was about 22GB compressed (text compresses pretty efficiently), according to the current version of this page:
As of 2 July 2023, the size of the current version of all articles compressed is about 22.14 GB without media
Some other sources describe the uncompressed offline copies as being around 50 GB, with another 100 GB or so for images.
Wikimedia, which includes all the media types, has about 430 TB of media stored.
Not litteral torrenting but the protocols ar e very similar (since they are both P2P data sharing):
Windows updates can downloaded from other computers in your local network
Steam now tries downloading games from other computers you are logged in. You can opt-in thlo serve other accounts in your local network as well.
Downloading actual linux ISOs with bittorrent is soo much faster than downloading them directly from the distro's mirror. I always use bittorent to download new linux distros I'd like to try.
Also, I believe p2p protocols are still popular in korea because ISPs there actually charge website operators for bandwidth delivered to korean customers. Twitch pulled out of korea because of this. I think their competitors there, e.g. AfreecaTV, uses p2p for their streams.
This might be stretching the definition of "common" and "torrenting," but BitTorrent created BitTorrent Sync with similar tech for personal file synchronization. It was later rebranded Resilio and still exists today.
An open-source alternative that works in a similar fashion, SyncThing, also exists.
I would consider this to be one of the intended functions of torrent files. Torrents started as faster ways to share files peer to peer. If a few people had a large file on their machines they could each upload part to someone who needs it essentially multiplying their upload bandwidth. This became less popular as internet speeds increased, except for "illegal" stuff. I would definitely try one of these...if I had more than one computer.
A common use case for SyncThing is keeping a password file up to date between, say, your PC and your phone. It'll even work remotely, thanks to the presence of relays.
(The downsides include pretty heavy battery usage )
peertube uses bittorent to stream video.
PeerTube uses Webtorrents to offload hosting of hueg files.
Odysee uses something similar to do the same. (At least they claim to, but last time I took a dig at it it seemed to be hosted "regularly")
Spotify famously had their own p2p-thing going in their desktop apps in the early days. Saved them a pretty coin back when hosting was expensive.
Coming to a browser near you is IPFS.
Clonezilla uses bittorrent for one of its massive deployment modes. I work at a university, and whenever we have to deploy an OS image, the ten gigabit uplink between the storage server and the classroom switches always gets saturated in unicast/interactive mode. Using bittorrent mode gets around this issue because once a computer has downloaded a chunk of the image, it can seed it for the rest of the computers within the subnet. One massive limitation is that the target computer has to have enough storage space for both the downloaded image and the deployed OS too.
One funny use I discovered when I was cloning a lot of computers is that even on a closed lan, BT with local discovery was stupidly fast in distributing a big set of files across a pile of computers instead of rsync. Also, setting it up was much easier.
There's Syncthing and it's proprietary counterpart Resilio that allow you to sync folders between machines and send individual files over p2p. Very neat software.
Any large file is going to be much quicker getting through BT as long as there are enough seeders. OS distros, patches, P2P files, 4K anything, etc.
Accessing public domain content thatβs not hosted digitally otherwise.
IIRC Steam uses BitTorrent to help users download game assets. Thereβs an option to switch it off, still, so must still be going.
I podcast I listen to says that they used to distribute episodes by BitTorrent, way back in like 2006, as a way to keep bandwidth costs down when they were new. I'm pretty sure they had stopped that option by the time I started listening in about 2008/9.
Transferring files to several other computers. I've done it in the past before I used KDE connect to transfer files rather than use ftp or just memory sticks. It would be useful at a LAN party to get several copies of the software distributed. (Kinda piracy but doesn't have to be if the game is free or everyone owns it legitimately).
I torrent old out of print books that I can't find anywhere else. The scans are usually pretty good. There was also a podcast I used to listen to called Caustic Soda. When they ended it, they released all of their episodes through torrenting so the fans could have them.
Software updates. I think Windows will uses p2p but not sure if it's torrent protocol.
I remember when it was relatively new and controversial BBC's iPlayer hadn't been around very long and they said they were going to start using Bittorrent tech for streaming. Guessing that never came to fruition though.
i think windows uses it for updates to make them faster, and many games too
If you just mean peer to peer, I feel like magnet links (often using bittottent) are still found for downloading large files from time to time (not just ISOs). Things like open source games and software, though if I'm being honest I can't think of a single one that still uses them. You used to find magnet links all over the open source scene but I guess with github offering free hosting it's not so common anymore.
Its a really interesting question. I wonder what the underlying economics and ideologies are at play with its decline. Economies of scale for large server farms? Desire for control of the content/copyright? Structure and shape of the network?
I guess it has some implications for stream versus download approaches to content?
If I recall, Spotify moved away from it just because the client/server model got way cheaper and the P2P model had some limitations for their future business plans. I remember them mentioning that offering a family plan was a challenge with their P2P architecture when people on the same network/account were using it at the same time.
It was probably also part of the move to smartphones. Spotify was just a desktop program for a long time and, while Iβm not an expert, I would guess the P2P model made a lot more sense on desktop with a good connection than early smartphones on flaky 2G/3G connections. They might have had to run a client/server model for iOS and/or Android anyway.
Very interesting, thank you. I guess then the centralised server must have some sort of economy of scale.
In my head, I'm comparing the network to the electricity grid with certain shapes of network making different technologies more or less feasible. I would guess the internet network is probably similar to the electricity grid in most places having fewer hubs and lines of high bandwidth rather than a more evenly distributed network. Maybe the analogy is bad though.
sharing fan edits