this post was submitted on 13 Dec 2023
234 points (98.0% liked)

Selfhosted

40329 readers
426 users here now

A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don't control.

Rules:

  1. Be civil: we're here to support and learn from one another. Insults won't be tolerated. Flame wars are frowned upon.

  2. No spam posting.

  3. Posts have to be centered around self-hosting. There are other communities for discussing hardware or home computing. If it's not obvious why your post topic revolves around selfhosting, please include details to make it clear.

  4. Don't duplicate the full text of your blog or github here. Just post the link for folks to click.

  5. Submission headline should match the article title (don’t cherry-pick information from the title to fit your agenda).

  6. No trolling.

Resources:

Any issues on the community? Report it using the report flag.

Questions? DM the mods!

founded 1 year ago
MODERATORS
 

I'm a retired Unix admin. It was my job from the early '90s until the mid '10s. I've kept somewhat current ever since by running various machines at home. So far I've managed to avoid using Docker at home even though I have a decent understanding of how it works - I stopped being a sysadmin in the mid '10s, I still worked for a technology company and did plenty of "interesting" reading and training.

It seems that more and more stuff that I want to run at home is being delivered as Docker-first and I have to really go out of my way to find a non-Docker install.

I'm thinking it's no longer a fad and I should invest some time getting comfortable with it?

you are viewing a single comment's thread
view the rest of the comments
[–] akash_rawal@lemmy.world -1 points 11 months ago (3 children)

As someone who is operating kubernetes for 2 years in my home server, using containers is much more maintainable compared to installing everything directly on the server.

I tried using docker-compose first to manage my services. It works well for 2-3 services, but as the number of services grew they started to interfere with each other, at that point I switched to kubernetes.

[–] Kialdadial@iusearchlinux.fyi 6 points 11 months ago (1 children)

If your compose files are conflicting then you're likely not tailoring your compose files to fit your server.

[–] akash_rawal@lemmy.world 1 points 11 months ago

I was writing my own compose files, but see my response to a sibling comment for the issue I had.

[–] GreatBlueHeron@lemmy.ca 2 points 11 months ago (1 children)

Wow - I thought docker was overkill for a home server and you've gone kubernetes! I guess if you use it for work and that's what you're comfortable with?

[–] akash_rawal@lemmy.world 1 points 11 months ago

Thank you... I had to learn kubernetes for work and it was around 2 weeks of time investment and then I figured out I could use it to fix my docker-compose pains at home.

If you run a lot of services, I can attest that kubernetes is definitely not overkill, it is a good tool for managing complexity. I have 8 services on a single-node kubernetes and I like how I can manage configuration for each service independent of each other and also the underlying infrastructure.

[–] null@slrpnk.net 2 points 11 months ago (1 children)

It works well for 2-3 services, but as the number of services grew they started to interfere with each other

Can you expand on that? I use docker-compose and have probably around 10 services on the same box. I don't forsee any limitations beyond hardware if I wanted to just keep adding more, but maybe I'm missing something.

[–] akash_rawal@lemmy.world 1 points 11 months ago* (last edited 11 months ago) (1 children)

If one service needs to connect to another service then I have to add a shared network between them. In that case, the services essentially shared a common namespace regarding DNS. DNS resolution would routinely leak from one service to another and cause outages, e.g if I connect Gitlab container and Redmine container with OpenLDAP container then sometimes Redmine's nginx container would access Gitlab container instead of Redmine container and Gitlab container would access Redmine's DB instead of its own DB.

I maintained some workarounds, like starting Gitlab after starting Redmine would work fine but starting them other way round would have this issue. But switching to Kubernetes and replacing the cross-service connections with network policies solved the issue for me.

[–] FooBarrington@lemmy.world 1 points 11 months ago (1 children)

An easy fix for this is to create individual networks for connections. I.e. don't create one network with Gitlab, Redmine and OpenLDAP - do two, one with Gitlab and OpenLDAP, and one with Redmine and OpenLDAP.

[–] akash_rawal@lemmy.world 1 points 11 months ago (1 children)

don't create one network with Gitlab, Redmine and OpenLDAP - do two, one with Gitlab and OpenLDAP, and one with Redmine and OpenLDAP.

This was the setup I had, but now I am already using kubernetes with no intention to switch back.

[–] FooBarrington@lemmy.world 1 points 11 months ago

Very understandable :)