I have some questions about selfhosting
from AlmightyDoorman@kbin.earth to selfhosted@lemmy.world on 01 Jun 06:59
https://kbin.earth/m/selfhosted@lemmy.world/t/1412745
from AlmightyDoorman@kbin.earth to selfhosted@lemmy.world on 01 Jun 06:59
https://kbin.earth/m/selfhosted@lemmy.world/t/1412745
Hello, i want to get into the self hosting game and have some questions about it. I hope that this community may help me (i already learnt so much by reading posts in this community).
- I want to self host an NAS and some home server applications like Immich, Homeassistant, Jellyfin and parts of the *Arr stack. Would it be advisable to get a mini-PC and a separate NAS or can i put this on the same appliance together? Was thinking about getting a NAS from Terramaster and i am not sure if i need a separate mini PC.
- If i would go with the all in one solution, would TrueNAS Scale be enough? I saw some recommendations to get Promox and run TrueNAS Scale for the NAS management and Promox container for everything else.
- If i would go with the separate solutions how would i handle seeding? Since the mini-PC would run the *Arr Suite with qBittorrent would it be able to download directly to the NAS and hardlink the stuff there or would seeding require two copies, one on the mini-pc and another one on the NAS?
I hope my problems are clear, i tried searching for it but did not find a satisfying answer, which is why i am asking here. Thanks in advance
threaded - newest
Usually, one device doing both compute and storage most people’s use cases better. If you want multiple compute nodes, because you want to be able to reboot one for updates without taking down services, or you want to run real kubernetes, then three compute nodes and one NAS for storage makes sense.
One of the issues with multiple devices is networking. Transferring totally legit files for the Arr stack to and from the NAS can be a lot of data. Keeping it all in one system means your speeds up to that point are SATA speeds vs ethernet.
For the OP, one file with hard linking is my goal, but I only use Usenet. I rip anything that comes down with Tdarr to strip languages, normalize audio and rip to H265. If you do that with torrents, you will need to keep the original for seeding.
Meh, I only have gigabit and my content lives on an NFS share. It’s been fine for streaming and everything else.
Is it really that much data? I mean I can't seed faster than my upload speed anyways and that is a lot lower than 1gigabit so i don't quite see how the network will get saturated.
It really depends on your use case. I’ve gone through 9tb of data in a month. And often have up to a dozen BR quality movie requests at once. 35-65gb each, on average. If you’re only doing one movie at a time and only doing torrent quality, you shouldn’t have any issues.
Thank you very much, that helps a lot.
Feel free to ask questions if you have them. I am no expert, but I am willing to try to help if you get stuck.
It depends of what you want as future proof (expansion capabilities). Usually home user nases come with low power cpu, a high power cpu usually is a enterprise grade nas, really costly for a home user. So having it separated makes the cpu upgrade easy but now you have 2 boxes. But if your terramaster comes with a decent cpu I don’t see any problem.
True nas scale is really a behemoth able of almost everything. I would start with something more reduced like omv or unraid. You really don’t need the advance enterprise features of that and it will add only complexity to the setup.
If you use NFS for exporting folders from your nas, the “computing box” will see this as a local folder, so no need to have 2 copies of the same file.
Hope it helps
The problem with such advance Sw is the overwhelming list of options and the lacks of sane defaults
It is not the same to find 10 different (and complex) solutions when you are evaluating what you can do for solving a problem. It adds more noise to the solution than anything else. And of course the minimum resources needed ;)
For the downloading I suggest you to have the download folder and the main storage both exported under the same nfs folder. Quite handy.
Ahh i see your point, okay i will probably start with something easy then.
My suggestion is either to use one device, that has a decent enough GPU for transcoding, to do everything or a separate NAS drive purely for storage and a separate PC with that GPU capability for jellyfin/plex + Immich. Home assistant and arr aren’t that resource hungry. Also go for Usenet instead of torrents. Good luck and happy sailing.
Yeah I'm still debating over Usenet vs. Torrents, but Torrents seem better for some old stuff. And I am a bit off put by the mention that there are some Usenet groups that are kinda secretive and special and hard to get access to. I mean there is kinda the same with private trackers but at least they seem a bit more transparent about it.
You can just use both. I do.
Use both.
I use both, and I’ve found Usenet to be significantly better for old content. Not even close.
Ahh okay, nice to hear. I will think about using both.
That’s the best way to learn. Embed yourself in a community and passively learn. When you have enough of the vocabulary to ask an intelligent question, ask and let the community present solutions. Good job.
Others may disagree, but I think a sufficiently powerful NAS can absolutely handle automation backends and media servers. I know many people run such tools on Synology devices without issue (Synology, however, have become greedy assholes wrt requiring their own drives for compatibility) including me. I haven’t used Immich, but I see no reason that couldn’t run there as well. A dedicated mini-PC is overkill, but it would make things snappier if you’re flush with cash. I’m currently running an M2 Mac mini for my server needs and torrents because I can afford it and it can support 2000+ torrents at the same time without breaking a sweat.
I haven’t used TrueNAS and I’m unqualified to comment on this. I have run Proxmox, but not as a container. I’ll let others comment.
I haven’t used the Arr Suite. I just searched and I can’t imagine any reason why you couldn’t run the torrent client on a dedicated device and store data on the NAS. You don’t need to duplicate the files to do so. You might have to create automation of sorts to mount the NAS volume on the mini-PC at login or restore it if it gets interrupted, or you could just do so manually.
Welcome to selfhosting. It’s a fun hobby.
I would recommend DIYing your own NAS/server, if possible. How much storage are you needing? What’s your budget?
I will use the server for photos, videos, backups from my PC via borg and as a media server so i wanted to start with approximately 6TB. Budget without the harddrives is under 500€, currently looking at a refurbished optiplex 3060 with an Intel I3-8100T as a miniPC for server usage and then some used NAS system from Terramaster. Do you mean building a multibay NAS myself? I looked into it and it didn't seem really cheaper compared to buying used/refurbished.
Okay then I will recommend the new Beelink ME mini
www.bee-link.com/products/beelink-me-mini-n150
And also the “pocket NAS” from CWWK.net
Those are fine but will be less power efficient than the mobile CPUs.
It definitely is. Also most NAS have a very weak CPU. Because a “NAS” really doesn’t need one.
Like you I lurked the self hosting communities until I made the dive myself. I bought a used HP Elitedesk Mink 800 G3, not a particularly powerful machine, and installed Ubuntu server. I started playing around with docker compose detuos for various services and eventually committed to running immich, qbittorrent, and Plex on it, along with hosting some dedicated servers for various games depending on what I felt like playing at the time. It all worked easily enough and I figured out things as I went along such as domain names (ddns), security hardening, and reverse proxies.
I picked it up around 100 euro, got a secondhand switch so I could have both my PC and it on the same line from the house router to my office.
I have two of them now so I can split game servers onto their own machine to save rrsourcss, and recently also picked up a Seagate expansion drive of 10Tb to use for media storage for the originak one. Still lots to learn, but that’s the fun!
Did you connect the expansion drive via USB or how are you doing that? AFAIK the G3 only supports 2.5 HDDs which are not available with 10Tb, are they?
Via USB yep! The SeaGate expansion is a 3.5HDD, I believe. I spent a lot of time worrying about read/write speed and such but in practice, there is no issues streaming media from it. Even fully remote from my network there is no stuttering or issues.
Hardware wise I’d go AIO. A mini and a pair of mirrored USB drives is my setup. I have an off-site backup running: another mini + USB. Finally, I have an inherited laptop as a redundant network box/local backup/immich compute. I have 5 households on my network, and aside from immich spiking in resources (hence the laptop), I have overhead to spare.
An n100 mini (or n150, n200, whatever) is cheap enough and powerful enough for you to jump in, decide if you want to spend more later. They’re small, quiet, reasonable Value for Money, easy Wife Acceptance Factor, and can age into a bunch of devices if you decided self hosting isn’t for you. I’d make a retro console out of any spare mini.
This way, when spending £x00s on a server, you’ll have some idea on what you actually need/want. The n100 can the age into a firewall/network box/local back up/etc if/when you upgrade.
All that said. An AIO storage-compute box is where I’m headed. I now know I need a dedicated graphics card for the immich demand. I now know I want a graphics card for generative AI hobby stuff. I know how much storage I need for everyone’s photos, and favorite entertainment, permanent stuff. I know how much storage I need for stuff being churned, temporary stuff. I now know I don’t care about high availability/clusters. I now know… Finally, the ‘Wife’ has grown used to having a server in the house: it’s a thing I have, and do, which she benefits from. So, a bigger, more expensive, and probably louder box, is an easier sell.
I am currently moving from a Synology DS920+ to using my Mac Mini M4 + DAS with hardware RAID. The Synology is great, but I want to get away from the proprietary RAID that it’s using. I was running Plex on it, as well as all my *Arr’s etc via docker, but Plex and all of these services on the Mac Mini run like greased lightning compared to the Synology. Sonarr/Radarr load instantly versus taking a minute to load the library on the Synology.
I think doing it this way rather than an all in one device is easier to maintain and upgrade. Run all the services in docker so you can make them on any device that you might upgrade/change to, and just have a big RAID array of drives. Plus this way I can use Backblaze personal and backup the entire 50TB for like $99 a year :|
I was also thinking about getting a DAS but found several opinions that recommended against it because the hardware RAID is more expensive and software raid with DAS is supposed to be a PITA.
I virtualized my nas for memes but honestly you don’t even need to do that. Just run proxmox and use zfs then build whatever lxc/VMS you want on top of it
Is that viable? I read that doing it with proxmox can be problematic because it doesn't offer such great NAS and Raid options out of the box, making it harder to maintain.
ZFS is objectively the best disk pool solution, where did you read that?
openmediavault and casas are good noob-friendly OSes for NAS purpose. Much faster and simpler to get it running than some Proxmox and True NAS overkill solution.
You can also just install whatever Linux OS you like, and plug-in some screen, keyboard and mouse and do your setup this way, like any other computer!
For Hardware, I recommend to build a computer out of standard parts. For what you said, a small motherboard with an integrated Intel N100 CPU and a nice looking case like the Jonsbo N2 will sever you well. This is very close to my current setup, using an older j5040 CPU and it runs everything just fine with no effort (Jellyfin w/ light transcoding, *arr stack, Usenet and torrent clients, syncthing, SMB and NFS filesharing, and more)