• 3 Posts
  • 20 Comments
Joined 2 years ago
cake
Cake day: June 20th, 2023

help-circle

  • Interesting idea, but I feel anyone who wants this already has the tools available.
    Personally, I use Syncthing to synchronize a Media folder across my desktops, phones and tablets.
    Media/Music contains my active music collection, mostly ogg conversions of the source flac files. I use .m3u/.m3u8 files as good old playlists, saved to the Media/Music root folder with relative paths. This allows players like AIMP on windows to play/edit those playlists, and players like GoneMAD on Android to play them without any kind of active internet connection.
    There’s also Media/Audiobooks, Media/Comics, Media/Movies, etc… Yes, they’re subsets of the full collections on my NAS, but I’ve never seen that as a disadvantage.


  • I would suggest looking at Syncthing. It’s not perfect by any stretch, but it works peer to peer, without any kind of central host, ip or domain name requirements. You simply install it on the client machines, and they work out how to talk to each other over any available networks.

    Beware changing the casing on your files or directories though, Syncthing was made entirely case sensitive, which does not play nice with Windows.

    One very nice feature is that it does have an android client (https://f-droid.org/en/packages/com.github.catfriend1.syncthingandroid/), and it supports full background syncing to your local storage on the phone. Great for syncing your photos, but also music. You add some mp3’s on your desktop computer, and by the time you’ve put on your jacket they’re on your phone ready to listen to without any ‘service’ getting in your way.

    A more advanced tip; Get a VPS somewhere in the cloud with cheap storage, and have Syncthing on it listening on port 443. That will allow syncing in more restrictive corporate settings, which often don’t allow connections to port 22000. And it gives you a ‘cloud backup’ of your important files in one go.






  • If you’re already going to the trouble of setting up ZFS for the two NVMe disks, I would suggest setting up a separate pool on the HDD as well. It will save you from monitoring two different filesystem types and give all the ZFS features, checksumming, compression, snapshots, etc… Do make sure your server has a decent chunk of memory through, as your VMs will be fighting the ARC for ram…


  • A proxmox root device uses barely any space. Mine usually sit around 12-14 Gb used. Writes are also negligible. DRAM less ssds are not a problem. I would suggest installing Debian first, so you can properly partition your root device before installing proxmox (https://pve.proxmox.com/wiki/Install_Proxmox_VE_on_Debian_12_Bookworm). You’ll have much more control over your disks and networking than using proxmox’s own installer. Start with a 64Gb root partition, and leave the rest of the drive empty for future use/SLC cache.

    Unless your VMs are somehow high volume data writers, like a proof-of-space-coin, I wouldn’t worry about it. Homelab setups rarely reach anywhere near the kind of write endurance of ssd’s.

    Your VMs are not going to write to the root device, so it won’t matter.

    You won’t notice the difference in performance of a filesystem on a rotating harddisk. Look for other useful features, like at-rest-encryption, and checksumming for bitrot protection.

    I would use a filesystem with checksumming, rather than relying on any point-in-time check to monitor HDD’s. Assume they will all fail eventually, because they will.


  • Ferawyn@lemmy.worldtoSelfhosted@lemmy.world*Permanently Deleted*
    link
    fedilink
    English
    arrow-up
    4
    ·
    edit-2
    4 months ago

    Security and bugfixes, after one or two rounds of testing by early adopters/key users. Preferably through some form of automatic updates.

    New features and breaking changes, or anything that requires the end-user to pay attention, I’d say no more than 4 times a year, and using a non-automatic form of update. The hard thing is getting the user’s attention on the changes, and not just clicking next and then having a broken or insecure installation.


  • Have a look at https://forwardemail.net/. It’s a service that handles accepting (and optionally sending) email on your domain, and forwarding any received mail to other backend services, like a gmail account. All you need to do is set some DNS records, like MX and their servers will handle everything. It works fine with domains hosted on cloudflare, and has excellent howto’s to get everything set up and running.

    Edit: The great thing about this service, imho, is their guides. They don’t just have a static howto, they template in your information into the exact string you need to copy/paste into the service provider’s web interface. Want to encrypt your plaintext TXT records? There’s a button for that on the guide. Want to learn how to get around a port 25 ISP block, they have a guide for that. Want to set up proper Send-As from Gmail using their SMTP server? There’s a guide for that. :-)



  • I have always preferred TightVNC over the various other VNC flavours. It does only one thing, but does it well, with minimal setup and network requirements.

    I have tried RustDesk recently, and the performance when it worked was nice. But I found it too complex to set up across more than a few machines, and ultimately unreliable, with connections failing without any useful error message, an unresponsive relay, weird certificate errors, etc… It needs a couple of years to mature.

    I would suggest looking into using WireGuard to wire your various networks and computers together. It works very well most platforms. You can easily give laptops a road-warrior connection, so they always phone home. Then it doesn’t matter where they are.



  • Various different ways for various different types of files.

    Anything important is shared between my desktop PC’s, servers and my phone through Syncthing. Those syncthing folders are all also shared with two separate servers (in two separate locations) with hourly, daily, weekly, monthly volume snapshotting. Think your financial administration, work files, anything you produce, write, your main music collection, etc… It’s also a great way to keep your music in sync between your desktop PC and your phone.

    Servers have their configuration files, /etc, /var/log, /root, etc… rsynced every 15 minutes to the same two backup servers, also to snapshotted volumes. That way, should any one server burn down, I can rebuild it in a trivial amount of time. This also goes for user profiles, document directories, ProgramData, and anything non-synced on windows PC’s.

    Specific data sets, like database backups, repositories and such are also generally rsynced regularly, some to snapshotted volumes, some to regulars, depending on the size and volatility of the data.

    Bigger file shares, like movies, tv-shows, etc… I don’t backup, but they’re stored on a distributed GlusterFS, so if any one server goes down, that doesn’t lose me everything just yet.

    Hardware will fail, sooner or later. You should see any one device as essentially disposable, and have anything of worth synced and archived automatically.