• 1 Post
  • 58 Comments
Joined 1 year ago
cake
Cake day: July 22nd, 2023

help-circle
  • I used to write to DVD's, but the failure rate was astronomical - like 50% after 5 years, some with physical separation of the silvering. Plus today they're so relatively small they're not worth using.

    I've gone through many iterations and currently my home setup is this:

    • I have several systems that make daily backups from various computers and save them onto a hard drive inside one of my servers.
    • That server has an external hard drive attached to it controlled by a wifi plug controlled by home assistant.
    • Once a month, a scheduled task wakes up that external hdd and copies the contents of the online backup directory onto it. It then turns it off again and emails me "Oi, minion. Backups complete, swap them out". That takes five minutes.
    • Then I take the usb disk and put it in my safe, removing the oldest of 3 (the classic, grandfather, father, son rotation) from there and putting that back on the server for next time.
    • Once a year, I turn the oldest HDD into an "Annual backup", replacing it with a new one. That stops the disks expiring from old age at the same time, and annual backups aren't usually that valuable.

    Having the hdd's in the safe means that total failure/ransomware takes, at most, a month's worth. I can survive that. The safe is also fireproof and in another building to the server.

    This sort of thing doesn't need to be high capacity HDDs either - USB drives and micro-SD cards are very capable now. If you're limited on physical space and don't mind slower write times (which when automating is generally ok), the microSd's and clear labelling is just as good. You're not going to kill them through excessive writes for decades.

    I also have a bunch of other stuff that is not critical - media files, music. None of that is unique and can be replaced. All of that is backed to a secondary "live" directory on the same pc - mostly in case of my incompetence in deleting something I actually wanted. But none of that is essential - I think it's important to be clear about what you "must save" and what is "nice to save"

    The clear thing is to sit back and work out a system that is right for you. And it always, ALWAYS should be as automated as you can make it - humans are lazy sods and easily justify not doing stuff. Computers are great and remembering to do repetitive tasks, so use that.

    Include checks to ensure the backed up data is both what you expected it to be, and recoverable - so include a calendar reminder to actually /read/ from a backup drive once or twice a year.




  • I'm inclined to give Linux more benefit of the doubt than, say, Windows. That's because of the motives behind it.

    Microsoft have a very long history of making design choices in their software that users don't like, and quite often that's because it suits their interests more than their customers. They are a commercial business that exists to benefit itself, after all. Same with Apple. Money spoils everything pure, after all. You mention privacy, but that's just one more example of someone wanting to benefit financially from you - it's just in a less transparent and more open-ended way than paying them some cash.

    Linux, because that monetary incentive is far less, is usually designed simply "to be better". The developers are often primary users of the software. Sure - sometimes developers make choices that confuses users, but that over-arching driving business interest just isn't there.



  • Perl is already installed on most linux machines and unless you start delving into module usage, you won't need to install anything else.

    Python is more fashionable, but needs installing on the host and environments can get complicated. I don't think it scales as well as Perl, if that's a concern of yours.





  • I sympathise with your Dad - everyone's had updates go bad, and it's easy to assume the "don't fix what ain't broke" mantra. But to do so is being willfully ignorant of basic computer security. And to be fair, Debian-stable is one of the least troublesome things to just let automatically update.

    Debian and Ubuntu have the unattended-upgrades package which is designed to take a lot of the sting out of automatic updating. I'd recommend setting that up and you won't have to touch it again.

    There's also the crontab way - "apt-get update && apt-get upgrade" at frequencies that suit you. (A check for reboot afterwards is a good idea).



  • robots.txt does not work. I don't think it ever has - it's an honour system with no penalty for ignoring it.

    I have a few low traffic sites hosted at home, and when a crawler takes an interest they can totally flood my connection. I'm using cloudflare and being incredibly aggressive with my filtering but so many bots are ignoring robots.txt as well as lying about who they are with humanesque UAs that it's having a real impact on my ability to provide the sites for humans.

    Over the past year it's got around ten times worse. I woke up this morning to find my connection at a crawl and on checking the logs, AmazonBot has been hitting one site 12000 times an hour, and that's one of the more well-behaved bots. But there's thousands and thousands of them.




  • How it's set up depends on your business needs. We have a few hundred, and ow they're set up and managed is defined by a dozen or so groups. Base image to deploy, then ansible and config management to set up the roles.

    Users are generally authorised via AD using sssd. Some have very specific Groups which have normal user access and occasionally sudo privs for specific commands. SSH, RDP or physical access.

    Our sysadmins have local users with root privs, but most administration is done at scale using ansible or Uyuni.

    Like everything, least privilege is the best way. AD allows us to quickly control access if someone leaves or is compromised, but it could equally be done with any central LDAP system and groups.


  • digdilem@lemmy.mltoLinux@lemmy.mlHow to stagger automated upgrade?
    ·
    edit-2
    2 months ago

    Small number of machines?

    Disable unattended-upgrades and use crontab to schedule this on the days of the week you want.

    Eg, Monday each week at 4 am - every combination of dates and days is possible with crontab. 2nd Tuesdays in a month? No problem.

    0 4 * * MON apt-get update && apt-get upgrade && reboot

    (You can also be more subtle by calling a script that does the above, and also does things like check whether a reboot is needed first)

    Dozens, hundreds or thousands of machines? Use a scheduling automation system like Uyuni. That way you can put machines into System Groups and set patching schedule like that. And you can also define groups of machines, either ad-hoc or with System Groups, to do emergency patching like that day's openssh critical vuln by sending a remote command like the above to a batch at a time.

    All of that is pretty normal SME/Enterprise sysadminning, so there's some good tools. I like Uyuni, but others have their preference.

    However - Crowdstrike on Linux operates much like CS on Windows - they will push out updates, and you have little or no control over when or what. They aren't unique in this - pretty much every AV needs to be able to push updates to clients when new malware is detected. But! In the example of Crowdstrike breaking EL 9.4 a few months ago when it took exception to a new kernel and refused to boot, then yes, scheduled group patching would have minimised the damage. It did so for us, but we only have CS installed on a handful of Linux machines.


  • digdilem@lemmy.mltoLinux@lemmy.mlWhat's on your personal server?
    ·
    edit-2
    2 months ago
    • HomeAssistant and a bunch of scripts and helpers.
    • A number of websites, some that I agreed to host for someone who was dying.
    • Jellyfin and a bunch of media
    • A lot of docker containers (Adguard, *arrs)
    • Zoneminder
    • Some routing and failover to provide this between main main server and a much smaller secondary (keepalived, haproxy, some of the docker containers)
    • Some development environments for my own stuff.
    • A personal diary that I wrote and keep track of personal stats for 15 years
    • Backup server for a couple of laptops and a desktop (plus automated backup archiving)

    Main server is a ML110 G9 running Debian. 48G/ram. 256 ssd x2 in raid1 as root. 4tb backup drive. 4tb cctv drive. 4x4tb raid 10 data drive. (Separating cctv and backup to separate drives lowers overall iowait a lot). 2nd server is a baby thinkcentre. 2gb ram, 1x 128gb ssd.

    Edit: Also traccar, tracking family phones. Really nice bit of software and entirely free and private. Replaced Life360 who have a dubious privacy history.

    Edit2: Syncthing - a recent addition to replace GDrive. Bunch of files shared between various desktops/laptops and phones.


  • (This is as much an answer to some of the comments already raised, as to the article - which like most such personal pieces has pros and cons.)

    As part of a previous job I used to host email for a small business - this was about 15 years ago. I ended up spending several hours to a day a week working on it; apologising to users, tracing and diagnosing missing sent email and the endless, ENDLESS arms war against incoming spam (phishing was much less of a problem then). The trust from the company in our email operation was very poor and you'd regularly hear someone apologising to a customer because we hadn't contacted them, or answered their email. The truth is much was going astray and staff were relying more on the phone than email because they knew it worked. You might guess from this that I'm terrible at running an email system but I don't think I am. I started moving email back in the late 80s when Fidonet was the thing, so I have some miles travelled. Tools have improved a bit since then, but so have those used by the bad guys.

    I still consider one of the best things I did for that company was move our company email onto Gmail Business (which was free for us as a charity) Every single one of those problems went away immediately and suddenly I had a lot more time to do more important stuff. I would never self-host email again despite running several personal servers.

    Plenty of people say they self-host just fine, and great for you if that's so. But the truth is you won't always know if your outbound mail silently gets dropped and you have a far higher chance of it arriving if it comes from a reputable source. There are a huge number of variables outside of your control. (ISP, your country, your region, your software, even the latency of your MX or DKIM responses factor into your reputation)

    You take the decision on whether any perceieved risks of privacy through using a third party outweighs the deliverability and filtering issues of self hosting, but please don't say it's simple or reliable for everyone. If it's simple for you, you're either incredibly lucky or just not appreciating the problem.