Permanently Deleted

  • restic, and there are a bunch of 3rd party utilities to help with things, including multiple GUIs. Restic has build in support for several cloud storage providers (including the most excellent BackBlaze), and encrypts data so you can feel safe using them.

    What I like most about restic is that you can mount your backups and browse them like a filesystem; it allows you to easily pull out a single file from a filesystem backup, or view different versions of one, without having to remember a path or restore.

  • TGHOST-V0@lemmy.ml
    hexbear
    11
    8 months ago

    Do you know rsync ? If yes why it isn't good ?

    You can even save uuid, files permissions, owners, this kind of things with it

      • TGHOST-V0@lemmy.ml
        hexbear
        1
        8 months ago

        after reading fast,

        theses two links can help you to start i guess,

        https://linuxconfig.org/rsync-command-examples https://devhints.io/rsync

      • TGHOST-V0@lemmy.ml
        hexbear
        1
        edit-2
        8 months ago

        for snapshot depending your filesystem, the function included/built-in, to manage it, can be better.

        I agree with dan's comment,
        Use rsync first to sync your home, or wallpapers,
        You will figure out, how it work, with the good options,

        And then you will do an script or crontab for you personnally, depending what you want to save, how, and how many times.

        Rsync, can restore files, delete files. Basycally its an file manager XD
        Like for /etc/ you maybe want to replace file and do an copy of it before if they have the same name. But not for your picture's folder.

        Learn, and read doc of rsync, use it, on low importance files, and you will manage regarding your needs :).

        OFC you can ask for specific help,
        I just dont want to tell you how to do something if im not sure its good regardings your needs.

        Edit : Check "tar" too maybe,

  • Maoo [none/use name]
    hexbear
    9
    8 months ago

    If it's a desktop/laptop, I recommend Pika, which is just a nice frontend and scheduler for borg backup. If it's a server, I recommend borgmatic.

    The nice thing about borg is that it does all of the things people usually want from backups but that are kind of frustrating to do with scripts:

    • Encryption so they're private and can be uploaded to cloud storage safely.
    • Compression so they aren't too big.
    • Uses snapshots with deduplication so that they don't take up too much space.
    • Snapshots happen on a schedule.
    • There's a retention policy of how many snapshots to keep and at what interval (1 snapshot per year for the last 4 years and 1 per month for 12 months, for example).
    • You can browse through old snapshots to retrieve files.
    • You can restore from a snapshot.
    • Ignore certain files, directories, and patterns.

    It is surprisingly difficult to get all of that in one solution, but borg things will do all of the above.

  • @Kongar@lemmy.dbzer0.com
    hexbear
    3
    8 months ago

    I like separating backups and snapshots as timeshift recommends. Backups are better handled by a different process copying your files to a remote location (pc failure, house fire, etc.). Lastly, backups are personal, so you gotta do what works for you - whatever makes them happen is good enough in my opinion ;)

    My setup (not perfect, but it works for me). I keep one snapshot only - but it is the entire drive including the home folder. It’s really close to a disk image minus the mount folders. This is done to a second local disk via rsync. The arch wiki entry on rsync has the full rsync command for this operation called out. I run this right before a system update.

    Backups go to my NAS. Synology in my case. They have a cloud software package like iCloud, OneDrive, etc, except I run it on the NAS and I’m only limited on storage by what drives I throw into it. That software scoops up my user folders on all my PCs and I set it to keep the 10 latest versions.

    Then since my NAS is inside my house, I back the entire NAS up to an external hdd and sneaker net it to work and keep it in my office drawer. This protects me from fires and whatnot. I do this monthly. This is a completely manual process.

    Some people have accused me of insanity-but it’s really not that hard. I don’t worry about losing pictures of my kids, and it’s aged well with my family (for example, my daughter doesn’t worry about losing stuff while she’s in college - if she writes a paper, 10 copies are kept here at home on the NAS automatically). And none of it was hard to set up, maybe just a bit pricey for the NAS (but it’s got a lot of other super useful things going for it)

    So ya, I’d recommend letting timeshift do its thing for snapshots, and I’d rethink what you’re trying to do for backups. I strongly believe they are two different things.

  • Dataprolet@lemmy.dbzer0.com
    hexbear
    2
    edit-2
    8 months ago

    BackInTime or Borg
    BackInTime should be easier to set up, Borg is more feature-rich and flexible. If you have any questions, feel free to ask. I use both for local and remote backup for years.

  • @utopiah@lemmy.ml
    hexbear
    1
    edit-2
    8 months ago

    So... I'm not going to answer your question, feel free to ignore me.

    It's of course possible to do so and the most obvious way is to use dd since on Linux devices, including disks, are files. Consequently you can indeed "save" the whole system from the CLI.

    That being said I would argue it's a bit waste of time unless you have a very specific, and usually rare, use case e.g testing OSes themselves. Most likely I imagine (and again I'm not directly answering your question here so please do feel free to fix my assumptions or ignore this entirely) you "just" want to "quickly" go from a "broken" state to one where you can "work" again.

    It might be because you are doing something "weird" e.g tinkering with the OS itself or lack of "trust" in your current setup.

    Here my recommendation would be instead to have a "work" OS and then other partitions, or even virtual machines (not containers) dedicated to testing because it's truly a great way to learn BUT it shouldn't come at he risk of your data or your time.

    Finally, one of the bounding resource is the speed of your disk and your time to focus. I find that installing a "fresh" OS from a modern USB stick is fast, like take 2 coffees fast. I installed Ubuntu just yesterday, twice, so rather confident about that comment.

    What is indeed slow is to copy YOUR files because they are larges and numerous.

    So... finally, the "trick" do NOT copy your files despite reinstalling the system! Instead, have a dedicated /home partition so that if you reinstall the OS, your files are untouched. Yes you might have to install a couple of software but if you keep track of them via e.g ~/.history (which BTW will be saved in that situation) you will be able to e.g grep apt install it and be back on track in minutes.

    TL;DR: /home partition that is not deleted on OS reinstallation is often IMHO the most efficient way to go.

    • @utopiah@lemmy.ml
      hexbear
      2
      edit-2
      8 months ago

      PS: obviously all the backup tools other recommended are still useful. I personally use rdiff-backup to save important data on my NAS with SSDs over Ethernet. Once again it's all about speed but only after you identified what actually matters to you and it the vast majority of cases, the whole system ain't it.

  • Joe Breuer@lemmy.ml
    hexbear
    1
    8 months ago

    I recently came across ReaR and very much like it so far for my "fire and forget" whole system backups (working data I back up differently, typically something rsync-y).

  • Minty95@lemm.ee
    hexbear
    1
    8 months ago

    How about Cron? If it's just for copying your files / data, super easy to set up and extremely rapid, it doesn't do snapshots, it's just a simple 'copy my file to another place', but it works 👍