My dear lemmings,

I discovered Clonezilla a while ago and it still is my main tool to backup and restore the partitions I care about on my computers.

I cannot help but wonder if there are now better, more efficient alternatives or is it still a solid choice? There's nothing wrong with it, I'm just curious about others' practices and habits — and if there was newer tools or solutions available.

Thank you for your feedback, and keep your drives safe!

  • WalrusDragonOnABike [they/them]@reddthat.com
    ·
    edit-2
    4 months ago

    Used it for cloning some laptops recently without much issue. Cloned one laptop's primary partition onto an SD card and then imaged the others no problem. Laptops were 256GBs capacity (but only like 30-60 GBs used) and the SD card was 64 GBs. Seemed pretty simple to me.

    There's a lot of options for those who want to do things like deploy over a network, but I haven't messed with them seriously (I didn't have the ethernet cables to do it - wasted a bit of time trying before realizing they weren't connect to a network; maybe there's a way to connect via wifi, but I didn't see it)

  • cmnybo@discuss.tchncs.de
    ·
    4 months ago

    I never really had a need for the features provided by Clonezilla. I've always just used dd since it's available on any Linux live disk. Unless I'm making an image for data recovery, I zero the free space and pipe the dd output through gzip to avoid wasting space.

  • Gabu@lemmy.ml
    ·
    4 months ago

    The main thing about Clonezilla is that you can always rely on it working, no matter the system. The bad thing is that proprietary solutions have a lot more creature comforts.

  • makeasnek@lemmy.ml
    ·
    edit-2
    4 months ago

    The fact that Linux lacks a decent system-level backup tool with a GUI is kind of a mind boggler for me. The best one I've found which gets close to this is timeshift. File-level backups can't restore your whole system state and users shouldn't be expected to remember or manually export their package lists and god knows what else. I have subsisted on file-only backups but it's really not great as a solution. Disks fail, and when they do, you inevitably have to reinstall the entire OS. It's a mess. RAID1 could theoretically prevent this, but no distro makes it easy to boot from a RAID1 setup.

    Backing up the entire filesystem is not a technically complex thing, there are plenty of command-line tools to do this and some filesystems even support this concept via snapshots etc. But this has yet to be put into a useful practice for end users.

  • krakenfury@lemmy.sdf.org
    ·
    4 months ago

    I'd recommend just scripting with rsync commands and run with cron or whatever scheduling automation. Backup locally to an external drive or orchestrate with cloud provider cli tools for something like S3.

    There are some tools that probably assist with this, but it's just very few moving parts to roll your own. Clonezilla seems overkill and harder to automate, but I will admit I'm not an expert with it.

  • lps@lemmy.ml
    ·
    4 months ago

    Rescuezilla is nice. I believe it just puts a more user friendly GUI on clonezilla

  • const_void@lemmy.ml
    ·
    4 months ago

    Also interested in this. Currently in need of an imaging solution that's less clunky to use than Clonezilla.

  • utopiah@lemmy.ml
    ·
    4 months ago

    Others have mentioned rsync and I'd like to suggest on top of rdiff-backup but it's indeed for files, not partitions or disks. That being said IMHO if you are not managing data-centers and thus swapping entire physical disks by the bucket, you probably don't want to actually care for disks themselves.

    If you genuinely have to frequently change not just data but entire systems, maybe looking at nix or cloud-init could help.