And that’s all, I’m happy since I was out of space.

    • Joelk111@lemmy.world
      link
      fedilink
      arrow-up
      2
      ·
      3 months ago

      I’m new to docker and all of my shit stopped working recently. Just wouldn’t load. Took about a half hour to find out that old images were taking up about 63GB on my 100GB boot partition, resulting in it being completely full.

      I added the command to prune 3 month old images to my update scripts.

    • Eager Eagle@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      3 months ago

      last year I had over 1TB freed by docker system prune on a dev VM. If you’re building images often, that’s a mandatory command to run once in a while.

  • Eskuero@lemmy.fromshado.ws
    link
    fedilink
    arrow-up
    5
    ·
    edit-2
    3 months ago

    Clean all the cache downloads of Arch Linux Packages

    pacman -Scc

    Remove unused docker networks and images

    docker system prune --all

    Cleanup untracked git files that might be in .gitignore such as build and out directories (beware of losing data, use “n” instead of “f” for a dry run)

    git clean -xdf

    Do an aggresive pruning of objects in git (MIGHT BE VERY SLOW)

    git gc --aggressive --prune=now

    Remove old journal logs, keeping last seven days

    journalctl --vacuum-time 7days

    Remove pip cache

    pip cache purge

  • ouch@lemmy.world
    link
    fedilink
    arrow-up
    2
    ·
    3 months ago

    Is there any disk usage tool that allows you to browse the tree while it’s still being calculated, prioritizing current directory?

  • eneff@discuss.tchncs.de
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    3 months ago

    My / is a tmpfs.

    There is no state accumulating that I didn’t explicitly specify, exactly because I don’t want to deal with those kind of chores.

    • Chewy@discuss.tchncs.de
      link
      fedilink
      arrow-up
      0
      ·
      3 months ago

      These tools are also useful for finding large files in your home directory. E.g. I’ve found a large amount of Linux ISOs I didn’t need anymore.

      • eneff@discuss.tchncs.de
        link
        fedilink
        arrow-up
        2
        ·
        3 months ago

        My users home directory is ephemeral as well, so this wouldn’t happen. Everything I didn’t declare to persist is deleted on reboot.

        What I do use tools like these for is verifying that my persistent storage paths are properly bind mounted and files end up in the correct filesystem.

        I use dust for this, specifically with the -x flag to not traverse multiple filesystems.

  • klu9@lemmy.ca
    link
    fedilink
    English
    arrow-up
    1
    ·
    3 months ago

    Personally I’m loving diskonaut. “Graphical” representation but at, ahem, terminal velocity. Image

      • Joe Cool@lemmy.ml
        link
        fedilink
        arrow-up
        1
        ·
        edit-2
        3 months ago

        One of the things I dislike about Rust is the massive amount of disk space and time it takes to do a download, compile, test run.
        2GB of dependencies and build files for a 200K binary is a bit much.

  • devfuuu@lemmy.world
    link
    fedilink
    arrow-up
    1
    ·
    3 months ago

    The always huge and killing my system space:

    • pacman cache
    • docker bullshit
    • flatpaks
    • journalctl files!
    • MouldyCat@feddit.uk
      link
      fedilink
      English
      arrow-up
      1
      ·
      3 months ago

      In case you don’t already know about it, paccache (part of the pacman-contrib package) will let you easily remove old packages from the pacman cache

  • GustavoM@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    3 months ago

    This is why I’ve set up a ramdisk on ~/.cache and ~/Downloads – “free” automatic cleanup plus a tad more of performance because why not.

      • GustavoM@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        3 months ago

        I don’t think you’ll need to do that, unless you are planning to download files that are over 4Gb long and/or you are using a potato that has less than 1 Gb of ram.

        t. I’ve set my entire ram into a ramdisk, and the performance actually IMPROVED compared to not setting a ramdisk at all.

        • eneff@discuss.tchncs.de
          link
          fedilink
          English
          arrow-up
          2
          ·
          edit-2
          3 months ago

          I don’t think they meant forcing themselves because their RAM would fill up, but because their stuff would be gone after rebooting if they didn’t move it.