Hacker Read top | best | new | newcomments | leaders | about | bookmarklet login

> I almost never use the human-readable file sizes.

Same here. I love the idea, and i keep trying to use human-readable sizes in every command which supports them. But it turns out they're much less scannable than numbers in a common unit. How long does it take you to see which of these files is biggest:

  13k  potatoes.txt
   7M  tomatoes.txt
  128  recipe_ideas.txt
   1G  hot_sauce_formula.txt
How about now:

       13093  potatoes.txt
     7182642  tomatoes.txt
         128  recipe_ideas.txt
  1023984672  hot_sauce_formula.txt
Human-readable numbers also break all sorts of useful things like sorting (unless you have some fancypants sort which understands them), calculating totals with awk, sedding them into an expression to evaluate with $(()), etc.


view as:

Interesting point. Maybe we could have our cake and eat it too:

          ..  13k  potatoes.txt
        ....   7M  tomatoes.txt
              128  recipe_ideas.txt
    ........   1G  hot_sauce_formula.txt

That's a clever hack, I like it

I'm disgusted, yet intrigued.

>unless you have some fancypants sort which understands them

FWIW GNU sort can sort by human readable units using the -h option.


I really want to know what the deal with hot_sauce_formula being 1G is. That must be one hell of a hot sauce.

If you want to see the largest file, wouldn't you want to sort it by size anyway?

You would, but you would have to run `man ls` to figure out how to do that. That's a showstopper.

Or...

    alias bff='ls -S | head -1' # biggest fscking file

Legal | privacy