如何使用 Red Hat Linux 上的标准工具对文件中的行进行随机化?

如何使用 Red Hat Linux 上的标准工具对文件中的行进行随机化?

我没有 shuf命令,所以我正在寻找类似于 perlawk一行程序来完成相同的任务。

80806 次浏览

And a Perl one-liner you get!

perl -MList::Util -e 'print List::Util::shuffle <>'

It uses a module, but the module is part of the Perl code distribution. If that's not good enough, you may consider rolling your own.

I tried using this with the -i flag ("edit-in-place") to have it edit the file. The documentation suggests it should work, but it doesn't. It still displays the shuffled file to stdout, but this time it deletes the original. I suggest you don't use it.

Consider a shell script:

#!/bin/sh


if [[ $# -eq 0 ]]
then
echo "Usage: $0 [file ...]"
exit 1
fi


for i in "$@"
do
perl -MList::Util -e 'print List::Util::shuffle <>' $i > $i.new
if [[ `wc -c $i` -eq `wc -c $i.new` ]]
then
mv $i.new $i
else
echo "Error for file $i!"
fi
done

Untested, but hopefully works.

cat yourfile.txt | while IFS= read -r f; do printf "%05d %s\n" "$RANDOM" "$f"; done | sort -n | cut -c7-

Read the file, prepend every line with a random number, sort the file on those random prefixes, cut the prefixes afterwards. One-liner which should work in any semi-modern shell.

EDIT: incorporated Richard Hansen's remarks.

Um, lets not forget

sort --random-sort

Related to Jim's answer:

My ~/.bashrc contains the following:

unsort ()
{
LC_ALL=C sort -R "$@"
}

With GNU coreutils's sort, -R = --random-sort, which generates a random hash of each line and sorts by it. The randomized hash wouldn't actually be used in some locales in some older (buggy) versions, causing it to return normal sorted output, which is why I set LC_ALL=C.


Related to Chris's answer:

perl -MList::Util=shuffle -e'print shuffle<>'

is a slightly shorter one-liner. (-Mmodule=a,b,c is shorthand for -e 'use module qw(a b c);'.)

The reason giving it a simple -i doesn't work for shuffling in-place is because Perl expects that the print happens in the same loop the file is being read, and print shuffle <> doesn't output until after all input files have been read and closed.

As a shorter workaround,

perl -MList::Util=shuffle -i -ne'BEGIN{undef$/}print shuffle split/^/m'

will shuffle files in-place. (-n means "wrap the code in a while (<>) {...} loop; BEGIN{undef$/} makes Perl operate on files-at-a-time instead of lines-at-a-time, and split/^/m is needed because $_=<> has been implicitly done with an entire file instead of lines.)

On OSX, grabbing latest from http://ftp.gnu.org/gnu/coreutils/ and something like

./configure make sudo make install

...should give you /usr/local/bin/sort --random-sort

without messing up /usr/bin/sort

Or get it from MacPorts:

$ sudo port install coreutils

and/or

$ /opt/local//libexec/gnubin/sort --random-sort

shuf is the best way.

sort -R is painfully slow. I just tried to sort 5GB file. I gave up after 2.5 hours. Then shuf sorted it in a minute.

When I install coreutils with homebrew

brew install coreutils

shuf becomes available as n.

A one-liner for python:

python -c "import random, sys; lines = open(sys.argv[1]).readlines(); random.shuffle(lines); print ''.join(lines)," myFile

And for printing just a single random line:

python -c "import random, sys; print random.choice(open(sys.argv[1]).readlines())," myFile

But see this post for the drawbacks of python's random.shuffle(). It won't work well with many (more than 2080) elements.

Mac OS X with DarwinPorts:

sudo port install unsort
cat $file | unsort | ...

FreeBSD has its own random utility:

cat $file | random | ...

It's in /usr/games/random, so if you have not installed games, you are out of luck.

You could consider installing ports like textproc/rand or textproc/msort. These might well be available on Linux and/or Mac OS X, if portability is a concern.