Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

Categories

removing duplicate files.

i need to delete duplicate instances of a file in current directory, and this is as far as i could get :

[code]

find . -maxdepth 1 -type f -print0 | xargs -0 md5sum | sort |
uniq -w 32 -c | awk '{ print $1, $3 }' | grep "^[23]"

[/code]

the output is :

[code]

2 ./q1
2 ./q4
3 ./q2

[/code]

so i can get to list the duplicate files alongwith the no. of copies that exist, but i dont know how to delete the files so that only one copy of each file remains. please help .

thnx for reading the post.
Sign In or Register to comment.