[Novalug] All files in a directory...

James Ewing Cottrell 3rd JECottrell3@Comcast.NET
Tue Jan 2 16:08:58 EST 2007


Kevin Dwyer wrote:

>On Thu, Dec 28, 2006 at 06:26:37AM -0500, Duane C. Mallory wrote:
>  
>
>>I would like to take all text files in a directory and append the data 
>>in them into one file with a different name. I was hoping I could cheat 
>>and use "find" to round them up and then "cat" to append them into 
>>another file; unfortunately all this does is append the file names into 
>>the new file - not the data in the files.
>>
>>Does any one have a down and dirty way of doing this?
>>    
>>
>
>If you want to use find, you do something like:
>
>find path/to/files/ -type f -exec cat {} >> concatenatedfile \;
>  
>
Too many things wrong here:

[1] never use "-exec". Use xargs.
[2] the >> needs to be quoted
[3] even if violating [1], use "find ... -exec cat {} \; > concatenatedfile"
[4] care must be taken not to let find or ls find concatenatedfile.
[5] what happens if concatenatedfile already exists? a minor problem

Here are two ways to do it safely, assuming that the current contents of 
concatenatedfile are unimportant

find /path -type f ! -name concatenatedfile -print -o -type d -prune | 
xargs cat > concatenatedfile

ls -a | while read file
do
case $file in
(.|..|concatenatedfile) continue;;
(*) test -f $file && cat $file;;
esac
done > concatenatedfile

Note that because the problem wasn't entirely specified (same or 
different directory? recurse or not?) the solutions given make some 
assumptions of their own.

>In addition, there are probably an uncountable number of ways to do this
>in various scripting languages, etc.
>  
>
One final note,. The problem mentions "text files" as opposed to binary 
files. Perl has the -T operator which is tailored for this.

>-kpd
>

JIM



More information about the Novalug mailing list