search files 
Author Message
 search files

Hello,

I do 'ls -l | grep "^-" | grep -v grep' into a directory to find out files
...
The objectif is to search a regular expression into each of thos files.

'ls -l' return ARG FILES TOO LONG ... I thout that there is too much files
into directorys, and the solution is to increase the buffer cache into
inode's table ???

'find .' works well, but it search also into subdirectory's, and i don't
know how to specify how to ignore case sensitive.

My question :
- How to do it with awk :
"Print files from spcified directory's starting with minuscule letter.
Open each file and change an expression whith uppercase(expression)"

For example : for file=bbb, containing var=toto.
the result will be file=BBB, contain TOTO




Thu, 03 Oct 2002 03:00:00 GMT  
 search files

% - How to do it with awk :
% "Print files from spcified directory's starting with minuscule letter.
% Open each file and change an expression whith uppercase(expression)"

I'd let the shell do the file name expansion. Given

 awk -f script /path/to/dir/[a-z]*

awk will read each file whose name starts with a lower-case letter.

The script could be something like this:
 # if the first line of a file && not the first file, close the output file
 # create the output file name on the first line of each file
 FNR == 1 && { if (NR > 1) close(outfile); outfile = toupper(FILENAME) }

 # change the pattern and print the line to the output file
 { gsub(/expression/, "EXPRESSION"); print > outfile }

--

Patrick TJ McPhee
East York  Canada



Thu, 03 Oct 2002 03:00:00 GMT  
 search files

Quote:


>% - How to do it with awk :
>% "Print files from spcified directory's starting with minuscule letter.
>% Open each file and change an expression whith uppercase(expression)"

>I'd let the shell do the file name expansion. Given

> awk -f script /path/to/dir/[a-z]*

I think one of his problems was that there were too many files
in the directory to glob.

This belongs more in a shell newsgroup, as its probably harder
than its worth to do in awk (anyone is welcome to prove me wrong).

find CAN just find files in the current directory if you tell
it too (someone correct me if the following is wrong):

You might do something like:

typeset -u uc_file
typeset -l lc_file
find . -name '.' -o -type d -prune -o -print | while read file
do
 lc_file=$file
 [[ $file = $lc_file ]] || continue
 uc_file = $lc_file
 sed 's/expression/EXPRESSION/g' >$uc_file
done

HTH,
Douglas Wilson



Thu, 03 Oct 2002 03:00:00 GMT  
 search files

Quote:




>>% - How to do it with awk :
>>% "Print files from spcified directory's starting with minuscule letter.
>>% Open each file and change an expression whith uppercase(expression)"

>>I'd let the shell do the file name expansion. Given

>> awk -f script /path/to/dir/[a-z]*

>I think one of his problems was that there were too many files
>in the directory to glob.

In that case, one should use xargs

man xargs

Chuck Demas
Needham, Mass.

--
  Eat Healthy    |   _ _   | Nothing would be done at all,

  Die Anyway     |    v    | That no one could find fault with it.



Thu, 03 Oct 2002 03:00:00 GMT  
 search files

Quote:

> I do 'ls -l | grep "^-" | grep -v grep' into a directory to find out
files
> ...
> The objectif is to search a regular expression into each of thos

files.

Lets see you want to check out regular files, I guess that's what the
first grep tries to do anyway. But what is t
hat "grep -v grep" doing there? That's usually only used when
processing ps output...

Quote:
> 'ls -l' return ARG FILES TOO LONG ... I thout that there is too much
files
> into directorys, and the solution is to increase the buffer cache into
> inode's table ???

That's a strange error message. I really can't figure why you get it.

Quote:
> 'find .' works well, but it search also into subdirectory's, and i
don't
> know how to specify how to ignore case sensitive.

You could maybe use "find . -type f -print". Case would never be an
issue. Though it would still recurse down sub directories of course.

Quote:
> My question :
> - How to do it with awk :
> "Print files from spcified directory's starting with minuscule
letter.
> Open each file and change an expression whith uppercase(expression)"

> For example : for file=bbb, containing var=toto.
> the result will be file=BBB, contain TOTO

It's not crystal clear to me what you want to do. But I'll try anyway.
An awkscript something like the following:

--
#!/usr/bin/awk -f
FNR == 1 {
  ucFile = toupper(FILENAME)

Quote:
}

match($0, /var=[A-Za-z_][A-Za-z_0-9]+/) != 0 {
  ucVar = toupper(substr($0, RSTART+4, RLENGTH-4))
  print "file=" ucFile ", contains " ucVar
Quote:
}

--

Let's say you name the script "pvars". Then you can call it with
something like:

for f in *; do [ -f "$f" ] && print; done | xargs ./pvars

Now if you do have too many files in that directory, depending on what
system you are on, you'll have to get around it somehow. One way might
be to put that one-liner in a script of it's own, let's say PVARS, and
call it like so:

ls | xargs ./PVARS

HTH,
/PEZ
--
-= Spam safe(?) e-mail address: pez at pezius.com =-

Sent via Deja.com http://www.deja.com/
Before you buy.



Fri, 04 Oct 2002 03:00:00 GMT  
 search files

   >Hello,
   >I do 'ls -l | grep "^-" | grep -v grep' into a directory to find
   >out files ....
The "| grep -v grep" is usually used in a pipe that begins with "ps".
"ps | grep" will create a "grep" process, but "ls | grep" does not
automatically create a file named "grep".

   >The objectif is to search a regular expression into each of thos
   >files.
   >'ls -l' return ARG FILES TOO LONG ... I thout that there is too
   >much files into directorys, and the solution is to increase the
   >buffer cache into inode's table ???
 The error message "Arg list too long" occurs when you use a wild card
expression which expands into a list too long for your shell to handle.
The problem isn't the buffer cache, it's your shell, and the solution is
to (a) use a different shell or (b) use ls without wildcards, and pipe
the output to a program such as grep, awk, or xargs.

Net-Tamer V 1.08X - Test Drive



Fri, 04 Oct 2002 03:00:00 GMT  
 search files

Quote:

>>I think one of his problems was that there were too many files
>>in the directory to glob.

>In that case, one should use xargs

That's what I thought at first, but from my understanding of
the problem (which COULD be far from what was intended),
it didn't seem like the best path.

What I THINK the problem is, is to take a bunch of lower case
files in the current directory, search for a lower case expression in them
and replace the expression with upper case, and output the result to
a file with the original name but an uppercase name.

There probably could be a solution along the lines of:
ls | xargs awk '
 ...
 print >uc_filename
 ...
'
as long as you don't run out of filehandles (close them when
the filename changes), you're probably ok.

Cheers,
Douglas Wilson



Fri, 04 Oct 2002 03:00:00 GMT  
 
 [ 7 post ] 

 Relevant Pages 

1. number of records for tps file and searching files

2. Search file and print line + next line?

3. Search file procedure

4. FYI - Re: Speed of searching file using Queues

5. search file dbfntxax.ch

6. searching files

7. TSO Rexx to search file for string

8. searching files into a table

9. search files / directory listing

10. Search file !!!

11. Searching files

12. searching files

 

 
Powered by phpBB® Forum Software