Search within files

May 17, 2010 at 05:47:54
Specs: Solaris
I have to search a particular folder which conatins around 2000 XML files.
I need to identify those files which have a value contains "GSM" under "Product Value"

<item name="Product Value">
<value>GSM</value>
</item>

<item name="Product Value">
<value>GSM Networks</value>
</item>

I need to copy those file names to another file.

This is my script. When I print $line, it is showing the entire line. But the grep statememt is taking it word by word. Also,the variables FOUNIT and FOUNDENT are not being set correctly.can somebody help me out?

Code:

#!/bin/ksh
cd /files/gsm
logfile=/users/gsm.txt
found=0
for i in `ls`
do

while read line
do

if [[ $found -eq 1 ]] ; then

found=0
FOUNDENT=`grep "GSM" ${line}`
if [ -z ${FOUNDENT} ]; then
echo $i |tee -a $logile
break
fi
fi
FOUNDIT=`grep "Product Value" ${line}`
if [ ! -z ${FOUNDIT} ]; then
found=0
else
found=1
fi

done< $i
done


See More: Search within files

Report •

#1
May 17, 2010 at 08:18:55
I think you have several logic problems:

First, to look at each file in a directory one per line, try replacing your for `ls` with something like this:

ls -1|while read i

Second, the way you are using grep in your script,the variable ${line} is expected to be a file - not a line of text. You might try something like:

FOUNDIT=`echo "$line"|grep -c "GSM"`
if [[ $FOUNDIT -gt 0 ]]; then
.
.

Third, note in the example above that you can use grep's -c option to get a count instead of doing the null checks.


Report •
Related Solutions


Ask Question