using ls instead of find to list files with f

May 7, 2011 at 08:43:17
Specs: Unix Solaris
I am using the following command in Unix: ls -lR | grep filename
I am able to locate the filename, but I want the full path to the filename and don't want to use find. It takes too long to locate the file with find.

Could someone please tell me what the appropriate syntax is, and whether I need to use awk or sed to get the full path to the file I am searching for using ls? I have spent the better part of one day trying to find an answer, and can't find any help.

Thank you!


See More: using ls instead of find to list files with f

Report •

#1
May 8, 2011 at 13:56:35
First, I don't mean to knitpick but this command:

ls -lR | grep filename

will find any string containing filename. Unfortunately, the Solaris version of grep/egrep doesn't support the -w option. This command finds filename beginning with a space and the line ends with a filename:

ls -lR |grep " filename$"

Second, the ls -lR is not really a good command because of the way it formats the output; it lists the directory on one line and then the objects in that directory on succeeding lines.

The only way I see to do this is to pipe the ls -lR commaand to an awk script. My solution is below. Personally, I cannot see how this script is faster than using the find command:


#!/bin/ksh

myfile="filename"
myloc=$(pwd)  # save current directory
ls -lR|nawk '
{

# sve current directory if it starts with .:
if($0 ~ /^\.:/)
  {
  dirname="/"
  continue
  }

# save current directory if it starts with ./
if($0 ~ /^\.\//)
   {
   gsub(":","/")
   gsub("^\.","")
   dirname=$1
   continue
   }

# print the full path
if($NF == "'"$myfile"'")
   printf("%s%s%s\n", "'"$myloc"'", dirname, $NF)

} '


Report •
Related Solutions


Ask Question