Tutorial :How can I escape white space in a bash loop list?


I have a bash shell script that loops through all child directories (but not files) of a certain directory. The problem is that some of the directory names contain spaces.

Here are the contents of my test directory:

$ls -F test  Baltimore/  Cherry Hill/  Edison/  New York City/  Philadelphia/  cities.txt  

And the code that loops through the directories:

for f in `find test/* -type d`; do    echo $f  done  

Here's the output:

  test/Baltimore  test/Cherry  Hill  test/Edison   test/New  York  City  test/Philadelphia  

Cherry Hill and New York City are treated as 2 or 3 separate entries.

I tried quoting the filenames, like so:

for f in `find test/* -type d | sed -e 's/^/\"/' | sed -e 's/$/\"/'`; do    echo $f  done  

but to no avail.

There's got to be a simple way to do this.

The answers below are great. But to make this more complicated - I don't always want to use the directories listed in my test directory. Sometimes I want to pass in the directory names as command-line parameters instead.

I took Charles' suggestion of setting the IFS and came up with the following:

dirlist="${@}"  (    [[ -z "$dirlist" ]] && dirlist=`find test -mindepth 1 -type d` && IFS=$'\n'    for d in $dirlist; do      echo $d    done  )  

and this works just fine unless there are spaces in the command line arguments (even if those arguments are quoted). For example, calling the script like this: test.sh "Cherry Hill" "New York City" produces the following output:

  Cherry  Hill  New  York  City  


First, don't do it that way. The best approach is to use find -exec properly:

# this is safe  find test -type d -exec echo '{}' +  

The other safe approach is to use NUL-terminated list, though this requires that your find support -print0:

# this is safe  while IFS= read -r -d '' n; do    printf '%q\n' "$n"  done < <(find test -mindepth 1 -type d -print0)  

You can also populate an array from find, and pass that array later:

# this is safe  declare -a myarray  while IFS= read -r -d '' n; do    myarray+=( "$n" )  done < <(find test -mindepth 1 -type d -print0)  printf '%q\n' "${myarray[@]}" # printf is an example; use it however you want  

If your find doesn't support -print0, your result is then unsafe -- the below will not behave as desired if files exist containing newlines in their names (which, yes, is legal):

# this is unsafe  while IFS= read -r n; do    printf '%q\n' "$n"  done < <(find test -mindepth 1 -type d)  

If one isn't going to use one of the above, a third approach (less efficient in terms of both time and memory usage, as it reads the entire output of the subprocess before doing word-splitting) is to use an IFS variable which doesn't contain the space character. Turn off globbing (set -f) to prevent strings containing glob characters such as [], * or ? from being expanded:

# this is unsafe (but less unsafe than it would be without the following precautions)  (   IFS=$'\n' # split only on newlines   set -f    # disable globbing   for n in $(find test -mindepth 1 -type d); do     printf '%q\n' "$n"   done  )  

Finally, for the command-line parameter case, you should be using arrays if your shell supports them (i.e. it's ksh, bash or zsh):

# this is safe  for d in "$@"; do    printf '%s\n' "$d"  done  

will maintain separation. Note that the quoting (and the use of $@ rather than $*) is important. Arrays can be populated in other ways as well, such as glob expressions:

# this is safe  entries=( test/* )  for d in "${entries[@]}"; do    printf '%s\n' "$d"  done  


find . -type d | while read file; do echo $file; done  

However, doesn't work if the file-name contains newlines. The above is the only solution i know of when you actually want to have the directory name in a variable. If you just want to execute some command, use xargs.

find . -type d -print0 | xargs -0 echo 'The directory is: '  


Here is a simple solution which handles tabs and/or whitespaces in the filename. If you have to deal with other strange characters in the filename like newlines, pick another answer.

The test directory

ls -F test  Baltimore/  Cherry Hill/  Edison/  New York City/  Philadelphia/  cities.txt  

The code to go into the directories

find test -type d | while read f ; do    echo "$f"  done  

The filename must be quoted ("$f") if used as argument. Without quotes, the spaces act as argument separator and multiple arguments are given to the invoked command.

And the output:

test/Baltimore  test/Cherry Hill  test/Edison  test/New York City  test/Philadelphia  


This is exceedingly tricky in standard Unix, and most solutions run foul of newlines or some other character. However, if you are using the GNU tool set, then you can exploit the find option -print0 and use xargs with the corresponding option -0 (minus-zero). There are two characters that cannot appear in a simple filename; those are slash and NUL '\0'. Obviously, slash appears in pathnames, so the GNU solution of using a NUL '\0' to mark the end of the name is ingenious and fool-proof.


Why not just put


in front of the for command? This changes the field separator from < Space>< Tab>< Newline> to just < Newline>


I use

SAVEIFS=$IFS  IFS=$(echo -en "\n\b")  for f in $( find "$1" -type d ! -path "$1" )  do    echo $f  done  IFS=$SAVEIFS  

Wouldn't that be enough?
Idea taken from http://www.cyberciti.biz/tips/handling-filenames-with-spaces-in-bash.html


Don't store lists as strings; store them as arrays to avoid all this delimiter confusion. Here's an example script that'll either operate on all subdirectories of test, or the list supplied on its command line:

#!/bin/bash  if [ $# -eq 0 ]; then          # if no args supplies, build a list of subdirs of test/          dirlist=() # start with empty list          for f in test/*; do # for each item in test/ ...                  if [ -d "$f" ]; then # if it's a subdir...                          dirlist=("${dirlist[@]}" "$f") # add it to the list                  fi          done  else          # if args were supplied, copy the list of args into dirlist          dirlist=("$@")  fi  # now loop through dirlist, operating on each one  for dir in "${dirlist[@]}"; do          printf "Directory: %s\n" "$dir"  done  

Now let's try this out on a test directory with a curve or two thrown in:

$ ls -F test  Baltimore/  Cherry Hill/  Edison/  New York City/  Philadelphia/  this is a dirname with quotes, lfs, escapes: "\''?'?\e\n\d/  this is a file, not a directory  $ ./test.sh   Directory: test/Baltimore  Directory: test/Cherry Hill  Directory: test/Edison  Directory: test/New York City  Directory: test/Philadelphia  Directory: test/this is a dirname with quotes, lfs, escapes: "\''  '  \e\n\d  $ ./test.sh "Cherry Hill" "New York City"  Directory: Cherry Hill  Directory: New York City  


find . -print0|while read -d $'\0' file; do echo "$file"; done  


ps if it is only about space in the input, then some double quotes worked smoothly for me...

read artist;    find "/mnt/2tb_USB_hard_disc/p_music/$artist" -type f -name *.mp3 -exec mpg123 '{}' \;  


You could use IFS (internal field separator) temporally using :

OLD_IFS=$IFS     # Stores Default IFS  IFS=$'\n'        # Set it to line break  for f in `find test/* -type d`; do      echo $f  done    $IFS=$OLD_IFS  


To add to what Jonathan said: use the -print0 option for find in conjunction with xargs as follows:

find test/* -type d -print0 | xargs -0 command  

That will execute the command command with the proper arguments; directories with spaces in them will be properly quoted (i.e. they'll be passed in as one argument).


#!/bin/bash    dirtys=()    for folder in *  do       if [ -d "$folder" ]; then          dirtys=("${dirtys[@]}" "$folder")       fi      done        for dir in "${dirtys[@]}"      do         for file in "$dir"/\*.mov   # <== *.mov     do             #dir_e=`echo "$dir" | sed 's/[[:space:]]/\\\ /g'`   -- This line will replace each space into '\ '            out=`echo "$file" | sed 's/\(.*\)\/\(.*\)/\2/'`     # These two line code can be written in one line using multiple sed commands.             out=`echo "$out" | sed 's/[[:space:]]/_/g'`             #echo "ffmpeg -i $out_e -sameq -vcodec msmpeg4v2 -acodec pcm_u8 $dir_e/${out/%mov/avi}"             `ffmpeg -i "$file" -sameq -vcodec msmpeg4v2 -acodec pcm_u8 "$dir"/${out/%mov/avi}`         done      done  

The above code will convert .mov files to .avi. The .mov files are in different folders and the folder names have white spaces too. My above script will convert the .mov files to .avi file in the same folder itself. I don't know whether it help you peoples.


[sony@localhost shell_tutorial]$ ls  Chapter 01 - Introduction  Chapter 02 - Your First Shell Script  [sony@localhost shell_tutorial]$ cd Chapter\ 01\ -\ Introduction/  [sony@localhost Chapter 01 - Introduction]$ ls  0101 - About this Course.mov   0102 - Course Structure.mov  [sony@localhost Chapter 01 - Introduction]$ ./above_script   ... successfully executed.  [sony@localhost Chapter 01 - Introduction]$ ls  0101_-_About_this_Course.avi  0102_-_Course_Structure.avi  0101 - About this Course.mov  0102 - Course Structure.mov  [sony@localhost Chapter 01 - Introduction]$ CHEERS!  



Had to be dealing with whitespaces in pathnames, too. What I finally did was using a recursion and for item in /path/*:

function recursedir {      local item      for item in "${1%/}"/*      do          if [ -d "$item" ]          then              recursedir "$item"          else              command          fi      done  }  


Convert the file list into a Bash array. This uses Matt McClure's approach for returning an array from a Bash function: http://notes-matthewlmcclure.blogspot.com/2009/12/return-array-from-bash-function-v-2.html The result is a way to convert any multi-line input to a Bash array.

#!/bin/bash    # This is the command where we want to convert the output to an array.  # Output is: fileSize fileNameIncludingPath  multiLineCommand="find . -mindepth 1 -printf '%s %p\\n'"    # This eval converts the multi-line output of multiLineCommand to a  # Bash array. To convert stdin, remove: < <(eval "$multiLineCommand" )  eval "declare -a myArray=`( arr=(); while read -r line; do arr[${#arr[@]}]="$line"; done; declare -p arr | sed -e 's/^declare -a arr=//' ) < <(eval "$multiLineCommand" )`"    for f in "${myArray[@]}"  do     echo "Element: $f"  done  

This approach appears to work even when bad characters are present, and is a general way to convert any input to a Bash array. The disadvantage is if the input is long you could exceed Bash's command line size limits, or use up large amounts of memory.

Approaches where the loop that is eventually working on the list also have the list piped in have the disadvantage that reading stdin is not easy (such as asking the user for input), and the loop is a new process so you may be wondering why variables you set inside the loop are not available after the loop finishes.

I also dislike setting IFS, it can mess up other code.


just found out there are some similarities between my question and yours. Aparrently if you want to pass arguments into commands

test.sh "Cherry Hill" "New York City"  

to print them out in order

for SOME_ARG in "$@"  do      echo "$SOME_ARG";  done;  

notice the $@ is surrounded by double quotes, some notes here


I needed the same concept to compress sequentially several directories or files from a certain folder. I have solved using awk to parsel the list from ls and to avoid the problem of blank space in the name.

source="/xxx/xxx"  dest="/yyy/yyy"    n_max=`ls . | wc -l`    echo "Loop over items..."  i=1  while [ $i -le $n_max ];do  item=`ls . | awk 'NR=='$i'' `  echo "File selected for compression: $item"  tar -cvzf $dest/"$item".tar.gz "$item"  i=$(( i + 1 ))  done  echo "Done!!!"  

what do you think?


find Downloads -type f | while read file; do printf "%q\n" "$file"; done  


Well, I see too many complicated answers. I don't want to pass the output of find utility or to write a loop , because find has "exec" option for this.

My problem was that I wanted to move all files with dbf extension to the current folder and some of them contained white space.

I tackled it so:

 find . -name \*.dbf -print0 -exec mv '{}'  . ';'  

Looks much simple for me


For me this works, and it is pretty much "clean":

for f in "$(find ./test -type d)" ; do    echo "$f"  done  


Just had a simple variant problem... Convert files of typed .flv to .mp3 (yawn).

for file in read `find . *.flv`; do ffmpeg -i ${file} -acodec copy ${file}.mp3;done  

recursively find all the Macintosh user flash files and turn them into audio (copy, no transcode) ... it's like the while above, noting that read instead of just 'for file in ' will escape.

Note:If u also have question or solution just comment us below or mail us on toontricks1994@gmail.com
Next Post »