What's the Max Number of Directories a Directory Can Have?

Perm url with updates: http://xahlee.org/UnixResource_dir/writ/unix_max_num_dir.html

What's the Max Number of Directories a Directory Can Have?

Xah Lee, 2011-03-18

Randal L Schwartz (famous perl coder) wrote a nice blog about using perl to delete a dir with huge number of files. The numbers of files are so large that the unix commmands “ls”, “rm”, etc are not responsive. The blog is: Perl to the rescue: case study of deleting a large directory (2011-03-17) By Randal L Schwartz. @ Source blogs.perl.org.

Basically, the unix tools are trying to gather the list first before doing anything. But the perl code does not wait for the whole list first before doing anything. Here's his code:

perl -e 'chdir "BADnew" or die; opendir D, "."; while ($n = readdir D) { unlink $n }'

It's a interesting article. This reminds me, in 2002, we had a huge dir on our e-commerce app to the point that it maxed out the max allowed. (i don't remember the details, but it was not inode.) I dug up this old post of mine.

Newsgroups: comp.unix.solaris
From: xah@xahlee.org (Xah Lee)
Date: 19 Apr 2002 17:59:02 -0700
Local: Fri, Apr 19 2002 5:59 pm
Subject: max number of directories inside a dir

what's is the maximum number of directories one can create under a directory?

On our production box on NetApp with UFS we have a directory that has about 38000 first level subdirectories. When this is tarred up and transfered to my personal dev box Ulta 5, and i tried to untar it, i get an error like "too many links" after about 3 hours of untar.

I wrote a perl script that creates directoris just to see what's the maximum, and it turns out to be 32765. my disk has sufficient space and inode. (see below my sig)

can anyone tell me, what's the the factor that control the number of directories one can create immediate below a given dir?

and whether this file system config can be dynamically updated or do i have to create a new file system? (or if this is entirely somethnig else)


dir making perl script

use strict;
my $path = ('/www/super_bucket/massive_files/');
for (1..10) {system(mkdir $_);}

shell session that shows the maxed out dir

[xah@hypatia ~][Fri Apr 19,17:46:32]
mkdir /www/super_bucket/massive_dirs/t
mkdir: cannot make directory `/www/super_bucket/massive_dirs/t': Too
many links

[xah@hypatia ~][Fri Apr 19,17:46:47]
rmdir /www/super_bucket/massive_dirs/1

[xah@hypatia ~][Fri Apr 19,17:46:55]
mkdir /www/super_bucket/massive_dirs/t

[xah@hypatia ~][Fri Apr 19,17:46:58]
ls -d /www/super_bucket/massive_dirs/t

[xah@hypatia ~][Fri Apr 19,17:47:35]
mkdir /www/super_bucket/massive_dirs/1
mkdir: cannot make directory `/www/super_bucket/massive_dirs/1': Too
many links

[xah@hypatia ~][Fri Apr 19,17:48:21]
rmdir /www/super_bucket/massive_dirs/t

[xah@hypatia ~][Fri Apr 19,17:48:29]
mkdir /www/super_bucket/massive_dirs/1

[xah@hypatia ~][Fri Apr 19,17:48:35]
/usr/bin/df -k
Filesystem            kbytes    used   avail capacity  Mounted on
/dev/dsk/c0t0d0s0    15457218 8802041 6500605    58%    /
/proc                      0       0       0     0%    /proc
fd                         0       0       0     0%    /dev/fd
mnttab                     0       0       0     0%    /etc/mnttab
swap                  843096      16  843080     1%    /var/run
swap                  843784     704  843080     1%    /tmp
/dev/dsk/c0t0d0s7    3117942  538955 2516629    18%    /export/home

[xah@hypatia ~][Fri Apr 19,17:50:00]
/usr/bin/df -F ufs -o i
Filesystem             iused   ifree  %iused  Mounted on
/dev/dsk/c0t0d0s0     305639 1648281    16%   /
/dev/dsk/c0t0d0s7      27314  368974     7%   /export/home

[xah@hypatia ~][Fri Apr 19,17:50:27]

Source groups.google.com