I tried to use RecursiveDirectoryIterator to dump all files (and theirs properties, such as size/is_link/is_dir/mtime/perms/owner/group) from a large directory (~400.000 files), filtering some specific wanted files/folders.
Using RecursiveDirectoryIterator and SplFileInfo, dump was taking about 50 seconds to perform, but it was working.
However, to improve performance, I decided to make another version of the same script, using only direct file functions, such as "readdir", "filesize", "filemtime", etc.., and adding recursivity myself (if(is_dir($path)) doRecursivity($path);
After running it, script went from ~50s to only ~20s to complete (On Linux CentOS 7, SSD 300IPs).
Strangely, on Windows 7, Sata3 (with exactly same files [mirrored]) the time went from ~63s to ~57s.
I believe that this payload is due to the OO approach of SPL, which runs lots of unnecessary extra code to perform the same tasks with more reliability, while direct file funcions are more like alias to C corresponding functions, and thereover, must faster.
So, if you're dealing with a large amount of files, using RecursiveDirectoryIterator is probably not the way to go.