linux recently - How to recursively find and list the latest modified files in a directory with subdirectories and times?

last unix (14)

To find all files that file status was last changed N minutes ago:

find -cmin -N

for example:

find -cmin -5

  • Operating system: Linux

  • Filesystem type: ext3

  • Preferred solution: bash (script/oneliner), ruby, python

I have several directories with several subdirectories and files in them. I need to make a list of all these directories that is constructed in a way such that every first-level directory is listed next to the date and time of the latest created/modified file within it.

To clarify, if I touch a file or modify its contents a few subdirectory levels down, that timestamp should be displayed next to the first-level directory name. Say I have a directory structured like this:


and I modify the contents of the file example.txt, I need that time displayed next to the first-level directory alfa in human readable form, not epoch. I've tried some things using find, xargs, sort and the likes but I can't get around the problem that the filesystem timestamp of 'alfa' doesn't change when I create/modify files a few levels down.

GNU Find (see man find) has a -printf parameter for displying the files EPOC mtime and relative path name.

redhat> find . -type f -printf '%[email protected] %P\n' | sort -n | awk '{print $2}'

I shortened halo's awesome answer to this one-liner

stat --printf="%y %n\n" $(ls -tr $(find * -type f))

Updated: If there are spaces in filenames, you can use this modification

OFS="$IFS";IFS=$'\n';stat --printf="%y %n\n" $(ls -tr $(find . -type f));IFS="$OFS";

I'm showing this for latest access time, you can easily modify this to do latest mod time.

There is two ways to do this:

1)If you want to avoid global sorting which can be expensive if you have tens of millions of files, then you can do: (position yourself in the root of the directory where you want your search to start)

linux> touch -d @0 /tmp/a;
linux> find . -type f -exec tcsh -f -c test `stat --printf="%X" {}` -gt  `stat --printf="%X" /tmp/a`  ; -exec tcsh -f -c touch -a -r {} /tmp/a ; -print 

The above method prints filenames with progressively newer access time and the last file it prints is the file with the latest access time. You can obviously get the latest access time using a "tail -1".

2)You can have find recursively print the name,access time of all files in your subdirectory and then sort based on access time and the tail the biggest entry:

linux> \find . -type f -exec stat --printf="%X  %n\n" {} \; | \sort -n | tail -1

And there you have it...

Try this one:

find $1 -type f -exec stat --format '%Y :%y %n' "{}" \; | sort -nr | cut -d: -f2- | head

Execute it with the path to the directory where it should start scanning recursively (it supports filenames with spaces).

If there are lots of files it may take a while before it returns anything. Performance can be improved if we use xargs instead:

find $1 -type f -print0 | xargs -0 stat --format '%Y :%y %n' | sort -nr | cut -d: -f2- | head

which is a bit faster.

You may give the printf command of find a try

%Ak File's last access time in the format specified by k, which is either @' or a directive for the C strftime' function. The possible values for k are listed below; some of them might not be available on all systems, due to differences in `strftime' between systems.

Here is one version that works with filenames that may contain spaces, newlines, glob characters as well:

find . -type f -printf "%[email protected] %p\0" | sort -zk1nr
  • find ... -printf prints file modification (EPOCH value) followed by a space and \0 terminated filenames.
  • sort -zk1nr reads NUL terminated data and sorts it reverse numerically

As question is tagged with Linux so I am assuming gnu utils are available.

You can pipe above with:

xargs -0 printf "%s\n"

to print modification time and filenames sorted by modification time (most recent first) terminated by newlines.

Try this

stat --format %y $(ls -t $(find alfa/ -type f) | head -n 1)

It uses find to gather all files from the directory, ls to list them sorted by modification date, head for selecting the 1st file and finally stat to show the time in a nice format.

At this time it is not safe for files with whitespace or other special chars in their names. Write a commend if it doesn't meet your needs yet.

Quick bash function:

# findLatestModifiedFiles(directory, [max=10, [format="%Td %Tb %TY, %TT"]])
function findLatestModifiedFiles() {
    local d="${1:-.}"
    local m="${2:-10}"
    local f="${3:-%Td %Tb %TY, %TT}"

    find "$d" -type f -printf "%[email protected] :$f %p\n" | sort -nr | cut -d: -f2- | head -n"$m"

Find the latest modified file in a directory:

findLatestModifiedFiles "/home/jason/" 1

You can also specify your own date/time format as the third argument.

The following returns you a string of the time-stamp and the name of the file with the most recent time-stamp:

find $Directory -type f -printf "%TY-%Tm-%Td-%TH-%TM-%TS %p\n" | sed -r 's/([[:digit:]]{2})\.([[:digit:]]{2,})/\1-\2/' |     sort --field-separator='-' -nrk1 -nrk2 -nrk3 -nrk4 -nrk5 -nrk6 -nrk7 | head -n 1

Resulting to an output of the form: <yy-mm-dd-hh-mm-ss.nanosec> <filename>

Ignoring hidden files — with nice & fast time stamp

Handles spaces in filenames well — not that you should use those!

$ find . -type f -not -path '*/\.*' -printf '%TY.%Tm.%Td %THh%TM %Ta %p\n' |sort -nr |head -n 10

2017.01.28 07h00 Sat ./recent
2017.01.21 10h49 Sat ./hgb
2017.01.16 07h44 Mon ./swx
2017.01.10 18h24 Tue ./update-stations
2017.01.09 10h38 Mon ./stations.json

More find galore can be found by following the link.

Both the perl and Python solutions in this post helped me solve this problem on Mac OS X:

Quoting from the post:


find . -type f -print |
perl -l -ne '
    $_{$_} = -M;  # store file age (mtime - now)
    END {
        print sort {$_{$b} <=> $_{$a}} keys %_;  # print by decreasing age


find . -type f -print |
python -c 'import os, sys; times = {}
for f in sys.stdin.readlines(): f = f[0:-1]; times[f] = os.stat(f).st_mtime
for f in sorted(times.iterkeys(), key=lambda f:times[f]): print f'

This could be done with a reccursive function in bash too

Let F a function that displays the time of file which must be lexicographically sortable yyyy-mm-dd etc., (os-dependent?)

F(){ stat --format %y "$1";}                # Linux
F(){ ls -E "$1"|awk '{print$6" "$7}';}      # SunOS: maybe this could be done easier

R the recursive function that run through directories

R(){ local f;for f in "$1"/*;do [ -d "$f" ]&&R $f||F "$f";done;}

And finally

for f in *;do [ -d "$f" ]&&echo `R "$f"|sort|tail -1`" $f";done

for file search
find / -xdev -name settings.xml --> whole computer
find ./ -xdev -name settings.xml --> current directory & its sub directory

for files with extension type

find . -type f -name "*.iso"

linux recursion time filesystems