Hello,
I am trying watch for changes in a massive folder structure consisting of photos.
What I am trying to achieve is that when a new photo is added into one of the ˜1000+ subfolders, an action is triggered, which will search the oldest photo in the folder, and copies the iptc/exif data via exiftool into the new photos
what I have done so far is,
- make a rule to go through all folders, as from subdirectories manual
- make a rule which is only activated once, to match already existing photos, so that I can use Date Last Matched = blank in the next rule
- then make a rule which matches Date Last Matched = blank, and jpg images, which then runs an embedded shell script:
#get directory
DIR=$(dirname "${1}")
cd $DIR
#get oldest jpg
OLDEST=$( ls -ltr *.{jpg,JPG} | grep -v '^d' | awk 'NR==2 {print $NF; exit}')
#copy tags
exiftool -TagsFromFile $OLDEST "-all:all>all:all" "${1}"
So the goal is just to drag photos in the folders they belong, and actions are triggered to get them tagged accordingly to the other photos in this folder and eventually some more things afterwards.
- now my questions:
- while that what I constructed now kind of works, is there a better way to tackle this?
- is that ever gonna be efficient enough to be used in a creative workflow? I have not tested it on big scale till now.
goal is to get as many things automated as possible, so eventually I will add uploading of these photos via cyberduck-cli as
well as making a copy of the photos elsewhere.
- If hazel is not efficient enough, what else could I try to use?
thanks