watch massive folder structure - can it be done efficiently?

Get help. Get answers. Let others lend you a hand.

Moderator: Mr_Noodle

Hello,
I am trying watch for changes in a massive folder structure consisting of photos.

What I am trying to achieve is that when a new photo is added into one of the ˜1000+ subfolders, an action is triggered, which will search the oldest photo in the folder, and copies the iptc/exif data via exiftool into the new photos

what I have done so far is,
- make a rule to go through all folders, as from subdirectories manual
- make a rule which is only activated once, to match already existing photos, so that I can use Date Last Matched = blank in the next rule
- then make a rule which matches Date Last Matched = blank, and jpg images, which then runs an embedded shell script:


#get directory
DIR=$(dirname "${1}")
cd $DIR

#get oldest jpg
OLDEST=$( ls -ltr *.{jpg,JPG} | grep -v '^d' | awk 'NR==2 {print $NF; exit}')

#copy tags
exiftool -TagsFromFile $OLDEST "-all:all>all:all" "${1}"

So the goal is just to drag photos in the folders they belong, and actions are triggered to get them tagged accordingly to the other photos in this folder and eventually some more things afterwards.

- now my questions:
- while that what I constructed now kind of works, is there a better way to tackle this?
- is that ever gonna be efficient enough to be used in a creative workflow? I have not tested it on big scale till now.
goal is to get as many things automated as possible, so eventually I will add uploading of these photos via cyberduck-cli as
well as making a copy of the photos elsewhere.
- If hazel is not efficient enough, what else could I try to use?

thanks
roxneft
 
Posts: 3
Joined: Thu Feb 07, 2019 8:30 am

Is it possible to have Hazel file it for you? If so, instead, use a staging folder where you drop photos. Hazel can then move it to the right subfolder, tag and import it for you. Doing it that way, you don't have to have Hazel monitor tons of files that probably won't ever be processed again.
Mr_Noodle
Site Admin
 
Posts: 11195
Joined: Sun Sep 03, 2006 1:30 am
Location: New York City

Mr_Noodle wrote:Is it possible to have Hazel file it for you? If so, instead, use a staging folder where you drop photos. Hazel can then move it to the right subfolder, tag and import it for you. Doing it that way, you don't have to have Hazel monitor tons of files that probably won't ever be processed again.


Well in theory yes, but thats an extra step.
Suppose the photos in question have something to do with icecream and bycicles, and are readily treated for being added to the archive - but missing the desired metadata.
The workflow now is to search the folders with these tags via spotlight, open the right folders and paste the files. As the photos already sitting in these folders have the right metadata, copying it out of one photo to the new photos is performed.
Now one could move these folders to a "processingFolder" and let the magic happen there..
But as I see it, it is not possible to tag and move them using a staging folder, the folders where the different photos have to land are too different, with also different metadata in every case...
roxneft
 
Posts: 3
Joined: Thu Feb 07, 2019 8:30 am

I have now developed my script a little further.

Now I think I still have to tackle a corner case:

  • "Date Last Matched" seems to match the filename only, not the path plus filename.
  • When I add a photo to a folder whose filename has been matched before, also in another folder, the rule which updates the metadata is not executed because the name is already matched, despite being added into another folder.
  • if I rename the photo, it gets recognized by the rule, and then metadata gets replaced.
  • so I would have to maintain an index of all names of photos and when a photo is added where the same filename was already used, I would have to rename that.

  • now how would I tackle this??
  • I guess I would have to catch that with my own sqlite database to be efficient enough? Or use spotlight's database somehow?
  • I would have to add a shellscript to the conditions which compares the basename of the added photos with all files in the database, and if there is a match, rename the file before proceeding...
or is there a simpler way to watch for new files, that does not need date last matched?

that's the script I use now

Conditions:
Code: Select all
Date Last Matched ->  is Blank

Code: Select all
Kind ->  is -> Jpg Image


Code: Select all
#get directory
DIR=$(dirname "${1}")
cd $DIR

#count all jpgs
COUNT=$(ls -1q *.{jpg,JPG} | wc -l)
 #echo $COUNT > $DIR/COUNT.txt

#only execute when more than one jpg is in the folder
MIN=1
if [ $COUNT -gt $MIN ]; then

  #get oldest jpg
  OLDEST=$( ls -ltr *.{jpg,JPG} | grep -v '^d' | awk 'NR==2 {print $NF; exit}')


  #copy tags from oldest Photo to all new Photos
  exiftool -overwrite_original -TagsFromFile "$OLDEST" -EXIF:ImageDescription -EXIF:Artist -EXIF:Copyright -XMP:Country -XMP:State -XMP:AuthorsPosition -XMP:Rights -XMP:Creator -XMP:Subject -XMP:Description -XMP:Title -XMP:CreatorAddress -XMP:CreatorCity -XMP:CreatorRegion  -XMP:CreatorPostalCode -XMP:CreatorCountry -XMP:CreatorWorkTelephone -XMP:CreatorWorkEmail -XMP:ImageCreatorName -XMP:CopyrightOwnerName -XMP:LicensorName -IPTC:ObjectName -IPTC:Keywords -IPTC:By-line -IPTC:By-lineTitle -IPTC:Province-State  -IPTC:Country-PrimaryLocationName -IPTC:Headline -IPTC:CopyrightNotice -IPTC:Caption-Abstract  "${1}"
fi
roxneft
 
Posts: 3
Joined: Thu Feb 07, 2019 8:30 am

But as I see it, it is not possible to tag and move them using a staging folder, the folders where the different photos have to land are too different, with also different metadata in every case...


I'm not sure I'm following you on this. Could you explain why this wouldn't work?

"Date Last Matched" seems to match the filename only, not the path plus filename.


Not sure what you mean by this. Each file is distinct though if you did a move or copy of a file, it's possible some of its metadata travelled with it.

Can you take a step back and outlined a concrete example of a file going through this workflow?
Mr_Noodle
Site Admin
 
Posts: 11195
Joined: Sun Sep 03, 2006 1:30 am
Location: New York City


Return to Support

cron