Using HAZEL to run a Automator Workflow

Get help. Get answers. Let others lend you a hand.

Moderator: Mr_Noodle

Using HAZEL to run a Automator Workflow Sun Jun 07, 2020 8:14 am • by Paul1762
I am trying to use Hazel to run an Automator Workflow App. but when I try and point the rule to the appropriate Automator script, all the scripts are greyed out.

Do I need to convert the Workflow? I have just tried running the Run Shell Script directly with Hazel without success.

Ideas?

Thanks Paul
Paul1762
 
Posts: 49
Joined: Sat Nov 08, 2014 5:32 pm

Re: Using HAZEL to run a Automator Workflow Mon Jun 08, 2020 10:03 am • by Mr_Noodle
What's the extension on the file?
Mr_Noodle
Site Admin
 
Posts: 11247
Joined: Sun Sep 03, 2006 1:30 am
Location: New York City

Re: Using HAZEL to run a Automator Workflow Thu Jun 11, 2020 6:44 am • by Paul1762
.app
Paul1762
 
Posts: 49
Joined: Sat Nov 08, 2014 5:32 pm

Re: Using HAZEL to run a Automator Workflow Thu Jun 11, 2020 10:34 am • by Mr_Noodle
Try saving the workflow as a .workflow file instead.
Mr_Noodle
Site Admin
 
Posts: 11247
Joined: Sun Sep 03, 2006 1:30 am
Location: New York City

Re: Using HAZEL to run a Automator Workflow Thu Jun 11, 2020 12:54 pm • by Paul1762
changing the app to workflow allowed me to select the workflow in Hazel... however after I run it, hazel ran and done process the folder but did not make the changes I expected. When ran independently the workflow works as expected.
Paul1762
 
Posts: 49
Joined: Sat Nov 08, 2014 5:32 pm

Re: Using HAZEL to run a Automator Workflow Fri Jun 12, 2020 10:45 am • by Mr_Noodle
What did it do differently? Can you post the workflow?
Mr_Noodle
Site Admin
 
Posts: 11247
Joined: Sun Sep 03, 2006 1:30 am
Location: New York City

Re: Using HAZEL to run a Automator Workflow Fri Jun 12, 2020 2:40 pm • by Paul1762
does not run through and rename images...

#!/bin/zsh

# associate array for SHA key and duplicate filenames
typeset -gA dx=()
typeset -ga exifdata=()
typeset -giZ2 dupecnt=0
typeset -giZ3 nseq=1
typeset -g RPT="$HOME/Desktop/dupe_rpt.txt"

function rpt_writer () {
# print by SHA key, original file, and duplicates (in any child dir)
for k in ${(@k)dx};
do
print ${(k)dx[$k]} >> ${RPT}
print ${${(s:|:)dx[$k]}[1]} >> ${RPT}
printf '\t%s\n' ${${(s:|:)dx[$k]}[2,-1]} >> ${RPT}
printf '\n' >> ${RPT}
done
return
}

: <<'COMMENT'
Recursively processes child folders in specified parent directory. Extracts
image data using ExifTool, or where absent fabricate that data to provide a
standard TEMPLATE for file renaming. Encountered duplicates will add 2-digit
suffix to the TEMPLATE which forms the renaming OUTFMT string. Report writer
to audit duplicate images.
COMMENT

# parent folder passed into script on the command-line
STARTDIR="${1:a}"

# case-insensitive directive
setopt NOCASE_GLOB
for f in ${STARTDIR}/**/*.(raf|cr2|dng|jpg|jpeg|tif|tiff)(.N);
do
# ideally, place these three results into array if image supports them
exifdata=( $(/usr/local/bin/exiftool -s3 -d "%Y%m%d%H%M%S" -DateTimeOriginal \
-EXIF:FocalPlaneXResolution \
-EXIF:FocalPlaneYResolution "${f}") )

# check number of items in array. There will never be a 2 category
case ${#exifdata[@]} in
0)
# could not get DateTimeOriginal so wing it with creation (Birth) date
CD="$(stat -t '%Y%m%d%H%M%S' -f '%SB' "${f}")"
exifdata=(${CD} 0000 0000)
;;
1)
# got DateTimeOriginal but no FocalPlaneResolution data
exifdata+=(0000 0000)
;;
3)
# If FocalPlaneResolution is floating point,round to nearest integer
[[ ${exifdata[2]} =~ "." ]] && exifdata[2]=$(printf "%.0f" ${exifdata[2]})
[[ ${exifdata[3]} =~ "." ]] && exifdata[3]=$(printf "%.0f" ${exifdata[3]})
;;
*)
# should not get here as won't get just two items
continue
;;
esac
# yyyymmddHHMMSS fx fy seq
# 20200530221158|6532|6532|_001
# the (j) concatenates the three array elements as one string
# deleted from below to stop sequencing
TPLATE=${f:a:h}/${(j::)exifdata}_${nseq}

: <<'COMMENT'
# reset assoc array and duplicates counter when child folder changes
if [[ $curdir != $prevdir ]]; then
dupecnt=0
fi
COMMENT

# get 40 character SHA checksum from current file
dkey=$(shasum -a1 "${f}" | cut -d ' ' -f1) 2> /dev/null

# check if current file is a duplicate by key lookup
# use not (!) because non-duplicate branch will improve speed
if [[ ! ${dx[(k)$dkey]} ]]; then
# non-duplicate processing
OUTFMT="${TPLATE}.${f:e}"
# push new key/value pair with SHA key and original filename with delimiter
dx+=( $dkey "|${OUTFMT} => ${f}" )

else
# duplicate file processing
# return the count of duplicate filenames, excluding original file
dupecnt=$(( ${(ws:|:)#dx[(k)$dkey]} - 1))
(( dupecnt++ ))
OUTFMT="${TPLATE}-${dupecnt}.${f:e}"
# add filename to key's list of duplicate names
dx[$dkey]+="|${OUTFMT} => ${f}"
fi

# rename the image
mv "${f:a}" ${OUTFMT}
done
rpt_writer
# release associative array storage
dx=()
exit
Paul1762
 
Posts: 49
Joined: Sat Nov 08, 2014 5:32 pm

Re: Using HAZEL to run a Automator Workflow Mon Jun 15, 2020 10:17 am • by Mr_Noodle
That looks more like a shellscript. Is there a reason you are using Automator here?
Mr_Noodle
Site Admin
 
Posts: 11247
Joined: Sun Sep 03, 2006 1:30 am
Location: New York City

Re: Using HAZEL to run a Automator Workflow Mon Jun 15, 2020 11:28 am • by Paul1762
there is no reason except that I use the Automator to ask for the folder to indicate the search...

Paul
Paul1762
 
Posts: 49
Joined: Sat Nov 08, 2014 5:32 pm

Re: Using HAZEL to run a Automator Workflow Tue Jun 16, 2020 10:36 am • by Mr_Noodle
Hazel passes in the file that matches the rule. Is there a reason why you need Automator to additionally prompt?
Mr_Noodle
Site Admin
 
Posts: 11247
Joined: Sun Sep 03, 2006 1:30 am
Location: New York City

Re: Using HAZEL to run a Automator Workflow Tue Jun 16, 2020 11:42 am • by Paul1762
I used Automator to get this working as a proof of concept now I know the script works... I would like to run it with Hazel so that I can add other task in...

How should I run it in Hazel as part of a workflow?

Paul
Paul1762
 
Posts: 49
Joined: Sat Nov 08, 2014 5:32 pm

Re: Using HAZEL to run a Automator Workflow Wed Jun 17, 2020 10:27 am • by Mr_Noodle
Since it's a shellscript, use the "Run shellscript" action. You'll need to take into account that the file matching the rule is passed in as the first argument.
Mr_Noodle
Site Admin
 
Posts: 11247
Joined: Sun Sep 03, 2006 1:30 am
Location: New York City

Re: Using HAZEL to run a Automator Workflow Wed Jun 17, 2020 11:27 am • by Paul1762
Thanks very much for your post...

Can you expand on passing in the first argument please...

Paul
Paul1762
 
Posts: 49
Joined: Sat Nov 08, 2014 5:32 pm

Re: Using HAZEL to run a Automator Workflow Thu Jun 18, 2020 10:34 am • by Mr_Noodle
Check the help. How you access the argument is different depending on the scripting language you use but the script editor should tell you which one.
Mr_Noodle
Site Admin
 
Posts: 11247
Joined: Sun Sep 03, 2006 1:30 am
Location: New York City

Re: Using HAZEL to run a Automator Workflow Sat Jul 04, 2020 9:34 am • by Paul1762
I have been working on trying to get this to run, it seems I have complicated what I want to achieve.

so what I thought I will use the exiftool to correct the names of the extensions and also rename the names of the files.

I can run the following command lines through Terminal , however I I would like to create a Hazel work flow?

to rename a file:
exiftool '-FileName<DateTimeOriginal' -d "%Y%m%d_%H.%M.%S%%-c.%%e"

Rebuild and fix corrupted EXIF metadata
exiftool -all= -tagsfromfile @ -all:all -unsafe -icc_profile -F

Correct Extension
exiftool -ext '*' '-filename<%f.$fileTypeExtension' '

I have create a hazel work flow with tree rules one for each of the above, however I get errors when I try this out.

2020-07-04 14:28:36.892 hazelworker[8521] [Error] Shell script failed: Error processing shell script on file /Users/Paul/Pictures/Temp/test1.dng.
2020-07-04 14:28:36.893 hazelworker[8521] Shellscript exited with non-successful status code: 1
2020-07-04 14:28:41.118 hazelworker[8521] Caught signal: 15

what am I doing incorrectly...

Many Regards Paul
Paul1762
 
Posts: 49
Joined: Sat Nov 08, 2014 5:32 pm

Next

Return to Support