Well as hinted above this happens only once, so there will be no increments to speed things up. The reason this has to happen is, because projects come back from backup and have to be re-linked to a different share. This is on our main production storage, so it's not exactly "fire up a python instance", nor under version control.
As this only happens once it can take a bit, as it does need manual intervention after the scan and then trigger the next step i want it to be as fast as possible still.
I made it a LOT faster by inverting the rules as hinted above. I would prefer for that to be more dynamic than my current implementation.
Currently i grab all global folders that can contain the scenes, then i find out how many scenes and subsequently shots there are and then parse only the subfolders that can contain max files. I would feel better if this wasnt as hardcoded as it currently is. Anyone has a recipe for translating a filterstring along the lines of
Basically it should iterate for each scene AND for each shot. But it should be open for different kinds of setups.
I think i want to ask if there is a general framework for setting up folder rules to be parsed. I guess i just don't know the name for what i want hehe.