i am currently writing a tool that is gathering all max files in a given directory including subfolders. The current approach works nice and i already tweaked it a little and got it quite a bit faster it still does not really scale that well. Here’s what i got currently:
def gatherMaxFiles(self, path): EXCLUDES = ["SOUND", ] files =  for root, subFolders, filenames in os.walk(path): for exclude in EXCLUDES: if exclude in subFolders: subFolders.remove(exclude) for filename in fnmatch.filter(filenames, '*.max'): files.append(os.path.join(root, filename)) return files
EXCLUDES is a list of subfolders to ignore completely (as parsing the render folder for max files is pointless)
Given the size of the folders i have to parse this still seems awfully slow (up to half an hour).
Any hints on how to speed that up? Go native?