Distributed build systems for batch processing

Hi All,

Has anyone played with distributed build systems (SCONS, IncrediBuild, SNDBS, etc) for batch processing of assets? Did you get anything going in a production environment?

In my test case I have have been trying to get a bunch of Mocap conversion jobs to run in Maya standalone (mayapy), across an IncrediBuild grid. I got everything up and running, the only problem is that the remote processes crash when initializing the maya.standalone module. The jobs successfully complete on the host computer, but not on any remote machines.

Any advice or guidance welcome,

Keir

We use Incredibuild XGE for distributed data processing. Specifically for “crunching” our intermediate export formats into game target files. Flat-out awesome.

I have no experience hooking up Maya or Max to Incredibuild, however.

When Bosse was first exploring the Incredicrunch concept, Adam, he and I talked about using Incredibuild and MaxScripts to parallelize batch processing of Max files. He said it was possible, and probably not hard to set up. We never got time on our schedules to investigate any further, though.

Is Maya standalone what is used for network rendering? I know that when you bring Max up in renderfarm mode it only has the capability to render scenes. That’s the tradeoff for being able to install on copy of Max on N renderfarm machines and not need a license for each installation. Maya might have the same limitation.

Maya standalone is a bit more than a render interface.

mayapy.exe is a full blown python runtime. You can use it just as you would any python install.

maya.standalone is a module you can import into the python runtime. When it is initialized it starts up a full Maya session, just with out the UI. You can do anything that the GUI version of Maya can do (except draw UIs).

A lot of our tools are setup so that you can use them in the GUI to operate on the current scene. Or you can run them in batch mode and pass the tool a list of files to process. This is great for exporting assets, cleaning up bloated scenes and in my above example processing mocap data.

The mocap processor is pretty cool. It reads the input files or directories and builds a list of jobs and setups up a job queue. It then spawns four (one per core) new Maya instances (mayapy.exe) which request work from the job queue. This setup meant our animators could process the data a lot quicker. I was hoping to take it to the next level with IncrediBuild, but so far Maya isn’t playing nice.

I use scons to process 3dsmax models into an intermediate format.

I am only using it in a “distributed” way in the sense that all of my users use a shared cache, so that no one has to convert the same asset twice.

Scons has builders
Builders map an input file to an output file, backed by a function that does the conversion.
You can write custom builders.
The function I use to back my Max2Intermediate builder creates a max script with two inputs.
The first input is the original 3dsmax file. The second is the desired output intermediate file.
It then executes max with the maxscript as an argument, and waits until max completes. It then returns control back to scons.

Because scons can cache anything that has a defined input, output, and black box transfer function, it ends up saving a lot of time. For further benefit the cache can be shared by everyone on the network. Artists, engineers, etc.

Hey JonnyRo,
Thats an idea I hadn’t considered, it would be nice to get cached results.

Have you tried running your max scons scripts in a distributed across multiple systems?

Quick update to this thread,
I have had some success with my task using a custom version of SN-DBS.
As far as I know it is only available to registered Sony developers. But if you are interested (and have your sony pass) there is a forum thread on scedev all about it.

Keir