Distribution techniques for external Python tools

Hi Guys, hopefully I am posting this in the correct area of the forum, forgive me if it is incorrect.

First, the question: What are some ways to distribute a Python tool to other artists (who are using Windows)?

I have been tasked with making some external batch tools (currently needed on Windows only). The studio is using Max (I am a Maya/MEL guy), and they would like to have some tools for moving files, and the like OUTSIDE of Max.

Because Python is one of the easier languages to pick up in a short time, has cross-platform options, and with the available time-saving modules, I have chosen to use Python for the external tools. Using my first tool as an example: open a file dialog, pick a source directory, open 2nd file dialog, pick a destination directory, then have the script copy all *.max files from source to dest.

Currently, my knowledge says I need to install Python (and any modules like wxPython that I use in the process) on every artist’s computer, which sounds lame to me. I have also read there are some tools which will package my script and modules into an exe (but so far, I only tried their demo of packing up a print “Hello World” script into an xe, which it failed miserably, so I am not all that hopeful for that yet.

For those of you who distribute python scripts to your artists, do you have a good way of doing this?

Thanks again for your help and info guys!
Ken

Everyone is going to do it a little different, but you have 3 options really as you mentioned: Copy all modules to artist machines, copy all modules to a network share, bundle it all into an exe.

The first will be the fastest and reliable way. It does require some work on the artist as they have to acquire the files in the first place. You could use something like git and just push changes to artist machines or set up a file watch maybe on their machines as well.

The second is easiest for the TA as you only have to push files to one place and artists get what they need. the problem, there is network reliability and download speed. This may or may not be a problem.

I wouldn’t recommend the third as you’ll have a lot of redundant files. If you are having to bundle wx with every single tool, the files will get big for no reason. And as you said, it’s certainly not as intuitive to get going. Im also not sure of the debugging hurdles it would place on your code.

Our studio uses the first and we have the artists sync their perforce workspace when new tools are available.

You’re not alone in being irritated, this is Python’s biggest weakness.

Py2Exe, which you tried, is pretty standard for doing what you want in Windows land; unfortunately it’s a not simple one-click to do. If you’re having trouble, you might want to read up on the unhappy state of python distribution tools and also look further at the way setup.py’s are structured.

There’s not necessarily one right answer here, but there are some important parameters to think about:

  1. Do you care about having a working python interpreter on each target machine? If you do, you’ll probably have to make sure everybody gets a standard install and write some tools to make sure there’s a common setup (file locations, versions, etc). If not, you can go for an exe based distribution where each tool is shipped as an exe with an internal python intepreter and libraries).

A side aspect of this question is whether you care about isolation – ie, if you distribute one tool on Monday and a different one on Tuesday with a few changes in your core library, do you want them to be reading off the same python files, or do you care? Isolation has pros and cons – the brutally short version is that it’s wasteful of disk space, but it tends to ensure that old stuff keeps working as it did before (which you may or may not like) exe-based distribution leans toward more isolation, although you can work around that if you have to.

<DISCLAIMER> I generally have a strong prejudice against maintaining a full pythonn install unless the target users are also programmers. Python expects a certain level of manual installation maintenance by somebody with a programmer-type personality. It’s easy for people to get into trouble by trying to follow instructions off the next and suddenly having a different set of libraries than everybody else in the building . It’s also possible for people to install tools off the internet that install their own pythons, and that can really screw you up if you get a different version of a particular library in ‘your’ install. I prefer closed distributions like exes. The only exception is for Maya users, who have an isolated install already which I don’t mind leveraging since I know it’s there and usually pristine

For this reason I’m not going in to the “standard” python way to distribute loose files, which is the combination of distutilsand easy_install. If I was doing IT for a lab full of NASA guys maybe. Otherwise, blecch.
</DISCLAIMER>

  1. What’s your UI language? That will dictate the details of any .exe based distribution. WX or PyQT applications have their own procedures but basically you’ll be compiling an exe with included dlls for the GUI and you python code as a resource. You’ll need to create an installer to handle all the dependencies, which is a sad fact of life in windows land; Both WX and QT have options for doing this.

The killer here is DLL dependencies – you’ll need to make sure your users get the right version of stuff you don’t even touch, like the VisualC++ redistributable that WX depends on. Make sure to do your testing on a clean machine, the DLL thing is madness inducing because (as a developer) you frequently have things that your users don’t). Try using Dependency Walker when things get ugly.

A somewhat less evil alternative, if you don’t need lots of specialty cPython modules, is IronPythoni. You can build a C# gui app with an IronPython interpreter and have windows native UI plus the .Net libraries to play with. The downside is that IPy is a ‘reverse engineered’ python – it’s not binary compatible so some modules with C cores don’t work (lxml and p4Python come to mind, although they’ve finally started supporting ZIP files). The plus side of going with IPY is that you could deploy the apps with ClickOnce , which will handle automatically updating your users to the latest version. If you go this route you have a dependency on the .Net framework, which is a pain but it’s a pretty simple install from Microsoft, it works on all windows 7 machines, and you can make your ClickOnce’s install it for you.

An outside option if you want to go for web based distribution would be Jython. If you control the server, you could run some of the UI server side; otherwise you could try using the outdated Java Web Start path to give your users simple one-click download and run of tools. You’d be shipping python code zipped up into JAR files if you go this route, you would have a dependency on Java (which most people will have). You’d get the Java UI librarys ( Swing or AWT ) for free if you go this route.

  1. In all cases, I’m a big fan of keeping my own code in .zip files of pyc’s. One of the number one source of bugs in open installation is conflict between files; say, for example, you refactor a module into a package but don’t delete the old pyc file from your original module file. You’ll continue to use your old code and won’t even know it until you hit a bug. Zipped PYCs make sure that the state of code you’re sending out is predictable and there won’t be any phantom pycs lying around . Plus they are less likely to suffer from users adding their own stuff into your sandbox.

  2. If you go with source control as your distribution system, you probably want to look into having a two-stage setup. You don’t want to confuse the source control you use for your own work with the one the users will be getting things from – if every checkin you make goes right to the users you are asking for trouble. You can’t control when they do their gets, so they could easily get half of a multi-part checkin and go blooey. At the very least you should have a procedure for branching or move-renaming finished scripts over from your work location to the distribution location.

Personally, I prefer to publish stuff to a distribution system and use source control for source control. A real build setup with automated tests can save you a world of headaches . I remember one company where the entire art staff was idled for three hours because somebody checked in a maxScript in Unicode, instead of ASCII – then everybody got it via sync and everyone’s max blew up, taking the auto-sync system with it so all of the changes had to be synced manually for 100+ artists.

1 Like

Thanks everyone so far for your input!

So, it sounds like Python might not be the best choice for distributing tools to the team… is there something either of you might suggest as an alternative? I don’t know a whole lot about dos and such, but I nknow thhose are good for windows native batching processes, I just dont know much about interfacing with it (i.e. opening a file dialog, or otherwise keeping the artists away from a command prompt. Given the simple things I need to do at the moment, wondering if maybe I should go another route?

Oh, it stinks but most alternatives are worse. BAT files are stone age tech, and very hard to maintain or debug and there’s no good GUI option. There are more sophisticated stepchilidren like Windows Powershell but they’re still pretty clunky.

C# is a good fast development language – it’s wordier and less flexible than python but less icky than C++ and has lots of ‘batteries included’, especially for Windows only use – good libraries for XML, web, etc. Java is sort of the same (though it’s even more verbose). Both of them have built in GUI options. So you could do a lot with either of those, as long as you’re up for doing a full programming language.

However, the unbeatable thing about python is that you get a lot of code reuse between tools that run inside max/maya and those outside; it’s worth suffering some of the annoyances to avoid rewriting every stupid little utility twice. Do make sure to keep your DCC-specific stuff separate from general purpose stuff so you don’t accidentally try to call a Max function inside a file formatting tool or something.

Thanks Theodox, when i first envisioned a solution for this, C# was the alternative, but in the short time that I have to learn it, it seems as though Python is slightly easier for me. The team is using Unity3d, which supports C# and UnityScript (a slight bastardization of JavaScript) so I had considered killing two birds with C#, but they are not yet expecting me to work with Unity in that capacity, and I also have my own web design business, where I have put a lot of time into JavaScript and PHP stuff, and since unity does allow for UnityScript I opted to just learn Python for tools (plus, Python is very common amongst all the ‘serious’ technical Artists out there so it seemed a safe use of educational time in case I need it at the next gig) :wink:

I had one more question regarding your option #2: putting the modules and such on a shared server. Did you mean to put the modules on a shared server and somehow point the users ‘environment’ to also include from that shared spot? Or do you just mean put it on a shared server so poeple can grab latest version and bring it locally?

Thanks again for your time on this!

I think that was Maxx’s #2.

I’ve never tried running multiple users out of a single share, I don’t know if it’s a great idea unless you set it up carefully. If you’re distributing PY files, users will need write access to the share to run out of it – which will be slow and also allow them to accidentally muck with the shared tools. If you put only PYCs it might work, but you’d probably have random slowdowns depending on your user’s access patterns.

You could, however, easily include a self-syncing stub file which always checks a read-only share for the latest files (again, I much prefer zips to loose files). Thats what I do now, although I’m going to move that to an HTTP file server instead so I can support in-house and out-of-house with the same infrastructure.

Hmmm, the point about a “next gig” really makes me point to python. But in your current situation, c# may be best. It’s pretty easy to pick up and UI is a dream to work with. Either way, you’ll have similar problems of distribution so go with your gut.

As for the network share, we had all our moudles on a network share and when max started up, it would add the share to the environment path. This was cool as i could make changes and then dump them on the share and everyone was up to date. However with great power comes great responsibility :slight_smile: One bad copy and the whole team is effed. Also, as theodox pointed out, if the network gets slow or goes down, everyone is screwed. Honestly I wouldn’t do it again, but thats just me.

We had the idea to just install git on everyones machine and set up a repo that we could push to. So you could even have certain artists test changes and when everyone was satisfied, push to the rest of the team.

Theodox had some good points, never release .py and .pyc together, choose one or the other or bad things WILL happen. Oh and to get a standard python install you can do it with a batch file and the python .msi installer. They won’t even know you’re doing it and they dont have to make decisions on where things get installed.

if you already know JS, and your users already have it, you might consider doing some editor scripting in Unity. No real advice to offer here, but it’s possible from the little I know.

As an outside option there’s also Boo which looks like python but is a compile-able .Net language like C#. You could write code in boo to use in .exe’s that I think can also run inside the Unity Editor <DISCLAIMER> Saw that in the docs. No first hand knowledge!</DISCLAIMER>

I’ll second Theo’s suggestion of Unity editor scripting, if you’ve got the access (and it doesn’t necessarily have to be a completely external, standalone tool). It can occasionally be a bit fiddly, and documentation isn’t stellar (it’s generally good, but a bit inconsistent), but it’s quite powerful, whether that’s writing tools, or asset post processors to handle a lot of import automation (setting up collision hulls, materials, etc on import, for example). You also have local file system access (via System.IO), and more or less the full .NET suite (I think Unity’s using 2.0?).

Good stuff everyone!

I definitely like the idea of having a second repo and pushing to that, as well as the batch install of python as another attempt to keep everyone in sync.

I also agree with the Unity stuff, although, because I have been warned that not all artists (maybe even none?) will be handling Unity (they said mostly designers and the engineers will be using it, no idea ‘who’ wold be adding/tweaking at once it goes in though) and because of the per seat cost, I figure i will wait until i see the definitive need to do in-Unity tool(s).

Just had a classic example of distribution fail today that illustrates why this has to be done carefully.

The engineers took down the build server, so my usual method for distribution ( the server compiles, tests, and packages my stuff into a zip file then serves it up to users via their stub files) was offline. One of my artists needed a rush bug fix, so I made the changes, checked in the code, and then activated the ‘debug mode’ I use for testing – an environment variable switch that makes the stub file load from the python files in perforce instead of the zips.

But, lo and behold – he’d done this once before, so his local copy of the python directory was full of pycs. And one of those pycs was write-locked, and it also happened to be a module that had since been refactored into a package. So he couldn’t start his maya, because the module always tried to run the out of date pyc instead of the up-to-date code in the init.py

Luckily this conversation was on my mind so I immediately nuked his directory and re-synced and it worked. But, it’s a perfect example of why you gotta eat your distribution vegetables.

As a brief aside, I’ve found .pyc files to be more trouble than they’re worth. Long ago I told Python to not generate them, and been grateful for that on several occasions. Their load-speed benefit is just insignificant with most tools. If anyone feels compelled to keep .pyc generation on, I’d at least recommend you keep them out of source control and let the user machines create them as needed. Far fewer files in your depot, and fewer potential headaches.

Here’s how to suppress bytecode generation.

That said, I like running Pycompile on everything and shpping only pycs, since that ensures at least a minimum guarantee that every module is syntactically correct and all the imports are valid.