Best Way to Share Your Scripts?

Being extremely new to scripting, Im wondering what the best practices are for sharing your scripts in Maya. At work, I’ve been saving the script, then the others open the script in their editor and save it to the shelf. Is there a better way? I know you can do the whole, they put it in their folder, then just import scriptName, and save that to their shelf. But Im curious if there is a more efficient way so I don’t have to train them all to add shelf buttons and what not. It’s not a huge deal, but any help or insight on how you go about sharing your stuff at work would be greatly appreciated.

Cheers

Check everything into revision (source) control and have people sync tools/scripts before they launch Maya. Write a simple batch/python script that sets up Mayas scripting environment variables, paths, etc., then have it launch Maya.

I’d have a different take on that .

Source control is a tool for developers, not a distribution mechanism for users. If every checkin makes a change to what the users get, you run a lot of risks: if somebody synchs up and starts while you’re checking in files one at a time they can easily get irreproducible bugs. If this happens to users often, they’ll start figuring out how not to get the latest tools - which makes support a nightmare. As a dev you get leery of checking in work because it every checkin might disrupt your users - so you end up doing big, infrequent checkins… which makes it hard to see and understand your own changes later. Plus you get tempted to fix bugs on the fly : "Oh, let me just check in a fix for that - sync up in 5 minutes " – which sounds great until it creates ripple bugs someplace else in code that might have relied on the original bug. So for all those reasons it’s really a good idea to have a barrier between your working environment and the regular user tools. You don’t want force users to be guinea pigs when all they want to do is get work done.

At a minimum, you can maintain two different copies of the scripts under source control: the one the TA’s work on, and a safe, tested and fairly stable one the artists use. They don’t even need to know the system is there - they just sync a folder that’s well tested and only changes when you’re ready to roll out. You can maintain the stable copy using p4 integrate (or just file copies): when you’ve tested the work branch thoroughly, copy it over to the stable branch in one go so everybody on the user side gets a complete package. If you go this route you can even do it on a schedule: say, every tuesday morning you update the tools, and everybody knows to be on the watch for new bugs. This is a minimal amount of work for you and they get to work efficiently most of the time.

This thread goes into a lot more detail on the topic

Theodox is spot on.

At LH we have a deployment script which is run at the startup of Maya which sync’s to specific revisions of scripts. As a tool developer, I/we can then specify release versions as well as test versions of a tool. This way a user never has to manually sync to our tool directory, and we know that they’re all using the specific versions of tools we expect and still have the flexibility to target a few people with our tool alterations before rolling the changes across the entire team.

I’m using a DVCS so I never get that problem Theodox is describing. I can do local check-ins all the time without disrupting anyone else and then my build server is set to only build on a specific tag.

Switching from SVN to hg (mercurial) is the best thing I’ve done lately.

what we do at work is we have separated repositories one is for developers so we have a local for each developers and a main for dev team once we have a stable release we need to deploy it get pushed to the actual repository every user pulls from so everything that goes to the final user is super controlled but really easy to use , then again maya is started with a custom .bat or linux bash script and here we go

We have a menu bar that is seprateded into categories for each discipline.

We have, mainly because of some Max crap, a program that takes a p4 workspace and deploys any changes to a network folder on ‘build’ - i specify when a build takes place. Each set of builds is versioned and stored on a network drive. The user is notified about this, also when I specify, and can easily ‘get’ the latest version, or roll back to a previous one.
When the user ‘gets’ the latest version it is really just a mapping of the ‘build’ network folders to max appdata folders. In maya you don’t need to do this as you can setup script paths, in python or mel, to point to external folders… as i said… max crap.

I like this setup [except the max crap of course ;)] as it allows me to work in p4 with my scripts, keeping nice incremental backups of all that stuff, and also a level of separation to what the user deployment revisions are. I personally like using version control for any of my work.

I have worked in a setup where anything i checked into p4 was what the user would get immediately on p4 sync (in maya). So essentially we setup a users maya.env to point to a bunch of p4 folders that contained our scripts. Once i updated a tool/script i emailed everyone and all they had to do was sync the ‘Tools’ folder [and restart maya].
I didn’t hit any show stopping issues doing it this way, though it was pretty regular that someone would forget to sync and something wouldnt work… “Have you sunc to the tools” was a common fix in this setup.
I did have to be pretty careful of what and how I submitted stuff. I was also, pretty much, the only one submitting stuff. If you were looking at setting up a robust pipleline for a larger team, this would’nt be the way to go. But, it was fast and simple, and as long as your smart and aware of the potential issues you could do it that way… it’s not the ideal setup.

Happy hunting dude.

Is this for Maya, Max, or something else?

Got a system setup this week for deploying scripts. Using the same P4 idea though I read some of these comments, I should make a dev build and a release build folder though Im the only one making tools so I do pretty rigorous testing before submitting. Artists access the tools through a dynamic menu created using directory structure. Also I miss python haha :frowning:

We’ve been using a very simple but effective setup here for distributing our “pure” maxscript tools. Users set up a startup folder in their Customize/System Paths/3rd Party, and it sets them up with a tools menu in their max client. The scripts source and run from a network location. I work locally, and can switch my client to work from a local version-control development copy with a single click. I then push my local versions to the network when I deem them bug-free. Users can see the changes I make in real time without restarting their 3dsmax clients, they just re-run the tool from their menus. If I add a new tool to the menu, it’ll just appear next time they restart max.

This system might benefit from another layer between dev and release for QA to poke around with, but honestly, the only people that are going to be able to put these tools through their paces beyond “this opens and doesn’t error immediately” are the artists and animators using them in production.

The exception to this are tools that require non-script components, specifically DLLs, like our exporter. They don’t share nicely from a network location, so they need to have local copies on each client. These get installed from an internal webpage. The exporter complains and refuses to export when a user’s local exporter version is out of date.

BTW: for folks using net shares as a distribution system, make sure you know the user capacity of the server that’s sharing the scripts. If you run off a generic shared folder, rather than a ‘real’ server that is set up for large numbers of users, your users sometimes get bumped (I think the limit is 20 users for Windows 7) This causes problems particularly if you use drive remapping to make the shared folder look like a local drive. Check with your IT guys before going down that route :slight_smile:

We use a server / network shared drive that everyone connects to.
Every maya install has it’s own Maya.env that is located on the server which needs to be copied to the user’s maya prefs location (this is a one time only copy, after this, you’re done.)
This env file contains everything needed for Maya to set relevant paths (proper usersetup.mel/py, shelves, site-packages, etc)
We have split up maya folders on that server with global plugins/scripts that are version independent, and versioned folders for each maya version that are version dependent.
Same for Fusion (we use a masterprefs file), all it needs is to run a bat file that sets 2 system environment variables, and that’s it.)

Code is maintained via svn, and release versions are copied to the server that everyone connects to (only problem you’ll run into is when Windows has dll’s that need to be replaced but are in use by workstations, this is manageable in a small team but might become a problem in larger windows environments.)

As for servers, we have switched everything to CentOs , i have a minimal usb install with a custom shell script that installs and sets up the rest with little user intervention (packages, LDAP, SMB, NMB, iptables, etc), so we can literally have a new fully functional file server running in under 30 minutes.

1 Like

that sounds like a pretty great idea, expecilly since it is so easy to build 1 purpose distributions like that.