Results 1 to 5 of 5

Thread: distribute multi-file script to others easily

  1. #1

    Default distribute multi-file script to others easily



    hello everyone,
    so i made a couple of tools recently and found i was using a bunch of the same functions over and over again in each tool. i decided to break the pieces out of the script and add them to a small external library that they can all reference. so now the time has come where the tools are done and need to be distributed but i was hoping to not have to distribute the library with each tool. is there some way to compile the dependencies of a script along with the script itself? or maybe there is some completely other solution? any tips would be appreciated.

  2. #2
    string
    Join Date
    Jan 2011
    Location
    Guildford
    Posts
    101

    Default

    It might be worth looking at Maya Modules. They allow you to create encapsulated packages which make it very easy to distribute.

    The docs (From Maya 2014 onwards) are quite good around Modules now. The idea is that you have a folder which contains a scripts folder, plugins folder, icons folder etc. You then have a single .mod file which describes the path (as a mininum).

    In terms of steering that towards your goal, you could have a distributable library module, and then a set of further modules that you deploy that utilise that. There are a series of complexities you have to be aware of though - such as ensuring that the library is up to date with the toolset. Ensuring backwards compatibility is critical when distributing shared libraries - so if you're not already I'd strongly consider adding some TDD or BDD into your workflow to ensure you're not going to critically block anyone on an older version of the library. Providing you always retain backwards compatibility you could inject a system in your library to self-update via http if a tool initialises it expecting a later version etc.

    If you want to avoid all that you could put yourself through a form of 'build/packaging process' whereby you programatically cycle through all the library dependencies and copy them into a subdir of your tool module whilst at the same time renaming it to prevent conflicts. For example:

    Your development code base:
    shared_lib
    ....some_mod_a
    ....some_mod_b

    some_tool
    ....some_tool_file
    ....shared_lib_n
    ........some_mod_a
    ........some_mod_b

    In this scenario you have 'sucked' a copy of your library (shared_lib, later renamed to shared_lib_n) inside your tool package and would need to auto-refactor based on the import changes. This can definitely work - and given clean code the refactoring can be done relatively reliably - but its success hangs in the consistency of your code and the strength of your packing process (that you'd need to write) and your level of testability to ensure that all key functionality remains intact in the final package before release.

  3. #3

    Default

    awesome exactly what i was looking for! i decided to go the modules route. thanks so much for the words of caution as well i will keep them in mind moving forward for sure.

  4. #4
    Technical Artist
    Join Date
    Jul 2008
    Location
    Austin, TX
    Posts
    677

    Default

    there are a couple of in-depths threads on this topic here, i'll try to dig them up.

  5. #5
    program Theodox's Avatar
    Join Date
    Mar 2012
    Location
    Seattle
    Posts
    1,107

    Default

    You can also zip all of the files and distribute the zip file. If the zip is on any python path that will work as well. More here: http://techartsurvival.blogspot.com/...m-egg-man.html

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •