Allzpark | Application launcher for the 21st century

Application launcher and environment management
for 21st century games and digital post-production,
built with bleeding-rez

Windows Linux MacOS

Hi all,

I’d like to present my latest project that I’ve put together over the past couple of months.

It’s an application launcher, for when you need control over what software and which versions of software belong to a given project. It builds on the self-hosted package manager and environment management framework bleeding-rez, providing both a visual and textual interface for launching software in a reproducible way.

Also keep an eye on its integration with Avalon, the fully-featured and open source VFX and games pipeline, building on amongst other things Pyblish and

Have a look at the website for details, and feel free to share your experiences or ask questions here or on GitHub, and tell your friends! :slight_smile:


This looks pretty cool. Quick question, what would be your approach to deploy the launcher for artist.

There are a few ways, depending on your environment.

The simplest way, considering artists won’t necessarily want to author their own packages, is to put Python and Allzpark at a shared network location.

s:\python\python.exe -m pip install allzpark
s:\python\python.exe -m allzpark
# booting up..

This will enable them to take advantage of the packages authored by TDs and developers like anyone else. You could make a shortcut/script on their desktops and use pythonw (on Windows) to avoid typing and the additional console window.

For TDs, authoring packages requires access to the rez executable, for which you’ll want each machine equipped with Python and an install of pip install bleeding-rez such that they can call…

cd my_package
rez build --install
# building..

Once you’re comfortable with this approach, but want to manage multiple versions of Allzpark, then I would recommend each machine (including artists) be equipped with Python and Rez, and for Allzpark be a Rez package.

rez env pipz -- install allzpark
rez env python-3 allzpark -- python -m allzpark
# booting up..

Did that answer your question?

Thanks for your reply. So if I put pthon and Allzpark at a shared netowrk location. That helps the artist install allzpark, what about like to control all artist have the same profile or when TD update some package of a profile? I think I need to get my hands on it to understand it better.

Yes, those things are explained in the Getting Started tutorial on the site, but in a nutshell - packages are all shared similar to how Python packages are shared. Where Python uses a PYTHONPATH to look for packages, Allzpark (i.e. Rez) uses REZ_PACKAGES_PATH which contain directories with Rez packages.

So you can store your packages anywhere, and add them to the path prior to launching Allzpark.

$ export REZ_PACKAGES_PATH=/path/to/packages
$ ls /path/to/packages
$ allzpark
# booting up..

Profiles and applications and software are all packages. Like PYTHONPATH, you can add multiple directories, for more control over organisation.

so how does it work when it comes to different profiles uses different package? Does the package needs to be pre-installed from user ?

For example of my Q:
Say Alita uses maya2018 and kingkong profile use maya 2019. Does the user need to have both maya 2018 and 2019 installed on their machine (on window OS) ? or there’s some deploy mechanism that the user don’t need to install anything…what comes with the profile will load the correct version of the software for them ?

It depends.

In an ideal world, each Rez package is self-contained. They each contain both the metadata like version and requirements to other packages, and the payload - like the whole Maya install.

That’s how it works for the vast majority of packages, and is the recommended workflow for Rez in general. However, because Maya is quite large, what most people do is let Maya reside locally, as a regular install, and have the package reference that install. The advantage is that you save space, network traffic and ultimately time spent loading Maya off a network; at the expense of having to install Maya locally on each machine, and of having this package not be self-contained. Not being self-contained means you can technically resolve a profile using this Maya package, but whether it works depends on the local environment.

So ideally, you would want every package self-contained - and some do self-contain even Maya and even larger packages (depending on how much space you have, how quick your network is and how complicated it is to deploy software like Maya locally, using e.g. Ansible/Chef/Puppet). But if you can’t (like most) then you would install Maya as-per usual, and make a package similar to the one you’ll find in the Allzpark Demo

The profiles for each project is independent of either option, and you could change your mind at any point without affecting the profile (instead affecting network traffic etc like mentioned) and would look something like…


name = "alita"
requires = [


name = "kingkong"
requires = [

If you do decide to self-contain Maya (and other DCC), then you could also employ the Allzpark localisation feature (see Localisation on landing page) to synchronise your networked Maya package with a local copy for local performance. Best of both worlds.

Thanks Marcuso, so localistion feature will try to synchronise the networked package with a local copy? if we updated the package it will sync to it and once it’s sync then it will just run locally ?

Yes, but it’s nothing fancy.

Because a package is already self-contained, it merely copies it to a local path, one that is also on your REZ_PACKAGES_PATH.

The Python equivalent would be something like…

pip install six --target \\some\networked\path
set PYTHONPATH=c:\local\path;\\some\networked\path
python -c "import six"

At this point, Python would find and import the networked six, but if you “localised” it…

robocopy \\some\networked\path\six c:\local\path\six
python -c "import six"

Then it would find the local copy, due to being ahead of the networked version on your PYTHONPATH. Allzpark localisation works just like this, with the addition that Rez packages are versioned. So when you release a new version of any package, any user of said package would be using the latest up to date version (as expected), but would need to localise again if need be.