haha, yes, having this on the workstations would be awesome. From what I’ve seen it’s not possible yet to take advantage of containers when it comes to deploy tools to artists. Currently containers run Linux inside and are within a VM. Windows server 16 offers “windows containers” which run Windows inside and don’t need to run in a VM either. i.e. the containers are run right on top of the host system and interaction with it should be very straight forward. But practically, right now, you have to work with Linux containers running in a VM if you use Windows. I think Windows 10 now allows Hyper-V instead of docker’s default Virtualbox, but that doesn’t seem much of an advantage.
However I would definitely watch what happens on Server 16. Containerizing Windows applications sounds very powerful. Especially since docker images can form dependencies. e.g. you have 1 core python image and all your tools include it. Yet each tool has its own container and will never “pollute” your core Python image. Within the image you can have parts of the file system which are persistent and which aren’t. Especially for a studio like mine, where we have up to 20 projects at a time, the ability to have containers which don’t “pollute” the base Windows installation, sounds great.
But I’m not thinking that far yet. My immediate concern is really server based software. Server based pipeline components for asset management, asset storage, databases, tracking, or server based bake/render jobs, etc. We try to distribute the servers because internet connectivity can be an issue for us. But updating even 4 studios with new builds, and testing them, can be quite time consuming.