A lot depends on your security enviroment and needs -- and on your level of trust with the outsourcers. A high-trust relationship, say between two studios in the same company located in different cities, is easy; a low-trust situation (a new outsourcer, an individual freelancer, or a studio operating in hack-friendly countries) requires a lot of planning.
Typically you want to ship two pieces: a simple shim that is installed directly and hardly ever changes, and then the main tools harness that you version check and download as needed. It's a nice courtesy to include an un-installer, or an off switch -- the outsourcers often work on multiple projects, it should be easy for them to be in 'you mode' or some other mode and know which mode they are in.
The shim is the shortest, dumbest bit of code you can write that figures out if new tools are available and downloads them if they are out of date, then runs the main toolset's startup routine.
The main harness is the actual toolset. It should be distributed as a unit (not loose files!) to make sure that all the pieces are at the same version -- for remote distribution, stuff like a locally locked file that won't go away or a file that perforce won't clobber are nightmares to find and fix. A single archive is much easier to validate. Any dependencies (like external dlls) should be included here as resources -- the harness should extract them and place them where they need to go on startup. Its a very good idea to have some kind of automatic testing that makes sure you can run with nothing except the shim and the distribution -- it's really easy to miss a dependency and very hard to debug remotely.
Distributing the harness can be done online with HTTP or FTP, or using source control. In max you can do the downloads with dotnet, in python it's a batteries-included built in. Network solutions are nice because they don't require much user intervention and don't give users the option to opt out. One important side effect though: this ups the burden on you to make sure that all distributions are tested and work out of the box -- autoupdate + crash-on-startup = disaster). Nowadays it's easy to get a cloud server from something like Amazon to host the tools distribution, that keeps your internal dev environment isolated (no need to poke a hole in your firewall for tools request) and it also guarantees uptime -- probably better uptime than doing it in house. However you need to have IT and managment that are comfortable with tools sitting on the internet; personally it's fine with me, since I know how easy it is to open up mxs or python files on somebody's local machine for prying -- but many companies will flip out at the thought of proprietary tools out on the net.
Plain old source control syncing works as well, but since it is opt in you'll have to expect some large % of users not to bother. Auto sync guards against that -- but it also means that perforce information (server address, user, and password) will be in the code and accessible to prying eyes. It also means you have to put source control code into your 'shim' -- which means it has to work with no perforce dlls yet, so expect ugly command line mongering if you go that route. Naturally, it won't work unless the users have correctly configured source control installed. And, of course, it only works on people with access to your depot. Many companies will never let that happen.
As far as security goes: you should scrub the tools for any info you don't want out in the wild - if your tools include, say, an email error handler that sends on a company address you should assume that the account is publicly visible; neither maxscript nor python can really guarantee your secrets will be kept. You can minimize 'internet' worries by providing your outsourcers with a local server that does net distribution on a LAN -- then you're only responsible for remotely updating that one box and you can assume the tools are all at the same version at that location. It's almost as good as net distribution and it keeps your stuff off of the whole internet. It's still true, though, that you have to assume every line of tool code is publicly available to anybody who knows the basics of mxs or python. Sort of like online games, its safest to assume the client is in the hands of the enemy
Lastly, there's always the email-a-file solution (again, this would be the 'lump' or distrbituion of the main toolset). Your outsourcers may not have internet connections at their desks (this is common in China). It's not very nice because (a) it depends on human beings and (b) those human beings are somewhere across the world and (c) you can't get them fired if they screw it up. If you do something like this, at the very least consider something like including an audit tool that lets you know what versions are being used on what machines -- then you can at least send emails saying 'half of the artists are using an exporter that's six months out of date'. This isn't much help if your outsourcers have no interned access, alas, but you can include some of this logging information in your deliverables to get at least some visibility into what they're using. Good metadata in the pipeline is also a must here.
Last but not least: Make sure that branching tools for different shops isn't going to kill you. Ideally you would keep everybody on the same code line, but it's highly likely that something will go wrong ("there's no version of max 2012 in Hungarian!", "Our exporter won't work in non-english character sets", "they're already mapping something else to the X: drive") that requires a slightly different code path between outsourcers and the main studio. It's better to do it in a real branch rather than in dozens of IF/THEN statements littered around your code.