Folder structure and File naming conventions

Rob, I take it from that example you steer clear of the default max project structures for exports.

Just out of interest, how many people actually find the default project structures defined in max/maya/xsi useful?

We have a system that works with the project structure defined by max (with some userpath tweaks) as there are a lot of good internal properties to make scripting life easier, but it would be nice to to define our own structure without Max getting all whiny about default paths.

I don’t know maxscript (been messing with python/mel mostly since I’m in maya land) but I think I see what you’re talking about. So you’re embedding the export paths into the file forcing the exports to go their rightful place? And you puppeteer the directories from the sidelines and so get the ability to change the entire file structure at any time without having to poke people?

-sorry if I come off dumb on this, still learning.

To the previous post, I find the maya default projects more so necessary than useful because the texture paths constantly break and need to be updated -been thinking about writing something that will collect the assets and relink them, I know Aftereffects has that option, weird that Maya doesn’t. I don’t remember if it’s the same issue with Max.

[QUOTE=Erilaz;710]Rob, I take it from that example you steer clear of the default max project structures for exports.

Just out of interest, how many people actually find the default project structures defined in max/maya/xsi useful?

We have a system that works with the project structure defined by max (with some userpath tweaks) as there are a lot of good internal properties to make scripting life easier, but it would be nice to to define our own structure without Max getting all whiny about default paths.[/QUOTE]

I don’t. Actually I turn off project folders option for artists because it creates so many problems with our art tools (file locations, etc).

Light

The main issue I have with the default structure is the textures aren’t in a folder below the scenes. That makes breakage very easy, and navigation a real pain.

The only thing I use Maya’s project feature for is to set the folder to the root of the project’s content directory structure, so i don’t have to navigate through 20 folders to get my files! :slight_smile:

Unfortunately, due to the inherited nature of our game, many of the assets are in disarray or employ 2, 3, and sometimes 4 different conventions. Its quite the quagmire :slight_smile:

I use a combination of appdata and MXS generated custom attributes to organize the disparate data. I’ve also written a small collection of tools for displaying the paths of all textures in a Max file (since they all are located in different folders), loading them, checking them in/out of Perforce, etc. Its an… interesting environment, and essentially I’ve tried as hard as I can to abstract away the disorganization of the data so that a usable workflow can be enjoyed. Python has also been very helpful on this front, mostly for crawling through the entire depot to find asset references in game data (text files), mass find/replaces, or just plain making sense of it all.

One thing I would like to do is make a markup language for our assets.

Some files are very dirty, I would like the artist to fill out a form in our toolset as part of the sign off process where he selects the main character mesh, and all items exported… Some way to allow us to automatically go through and re-export characters.

On Crysis, we were asked to flip all characters 180 degrees with a few months left in the project.

This was a nightmare because:

  • Many early assets used physique
  • Flipping a char means re-applying skin/physique to reset ‘initial skeleton positions’ saved in the modifier
  • Some artists saved a compiled asset as a name different than the mesh in the max file
  • Only ~80% of assets had LODs
  • Some had 1 LOD, some 2, some 3
  • Some used multi res at different settings, but only one modifier, which meant you had to change the mod, export, change it again, export, etc…
  • Some compiled meshes were no longer in the folder with the max file

Many other issues. I would like to avoid this in the future, I am thinking the only answer is:

  1. trying to automate things, run checks at file export and save
  2. ruling with an iron fist
  3. making rules and having people sign something saying they agree.

BTW, I said we were ‘asked’, my answer was ‘no’. We had an upwards of 300 CHR (copiled character meshes) in game.

We’ve had all sorts of problems with this in the past, so now we build the rules into the scene exporters.

The tools throw errors and prevent export if textures/references aren’t in the proper places. It’s temporarily annoying for the artists, but they soon get used to it.

It also provides lots of candidates for the “Clockwork Rat ‘o’ Shame”[SUP]tm[/SUP]

Our tools are set up around the naming conventions… an artist may ignore the naming conventions, but soon finds his or her project not being able to utilize the tools…

One project doing it manually is all they ever need to experience and they learn… ha ha…

Occasionally we’ll review the structure, about once a year or so, with the TD’s and animators to trim or add what’s needed.

Erilaz: “I take it from that example you steer clear of the default max project structures for exports.”
My thoughts… hell yeah… turn the whole system off, which of course is not in the UI anywhere I know of, but can be disabled in the ini…

Chris.Evans:
My only though would be step 3 before step 2: Remember, we’re all “employees” and where there to do the man’s bidding… if that bidding is naming your file correctly, then they better do it or else… but having group consent is important and the structure design will be stronger with input from your top producers… and having tools to make naming it “right” easier than naming it wrong is a HUGE help…

Drea:
“Clockwork Rat ‘o’ Shame?” I can only image what THAT is…


Triumph Sprint ST

Though I’m ashamed to admit it, this is actually something I quite perversely enjoy laying out at the start of a project, (I even have an excel sheet that dynamically creates file naming conventions from data stored in a seperate sheet so I can easily adjust the entire project list in the early stages if need be). Of the two folder structures described in the inital post, beginning the industry in a team that used the bottom-up method, I would never wish it upon anyone - with it resulting in many folders containing the same layout of subfolders. As such, it’s top-down all the way for me because it keeps not only all project assets grouped, but all similar file-types, (and therefore different disciplines within a project), working within their own folders. Meaning even if others don’t run a tight ship, the animation folder never becomes a mess.

Within the top-down structure, it makes sense to make each delineation based on groupings of diminishing importance, with the project at the very top and the actual action group in question at the very bottom, for example:

//game_x/animation/ingame/male/combat/walk/… created from
//project/discipline/animtype/skeleton/set/subset/…

For file naming conventions, I’ve grown the habit of the filename reflecting at least part of the folder structure for two reasons:

  1. It allows files to be easily located by those outside the animation team when tracking down bugs etc.
  2. Actions will be grouped alphabetically even in a completely flat structure, as often happens once they are re-grouped inside the game engine.

An example of this would be:

//game_x/animation/ingame/male/combat/walk/M_CBT_WalkForward.anim

Clearly if the entire folder structure were added to the filename then it would grow very long indeed, so I use capitalised abbreviations as there should be a manageable amount for the team to learn due to it only describing the skeleton and action set, but I really HATE abbreviations on actual action descriptors (eg. WalkForward becoming WkFwd) - these should be clear for everyone to understand at a glance as there will be potentially thousands per project.

Lastly. regarding seperating raw/working assets such as Maya files from exported files to be used in the engine, I’ve worked on projects that keep them in seperate folders (both with identical top-down structures) and with everything together, and both work as well as long as the latter sticks to the same rigid naming conventions for the multiple filetypes that exist in the same folder.

I’d like to talk about asset reuse as well.

We consistently reuse elements for different projects, but we’ve recently been having the debate over cross-linking versus duplication of assets in the project folder structure.

[ul]
[li]If you cross link a file between projects you know where it originally came from and it saves space, but you have the added danger of losing links when you archive or change something.
[/li][li]If you fully copy an asset into the new project you have an independent copy, but it can take up space and issues can arise from duplicate filenames or changing assets later on.
[/li][/ul]

The space issue is almost a moot point these days as large drives are cheap and easy to expand, but the other points are valid.

Say for example I have a 250 frame sequence of a logo:
//project01/artwork/animation/renders/logo_render/logo_render_(1-255).tga

is it better to copy that whole sequence into another project for reuse:
//project02/artwork/animation/renders/logo_render/logo_render_(1-255).tga

or link back to the logo from project01 from project02?

Generally I would prefer to copy the whole thing to avoid complication, but it can get messy if you start to see the same files across multiple projects.

Good topic to bring up! In my opinion, if it is a separate project it should have a separate asset library.

In the past I have decided to just link to the previous project’s location, but then down the road find that I need to change something about and make it unique to the new project and it is very difficult by this point in time. I think its better now to just prepare for it and duplicate the asset in the beginning.

It creates a problem though if both projects are still in production and I need it so when the asset is changed in one location the changes are reflected in the other location for the other project. Can get around this by setting up some batch files in each location and then just manually run one when needed in order to sync the files to the other location. If manual syncing might not be all that reliable because of the number of people working on the project or something, there is a great little app called Robocopy that you can configure to monitor the locations and mirror them either when changes have been made or by a time interval only if changes have been made. Mirroring is dangerous though, of course, so it needs to be done with extreme caution.

Syncing I guess is one way to do it, but as you suggest, there is the issue of accidentally syncing something you didn’t want updated!

The other option we’ve been discussing is to set up files like an OOP structure, where you reuse and expand assets from a specific common resource folder.

For example:
//common/artwork/animation/renders/logo_render/logo_render(1-255).tga
//Project_001/artwork/etc…
//Project_002/artwork/etc…

Again, the danger of cross-linking still applies, but it’s much more centric and only one file path needs to be changed if assets are relocated.

I’d opt for a copy but have a system in place that can tell the files are trunked off another project, and which one. If you have that it would be simple enough to keep the copies up to date if needed. Could be as simple as a project code in the filename if current naming conventions allow. When an asset needs to be changed separate from the original project, the code is lost and it is no longer considered an instance of the original file.

[QUOTE=Mathieson;1220]
It creates a problem though if both projects are still in production and I need it so when the asset is changed in one location the changes are reflected in the other location for the other project. Can get around this by setting up some batch files in each location and then just manually run one when needed in order to sync the files to the other location. If manual syncing might not be all that reliable because of the number of people working on the project or something, there is a great little app called Robocopy that you can configure to monitor the locations and mirror them either when changes have been made or by a time interval only if changes have been made. Mirroring is dangerous though, of course, so it needs to be done with extreme caution.[/QUOTE]

Would branching via your source control potentially solve this issue? I guess it depends on the source control system in use, but for something like Perforce it “should” be relatively easy. Just create a branch of the original assets for the new project and then just integrate the changes when they occur in the old… Maybe trigger the integrate through a simple monitoring app or just go the easy route and write a batch file that runs on Windows Scheduler as often as would be prudent?

On all of my projects, we duplicated assets like crazy early in the project in order to get “playable but inaccurate” assets in to the game for first-playable. This often caused headaches later for the reasons others have mentioned… So for the latest project, which conveniently started with very few re-used assets, I decided to go with relative paths for everything; export locations, custom material & shader data, etc. A little easy MaxScript string parsing to prune the paths to specific ‘anchor’ directories in the hierarchy that were always consistent. This allowed me to move and copy stuff willy-nilly wherever it needed to go without stuf becoming a tangled mess.

It seems there is a mixture of different types of project we are talking about. Some people are involved in film and others in Games. I think there is a big difference myself.

Tied into this, I’d love to know people’s opinions of absolute and relative paths in relation to their folder structures. Maya’s relative paths always seemed a little buggy to me, so I like the idea of absolute paths and setting up perforce clientspecs to sync to consistent folders for everyone on the project, even though it has its own downsides. There’s nothing more annoying than going to someone else’s computer to help with a problem and realize that they sync to a different root so textures don’t show up or something like that.

You could have everyone using a mapped drive using the subst command and then use absolute paths. This way work can be kept on different drives on different machines but still all work correctly.

As always its all about making sure everyone is doing the same thing.

[QUOTE=mikiex;1408]You could have everyone using a mapped drive using the subst command and then use absolute paths. This way work can be kept on different drives on different machines but still all work correctly.

As always its all about making sure everyone is doing the same thing.[/QUOTE]

That’s pretty much what we do.

Every machine has a physical drive called y:. We have code and resources workspaces on perforce that sync to the root of y:. Then the tools create two mapped drives (r: and s:) that point to the two folders in y:.

Branches make their own folders in y: and the tools redirect the drive mapping when you switch branch, so which ever branch you’re in the paths are the same. One problem is that drag and drop from P4V opens from y:, so we have a callback in Max that detects the y: path and reopens the file from r:.

Looking back at what I wrote, it seems quite complex, but it actually works seamlessly in practice and we don’t have the problems with inconsistent paths between machines that we used to get.

#39 Drea:
Damn, using subst for branches seems like a simply brilliant idea. We’ve never been able to branch “source data”, but only code as it’s relatively trivial to change a few clientspec lines. But getting packages like 3dsmax or Photoshop to figure out branches was an adventure we never mastered.

But just syncing the same root (we have all our Perforce depots mapped to a local W:\ drive) and then substing from there just seems brilliant. I’ll need to try the idea and steal it immediately if it works over here… :wink:

Forcing every client to use the same root folder should just be mandatory. There will always be this software X that refuses to play nice with relative paths, so forcing everyone to use the same drive on Windows just works… Hard drives are pretty cheap so just get IT to buy a separate drive for everyone. That way even if your depot grows to gigantic sizes, you can always just swap in a bigger drive and get back to work in notime… Well, I think most computers get outdated before running out of hard drive space.

SamiV.