Integrity checking of geometry

I’m in the middle of overhauling our Maya exporter to speed it up, and I thought I’d ask you guys what you do about checking your data. Our exporter does a huge amount of integrity checks on geometry to prevent bad geometry even getting out of Maya.

We check things like:

  • meshes without shaders
  • corrupt shaders
  • zero size meshes
  • zero size faces
  • non uniform or zero scaling on transforms/joint
  • meshes fully skinned to 1 bone
  • weighting < 0.1 to a bone
  • verts influenced by > 4 bones

and many more things specific to our pipeline. This is all done as a pre-export pass and if the exported geometry fails in any way, it doesn’t get exported. This prevents a huge amount problems in game, and attempts to ensure maintain fast and efficient art.

Do you guys do similar things?

What about those that use Collada, Do you modify the Collada exporter to check data or do your pipeline tools do error checking for you?

We’ve traditionally done many of those same checks at export-time. Lots of outright error checking as well as general sanity checks that may not be outright errors, but could create more subtle problems game-side.

I’m not a big fan of putting error checking like this in compiled plugins since they’re harder to iterate, especially for typical tech artists. The list of checks is also a pretty fluid thing over the course of a project, so I prefer having them live in a script rather than a C/C++ plugin.

We use the ColladaMax compiled plugin on our current project, but it’s managed (wrapped, essentially) by a Python-based exporter script. While we’ve made a couple bug fixes and minor feature additions to the ColladaMax plugin, the UI, error checking and general heavy lifting happen in Python. I really like the arrangement.

I’ll concur with Adam on the checks needing to be fluid throughout the course of a project…

Case and point:

In addition to many of the items already on your list, for a short time we had a rash of orphaned verts that artists were trying to export; verts that were part of a mesh but were not part of an edge, face, etc. These verts usually arose during the process of artists cutting apart building and vehicle models for our destruction system; turning a few exported meshes into dozens per export. Max makes it somewhat non-trivial to spot those orphan verts, especially in a large cloud of points, so I just started checking post export…

edit:
One more thing we checked progressively more aggressively is texture sizes. While probably unique to our process at Ensemble, we rarely put hard caps on texture sizes and number of sub-materials (within reason!) in order to allow some creative exploration by the artists. Unfortunately, Microsoft desired to have the game playable without the use of a hard drive. Because we were an RTS the nature of the game allows the user to jump to any part of the map at any given time; that meant we had to load every single object, texture and shader into memory at the beginning of time for each game. That put some pretty severe memory restrictions on a lot of things and texture sizes were both fortunately and unfortunately the most ‘malleable’ throughout the dev cycle.

So at the outset of the project, restrictions were lite and creative freedom reigned. As the project neared completion, a lot fewer benevolent processes remained. :wink:

Just because error checking is done in a plugin doesn’t mean it’s not fluid. Rolling out a new .mll is no different to a new script. Artists simply get latest.

Over the course of a project we still add/alter any integrity checks we do. I mainly choose to use a C++ plugin due to the greater speed of execution. Mel is just too slow, and the plugin has been a major part of the pipeline since well before Python was introduced to Maya.

I’m the only TA here so I don’t have to worry about the C++ skills of other TA’s. However as we grow I can see more TA’s being needed and the use of C++ may become more of a problem.

As for textures, we generally develop for DS, PS2, Wii, and 360 all on the one project, often sharing geometry across several consoles. For this reason the resolution of the textured directly referenced in maya isn’t a concern. We have platform specific texture folders that the game loads from. Currently the game build process does the sanity check on those textures. We should check the textures that are in those folders directly from Maya so the artists are aware of the current texture memory usage per platform…Good idea! :slight_smile:

I think the idea of forced sanity checks on each export can become tiresome to the artist. I certainly agree such checks need to take place, but that could be run on a per situation basis… Lard, that could be why I’ve used MEL in such a situation and never found it to be slow or a hinderance.

If an artist or programmer can see there’s numerous art-based errors in a level then it would call for a sanity check, but whereas I’m just moving objects around and re-exporting, there’s no real need for such a check to be forced: re-checking texture paths, quads/n-gons etc.

We do a similar (but smaller) set of checks. But ours aren’t done at export time. The exporter spits out pretty much everything to an intermediate file (including textures as they appear in Max) which is then parsed by one of or several of a set of data pullers. It’s those that do any checking and clean up. With that setup it means that data that’s not valid for one type of object isn’t flagged for another where it’s fine. Our last game used instanced geometry for things like bushes and rocks. We made extensive use of non-uniform scaling to vary those so non-uniform scaling is fine for that puller on that data type. It isn’t valid for bones however, so we might check for that in the skinned character puller (I don’t think we do, but we could).

As far as possible we either fix up or handle unexpected data too and allow the artist to use it if we can. Rather than throwing an error when the user hasn’t bothered to assign a texture yet it just puts a tiny white one on there, if the user has assigned a checker texture to check tiling, it uses that rather than expecting to find a bitmap in the platform specific folder. If the user hasn’t made the platform specific version of any texture yet, it pulls it from the original Max exported one. If the user has left a live stack that has a patch modifier on a spline, then it exports the geometry as it appears in Max and so on. Anything that’s undesirable but not actually illegal doesn’t stop the export or conversion, but it logs it to the console and to a file for later checking. This means you can be pretty sloppy in the early iterations of work when you’re throwing stuff around trying out ideas. Which speeds up early development of scenes and so on. They’re very easy to check on target at a very early stage. And it also means you can export a load of things that you know aren’t going to be optimal without the exporter throwing an error every time.

I like this approach more than say the Nintendo exporter that throws a fit if you’ve got objects hidden in the scene that you didn’t want exporting and it can’t find the textures on them.

But it is easier to abuse. Although it logs errors and warnings, it doesn’t force you to do anything about them or read the log. More than once I’ve ignored the warnings that were telling me a platform specific texture wasn’t in the expected folder so it was going to use the high res, high colour version from the intermediate file, then wondered why the PS2 version slowed to a crawl.

We also need to include more error trapping. There’s a few things that currently break because they aren’t trapped properly. A closed spline for instance can be interpreted as a mesh, and therefore isn’t exported as a spline, so the tool expecting to find specific splines will crash. Haven’t found a nice solution to that one as we often use lofted splines as geometry (for scenery in racing games for instance here) and I like to keep them live to allow easy editing. Currently we can set the visibility flag to invisible on a closed spline and it will get exported as a spline, and visible if we want geometry. But it defaults to the geometry state so the above error happens quite often. But this setup hasn’t been used for the most recent game and may not be used again, so we haven’t bothered to fix it.

Our situation is not unlike Robin’s. We throw warnings to the exporter log, and have very few actual errors that halt the export. Because many of our artists use a button that both exports and previews on target, we would rather them get to see their item on target while they fix the technical mistakes in the export, so that they can address multiple issues at once, and generally get less frustrated.

Our checks are also defined as a list of callbacks into C++.  This was critical for us because traditionally, our scenes were probably an order of magnitude larger than most scenes should ever be.  Speed was too critical.  We're talking 15-45 minute load times.  And if we were not in c++, equivalent exports.  The scenes are a lot more sane these days, and we could probably move to a script based export check.  My personal favorite would be to make the export test harness agnostic to such things, so we can fluidly write checks, push them down to c++ if we need the speed, and still maintain the ability we have to turn on and off certain checks for the project.  That, in my opinion, would be a very beautiful solution.

For speed, we pass a ref of the list of valid objects to export around.  Each check is free to pare down the list to try and reduce work for the next check.  This worked very well for us, as we placed the aggressive checks at the beginning ( isExportableType? ) and the very fine ones that take a bit more time (hasZeroAreaFace?) at the end, where there are fewer nodes left to operate on.  What might be fun though, is writing a parallel exporter, do all your checks, then collide the list of valid exportables at the end.  Funny thing is, for sane scenes, that might not actually give you anything back.  That, I think is an experiment for another time though.

Cheers! -Lith

I set up a system like that when I was back at Electronic Arts – the exporter would run, as a pre-export step, a SmokeTest framework and each project could drop modular little mell snippets for the smoke tests into their tools folder. Different teams had different specific problems.

I had a few standard buttons on each Smoke Test - like “Find” which would select anything that failed that particular test, and “Try to Fix” for things that were fixable by script. When an artist selected the failed test, a little description would show, too, which explained what went wrong, why it usually happened, and what the artist might do to fix it.

We had little tests for all sorts of stuff – I think one of the most used were actually to clean up after the old MJPolyTools edge and loop splitting scripts that everybody used then. The garbage would totally litter a scene, but not using the script would slow down the artists, and cleaning it up pre-export was pretty easy.

Here’s a screenshot of it’s early stage. Pretty simple, but the amount of time it saved was awesome.
http://www.alexhogan.com/folio/scripts/smokeTestBig.gif

Thanks guys it’s great to get an insight into how others work. It sounds like our system is quite similar to yours Alex. I do have errors and warnings in the sanity checks and let the export complete with a lot of the less serious problems.

Adding a ‘Fix’ button on the problem report is something I’ve been meaning to implement for quite a while. That’ll make a lot of guys happy :slight_smile:

Before unpacking your wedding dress, make sure that you have a clean, uncluttered area in which to spread the white wedding dress out away from food, pens, children, pets etc. Your hands should be clean and dry, and it’s wise to remove any jewellery that could snag the fabric of the wedding dress. Ideally, wear fine cotton gloves when handing or moving the plus size wedding dresses