[Animation] Looking at different ways to reduce keys



We’re looking at ways we can reduce the number of keys, currently I am using a process that checks the last position of the key and if it’s less than a value (0.05 this can be increased to remove more keys) we’re also only able to work with Linear keys, no Bézier Curves Currently doing it in C++ Maya API so I will convert to Python for you:

for value in indexValues:
				if value != 0:
					keyValueChange = cmds.keyframe(curves, index=(value, value), query=True, valueChange=True)[0]
					self.keyInfo['total'] = self.keyInfo['total'] + 1
					valueChangedFromLastKey =  abs(keyValueChange - lastValue)
					if valueChangedFromLastKey <= 0.05:	
						self.keyInfo['deleted'] = self.keyInfo['deleted'] + 1

It will clean up half of our Animation sizes, while this was a quick fix I am looking at better ways to accomplish a more automated system since we’re still needing to manually confirm this looks good.

I’ve stumbled across a few papers but most of them are how to use curve calculations and how to set up Bézier Curves. Does anyone have any suggestions on how I could make a better algorithm to remove unnecessary keys?


I think I’d try something more like the approach in this paper. It basically works like this:

  1. Start with keys at the beginning and end of the animation. You’ve got a line.
  2. Walk along the curve and check each existing key for its distance to the line. Insert one point into your line at the location of the largest distance to the line.
  3. Now repeat the process for the two sections: find the distance for the keys in the two segments to their respective lines. Find the biggest outlier and give it a key.
  4. Repeat the process, iteratively adding points one at a time at the key farthest from the existing approximated line
  5. Keep track of the total error after each step. When it falls below an acceptable tolerance, stop adding points

This should be a much more adaptive method than using just a fixed delta.


Here’s a quickie implementation of the adaptive resample in python

    def resample_keys (kv, thresh):
        start = float(min(kv.keys()))
        end = float(max(kv.keys()))
        startv = float(kv[start])
        endv = float(kv[end])
        total_error = 0
        offender = -1
        outlier = -1
        for k, v in kv.items():
            offset = (k - start) / (end - start)
            sample = (offset * endv) + ((1 - offset) * startv)
            delta = abs(v - sample)
            total_error += delta 
            if delta > outlier:
                outlier = delta
                offender = k
        if total_error < thresh or len(kv.keys()) ==2:
            return [{start:startv, end:endv}]
            s1 = {kk:vv for kk, vv in kv.items() if kk <= offender}
            s2 = {kk:vv for kk, vv in kv.items() if kk >= offender}
            return resample_keys(s1, thresh) + resample_keys (s2, thresh)
    def rejoin_keys(kvs):
        result = {}
        for item in kvs:
        return result        
    def decimate (keys, tolerance):
        return rejoin(resample_keys(keys, tolerance))
    # feed in a dictionay of key:value pairs and a maximum error per key span...
    decimate( {0:0, 1:10, 2:1, 3:1, 4:1, 5:1, 6:2, 7:2, 8:1, 9:3, 10:4}, 3)
    # Result: {0.0: 0.0, 1.0: 10.0, 2.0: 1.0, 8.0: 1.0, 10.0: 4.0} # 


I actually came to post almost this exact thread, but figured I’d piggy-back off of this thread for now. :slight_smile: We’re looking at solving a similar problem, but with a slightly different use case. We have some baked keyframe data we want to reduce, but rather than generically reduce the keys based on a tolerance, we have specific keys we want to retain throughout the timeline (the resulting “retained” keyframes would have their tangents adjusted appropriately that they match the baked curve).

In our initial tests, we’re trying to use curve fitting algorithms that take in points along the curve and return control points for each bezier segment. The data coming back is close, but the level of error increases as the original tangents become more extreme. Here’s some of the resources I’ve used so far:

I also came across and old post by Morgan Loomis using the rebuildCuve command to optimize a curve that is built from baked keyframe data, then pass that back into the animation curve. I modified it to optimize the curve down to only 4 points which I then use as the points for each bezier segment (start point, end point, and two positions used for the bezier handle positions). The results with this are actually pretty close to the original, but also with increasing error as the tangents become more extreme.


I’m curious to see if anyone else has tackled a similar problem using any curve fitting algorithms.