Blendshape delta scale offsets?

I posted on cgtalk but I thought I’d cover my bases here

So I have 2 heads with same topology but modeled into different likeness and have the same UV’s. One has blendShapes and I want to transfer the deltas over to the new head. I’ve got it mostly working by using transfer attributes using UV space from the head with blendShapes to the head without and then applying a blendShape from a duplicate of the head # 2 to put it back into the original state.

Where it breaks down is for example the eye blinks. On the first head, the lids close and hit their marks. The head with the transferred attributes and corrective blendShape, head #2 eye blink doesn’t hit the correct shape. I know it has something to do with the delta offsets in relation to the 2 meshes but I can’t seem to wrap my head around it, sort of speak. I feel I need to calculate some sort of scale factor from corresponding points in order for them to hit the same shapes.

Any suggestions?
-Sean

Since blend shape deltas are stored as a point (x,y,z,1.0) that is relative to each vert, subtle sculpt chages in proportion will be much harder to calculat than global scale. You would have to measure landmarks that match the general direction of the shape and scale up the blend shape deltas, or easier yet, scale up the target weight. In your blinks scenario you can measure to lid to bottom lid in both models and find your scale factor with that ratio. The blinks won’t be the only problem like this, it is just the most noticeable

That makes sense. Thanks for the suggestion.
-Sean

Can you maybe adjust the weights of the blend shapes in Component Editor so that the eyelid verts that aren’t hitting their mark can reach the desired position? Or am I not understanding the problem correctly?

[QUOTE=snolan;27771]I posted on cgtalk but I thought I’d cover my bases here

So I have 2 heads with same topology but modeled into different likeness and have the same UV’s. One has blendShapes and I want to transfer the deltas over to the new head. I’ve got it mostly working by using transfer attributes using UV space from the head with blendShapes to the head without and then applying a blendShape from a duplicate of the head # 2 to put it back into the original state.

Where it breaks down is for example the eye blinks. On the first head, the lids close and hit their marks. The head with the transferred attributes and corrective blendShape, head #2 eye blink doesn’t hit the correct shape. I know it has something to do with the delta offsets in relation to the 2 meshes but I can’t seem to wrap my head around it, sort of speak. I feel I need to calculate some sort of scale factor from corresponding points in order for them to hit the same shapes.

Any suggestions?
-Sean[/QUOTE]

It’s not just for blinks. It would need to be done for every target. It’s just most noticeable when blinking. I haven’t quite implemented what I need but I think I have a better idea of what needs to happen to make this work.

You could do it with 2 blendshape nodes. First blendhape node puts your base head into the new unique shape, the second node runs in parallel with your blink deltas. Rotate the eye socket too far off axis and it starts to break down. Translation and scale should work okay.

At the very least you could use it to generate new targets for the second head shape.