Rotating points in space to use as morph target

hello,

i have one arm mesh set in a particular position. and i have another arm mesh that shares the same topology but which has been sculpted differently and positioned/rotated elsewhere in 3d space.

What i want to do is subtract the translation/rotation of the second mesh so that it rests ontop of the first mesh and i can use them as blend shapes, so when i blend it just reveals the detail of the new sculpt and doesn’t move the mesh in space.

i thought i would find the average vert position of either end of both pieces of geo and find the vector of each piece, then find the angle between and rotate the points to align somehow. But i wanted to know if there is a way of somehow doing this with just maths rather than in 3d, storing the new aligned points in an array and using them as a morph target.

does this make any sense?

thanks,
Sam

Are they both static meshes? How did you deform the arm into the raised position? Generally you want to do that using a deformer so you can invert the raised-arm sculpt through the deformer chain (by using something like this: http://www.chadvernon.com/blog/resources/cvshapeinverter)

thanks capper, thats exactly the sort of thing I’m after. The problem though is that have two 3d scans of an arm mesh, one mesh has the arm in an open neutral position, the second mesh is the arm bent into a muscle man pose. I retopologised them to be identical topo.

what i want to do is bend a skinned character arm up and have it automatically blend into the flexed blend shape. But first i need to align the top half of the flexed arm to the top half of the neutral arm. then the same for the lower part of the arm. So then when i assign as a blend shape, the surface deformations will be static, and the actual bending of the arm can be under the control of the joints.

i know this will probably be tricky but it will be worth it to get the realistic muscle detail of the body scanned flexed muscles

Sam

I’m assuming that if you’re creating correctives then the mesh has an operational deformation rig? I’ve handled this problem in the past by deforming the neutral mesh into the position of the scan, and then binding the scan to the rig, copying weighting, and returning to neutral. At my last job we started off doing this manually but eventually wrote an automated process that performed this and determined an optimal pose that matched the scan.

well im glad you said that, so its not an insane thing to do. Ive just got these 3d scans from the scan store. They’re just two meshes i have retopod and was hoping i could salvage the deformation details somehow.

I tried to split the bent arm at the elbow and wrist and awkwardly rotate the arm into the neutral scan pose. Then i connected it as a morph target and bound it to the arm joints. I connected the y translate of the ik handle as a driven key to blend to the muscly arm when it becomes bent. It looks like it would have the potential to work if i spent enough time aligning the meshes. But maybe i will need some intermediate morph targets because when the arm is half way bent, the geo around the elbow bend up weirdly (where the bicep and forearm would have been pressing together in the original bent arm scan)

I was hoping there was a magical program to do this, but i will just try and do manually and make it work.

thanks alot for the info, if you know any special secrets let me know

Sam