• RuntimesUnity
  • Dynamically adjust hands position and rotation

Hi! I'd like to know what is the best way to adjust hands position and rotation based on what they are grabbing. I have 2 sprites representing the left hand and the right hand.

Important notes: animations of the grabbed objects will be manually controlled by the user according to the mouse position; also the animation can be interruped in any time, and must "freeze" untill resumed. I achieve this by setting time scale equal to zero. So far, so good. The problem is to manage multiple animations in multiple EntryTracks.

I have tried keying hand bone within object animation, but it's not the right way, since hand will stay there, visible, unless an animation occours and override it's bone transform. Once this animation ends, the hand will return to the previous animation state.

I'm wondering if there is a smart way to use transform constrains at runtime maybe. Suggestions?

  • Misaki ответили на это сообщение.
    Related Discussions
    ...

    To dynamically adjust the hands' position and rotation based on what they are grabbing, you can use transform constraints in Spine. This approach allows you to control the hands' movement and rotation by setting up constraints that can be adjusted at runtime. You can create transform constraints for the hand bones and adjust their influence based on the object they are interacting with. This way, you can programmatically change the target of the constraints or adjust their influence to achieve the desired hand positioning and rotation.

    Additionally, you can use multiple constraints for the same bone and toggle them on and off during the animation to achieve different effects. This method allows you to maintain control over the hand positions even when animations are interrupted or paused. By using transform constraints, you can ensure that the hands follow the objects they are supposed to grab, and you can adjust these constraints dynamically based on user input or other runtime conditions.

    Gomi91 Spinebot's answer is not wrong, but I think you would rather use the IK constraints than the transform constraints in your case. With IK Constraints, you can easily make your character react to the environment, as shown in the Spine demo page below:
    https://esotericsoftware.com/spine-demos#Inverse-kinematics
    You can set IK constraints on your skeleton's hands in the Spine Editor, and then change the position of the target bone for those IK constraints at runtime as needed.

    • Gomi91 ответили на это сообщение.

      Misaki Don't know if this is an important detail, but hands are bones on their own, they aren't attached to arms or something. There isn't a body let's say.
      I'd like to ask also how to access at runtime to skeleton constrains, since I cannot find any reference in documentation

      • Misaki ответили на это сообщение.

        Gomi91 IK constraints are available even if there is only one bone, so they should still be useful on your rig.

        Skeleton has methods to find constraints. For example, you can find an IK constraint using findIkConstraint(): https://esotericsoftware.com/spine-api-reference#Skeleton-findIkConstraint

        Then you can change the properties of the returned IkConstraint: https://esotericsoftware.com/spine-api-reference#IkConstraint

        In case you can't find the explanation in the documentation of the spine-unity runtime, you may find it in the general documentation of the Spine Runtimes.

        • Gomi91 ответили на это сообщение.
        • Gomi91 оценил это.

          Misaki Thanks! I'm gonna do some tests and I'll let you know what I found