Added a check box for enabling hand tracking. It is located under "AI Settings" menu > "AI Tracking Modes" section > "Hand Tracking".
Added "Debug Bounding Boxes" checkbox under "Canvas Settings" menu. Axis aligned bounding box and bounding polygon have separate color and opacity settings. Bounding polygons would help visualize collision shapes.
Fixed a lag bug caused from turning on and off the camera/video. The software activated body tracking too frequently in a short amount of time. Mitigated by slowing down the activation from microseconds to milliseconds.
Fixed "Showing Landmarks" canvas bug where it was not showing. The canvas was erasing faster than it can draw due to the 2.0.3 lag bug. Synchronized the erasing only when it is ready to draw the next frame.
Added hand gesture tracking for closed fist, point up, victory sign, I love you sign, thumbs up. thumbs down. Hands open is assumed to be the default hand position.
Optimized setting Spine animation for hand gestures only once. Prevent continuously setting Spine animation for the same hand gesture animation.
Added background thread support for body tracking AI on supported modern web browsers. The background thread start up the AI immediately without hindering the main thread. Users would experience shorter or no wait after starting the body tracking for the first time.
I am currently working on including a physics engine. Experimenting various ideas. There are physics constraints that have to be anchored to the Spine bones otherwise the physic bodies won't follow the Spine skeleton. There isn't any graphical user interface so that would have to be include before the release. Physics collision adds its own complexity and problems so I would have to add it at a later date.
Misaki Yes, several physic constraints could be used together to create various usages. In the video the ball is using gravity force that is pointing upward.
The web application will switch between playing the eye blinking animation track and setting the track time based on tracking data. This should result in a smoother blinking.
Added a "blinking speed" property into the model's "Single Value Properties" to control the play back speed of blink. At default it should be "2" which is about half a second of blinking animation ( 1 second / 2 play speed ). Note that the duration will be shorter because the animation track will stop at where the eye tracking state. This animation stop is to prevent jumping from full eyes wide open to eye tracking state after the animation track finishes.
Hey @SilverStraw I've been following this thread for awhile now. Amazing stuff and very exciting to potentially take some market share away from Live2D (one of the most backward programs I've ever seen) Anyway I've done some of my own tests for the vtuber prototype and I've run into some problems. I'm not sure if I am the one that is the cause or if it's an issue on your end. I'm using the latest 2.0.8 prototype.
I took very simple square shape graphics and meshed it into the face, eyes, and mouth and I believe I did all the animation types that are necessary. Unfortunately the end result is not the same as my Spine editor file. The mesh goes crazy and stretches and pulls in directions that are in no way related to the file. Are there mesh limitation for the prototype or is it a bug that can be fixed?
If this can be fixed or if I can be informed on how to do this correctly without breaking it I'm confident I can make a very cool vtuber model with a combination of bones and mesh.
Lastly I just want to say thank you for building this prototype because the potential for this to be amazing is very high.
SilverStraw Yes! Are the physics constraints not ready for prime time yet?
Загрузка...
Что-то пошло не так при попытке загрузить полную версию этого сайта. Попробуйте обновить страницу с удалением кэша браузера (CTRL+F5) для решения этой проблемы.