Hi there !
I'm currently working on a system that will add (in runtime if possible) objects to our character, which is an animated skinmesh. The idea is to go beyond the simple method of attaching an object to a bone. I would like to "combine" the object with the skinmesh system, so every vertex will follow the closest vertex on the skin.
So far my first idea was quite simple, but not fully runtime. - First for every vertex on your object, find the closest vertex on the skin mesh (on idle/T pose) - Store data of the index corresponding for each vertex, the initial vector between the two, and the normal of the vertex on the skin - Then in runtime, move the vertex at the skin vertex position with an offset that is calculated from the rotation between the saved normal and the save vector, then multiplied with the current normal of the mesh
This method works well, but has two downsides. The first one that can be reduced is the bake time. Currently I'm searching for closest point on the skinmesh in a very "dumb" way, bruteforce-style. But I already know some cool code that will help to fix that (I'm looking at you, K-D trees). The other downside is that I have to call BakeMesh on the skin on the Update in order to get the real skin vertex positions and normals.
**So my first question is:** Can you calculate individually real vertex positions from a skin mesh currently animated, and if so, is it faster than calling BakeMesh (let's say in a ratio from 80 vertex manually to 4k with BakeMesh) ?
**My second question is:** Do you know if it would be possible to generate a new skin mesh based on data already existing on our character and use it to move the object ? I've searched quite a lot, and only found some plugins doing something like this (like mesh baker), but never in runtime (which might be acceptable in our project).
Here's my [current][1] result, using a chili bowl on our character's elbow :D
Thanks ! [1]: https://streamable.com/p1l0q
So far my first idea was quite simple, but not fully runtime. - First for every vertex on your object, find the closest vertex on the skin mesh (on idle/T pose) - Store data of the index corresponding for each vertex, the initial vector between the two, and the normal of the vertex on the skin - Then in runtime, move the vertex at the skin vertex position with an offset that is calculated from the rotation between the saved normal and the save vector, then multiplied with the current normal of the mesh
This method works well, but has two downsides. The first one that can be reduced is the bake time. Currently I'm searching for closest point on the skinmesh in a very "dumb" way, bruteforce-style. But I already know some cool code that will help to fix that (I'm looking at you, K-D trees). The other downside is that I have to call BakeMesh on the skin on the Update in order to get the real skin vertex positions and normals.
**So my first question is:** Can you calculate individually real vertex positions from a skin mesh currently animated, and if so, is it faster than calling BakeMesh (let's say in a ratio from 80 vertex manually to 4k with BakeMesh) ?
**My second question is:** Do you know if it would be possible to generate a new skin mesh based on data already existing on our character and use it to move the object ? I've searched quite a lot, and only found some plugins doing something like this (like mesh baker), but never in runtime (which might be acceptable in our project).
Here's my [current][1] result, using a chili bowl on our character's elbow :D
Thanks ! [1]: https://streamable.com/p1l0q