Alpha Presentation
David and I had our alpha presentation recently, and from the response and critique of our lecturer I believe we did an excellent job.
David's environment is ahead of schedule, and besides a few possible improvements that our lecturer suggested, the environment is complete. My animations were in 'key frames'-state, i.e. they had no in-between frames, but presented the strong key frame poses that followed the timing of the storyboard and cinematic. The first few seconds of the character coming to life were already fleshed out with in-between frames, and I had intended to do that for the whole of the cinematic ready for alpha, however issues I had with the rig and texture outputs caused me to be behind in my personal schedule (issues I will detail later in this entry). That being said, considering the definition of alpha stage, my animations were alpha-level, as nothing would now change except the completion of the animations; the alpha-level animations communicated what the final cinematic would look like.
)
David's environment is ahead of schedule, and besides a few possible improvements that our lecturer suggested, the environment is complete. My animations were in 'key frames'-state, i.e. they had no in-between frames, but presented the strong key frame poses that followed the timing of the storyboard and cinematic. The first few seconds of the character coming to life were already fleshed out with in-between frames, and I had intended to do that for the whole of the cinematic ready for alpha, however issues I had with the rig and texture outputs caused me to be behind in my personal schedule (issues I will detail later in this entry). That being said, considering the definition of alpha stage, my animations were alpha-level, as nothing would now change except the completion of the animations; the alpha-level animations communicated what the final cinematic would look like.
)
Overall Update
Currently I have the textures for the character and sword finalised, with a possibility of using an emission texture later on for the eyes and maybe some of the cracks. The animation for 1 of the characters (as there are a total of 4 statues) is partially fleshed out, and fully blocked out with key frames in place. Only in-between frames are left to do for that one. The other 3 will have similar animation, but varied in the way they wake up and the way they do their battle cries in order to make each seem unique in the sense that they are separate entities even though they look identical.
Leading up to the alpha presentation I had a few issues that meant we were not ready as early as we thought.
I had completed the textures for the character using poly painting in ZBrush and was ready to export them, however without realising I had used the wrong base mesh that had incorrect UVs, which mean when I tried exporting the texture map from ZBrush there were parts of the UV missing or incorrectly placed. I had 2 base meshes for the character, 1 was the actual low poly mesh that would be used in engine with the rig and the other was low poly with parts of the mesh altered in order to prepare it for ZBrush (it had triangle polygons converted to quads and also had extra edge loops in order to spread poly distribution so that it would sub divide without causing stretches). I had some how imported the wrong one for use in ZBrush.
ZBrush allows you to reimport the low poly mesh that you started with, so that you may update things such as UVs yet still keep all the sculpting that you have done in the higher sub division levels. I tried doing this but realised that because I had altered the mesh in preparation for ZBrush (more edge loops etc.) meant that ZBrush saw it as a completely different mesh, and therefore when it applied the sculpted sub division levels it caused it to soften edges and destroy the work I had put into it. For a long time I couldn't find any solutions and thought that I would have to start over, meaning the days I spent working on the high poly sculpt and poly painting would be a waste. Thankfully I was in a 2nd year lecture at the time of trying to find a solution, and the lecturer (Leavon Archer) was highly knowledgeable in ZBrush, and knew of a solution to my problem. Since I was using poly painting in ZBrush Leavon told me that the initial base mesh's UVs don't matter, as you can export the high poly sub tools (individual meshes) as .obj files which contain the colour information on a per-vertex basis. xNormals, which I had been using for normal and AO map baking, also has the ability to bake diffuse texture maps using the per-vertex colour information. This solved my problem and saved me a huge amount of time in redoing all that work in ZBrush.
I took this diffuse texture and applied the ambient occlusion and cavity maps to it in Photoshop. The AO was to give it shadow and light details, and the cavity map made the smaller details pop out in the diffuse, which the AO can struggle with. I also removed the solid black shadows that you can see in the AO bakes at the bottom left of the AO, and used a dodge and burn layer in Photoshop to bring it to the same light levels as the rest of the texture.
Leading up to the alpha presentation I had a few issues that meant we were not ready as early as we thought.
I had completed the textures for the character using poly painting in ZBrush and was ready to export them, however without realising I had used the wrong base mesh that had incorrect UVs, which mean when I tried exporting the texture map from ZBrush there were parts of the UV missing or incorrectly placed. I had 2 base meshes for the character, 1 was the actual low poly mesh that would be used in engine with the rig and the other was low poly with parts of the mesh altered in order to prepare it for ZBrush (it had triangle polygons converted to quads and also had extra edge loops in order to spread poly distribution so that it would sub divide without causing stretches). I had some how imported the wrong one for use in ZBrush.
ZBrush allows you to reimport the low poly mesh that you started with, so that you may update things such as UVs yet still keep all the sculpting that you have done in the higher sub division levels. I tried doing this but realised that because I had altered the mesh in preparation for ZBrush (more edge loops etc.) meant that ZBrush saw it as a completely different mesh, and therefore when it applied the sculpted sub division levels it caused it to soften edges and destroy the work I had put into it. For a long time I couldn't find any solutions and thought that I would have to start over, meaning the days I spent working on the high poly sculpt and poly painting would be a waste. Thankfully I was in a 2nd year lecture at the time of trying to find a solution, and the lecturer (Leavon Archer) was highly knowledgeable in ZBrush, and knew of a solution to my problem. Since I was using poly painting in ZBrush Leavon told me that the initial base mesh's UVs don't matter, as you can export the high poly sub tools (individual meshes) as .obj files which contain the colour information on a per-vertex basis. xNormals, which I had been using for normal and AO map baking, also has the ability to bake diffuse texture maps using the per-vertex colour information. This solved my problem and saved me a huge amount of time in redoing all that work in ZBrush.
I took this diffuse texture and applied the ambient occlusion and cavity maps to it in Photoshop. The AO was to give it shadow and light details, and the cavity map made the smaller details pop out in the diffuse, which the AO can struggle with. I also removed the solid black shadows that you can see in the AO bakes at the bottom left of the AO, and used a dodge and burn layer in Photoshop to bring it to the same light levels as the rest of the texture.
Diffuse texture derived from high poly bake of vertex colour information to low poly, with AO and cavity maps applied |
Running up to alpha I re-baked my ambient occlusion map for the character as I realised that sadly when batch baking all the individual meshes at once in xNormals it puts them all in the same 3D space, and so light information was being taken from meshes that were close together. I therefore redid the bakes, but on an individual basis. The differences can be seen below:
AO batch bake |
AO re-bake - individually baked |
Sword diffuse with AO and Cavity maps being applied in Photoshop, at 512 x 512 texture size. |
The second issue, that still persists, is the inability to use the master controller on the rig. I would normally translate this in space and then apply walking animations to the legs that moved in time with the master controller speed. It's a great way to block out timing and spacing, and then apply the walking motion of the legs. However, because this rig was aimed at game use (where the character stays in one spot regardless of the animation sequence, and is then moved in world space via the engine in game) the master controller applies no translation or rotation values to the skeleton root. Therefore I have to leave the master controller where it is and select each controller and move it in time with the rest. It's not a big issue, but it means that I cannot 'zero-out' the controllers if I want to start fresh because this would reset them back to their default positions back at the master controller. It makes this rig inefficient for cinematic animation, and in the future I will build cinematic rigs with master controllers that apply translation values to the skeleton.
The final issue is that I had to alter the type of animations that I wanted to do because of the inability of my rig to let me control the sword the way I had wanted. The rig doesn't allow me to leave the sword in a position and continue to control the hand that was holding it as a separate movement. I found the solution to this, rigging 'space-switching', but for an unknown reason I cannot get this to work with my rig. Space switching allows me to set the translation information of the sword to be controlled by two or more different points (E.g. The hand and the floor) through the use of interchangeable governing parent constraints. I couldn't get this to work with my rig, and I am not sure why, but as it isn't imperative to have this on my rig I weighed up my options and decided to not waste time trying to solve why I couldn't apply it, and instead changed my animation slightly so that the character did not pick up the sword and instead already starts off with the sword in it's hand.