Category Archives: AI and Animation

Behaviour tree

There are many different behaviour trees. Some that have been shown by teachers, some that are online done by people working on major titles. But they do basically the same thing by the looks of it. The AI will need a behaviour tree. So the first thing I’ll be doing is making a list of all the things the player will need to do. I’ll start with the bare minimum for combat (without capture the flag). The basic flow of the AI according the the brief is as follows:

  1. AI stands for a random count “Idle mode”
  2. After count is up, AI picks a random nav point and creates a path with A*
  3. The AI then walks the path and goes back to idle mode and this is repeated
  4. If the AI sees an enemy in any of these steps, it switches to attack mode
  5. In attack mode the AI will walk directly towards the enemy, shooting his gun
  6. If the AI kills the enemy, he will switch back to idle mode
  7. If the AI dies, he will switch do Die mode
  8. In die mode, the die animation will play, and then the AI will respawn and the process is repeated

To break down into smaller actions, the AI tree would look something like the following:

  • Guard (selector)
    • ChooseRandWait
    • StandGuard -> complete
  • Hunt
    • CreateRandomPath
    • WalkPath (check for enemies -> Attacking) -> complete
  • Attacking
    • FacePoint
    • WalkPath
  • Die
    • Die
    • Dead

I may refine these even more but this is the basic tree I’ll be creating.

For these to work, they will need to be arranged in a way that the tree can choose which one in order of importance. The order would be:

  • Is Dead? Do die sequence (first priority)
  • Can see enemy? (do attack sequence)
  • Has Guard time? (do Idle)
  • Otherwise do roaming (pathfind mode)

To make this tree, I will first make a main “BehaviourNode” class. This will be what all nodes derive from.

From this, I’ll make a “CompositeBehaviourNode” class. This will be used as either sequences or selectors. Each will be assigned on the creation of the object in their constructor.

Finally, I’ll make an “ActionBehaviourNode” class that all my main actions will be derived from.

The reason I’ve decided on this approach, especially with keeping the selector and sequence in one class, is for simplicity’s sake. From this there ill be less code and also less files laying around. I am also concerned about me forgetting which is which.

The nodes will all have a method called “Execute”. This function takes in deltaTime for it’s measurement of time. It will also need to access the world some how. I plan to do this by referencing the Player in the constructor of each BehaviourNode. Normally the best practice for this sort of thing would be to use a “Puppet” type of object, and the player be derived from that, but in the case of this simple game, where there’s only 1 type of actor, I’m going to do it this way.

I strongly believe the best way to learn how to code is to jump in and do it, and then make mistakes so you can write it better again. That’s what school is for. I’m not going to be able to make the perfect behaviour tree no matter how hard I try.

Below is an attempt at a diagram of the structure of the tree if we implemented the capture the flag mode:

20130807030440

And here’s the class diagrams:

20130807034230

I’ll also need to create some sort of Blackboard class to communicate with each leaf node and also so the player knows when to die. I’ll probably use this also for decorations as I don’t plan on doing any decorator classes until I see a fantastic advantage to doing them. For now I just want a working tree so I understand the basic concept.

So on with the code!

Changed AI Assignment

What? Yeah. I changed it a bit. After completing the network project I found new things. People change, Sarah, and so did my code. I’m leaving you.

Unfortunately, the code is mostly in int main(), and the entire project probably should have been bundled into an Application object that had a Renderer to control things like drawing Marv. Because I didn’t do this, the main.cpp file ends up having many globals.

The good news is, the globals aren’t very numerous.

The other good news is, I know what I’m doing now.

The other bad news is, main.cpp is long.

Some people like it long.

So far since this change:

We have Marv on the screen, much like the Network assignment. Except now Marv is smart (sort of) well. Not yet. I still need to do pathfinding and a Behaviour Tree. Do I know how to do one? No. Will I figure it out. Yes. Because they look awesome.

I also fixed a lot of rendering bugs. The AI assignment had weird things happening to normals on the walls of the castle. – speaking of the castle walls, I changed the FBX image loading code to include an image name instead of a pointer to the scene, so I can stick all the images belonging to that FBX model into its own folder in the images folder. THis looks nicer. So back to the images – I found on the internet some wall and grass textures, so now the FBX of the game world doesn’t fail when trying to load these 2 images.

On next week’s episode: we will create a behaviour tree and an A* path finder thing.

Let’s make Marv do things

Scaling

Today I updated the scaling part of my vertex shader because it seemed wrong. And it was wrong. Before:

mat4 newmodel = Model;
newmodel[3][3] = Scale;
vWorldPos = newmodel * PosWithBone / Scale;
// output position
gl_Position = Projection * View * newmodel * PosWithBone;

After

// for scaling
PosWithBone.x *= Scale;
PosWithBone.y *= Scale;
PosWithBone.z *= Scale;
vWorldPos = Model * PosWithBone / Scale;
// output position
gl_Position = Projection * View * Model * PosWithBone;

The vWorldPosition is set so the pixel shader knows easily where the world is so I don’t have to send it as a uniform.

Flags

I plan on adding flags for capture the flag. I spent a bit of time on TurboSquid finding some free flags but only one was free in FBX format and it didn’t import properly, so I decided to go with using the dragon FBX, despite it being extrem,ely high poly count. I believe this won’t be a problem but if frame-rates slow down then I’ll just use another random object like soulspear.

Lights and scenes

The light were originally just structs, so I moved them out into their own header and class files for better code standards.

So far here’s what we have:

20130528000756

Playing with fog. Using the z buffer to darken the fragments depending on distance from the camera.20130528000839

Got a nice specular effect using a normal map created in crazybump.20130528001303 20130527232639

Marv ready to be animated.20130527232702

 

Testing fog colours. Here we have red fog.20130528001501

Using different textures on the level. Lava texture looks a bit too bright.20130528030046

Here we see the result of what Marv animated like before I removed the 4th bone transform from the indices. I’m going to assume that that last one is for the transform of the gun Marv is supposed to be holding.20130528030328

Using emissive textures to light up areas that are not lit. I couldn’t figure out the formula though so I removed it.20130528030358

Go home Marv, you are drunk.20130528030522 20130528032217Trying with only 1 bone index. Marv hated that.

 

Marv: The movie

That’s right boys and girls, today I got Marv to animate. (Well, Monday).

Horaaah.

I am very proud of this because it was really hard to figure out, and some parts I still don’t fully understand, but I’m going to explain how I did it here and maybe someone one day will read it and think I’m amazing. Which they are right now. Yes I’m talking about you. The reader.

The first thing I had to get to work was, working off of yesterday’s progress, getting the indices and weights for the bones into the shader. This was done by adding a few more vertex attribute arrays to the shader and then sending them in where the shader is loaded:

InitFBXSceneResources()

// bind arrays needed for animation / normals / texturing etc
 glEnableVertexAttribArray(0); // pos
 glEnableVertexAttribArray(1); // normal
 glEnableVertexAttribArray(2); // tangent
 glEnableVertexAttribArray(3); // binormal
 glEnableVertexAttribArray(4); // indices for bones
 glEnableVertexAttribArray(5); // weights for bones
 glEnableVertexAttribArray(6); // uv
 glVertexAttribPointer(0, 4, GL_FLOAT, GL_FALSE, sizeof(FBXVertex), (char*)FBXVertex::PositionOffset);
 glVertexAttribPointer(1, 4, GL_FLOAT, GL_FALSE, sizeof(FBXVertex), (char*)FBXVertex::NormalOffset);
 glVertexAttribPointer(2, 4, GL_FLOAT, GL_FALSE, sizeof(FBXVertex), (char*)FBXVertex::TangentOffset);
 glVertexAttribPointer(3, 4, GL_FLOAT, GL_FALSE, sizeof(FBXVertex), (char*)FBXVertex::BiNormalOffset);
 glVertexAttribPointer(4, 2, GL_FLOAT, GL_FALSE, sizeof(FBXVertex), (char*)FBXVertex::IndicesOffset);
 glVertexAttribPointer(5, 2, GL_FLOAT, GL_FALSE, sizeof(FBXVertex), (char*)FBXVertex::WeightsOffset);
 glVertexAttribPointer(6, 2, GL_FLOAT, GL_FALSE, sizeof(FBXVertex), (char*)FBXVertex::UVOffset);

Init()

// load shader
 const char* aszInputs[] = {
 "Position",
 "Normal",
 "Tangent",
 "BiNormal",
 "Indices",
 "Weights",
 "UV",
 };
const char* aszOutputs[] = {
 "outColour"
 };
// load shader
 g_ShaderID = LoadShader(
 7, aszInputs, 
 1, aszOutputs,
 "./shaders/animation_vertex.glsl",
 "./shaders/normalmap_pixel.glsl"
 );

Below is an image showing the structure of the FBX model and the animation part of it.
20130529015854

 

 

Programming for AI assignment

Today I focused on getting the environment for my AI and Animation demo done with lights and models imported. Because I had most of what I need to start with from assignment 1 and tutorials, this was easy. The Marv model we get to use for animation of a character doesn’t have an actual material, so I applied a plain metal texture to him and then tinted him red.

The texture for the level, however, looks really crappy. I used a uniform and sent a “tileAmount” variable to the pixel shader to tell it how much to tile the texture on the level but it didn’t help.

The shader also has a fog setting, and a normal map texture sent to it.

20130526223547

I don’t know what the deal is with the pixelation that looks like noise. I guess I need to set up some sort of LOD or mipmapping so the textures blend a bit better…

20130527024241

 

That is a closeup of the wall behind Marv. It looks like white noise.

So, after browsing the net, I found that openGL can have mipmaps set up to work automatically (instead of creating them all yourself). http://www.swiftless.com/tutorials/opengl/mipmap_generation.html This could be a feature in this AI assignment for frustum culling.

The next steps were to try and find how FBX models animate. This stage was difficult as I had no idea how this worked and required a lot of digging around.

After digging around I managed to figure out:

  • FBX models have a “skeleton” made of “bones”. “Bones” are basically just 4×4 matrices. This skeleton can be accessed with GetSkeletonByIndex( id )
  • FBX models also have an “animation” which contains “tracks”. These “tracks” have “keyframes”. Each keyframe has a “rotation”, “translation” and “scale”. Animations can be accessed with GetAnimationByIndex( id )
  • Animations contain “TotalTime” that can be used to loop.

What I couldn’t figure out is how the actual indices are affected by each bone. I mean, there’s bones in there, but what indices do they affect? This would need to be held in some sort of array as a bone could affect multiple indices. Then after I figure THAT out, I need to somehow throw them into the shader. An array maybe? I’ll do this tomorrow.

AI Assignment – Plan

This assignment needs to have animation and AI. What does the A stand for? Artificial. What does the I stand for? Intelligence. What does animation stand for? Movement over time. Let’s do it.

Intro

This assignment is about bringing our static boring meshes alive and making them walk around like people things. I’ll be creating a sort of AI demonstration with animations. This will be a small combat situation where teams of 4 red and blue Marvs will fight to the death and perhaps capture a flag.

  • The level, including a light. The level needs textures on it.
  • At least 4 AI controlled characters with animations played at certain times for:
    • Running
    • Idle
    • Death
    • Attack
  • Animations should be blended between each other
  • A* Path finding
  • Behavior trees
  • Collision avoidance
  • Frustum culling
  • Line of sight visibility check

Combat

For the combat system, I plan on having the men run around with a fairly short line of sight. I’ll be using the physics level provided to us which is fairly open, so if the line of sight is infinite, then everyone will be shot all over the place.

The weapons will be plasma guns that shoot a shiny light (no “lights” though, a low poly sphere will be used for demonstration purposes). Basically the characters will run around and if they see an enemy they will shoot it. The plasma ball will need to be aimed a little bit in front of the opponent so it hits it, so the AI will need to do a quick measurement and guess where the enemy will be walking to when the bullet is in that position. This measurement will be something like getting the direction of the bullet, the direction of the player and then determining where these 2 will cross at the time it takes for the bullet to reach a projected vector from the direction of the player.

time

I’ll also add in a “capture the flag” mode. This will simply be either nobody or 1 of the 4 players on each team is set to “flag master” and will be the one who goes to get the flag. There won’t be any “protect the flag” modes or anything because I think the characters will be in battle a fair but anyway.

When a player dies, he will respawn at the flag on their base. To get a point, a player must kill an enemy player, or bring the enemy flag back to their base which will award 10 points. The game will be run on a timer of 2 minutes (or whatever time works best). And the winnder is announced on the screen after that. The game will then restart and continue forever.

Graphics and culling

The graphics will just be as simple as possible to just show off the AI and animation. All the Marvs and level will both be textured with a single material which will include a normal map. This will be a very plain texture so it can be tiled easy and not look really rubbish due to detail being stretched or uneven on the models.

I’ll be implementing a frustum culling system that will make sure only things on the screen are rendered. This will also need to count and display all these objects. The way I’ll do this is to simply check bounding boxes of the players are inside the camera. This requires 2 complex maths things: figuring out the volume area of the camera, and the bounding box size. The bounding box could be set manually to save time. These 2 volumes need to be checked of intersection. Which should be nice and difficult.

I’ll be using the physics level provided to us which is fairly open, so there won’t be need for portal culling.

Animation

Animation will need to be done by first figuring out how the FBX loader reads animation sequences and tracks in the FBX files for Marv. This shouldn’t be too difficult.

Once the vertex shader has been set up properly, the animations will just need to be played and the model swapped depending on which one needs to be played. I could keep all the animations on the video card only once and copy them per player.

AI

Probably the most difficult part of this assignment. The main AI will be fairly straightforward with a state manager for each player. Then the AI will need a behaviour tree that controls each state of the AI based on steps within steps that need to be executed in order to complete a node in the tree so the next task can be performed. Sounds complex.

Then there’s A*. In theory this isn’t hard, but then I need them to figure out where the floor is and probably have to use a navigation mesh for them to know where to walk. This will be pretty hard to program.

One thing I could add would be a “Commander” class that controls all AI entities and tells them what to do in a team environment. I don’t think I will need this really, because the AI can surely play pretty well by themselves the way I’ve planned it.