Async Meaning

Here is a good explanation from a post on stackoverflow.

When you execute something synchronously, you wait for it to finish before moving on to another task. When you execute something asynchronously, you can move on to another task before it finishes.

And here is an example I wrote on the javascript async function demo pageĀ 

function FuncContainAsyncCall()
{
  function ResolveAfter2Seconds() {
    return new Promise(resolve => {
      setTimeout(() => {
        resolve('resolved');
      }, 2000);
    });
  }
 
  async function AsyncCall() {
    console.log('calling');
    var result = await ResolveAfter2Seconds();
    console.log(result);
    // expected output: "resolved"
  }
 
  AsyncCall();
  console.log("async won't block the flow!");
}
 
FuncContainAsyncCall();
 
> "calling"
> "async won't block the flow!"
> "resolved"

We can see from the console log that the async function won’t block the current execution flow so the execution can continue even the async function hasn’t finished yet.

Behaviour Tree In Unreal Engine 4

In this blog, I will be talking about the Behaviour Tree In Unreal Engine 4 based on my own experience. Keep in mind, I am using Unreal Engine 4.15 but I guess the version may not affect the topic of this post.

It would be good if I can get a confirm or correction from a more experienced person. Thanks in advance.

1.Ā  Tick

This post explains the tick behaviour of Behaviour Tree in UE4 very clearly. (I will add myĀ own understanding in the future when I have time. Maybe in Chinese as well.)

UE4 ēš„ Behaviour Tree Tick 和其他ēš„ BT ꜉äø€äŗ›äøåŒå°±ę˜Æ它ēš„ęÆę¬” Tick åŖ Tick ꓻ跃 Active ēš„节ē‚¹ļ¼ŒęƔ如 Service ꈖ者꜉ Event Receive Tick ēš„ Taskļ¼Œ 而äøä¼šåœØęÆę¬” Tick ēš„čæ‡ē؋äø­éƒ½ä»Žå¤“éåŽ†čŠ‚ē‚¹ę ‘怂

2. Abort

当你ē‚¹å‡»äø€äøŖ Condition ēš„ę—¶å€™ļ¼Œę‰€ęœ‰č¢«ē»æč‰² (self) å’Œč“č‰² (low priority) åŒ…č£¹ēš„nodeéƒ½ä¼šč¢«abort (å¦‚ęžœčæ™äŗ›task里꜉Event Receive Abort AIä¼šč¢«č°ƒē”Ø到)

3. In behaviour tree tasks, there are three types of events.Ā  Event Receive Execute AI, Event Receive Tick AI, Event Receive Abort AI. What do they mean and when do they get fired?

Event Receive Execute AI will be called once the Task has started.

Event Receive Tick AI will be called in a specific tick rate. (You can check the Delta Seconds to see the tick rate)

Event Receive Abort AI only gets called when the task has been aborted ( mentioned above)

A Good Way To Know Who In Blueprints Called This Native C++ Function

Sometimes we write some BlueprintCallable functions in C++ and we want to know who in blueprints called the native C++ function.

1) Put a breakpoint in the native C++ function and when it gets hit

2) From this post, there is a C command we can run

{,,UE4Editor-Core}::PrintScriptCallstack(false)

from the Immediate Window in Visual Studio.

The parameter means if we want to empty the blueprint call stack, so I suggest to use false since we can print it again later.

To clear the immediate window,Ā  you can use,

>cls

The Definitions Of Different FAttachmentTransformRules For AttachToComponent Method in UE4

So, in Unreal Engine 4.15.3, the AttachTo method which used to attach a component to another one is deprecated.

We are supposed to use another one called AttachToComponent.

However, there is a parameter in this method which isn’t explained clearly in Epic’s doc or in the engine source code.

From a reply of this topic and my own experience, their definitions are:

-> KeepRelativeTransform: when you are attaching the component you are going to use a relative offset; given by a transform. This is the one that you normally want to use; but depends on the task.

-> KeepWorldTransform: Is going to be attaching the component through a transform translated from world space. I.E. Attach it from exactly where it is in world relatively to the component it is being attached to. So if your actorcomp that your attaching is at the world position FVector(0.f, 0.f, 100.f); It will be attached and keep that world position.

-> SnapToTargetIncludingScale: Snap the actor component to the new parent. This calculates the relative scale of the component that is being attached; so that it keeps the same scale. Essentially this is taking in the scale given and using that. (This one is usually used to attach a weapon to the character mesh.)

-> SnapToTargetNotIncludingScale: This Does the same as the above, but ignores the scale param of the given relative or world transform.

Differences between "Orient Rotation to Movement" and "Use Controller Desired Rotation" Unreal Engine 4

These are two options in the CharacterMovementComponent in UE4 (My current version is 4.15.3).

You probably won’t see much difference for AI-controlled characters, but for your player character the difference is this:

“Orient Rotation to Movement”: Your character will turn to face the direction of travel. No matter which way the camera is facing, your character will always face the direction in which he or she is moving.

“Use Controller Desired Rotation”: Your character will orient to the direction of the controller rotation. In most games, this is visually represented by the direction of the camera so your character’s back is always to the camera and he or she rotates to match when the camera is swung. Of course, you can decouple your camera from following the control rotation as well so a more accurate description is that the character will face whichever direction the “right stick” or “Mouse X” is pointing.

Again, you don’t control the AI so both of these options look identical to a third party. I think “Orient Rotation to Movement” looks more natural, but there’s really no big difference, visual or otherwise, for the AI.

The idea is from a post in the Unreal Engine Forums. Here is the link.

Difference between “Orient Rotation to Movement” and “Use Controller Desired Rotation”

Unreal Engine 4 AIController::SetFocus

From this link

AIController.SetFocus is used to keep AI controlled pawn rotated towards given actor or point in space. There are a few priorities of focus points, allowing easy creation of e.g. scripted overrides. For example:

1. AI is moving and path following automatically sets intermediate destination as focal point with low priority (EAIFocusPriority::Move). AI’s pawn is facing forward as it moves around.

2. AI receives a target and sets it as focal point with higher priority (EAIFocusPriority::Gameplay). Pawn starts facing toward target as it keeps moving.

3. Level script forces AI to look at specific point, setting it as even higher priority (EAIFocusPriority::LastFocusPriority + 1, usually defined by project as something more readable…). Pawn rotates toward this point.

4. Movement stops, pawn still looks at whatever script told it to.

5. Script clears forced focus point, pawn rotate towards most important focal point, which is right now target.

6. Target is cleared, pawn keeps current rotation since there’s no focal points to look at.

Set Up Navmesh Invokers In Unreal Engine 4.15

Sometimes our map is very big so we don’t want to generate navmesh for the whole map to reduce the cost.

We can use navmesh invokers to only generate navmesh around the NPCs so they can still move around.

Firstly, we need to go to change Project Settings->Engine->Navigation Mesh->Runtime Generation to Dynamic.

Secondly, we go to Project Settings->Navigation System and check Generate Navigation Only Around Navigation Invokers.

Then, we need to put a navigation volume into the level and make sure it encapsulate the place where we want navmesh to be generated.

At last, we open our npc actor, add a Navigation Invoker component to it.

And register navigation invoker in the event begin play.