Hi,
The angle between vector (1, 0) and vector (0, 1) should be 90 degree as shown in below:

But why when I logged out BABYLON.Angle.BetweenTwoPoints(new BABYLON.Vector2(1, 0), new BABYLON.Vector2(0, 1).degrees()), it gives me 135?
Hi,
The angle between vector (1, 0) and vector (0, 1) should be 90 degree as shown in below:

But why when I logged out BABYLON.Angle.BetweenTwoPoints(new BABYLON.Vector2(1, 0), new BABYLON.Vector2(0, 1).degrees()), it gives me 135?
BABYLON.Angle.BetweenTwoPoints(a, b) is getting the angle between the x axis and b-a vector, so in your case (-1, 1), which is indeed 135°.
In your case, you should take the dot product to calculate the angle:
dot(a,b)=norm(a)*norm(b)*cos(ab)
Dot can be found in static BABYLON.Vector2 and norm is the length() function of the vector.
Ah I see, thanks a lot!
why is it calculate (-1,1) in this case? can you explain more? do we have any playground for this?
In the example, B=(0,1) and A=(1,0), so B-A=(0-1,1-0)=(-1,1)
The expression BetweenTwoPoints is bit of a misnomer since this function
const angle = BABYLON.Angle.BetweenTwoPoints(a, b);
finds the gradient angle of the vector2 b - a