Hi,

The angle between vector (1, 0) and vector (0, 1) should be 90 degree as shown in below:

But why when I logged out `BABYLON.Angle.BetweenTwoPoints(new BABYLON.Vector2(1, 0), new BABYLON.Vector2(0, 1).degrees())`

, it gives me 135?

Hi,

The angle between vector (1, 0) and vector (0, 1) should be 90 degree as shown in below:

But why when I logged out `BABYLON.Angle.BetweenTwoPoints(new BABYLON.Vector2(1, 0), new BABYLON.Vector2(0, 1).degrees())`

, it gives me 135?

1 Like

`BABYLON.Angle.BetweenTwoPoints(a, b)`

is getting the angle between the x axis and `b-a`

vector, so in your case `(-1, 1)`

, which is indeed 135°.

In your case, you should take the dot product to calculate the angle:

`dot(a,b)=norm(a)*norm(b)*cos(ab)`

`Dot`

can be found in static `BABYLON.Vector2`

and `norm`

is the `length()`

function of the vector.

3 Likes

Ah I see, thanks a lot!