Why doesn't the default value of LayerMask use the binary representation of e.g. 0b0001

Why doesn’t the default value of LayerMask use the binary representation of e.g. 0b0001. The default value is now 0x0fffffff in hexadecimal, is there any advantage to this.

I feel that binary is easier to understand and set than hexadecimal. Although they are theoretically the same, the default value of 0x0ffffff makes it difficult to use binary unless I set the layermask for each object.

the problem is that 0x0ffffff will be really long in binary. So it is more about size when you write it in your code

I understand that, but why is the default value 0x0fffffff, when it would be easier to write if it was 0x00000001 (no need to count the zeros). Is there some hidden benefit to this?

There is no particular reason :slight_smile: we did not give it a lot of thoughts :X

2 Likes