Why doesn’t the default value of LayerMask use the binary representation of e.g. 0b0001. The default value is now 0x0fffffff in hexadecimal, is there any advantage to this.
I feel that binary is easier to understand and set than hexadecimal. Although they are theoretically the same, the default value of 0x0ffffff makes it difficult to use binary unless I set the layermask for each object.