The x^y notation is popular outside of programming. However, the ^ operator is a bitwise xor, so in a programming context x^y means “all the bits which are set in x or y, but not both”.
X |
Y |
X&Y |
X|Y |
X^Y |
---|---|---|---|---|
0 |
0 |
0 |
0 |
0 |
0 |
1 |
0 |
1 |
1 |
1 |
0 |
0 |
1 |
1 |
1 |
1 |
1 |
1 |
0 |
So I use x**y instead, because it doesn’t have that ambiguity.