#include <bitset>
#include <iostream>
int main() {
std::bitset<8> a = 10101010;
std::bitset<8> b = 11111111;
std::cout << (a ^ b);
}
When running the following code the result is:
11010101
Expected output is:
01010101
Am I doing something wrong?
>Solution :
You initialize a and b with values which are int literals.
In order to use binary literals you need the 0b prefix:
#include <bitset>
#include <iostream>
int main() {
std::bitset<8> a = 0b10101010;
std::bitset<8> b = 0b11111111;
std::cout << (a ^ b);
}
Output:
01010101
More info about such literals: Integer literal.