(I’m a new programmer… shhhhh)
When looking at other people’s code, specifically with bitwise operations, I see people using hexadecmal instead of decimal.
( _data & 0x80) versus ( _data & 128)
In my programs I’ve run, it doesn’t make a difference. Is that it? Does it boil down to preference, or am I missing something?