The “2+2” example is interesting because obviously you can break it up into “2”, “+” and “2”. But people say that “2+2=4” is unquestionable and deny there is hidden complexity there that could be analyzed. Those same people have no idea how addition is implemented in a computer (using smaller parts – multiple basic logic operators). But they figure that, for human beings, it’s just one step: you just somehow know the answer. That’s after you learn it, of course – babies don’t know arithmetic. And kids generally don’t learn it until after they learn the numbers and learn to count. But still, it seems like a foundational axiom to people. If anything’s infallible, their understanding of that is! If it’s one step in humans, what does one step mean? One neuron firing? Only one neuron involved? Realistically, there are at least thousands of neurons involved – millions wouldn’t be surprising. Addition is broken down into multiple smaller parts or steps in our brain, as it is in a silicon computer.
Different people may think of “2+2=4” in different ways. Feynman gave an example of this in an interview: