1&1
1 + 1 = 2 because that's how we've defined it, in the same way that the word "cat" refers to small, furry, feline creatures because that's how we've defined it.
Math is just a model. We define 1 + 1 to equal 2 because it's useful. Since we define addition the way we do, we can represent, for example, what happens when I have an apple and someone gives me another apple. 1 apple + 1 apple = 2 apples. If we defined it differently, apples would still behave the same way, we just wouldn't be able to use addition to talk about apple transactions.
There are also a lot of real-world situations that seem like addition but aren't well-modeled by this definition of addition. For example, when two clouds merge, you get one cloud, but that does not mean that 1 + 1 = 1, because math isn't talking about that kind of addition when it says "+".