So, I have an answer that's a bit different than others, so it may give you a slightly different perspective.
When I was studying math in university (at the time Waterloo had the ridiculous requirement that Comp Sci majors had to be in Honours Math), one of the things that I liked the most was the discovery that math was both mutable and extensible. You can take things that you "know", like how to add, and you can apply that knowledge to ever-expanding sets.
Now, you actually learned this yourself, back in grade school. First, you learned to count. And then you learned to add. And then you learned subtraction, and division. And then (this is the cool part) you learned it all over again for fractions. And for real numbers. And percentages.
It's not that you invalidated the whole idea that 1+1=2, you just realized that you could also calculate (1/2) + (1/2) + (1/2) + (1/2) and also get 2! Same little "+" operator, but doing something different.
Now, I want to return to the original question, and ask "What if 1+1 was not actually 2?" Would that make mathematics “fundamentally wrong”?
Well, now things start to get interesting. Not only would this be not "impossible", it's actually incredibly interesting. Let's see what we can do...
First, let's imagine a mathematics where addition works like this:
1+0=1
0+1=1
So far, totally normal. Now watch this:
1+1=0
"Whoa.... what???" I hear you saying. "That looks a little wonky."
It is a bit wonky, but bear with me. We've got some more:
1-0=1
1-1=0
0-1=1
0-0=0
1*0=0
0*1=0
1*1=1
1/1=0
0/1=0
1/0=undefined
0/0=undefined
Now, you have to follow along here, because this is where things get fun. Believe it or not, we have a mathematical system here that works perfectly well! Yes, it's a bit weird, for example 1+1+1=1, and 1+1+1+1=0, but actually, you could go on to form all sorts of mathematical equations, and they would all work out just fine.
A + B + C = A + C + B
A + 0 = A
A - A= 0
A * (B + C) = (A * B) + (A * C)
Except that they are performed in an universe where 1+1=0.
OK, so this might leave you a bit non-plused. Especially since a minute ago I promised that this would be "incredibly interesting", but unless you are a math geek I've not really delivered.
So, now some more background. A hundred years ago, when mathematicians started playing with this idea, that is to say the operations that could be performed on the set consisting of {"1" and "0"}, this was an uninteresting backwater of set theory, more or less ignored by the "real" mathematicians.
Until about 60 years ago, when the first rudimentary computers came along. And people discovered that it was very, very difficult to store voltages like "5" or "37", but really easy to store voltage like "on" and "off.” And suddenly, operations on the set of {"1" and "0"} became really, really important. Especially if we changed our little addition definition in one small way:
How about when we add 1+1, we still get 0, but we carry a 1 to the next digit? So we get something like this:
1+1=10
And suddenly, this obscure bit of math, heretofore thought useful only in an invented universe, suddenly comes to be the basis of all information represented digitally, right up to and including the very image you are looking at right now! With that small extension, we can suddenly build a machine to do what previously took a human:
And that, my friends, is why math is useful: because the it allows us to build and model universes, even when the existence of those universes is unknowable. And if and when we meet a higher being, you can bet your bottom dollar that she will also have developed a system for building and modeling universes.