I’ve studied WW2 and read a gazillion books about it over the years. And one of the most fascinating things is to read actual accounts and interviews with German soldiers. Doesn’t seem possible, but even AFTER the war was over, years later after the truth had come out… many (if not most) still thought they were justified in their beliefs and actions. I always marvel at how effective the Nazi propaganda was.
D-Day? Those were American & British invaders, and the Nazis were just trying to defend “united Europe” from those murdering scum. (Conveniently forgetting that Germany had bombed, raped & killed those countries into submission. Nobody joined Hitler’s “united Europe” voluntarily). The war in the East? Why, it was those nasty Ivans trying to overrun the Fatherland. (Never mind that Hitler launched a sneak attack against Russia first, and the German atrocities there were mind-boggling).
Concentration camps and mass murder? Fake news, never happened. Roosevelt & Churchill were the war mongers who had thrown the world into chaos and destruction. Hitler? Well, he had tried at every turn to avoid conflict, but the “international Jewry” that controlled the USA and western world had plotted the whole thing, and thus were responsible for the deaths of millions.
I mean, it’s impossible to even type all that with a straight face, but that’s what they thought. So to answer the question… No, most German soldiers did not realize they were on the wrong side of history. Or if they did realize it, they were never willing to admit it. Most died believing in their cause.