Germany declares war on the United States on December 1941 During Adolf Hitler Regime

Germany declares war on the United States on December 1941 During Adolf Hitler Regime Adolf Hitler declares war on the United States, bringing America, which had been neutral, into the European conflict. The bombing of Pearl Harbor surprised even Germany. Although Hitler had made an oral agreement with his Axis partner Japan that Germany would join a war against […]
http://dlvr.it/SN1Z4g

Comments