In World War II, we Americans did not go to war with Germany for Great Britain, when it declared war on Hitler’s Germany and then was defeated in France. We went to war with Germany only when Hitler declared war on us, four days after Pearl Harbor in December 1941.

By

Leave a Reply