Hello, you have come here looking for the meaning of the word
West Germany. In DICTIOUS you will not only get to know all the dictionary meanings for the word
West Germany, but we will also tell you about its etymology, its characteristics and you will know how to say
West Germany in singular and plural. Everything you need to know about the word
West Germany you have here. The definition of the word
West Germany will help you to be more precise and correct when speaking or writing your texts. Knowing the definition of
West Germany, as well as those of other words, enriches your vocabulary and provides you with more and better linguistic resources.
English
Etymology
From informal use describing the British-, French-, and American-occupied zones of Germany following World War II, under influence from existing German Westdeutschland and Westdeutscher used to describe those areas of German-speaking Europe since the 17th and 18th centuries.[1]
Pronunciation
Proper noun
West Germany
- (historical, 1949–1990) A former country in Central Europe, distinguished from the German Democratic Republic, commonly known as East Germany. Official name: Federal Republic of Germany.
- (since 1990) The former areas of the Republic during that time, distinguished from the former East German areas.
- (historical, uncommon, 1945–1949) Collectively, the British-, French-, and American-occupied zones of Germany, distinguished from the Soviet-occupied zone.
Synonyms
Translations
References
- ^ Oxford English Dictionary, 3rd ed. "West German, adj. and n." Oxford University Press (Oxford), 2012.