Definify.com
Definition 2024
West_Germany
West Germany
English
Proper noun
- (historical, 1949–1990) The Federal Republic of Germany, distinguished from the German Democratic Republic ("East Germany").
- (since 1990) The former areas of the Republic during that time, distinguished from the former East German areas.
- (historical, uncommon, 1945–1949) A collective name for the British-, French-, and American-occupied zones of Germany, distinguished from the Soviet-occupied zone.
Synonyms
Derived terms
Related terms
Translations
former European country
|
|
References
- ↑ Oxford English Dictionary, 3rd ed. "West German, adj. and n." Oxford University Press (Oxford), 2012.