meaning of west germany

1. a republic in north central Europe on the North Sea; established in 1949 from the zones of Germany occupied by the British and French and Americans after the German defeat; reunified with East Germany in 1990


Related Words

west germany |

Developed & Maintained By Taraprasad.com

Treasure Words