Definition of Namibia
Noun: Namibia nu'mi-bee-u
- A republic in southwestern Africa on the south Atlantic coast (formerly called South West Africa); achieved independence from South Africa in 1990; the greater part of Namibia forms part of the high Namibian plateau of South Africa
- Republic of Namibia, South West Africa
Anagrams containing the word Namibia