What does Namibia mean?
• NAMIBIA (noun) The noun NAMIBIA has 1 sense: 1. a republic in southwestern Africa on the south Atlantic coast (formerly called South West Africa); achieved independence from South Africa in 1990; the greater part of Namibia form part of the high Namibian plateau of South Africa Familiarity information: NAMIBIA used as a noun is very rare.