Definition of West Africa in English:

West Africa

proper noun

  • The western part of the African continent, especially the countries bounded by and including Mauritania, Mali, and Niger in the north and Gabon in the south.