COMMENT Nelson Otieno 13 May 2022
“Why Africa” Search Results
Search results are enabled by algorithms that collect, organize, and categorize data into contexts to produce desired results. With this in hindsight, I have done a quick random (and admittedly unscientific) analysis by searching for the word ‘Why Africa’ on Yahoo search engine. I note that that Yahoo provides six generic results. Two search results are from my search history. Four of these are generic and read as follows: ‘Why Africa is poor’, ‘Why Africa has less Covid 19’, ‘Why Africa is underdeveloped’, and ‘Why Africa is called a dark continent’. Suffice to note that 75% of my generic results about Africa are laced with negative tinge. While this is not entirely untrue, their prioritization tells of underlying push factor whose consequence is to significantly paint the bad image of the African continent.
The prioritization of my search results was drawn from over 2 billion websites on the internet.[1] The algorithms that are consequence of complex Artificial Intelligence (AI) technologies make complex decision and suggest and display the most important links about a subject. Such results may be here with us for long considering AI technologies are projected to be much more entrenched in business and activities in 2022 and beyond[2] owing to its ability to utilize large volumes of data collected from individuals.[3]
Problem of White Domination
Researchers as examples of key users of online content and who increasingly interact with more AI algorithms developed to enhance their product and service access and experience. Currently, they can resort to a growing list of search engines including Google search, Bing, Yahoo, Google Scholar, Dogpile search, Webopaedia and Internet Archive search. These search engines are managed by companies that are mostly based in “developed” States and whose control and operation are dominated by whiteness.[4] The management and control do not appear to be problematic at first sight. In light of search results on my Yahoo search engine, however, a higher question that I must reiterate is whether the nature of relationship between service providers for the search engines and researchers in the Global South could give rise to racist concerns.
In her recent book published in 2018 and titled ‘Algorithm of Oppression: How Search Engines Reinforce Racism,’ Safiya Noble opines that racists concerns underpin development of these algorithms. To the Associate Professor, new technologies like AI consist of social practices that are determined by racial dynamics. Her rationale for this linkage, that I find compelling, is twofold. First, if the staff, management and control of coding for the AI technologies are white dominated, there is high propensity to benchmark with the ideal of ‘white’. In other words, the challenge of white domination in traditional research is carried over to online management. Her second rationale is that algorithms are developed by human beings. These human engineers influence the codes with their bias such as their views of ideas of richness, power, democracy, and other white ideals.
Considering Noble’s recent affirmations and earlier ones,[5] African researchers and online content consumers must be self-aware of colonial and racists ideals that continue to underpin the assumptions in the algorithms that search engines bring. Though Noble has recommended that more focus must be made on justice when coding, the implementation of such a proposal may not be easy. I say so because of the existing power imbalances between the western internet service providers and the service users in the Global South. In fact, so powerful are the service providers that the African Union has recently recognized that its member States have limited ability to influence the behaviour of these foreign service providers.[6]
Conclusion
Several multinational companies which provide internet services have invested in providing strong, fair and accurate algorithms. That, however, is not enough to stamp out white supremacy in personal data processing. Therefore, it is not enough for researchers on key African to be self-aware about the racists concerns. The researchers must also take deliberate steps to overcome racist online prioritization and appreciate decolonial perspectives that may otherwise not be prioritized by algorithms. Relevance must be pegged on parameters beyond just the quick, free, and accessible search engine if meaningful contributions are to be made in academic discourse.
[1] https://www.seoquake.com/blog/how-search-engine-algorithms-work/
[2] https://www.toolbox.com/tech/artificial-intelligence/interviews/top-ai-technology-trends-2022/
[3] https://www.degruyter.com/document/doi/10.18574/9781479833641/html?lang=en
[4] https://www.theguardian.com/technology/2022/mar/18/google-black-employees-lawsuit-racial-bias
[5] https://thevisualcommunicationguy.com/2015/10/01/are-search-engines-racist-surprising-google-images-results/
[6] https://www.internetsociety.org/wp-content/uploads/2018/05/AUCPrivacyGuidelines_2018508_EN.pdf