The race being often ignored from studies

    The first issue contributing to the race being often ignored from studies is the source problem. Due to the racial bias in favor of whites, other races' achievements received less recognition. According to De la Pea (2010), "in order to write the histories on race and technology that are missing, historians must ask about what is missing from the record and archives" (pg. 926). Thirdly, a greater emphasis on gender inclusion led to the racial absence in research. The demands to promote the participation of women in technology could not consider the ethnic disparity existing. Due to the developers' prejudices, the race is removed from computational computations. Algorithmic computations are not objective but rather depend on the developers' preconceptions. As such, algorithmic computations reveal preconceptions and systemic disparities. Marijan (2018) discovered that "new algorithms are further embedding biases about the poor and putting these vulnerable populations in an ever more precarious position" (p. 6). These biases occur due to the authors of these devices' views. Carolyn de la Pea believes that the answer to racial prejudice in technology history is for historians to be more ready to go outside the accessible archives. There is a source issue resulting from earlier historians' neglect to account for minorities. The proposed approach can adequately recognize the significant contributions of individuals of various ethnicities throughout the history of technology. However, efficiency may be reduced when historians go too far in their study or misunderstand the given evidence. The anonymity surrounding algorithmic computations is a significant impediment to resolving race in algorithmic computations. Algorithmic computations are incredibly confidential, as are the data sets utilized to generate conclusions. The public's inaccessibility to this knowledge may negatively affect society's most vulnerable members.