Limitations of Predative analytics (2)



                                                                 Bias in data and Algorithm

Predictive analytics models have a tendency to display the bias that stems from the fact that they are inclined to certain outcomes and choose certain groups in the output they produce. When a model for forecasting creates a biased output, it is the result of evaluating the data utilized for training, and the latter being unbalanced in terms of its representation of past events in addition to the use of stereotypes. 

A biased hiring algorithm, which is trained using the data that is biased towards the male candidates, will create biased selection that discriminates against the female candidates even when they have similar skills. The current of human bias in the data is as a result of human behaviour patterns, which persist and the continuation of past practices that result in the incomplete nature of data collection. The full bias and at the same time the partial data representation of the information from the past periods are still persistent in the data repository.

 There is a significant decrease in the accuracy of the models whenever the researchers act in a way that rural areas have not been included in their data collection because they exclusively targeted urban settings. The process of decision-making that is biased in the fields of the hiring process, rights of the accused, and fiscal matters involves the systematic exclusion from and favouritism of some citizens.


Comments

Popular posts from this blog

Implications of big data for individuals (6)

Application of big data techniques to a problem 1

Types of problem suited to big data analysis. (5)