Social exclusion as a side effect of machine learning mechanisms
https://doi.org/10.26425/2658-347X-2022-5-4-23-30
Abstract
The development of neural network technologies leads to their integration in decision-making processes at the level of such important social institutions as healthcare, education, employment, etc. This situation brings up the question of the correctness of artificial intelligence decisions and their consequences. The aim of this work is to consider the origin and replication of social exclusion, inequality and discrimination in society as a result of neurotraining. Neurotraining understood as the principles of any neural networks’ training. Social exclusion and the resulting discrimination in decisions made by artificial intelligence is considered as a consequence of the big data processing principles. The authors review the theories of foreign and Russian authors concerning the impact of artificial intelligence on strengthening the existing social order, as well as problems with processing and interpreting data for training computer systems on them. Real situations of the specifics of the data itself and its processing that have led to increased inequality and exclusion are also given. The conclusion about the sources of social exclusion and stigmatization in society is made due to the similarity between natural and artificial neural networks functioning. The authors suggest that it is the principles of neurotraining in a “natural” society that lead not only to discrimination at the macro level, but also cause vivid negative reactions towards representatives of the exclusive groups, for example, interethnic hatred, homophobia, sexism, etc. The question about the possibility of studying “natural” society in comparison with “artificial” one is raised.
Keywords
About the Authors
A. G. TertyshnikovaRussian Federation
Anastasiya G. Tertyshnikova, Cand. Sci. (Sociol.), Senior Lecturer at the Sociology Department
Moscow
U. O. Pavlova
Russian Federation
Ul’yana O. Pavlova, Student
Moscow
M. V. Cimbal
Russian Federation
Mariya V. Cimbal, Student
Moscow
References
1. Cruz T.M. (2020), “Perils of data-driven equity: safety-net care and big data’s elusive grasp on health inequality”, Big Data & Society, vol. 7, no. 1, https://doi.org/10.1177/2053951720928097
2. Dobrinskaya D.E., Martynenko T.S. (2020), “Is digital equality possible? (on the book “The Digital Divide” by J. van Dijk)”, Sociological research, no. 10, pp. 158–164, https://doi.org/10.31857/S013216250009459-7
3. Goffman E. (1963), Stigma: Notes on the Management of Spoiled Identity, trans. from Eng. Dobryakova M.S., Prentice Hall, New York, US (in Russian).
4. Ivanova N.A. (2011), “The concepts of ’Habitus’ and ‘Habitualization’ in the context of sociological theories”, Tomsk State University Journal of Philosophy, Sociology and Political Science, no. 1(13), рp. 115–129.
5. McMillan Cottom T. (2020), “Where platform capitalism and racial capitalism meet: The sociology of race and racism in the digital society”, Sociology of Race and Ethnicity, vol. 6, no. 4, pp. 441–449, https://doi.org/10.1177/2332649220949473
6. Noble S.U. (2018), “Algorithms of oppression: How search engines reinforce racism”, Ethnic and Racial Studies, vol. 43, no. 3, pp. 592–594, https://doi.org/10.1080/01419870.2019.1635260
7. Rodgers G., Gore Ch., Figueiredo J. (1994), Social Exclusion: Rhetoric Reality Responses, International Institute for labour studies, United Nations development program, Geneva, Switzerland.
8. Vasenkov D.V. (2007), “Methods of teaching artificial neural networks”, Computer tools in education, no. 1, pp. 20–29.
9. Wajcman J. (2017), “Automation: is it really different this time?”, The British journal of sociology, vol. 68, no. 1, pp. 119–127, https://doi.org/10.1111/1468-4446.12239
10. Yarskaya-Smirnova E.R. (1997), Sociocultural analysis of atypical, Saratov State Tech University Publ. House, Saratov, Russia (in Russian).
Review
For citations:
Tertyshnikova A.G., Pavlova U.O., Cimbal M.V. Social exclusion as a side effect of machine learning mechanisms. Digital Sociology. 2022;5(4):23-30. (In Russ.) https://doi.org/10.26425/2658-347X-2022-5-4-23-30