no one will have escaped the flood of accusations fierce and passionate against Amazon and its algorithm for recruitment considered sexist towards women. Amazon has recently developed and tested an algorithm designed to select from among the CV of candidates the tops profiles to recruit within the company. This algorithm, structured by the hiring heavily male over the past ten years, has developed a bias that made him under-estimate the value of a profile female which was now proposed. This bias in algorithmic genre has finally imposed to Amazon to stop its tests. The large and growing number of articles mentioning sometimes inflamed this story opposes the impoverishment of the debate on the use of decision algorithms. This dynamic intolerant towards this giant of the data risk of the push to cover up his mistakes and play the card of the not seen, not taken. To combat the risk of a totalitarianism algorithmic, it must at all costs defend the Amazon and its algorithm for recruitment !
Yes, Amazon has made a mistake with serious consequences. But no, it is not necessary to throw stones blindly, Amazon and even less to the designers of the algorithm. The bias algorithmic exist and hide in lines of codes, criteria, algorithms or in the choice of learning strategies, and this, in all the software ! But we are not aware of these means that from the moment they are exposed to the big day, and this is where the shoe pinches. Amazon, like all Gafam, is its own Achilles heel with its high visibility and its risk réputationnel which is vulnerable in case of error. That being said, Amazon is entirely responsible for its acts, but a transparency of sound, its activities can only exist in a revolutionary atmosphere, and dangerous on the part of citizens who are beginners.
Read also FORUM. Ngsbahis For a digital school, made-in-France
Dice-very fond of the debate
We risk to enter a company ” not seen not taken “, which can eliminate the giants from the data which will benefit from the immateriality of algorithms to hide more easily the corpse under the carpet. But also to disempower individuals, the users of these tools, which, by a lack of knowledge and of technological culture, will not seek to understand the origins of possible errors and ways to avoid them. In fact, the vast majority of existing regulations are based on the obligations of the giants of the tech, but never mention the duties of the users of these digital tools.
Read also Amazon : the law of the jungle, the future of management
That one is not mistaken there, without victimizing Amazon, it is essential to defend the freedom to think, act, and be wrong to allow the company to move forward by capitalizing on the mistakes of the past. Dice-are very fond of this debate, we have to make sure that no carpet will cover any dead body, and that the legal or natural persons to take their responsibilities in a reading understandable and bilateral issues and mechanisms technology. We have to make sure that these people can freely share their mistakes for the benefit of all the actors of the exchequer, technological and social.
* Aurélie Jean is a doctor in sciences and digital entrepreneur. This specialist French algorithms has conducted five years of research at the Massachusetts Institute of Technology.