by Mara Tanelli, AA. VV.

How gender biases and stereotypes manifest in corporate language and how technology can support the analysis and mitigation of their effects

Recent technological advances are making it possible to collect an extraordinary amount of data on how individuals behave, react, express themselves, move, and interact with each other. These data can be stored and processed with ‘intelligent’ algorithms to extract knowledge, hidden information, and, often, achieve a deeper understanding of the context. This technology-driven perspective in the field of social sciences is of great interest, as technology supports the analysis of large data sets and the development of tools that can bring the results to a wide public. 

In this article, we discuss an example of such cross-feeding between economic science, linguistics and machine learning to analyse the impact of gender-biased language in corporate documents, which has been the object of a joint research project between the Politecnico di Milano and NTT-Data, with the linguistic support of Prof. Cristina Mariotti of the Università degli Studi di Pavia.

To understand how language impacts the perpetration and spread of gender biases and stereotypes, it is worth noting that the phenomenon goes well beyond the corporate world. For example, Tenenbaum et al. [1] have shown that scientific discourse between parents and children has inherent gender-related differences from early childhood, with repercussions on later choices the children themselves make regarding their studies. In general, Tenenbaum et al. [1] found that parents were more likely to believe that science was less interesting and more difficult for daughters than for sons, and fathers tended to use more cognitively demanding speech with sons than with daughters when performing science-related tasks.

All the literature investigating the dichotomy between the so-called ‘agentic’ vs. ‘communal’ traits, which are a trademark of gender bias in behavioural psychology, stem from these early signs, see A.E. Abele [2]. These studies conceptualise both the gender stereotype and the gender self-concept as the distinction between more ‘masculine’ and more ‘feminine’ traits. The former are agentic-instrumental traits (e.g., active, decisive). The latter are communal-expressive ones (e.g., caring, emotional). 

These language traits, and their attribution to women and men, can indeed also have an impact on individuals’ careers. J.M. Madera’s work[3] shows that differences in the attribution of agentic and communal characteristics used in letters of recommendation to describe male and female candidates for academic positions can influence selection decisions. These results are particularly important, as letters of recommendation continue to be heavily weighted and commonly used as selection tools. In line with this, a stream of research analysing job postings shows that their language has a strong impact on the hiring process at all levels [4].

A quantitative assessment of (conscious and unconscious) gender biases in documents and communications can be of great help in raising awareness of these important issues. To build a language model that can spot the absence of gender neutrality and/or highlight discrimination, two major approaches can be used: a semantic and a pragmatic one. The former is based on: a) looking for target words and exploiting the recognition of agentic-instrumental traits vs communal-expressive traits; and b) looking for keywords (e.g., man/woman, male/female), both in isolation and in compound phrases to detect the presence of non-neutral expressions (e.g., gentlemen’s agreement). The second and more complex approach involves reasoning in a pragmatic way, i.e., exploiting the reference and co-reference associations. These may unveil unneeded overextensions of the masculine forms to generic referents [5]. 

Thanks to data analytics tools, one can in principle automatise these linguistic analyses to assess the presence of gender stereotypes through text-mining [5]. Once the desired features are extracted from data, and with deep context knowledge, one can develop specific scoring methods to succinctly evaluate the company’s attitude towards gender equality and inclusion as a whole. 

A first attempt to build such a system has given rise to the GeNTLE (Gender Neutrality Tool for Language Exploration) project, in which a prototype tool was developed. This tool takes as its input corporate documents and – thanks to an inner layer of linguistics-informed automatic text analysis – provides as output a specific scoring for significant gender-related KPIs, and creates the opportunity for ‘deep diving’ into the original documents to interpret and understand the aspects that were highlighted by the software analysis.

We believe that this automatic analysis tool can indeed help companies increase awareness of their true corporate culture with regard to gender equality matters and about their written production, which is a powerful weapon in shaping the firm’s identity on these matters.

The tool, while being able to provide useful recommendations on how to act to manage any gender-related communication bias hidden in company documents, is neither intended to provide a direct judgement of the resulting scoring, nor to automatically enact the needed mitigation actions. The latter activities must be performed by dedicated specialists, working with the firm’s executives. This creates virtuous synergy between technology-enabled analysis and the human-centric stage of discussion, analysis and reasoning, which is absolutely indispensable to initiate the cultural process that will eradicate the deepest roots of gender bias and stereotypes.

REFERENCES

[1] Tenenbaum, R. et al., (2003) “Parent–Child Conversations About Science: The Socialization of Gender Inequities?” Developmental Psychology.

[2] A.E. Abele (2003) “The Dynamics of Masculine-Agentic and Feminine-Communal Traits: Findings from a Prospective Study”, Journal of Personality and Social Psychology.

[3] J.M. Madera et al., (2009) “Gender and Letters of Recommendation for Academia: Agentic and Communal Differences, Journal of Applied Psychology.

[4] Gaucher, D., Friesen, J., & Kay, A. C. (2011). “Evidence That Gendered Wording in Job Advertisements Exists and Sustains Gender Inequality”. Journal of Personality and Social Psychology.

[5] A. Ndobo (2013) “Discourse and attitudes on occupational aspirations and the issue of gender equality: What are the effects of perceived gender asymmetry and prescribed gender role?” European Review of Applied Psychology.

[6] Soon, W.M. et al., (2001) “A Machine Learning Approach to Co-reference Resolution of Noun Phrases”, MIT Press.

DATI PER AUTHOR BOX (NOME/COGNOME/ANNO DI NASCITA/TITOLO DI STUDIO/JOB TITLE)

Mara Tanelli, 1978, Ph.D. in Information Technology, Professor of Automatic Control at Politecnico di Milano

Cristina Rossi-Lamastra, 1973, Ph.D. in Economics and Management of Innovation, Professor of Business and Industrial Economics at Politecnico di Milano

Cristina Mariotti, 1971, Ph.D. in Lingiustics, Associate Professor of Linguistics at Università degli Studi di Pavia

Raffaele Mancuso, 1991, M.Sc. in Management Engineering, Ph.D. student at Politecnico di Milano

Alessandro Santi, 1964, M.Sc. in Electronics Engineering, Director of Consulting at NTT-Data Italia

Leave a Reply