How People Influence Large Language Models: AI and Gender Bias

A test to see if ChatGPT would generate a woman when prompted to generate an image of a nurse Image credit: Kirsten Davis

BY KIRSTEN DAVIS

The New York City Council is considering ways to eliminate gender bias in artificial intelligence technology used by city agencies. On December 8, 2025, Council Member Farah N. Louis introduced a bill that would require the Department of Information Technology and Telecommunications (DOITT) to conduct a gendered impact assessment every two years on AI algorithms used by city agencies. As a result, the DOITT would have to evaluate and ensure that any AI technology used does not contribute to gender disparity in the workforce. 

For many AI experts, the bill addresses a problem that they have been researching for years since AI became open to the public. Because AI can have bias. By going unchecked, this bias can bring disadvantages to women in the workplace. Artificial intelligence was developed and introduced to the public as problem-solving software designed to make everyday life easier. While some will argue that it has done that, critics argue that AI is doing more to reinforce gender bias. 

This bias ranges from placing women into stereotypical roles to placing their resumes underneath those coming from men. AI learns from what it is taught and what is documented on the internet. The world itself holds gender bias, and artificial intelligence picks up on that. 

Much of the gender bias that women are facing from AI technology comes from hiring algorithms. These AI-powered algorithms filter through resumes during the hiring process and select candidates they find would be best suited for the job. On a smaller scale, when prompted to generate images, AI will categorize men and women into gendered roles in the workforce, such as nurses and construction workers.

“Most of the leaders in leadership positions are men, right? So, you just don’t have enough examples of women who’ve succeeded,” said Swathi Dhamodaran, a founding member of The Neural.AI, a network of engineering and technology professionals that examines the possibilities of AI. “And therefore,” Dhamodaran said, “what is AI based on? Training data, the bulk of which is successful, but men in this instance. That’s why hiring algorithms have gender biases.” 

In 2018, Amazon pulled an AI-powered recruitment tool after it was determined that it favored the resumes of men over women. The tool learned to pick out gendered terms in resumes that pointed to applicants being women, and even put graduates of two all-women’s colleges at the bottom of the list. Facial recognition systems also struggled to identify women, especially those who are women of color. False matches can be dangerous, leading to false arrests, placement on watch lists, and unwarranted police altercations, among other things. 

Sarah Wyer is a PhD researcher who specializes in bias in AI and received an award for being a top woman in tech. Wyer discovered gender bias within Google and, as an experiment, began prompting the GPT-2 tool she was using to gain feedback on specific gender biases she fed to it. 

“Some of the outputs that we were getting were terrifying,” Wyer said. “It was elevating the status of men and sexualizing the status of women. Then I did it with different versions of GPT-3, which is where ChatGPT came from. It was all very negative and very derogatory against women.” 

Wyer’s research found that AI like ChatGPT is easily influenced, and feeding it these gender biases then causes it to be trained around these beliefs. “The data choices that we make when we’re creating these large language models (LLMs) are important because our values can get embedded within there,” she said. 

Algorithms can be trained to break out of this habit, but it proves difficult when the AI tools can reinforce these biases from what they have already learned. However, the implementation of a bill by the NYC Council is a step towards keeping these tools from continuously leading users towards gender inequity.

“There is so much we as women can do,” Dhamodaran said. “Focus on the impact that can be made rather than the barrier that’s getting in the way of you being the only woman in the boardroom.”