© 2024 WSHU
NPR News & Classical Music
Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations

Transmedia artist employs AI to challenge its inherent racial, gender and age biases

Wikimedia

Amidst the whirlwind of technological advancements, there is an artist whose canvas extends far beyond the conventional brush and paint. Transmedia artist Stephanie Dinkins is leveraging her art to imbue artificial intelligence (AI) with empathy and care for communities of color.

Her approach to AI is not questioning the inherent biases ingrained within these systems but rather seeking to reprogram them through nurturing and injecting generosity. Dinkins sees this as essential for transforming AI into a learning tool that unlocks new possibilities to create effective change in society.

“When we treat people differently, medically or legally, depending on the color of the skin, it’s a problem,” Dinkins said. “And I think that problems get deeper as we look at artificial intelligence and embed our code the way that we see and treat each other.”

One of her first projects’, “Not the Only One,” conceptual idea was to share an oral memoir about three generations of women in her family through a trained chatbot that will generate logical responses. However, Dinkins quickly found out this is not necessarily so.

“The chatbot sometimes answered logically, other times not logically at all. Sometimes poetically but often it responded with the ethos of my family,” Dinkins said. “I see ‘Not the Only One’ is proof that building these kinds of particularities in a system is possible.”

Most of her projects are ongoing experiments as Dinkins continues to push the edges of AI, exploring how the system generates narratives when it's trained on the grounding morals and cultures of marginalized communities. It provides instructions on understanding the values of AI and its impact on our society.

Dinkins views the development of AI as give and take, in the sense that developers are providing this technology and we as a society have the obligation to push for the issues within the system to be addressed.

“I hope that people start to see that AI isn’t as scary as we’ve been told it is and we rarely tell the counter story of the benefits of AI,” Dinkins said. “As we see this give-and-take and that we can work with the system, we’ll get better results.”

It’s important to note that while AI itself cannot form a bias, through the data sets it is trained on and who develops it, a discernible bias can materialize. A significant contributor towards these ingrained biases in the AI models is the lack of diversity in the technological industry which is predominantly white, cisgender, heterosexual (cishet) men. In the development and training of AI, the perspectives and experiences of those outside this tech demographic are inadequately considered.

“What I think about now is what it means to sanitize history in the name of bias in a system. If we are trying to take bias out with a sledgehammer, we are doing ourselves a disservice,” Dinkins said. “I think we have to be more surgical about how we clean up and remove biases within the system so that the history is not lost, even if they are horrifying, they are important for us to remember and important stories for the system to have.”

Dinkins believes that there needs to be real representation in AI to cover “our blindspots” and that doesn’t mean only one person of color but rather a truly diverse array of perspectives. Whether people are trying to influence and educate the system on a basic level such as prompting the generative AI model or on a highly advanced level like coding, Dinkins believes everyone should be at least trying to engage this technology.

Times Magazine has named Dinkins as one of the 100 of the Most Influential People in AI.

“As a social practice artist I can open conversations about the technological world, have the latitude to think forward these ideas, and ultimately engage with the world around me,” Dinkins said.

Jenna Zaza is a news intern at WSHU for the fall of 2023.