Remarkable Accepted to The Forum’s Early AI Adoption Lab as a Decolonial Technologist
As a leader of a tech company, it’s my responsibility to stay on top of tech and provide clients with cutting edge solutions – while staying grounded in our decolonial values.
So naturally, I’m thrilled to share that I’ve been accepted into The Forum’s Early AI Adoption Lab, a program designed to help businesses like mine learn how to responsibly navigate AI use.
Over six weeks, I will join a group of female and non-binary founders to explore how AI can be integrated into our day-to-day operations. Women are still vastly underrepresented in tech, and AI is no exception. It’s exciting to be part of a peer group that’s committed to learning and growing together in this space.
AI’s role in the tech world is growing rapidly. In many ways, it’s opening doors. It’s pushing innovation faster than we’ve ever seen before. As someone whose job revolves around innovation, I feel a need to look into AI to learn how and if it can benefit our clients. I also have to be mindful of AI’s potential to cause harm.
AI can be a game changer, but it’s not without flaws. The information it provides isn’t always accurate. It can often provide biased information, because it’s been trained on historical examples that reflect past prejudice or implicit bias – in other words, it’s trained on material that includes harmful disinformation, stereotypes, and oversimplification.
Decolonial AI attempts to solve for this issue, and working with Indigenous clients, this is a major concern – and a reminder to never take AI at face value. This article from 2024 is about how Indigenous data stewardship stands against extractivist AI discusses these concerns in detail.
A core thing AI is missing is humanity. When working with Indigenous clients especially, we need to be culturally aware and willing to learn constantly. Those are inherently human behaviours. I know that AI can never replace the work I do, but it may become a useful tool.
Opinions on AI are as diverse as the technology itself. There’s opportunity for Indigenous communities to play a role in how AI is shaped. Here’s a story from 2023 about how Indigenous people can help form AI systems to be respectful of Indigenous knowledge and values. Then again, there's a risk of irresponsible AI use causing harm. I look forward to The Forum’s Early AI Adoption Lab, with hope it’ll help me navigate the complex world of AI. There’s a lot I don’t know yet—but what I do know is that I’m committed to approaching this technology with a decolonial and critical lens.
If you have thoughts on decolonial AI practices or ways to use this technology responsibly, I’d love to hear from you. You can find ways to reach me online at laurelannestark.com, and I will also be in Vancouver in November to present at the Our Children Our Way conference too - here’s more info on that.
Yours in solidarity,
Laurel Anne Stark