🤖 AI Summary
This study addresses the risks of cultural misrepresentation and bias posed by generative AI in low-resource and Indigenous educational contexts. Through four rounds of participatory co-design workshops with 22 educators from Hawaiian public schools, the research reconceptualizes AI auditing as a community-centered, collaborative process grounded in Hawaiian cultural values and knowledge systems. The project proposes design principles for auditing tools that emphasize genealogical tracing of knowledge, cultural sensitivity, and provenance of Indigenous knowledge—challenging conventional individualistic auditing paradigms. By centering local epistemologies and relational accountability, this work offers an innovative pathway for AI governance in multicultural settings, particularly those involving historically marginalized communities.
📝 Abstract
Although generative AI is being deployed into classrooms with promises of aiding teachers, educators caution that these tools can have unintended pedagogical repercussions, including cultural misrepresentation and bias. These concerns are heightened in low-resource language and Indigenous education settings, where AI systems frequently underperform. We investigate these challenges in Hawai`i, where public schools operate under a statewide mandate to integrate Hawaiian language and culture into education. Through four co-design workshops with 22 public school educators, we surfaced concerns about using generative AI in educational settings, particularly around cultural misrepresentation, and corresponding designs for auditing tools that address these issues. We find that educators envision tools grounded in specific Hawaiian cultural values and practices, such as tracing the genealogy of knowledge in source materials. Building on these insights, we conceptualize AI auditing as a community-oriented process rather than the work of isolated individuals, and discuss implications for designing auditing tools.