Exploring AI in Education

 

Exploring AI in Education

*Photo Description: 
A bold graphic of a glowing head, has the words “AI in Education” and “Learning Experience” which highlights the impact of artificial intelligence on modern learning.* 

As part of our coursework on understanding artificial intelligence and its role in modern learning environments, I recently completed an exercise from the YouTube lecture titled “Advantages and Disadvantages of AI in Education” by T. Hailey (2025). This insightful presentation was paired with readings such as “Does AI Have a Bias Problem?” by A. Greene-Santos (2024) and “Explained: Generative AI’s Environmental Impact” by A. Zewe (2025). The exercise I focused on required us to reflect on AI bias and its implications in educational settings.

The task involved critically analyzing how AI tools (like ChatGPT) might perpetuate or combat systemic inequities in the classroom. We were encouraged to explore examples of algorithmic bias, consider who designs AI tools, and think about how students from diverse backgrounds might experience AI differently.

While completing this reflection, I became more aware of how embedded bias can be in the data AI uses. For instance, Greene-Santos (2024) discusses how AI systems often inherit prejudices from their training data, which can lead to skewed outputs, particularly in tools used for grading, predictive analytics, or content generation. This becomes an issue when educators or institutions rely too heavily on AI without questioning its sources or limitations.

In reflecting on this, I realized how crucial it is for educators to remain actively engaged in any AI-integrated learning process. Human oversight, cultural awareness, and continuous evaluation of AI outputs are essential to ensuring fairness. While AI offers exciting possibilities for individualized learning, accessibility, and efficiency (Hailey, 2025), these benefits mean little if the systems reinforce the very disparities they aim to reduce.

Engaging with this exercise helped me think beyond the convenience of AI and start examining its real-world implications. I’ve used AI tools in my academic and professional work, but this experience prompted me to think more critically about the ethics behind them. I now see AI not as a neutral tool, but one that mirrors the values and flaws of its creators and training data.

As a librarian, I feel a renewed responsibility to not only use AI thoughtfully but to teach students how to engage with it critically. Asking questions like “Whose voices are missing in this dataset?” or “Who benefits from this tool?” should be common practice in classrooms that adopt AI.

This exercise deepened my understanding of both the promise and the pitfalls of AI in education. It’s not enough to adopt new technologies, we must also interrogate them. As we move forward, equity must remain at the center of any discussion about digital learning tools. AI can be a powerful force for personalized education, but only if it is built and used with fairness and transparency in mind.


References
Greene-Santos, A. (2024, February 22). Does AI have a bias problem? NEA. https://www.nea.org/nea-today/all-news-articles/does-ai-have-bias-problem

Hailey, T. (2025, June 11). Advantages and disadvantages of AI in Education. Schools That Lead. https://www.schoolsthatlead.org/blog/ai-in-education-pros-cons

Zewe, A. (2025, January 17). Explained: Generative AI’s environmental impact. MIT News. https://news.mit.edu/2025/explained-generative-ai-environmental-impact-0117

ReadFast. (2024, February). AI in Education Learning Experience [Digital graphic]. ReadFast. https://readfast.in/storage/2024/02/do-whatever-6.jpg

Comments

  1. Diamond, I love your statement, "I now see AI not as a neutral tool, but one that mirrors the values and flaws of its creators and training data." After reading so much about AI this week and trying to formulate my own opinions about it and its place in education, my perspective has shifted some. I see some benefits but also have some concerns. I agree that not blindly adopting new technologies, but instead "interrogating" them should be a common practice in classrooms and schools.

    ReplyDelete

  2. Hi Diamond,
    It is definitely important to remember that AI can be biased. I think people often assume that technology is neutral because they think of it as a machine and thus machines can’t be biased. However, AI is developed and trained by humans so biases can be passed to the systems. When using AI it is important to keep this in mind and an important lesson to teach students. In Mollick’s (2024) lecture, I thought it was interesting to learn about how he uses AI and encourages his students to use programs like ChatGPT but with certain guidelines. In the education field, I think a lot of people still think AI resources like ChatGPT are just cheating tools. However, as Mollick (2024) expressed in his lecture, there are lots of meaningful ways you can use AI, even in an educational setting but it is always important to evaluate the information it generates.

    ReplyDelete
  3. Hello Diamond,
    Your statement "It’s not enough to adopt new technologies, we must also interrogate them" is a very powerful statement on the implementation of AI in classrooms. When it comes to professional development, sessions are typically about an hour long. when it comes to AI that is hardly enough time to become proficient, so there needs to be relatively extensive practice on the platform. And everyone knows, devoted planning time is a precious commodity that isn't always available to many teachers. Teachers would need to practice both how to use AI for their own uses as well as how to teach responsible use to students. Unfortunately, with other duties that take up a teacher's planning time, they may not have opportunities to learn how to teach this responsible use to students. That can eventually lead to students who jump into using AI, without the suppors to using it responsibly. So you are absolutely correct, there is still work to be done beyond just adopting this technology into the classroom.

    ReplyDelete
  4. Diamond
    You have nailed this blog (not saying you don't well on the others). Human oversight is critical for any platform we as educators use. Cultural awareness should be first and foremost to ensure that no one is left behind. Continuous evaluation of AI outputs are essential to ensuring fairness. After this class I will be encouraging the students at HES to use ChatGPT with certain protocol/rules/procedures in place to ensure they don't become totally dependent on AI. At the end of the day students should be digging deep to use their skills, thoughts and knowledge to produce their work.

    ReplyDelete

Post a Comment

Popular posts from this blog

Standards in Harmony: AASL and ISTE in the School Library

The Pedagogical Potential of 3D Printing in K–12 School Libraries