Finding the middle ground with AI and education

 Created with Adobe Firefly in Adobe Express

    When the caveman discovered fire, there was immediately a sense of wonder at all the incredible things this new technology could achieve. You could get warm and you could cook food, but after bringing the fire indoors the caveman then discovered this new invention could burn down your house and cause harm. 

    Like any technology or new advancement, the same can be said for Artificial Intelligence (AI). While you’re reading this sentence, AI programs are painting cosmic portraits, responding to emails, preparing tax returns, and recording songs. They’re writing pitch decks, debugging code, sketching architectural blueprints, and providing health advice (Chow and Perrigo, 2023). 

    However, AI is not a new thing. AIs are used to price medicine and houses, assemble cars, determine what ads we see on social media. But generative AI, a category of system that can be prompted to create wholly novel content, is much newer (Chow and Perrigo, 2023). 

    How popular is AI?  In January 2023, ChatGPT reached 100 million monthly users, a faster rate of adoption than Instagram or TikTok (Chow and Perrigo, 2023). Other companies were quick to follow with their own Generative AI programs. In fact, if you are currently watching the Paris 2024 Olympics, you are constantly being exposed to ads featuring Google's new Generative AI program, Gemini. 

    The idea of a program like ChatGPT being able to write an entire essay has English teachers shaking and quaking in their shoes. In 2023, as OpenAI.com, the website of the company that produced ChatGPT, became one of the 50 most visited websites in the world, some of the nation’s largest school districts, from New York City to Los Angeles, banned its usage in the classroom while they worked to formulate policies around it (Waxman, 2023). 

    Meanwhile, teachers desperate to figure out how to harness the tech for good congregated in Facebook groups like “chatGPT for teachers” (about 300,000 members) and “The AI Classroom” (more than 20,000 members) (Waxman, 2023). 

    Some teachers are embracing this new technology, while others don't want to use it all. Some school districts have banned all AI websites from their servers, while others have written policies. This year in my students' staff manual for our media class, I wrote my first AI policy. As a teacher, whose district currently has all AI sites banned, I am not dumb. I know my students are using Generative AI. Heck, I use it. However, I feel that it's my job, as their teacher, to responsibly teach them how to use this new technology. 



Ethical Concerns

       Last month, the U.S. Senate overwhelmingly (by a vote of 91-3) passed legislation that is designed to protect children from dangerous online content. If the child safety bill becomes law, companies would be required to mitigate harm to children, including bullying and violence, the promotion of suicide, eating disorders, substance abuse, sexual exploitation and advertisements for illegal products such as narcotics, tobacco or alcohol. To do that, social media platforms would have to provide minors with options to protect their information, disable addictive product features and opt out of personalized algorithmic recommendations. They would also be required to limit other users from communicating with children and limit features that “increase, sustain, or extend the use” of the platform — such as autoplay for videos or platform rewards (Jalonick and Ortutay, 2024). 

    Common Sense Media, a national nonprofit organization dedicated to improving media and technology for educators, children and parents, is a huge advocate of online safety for children. Their breakdown of what the bill will do is this: 

    First, kids will no longer be driven to unwanted content that promotes dangerous behaviors, like eating disorders and narcotic drug use.

    Second, kids will stop seeing unwanted, individually targeted ads designed to push specific products on them.

    Third, social media and gaming sites will need a parent's permission if the child is under 13, or a teen's permission, before the companies can collect kids' personal data.

    Fourth, kids will be able to turn off features like algorithmic feeds, autoplay, and endless scrolling that are designed to keep them glued to their devices.

    Finally, young users will still be able enjoy, learn from, and engage with friends and communities on the platforms they love.

(Grosshans, 2024)

    Why is this bill coming about? AI presents three major areas of ethical concern
for society: privacy and surveillance, bias and discrimination, and perhaps the
deepest, most difficult philosophical question of the era, the role of human
judgment (Pazzanese, 2020). 

    The same can be said for teachers who want to try and use AI in the classroom. Teachers need to think about privacy, surveillance and interaction, autonomy, and bias and discrimination (Michigan State University, 2021). 

    “Artificial intelligence can manipulate us in ways we don’t always think about,” said Christine Greenhow, co-author and a faculty member in Educational Psychology and Educational Technology at MSU (Michigan State University, 2021).

    Randi Weingarten, President of the American Federation of Teachers, a major teachers union, believes the panic about AI is not unlike the ones caused by the Internet and graphing calculators when they were first introduced, arguing ChatGPT “is to English and to writing like the calculator is to math.” In this view, there are two options facing teachers: show their students how to use ChatGPT in a responsible way, or expect the students to abuse it (Waxman, 2023).


(YouTube: Dr. John Spencer)

Middle Ground

   As Generative AI quickly expands, many people, including educators, fall into two traps:

        1. Techno-Futurism - uncritical embrace of AI to transform education forever.

        2. Lock It and Block It - dominated by fear, block it from all students.

(Spencer, 2023)

    Dr. John Spencer's video, fortunately, comes up with a middle ground
solution called Vintage Innovation, which is a shift away from the flashy and new and over to the different and better (2023). It's not a reactionary rejection of technology, but it's the overlap of the tried and true and the never tried. A mashup of the cutting edge technology and the old school tools (Spencer, 2023).

    A Walton Family Foundation survey published July 18 found 73% of teacher respondents had heard of ChatGPT, and 33% used it to help come up with “creative ideas for classes” (Waxman, 2023). Some of these creative ideas include getting ChatGPT to generate a rap about vectors and trigonometry and then perform them in a classroom competition, or to generate materials for students at different reading levels (Waxman, 2023). I created an IEP plan in Google Gemini to help teach a student how to learn news writing that has a speech impediment and is autistic. 

    No, AI doesn't always get things correct. In fact, some teachers are having students use AI to fact-check essays generated by the program in response to their prompts, hoping to simultaneously test students’ knowledge of the topic and show them the problems with relying on AI to do nuanced work (Waxman, 2023).

    I like to play with text to image AI programs, like Adobe's Firefly, with my own children. Below is a picture my daughter generated. She wanted a picture of one of her favorite book characters. In order to do this, she couldn't just put the character's name. Firefly wouldn't accept that due to copyright. My daughter, age 7, had to describe the character. What color is her hair? What is she wearing? Is there a pattern on the dress? Is the dog sitting or standing next to her? As we did this together, she decided to change the hair color, the dress, and even the dog, making her own, original image. As I learned in a professional development workshop earlier this summer, when it comes to AI if you put junk in, you get junk out. However, if you are specific and detailed, you can get some amazing things. 


(Image created by Adobe Firefly)

    Our students need to develop the soft skills that machines lack, like collaboration and empathy. In a world of of constant change, our students need to be divergent thinkers. In an era of automation, our students might just need Lo-Fi tools. In a world of AI, our students need to think philosophically. In a sea of instant information, our students need to slow down to be critical thinkers and curators (Spencer, 2023).

    K-12 education should thus prioritize teaching critical thinking, problem solving, and teamwork across subject areas. Teaching students to become analytical thinkers, problem solvers, and good team members will allow them to remain competitive in the job market even as the nature of work changes (Levesque, 2018). 

    In her article, Levesque makes several recommendations with respect to education policy to help students and workers adapt to changes in the workforce given advances in AI:

    1. State standards and curricula should incorporate 21st century skills across subject areas. 

    2. Federal legislation and policy should explore and support workforce development partnerships. 

    3. Support displaced workers and other “non-traditional students” in their search for new career pathways.

(2018). 

    Teachers are doing their students a disservice if they don't at least introduce them to Generative AI and show them how to use responsibly. As Spencer states at the end of his video, with Vintage Innovation we can avoid the 2 Traps by asking What does it mean to use AI wisely? How do you think critically about AI as a tool? (2023). 


                                                                                                                                                                                                                       (Image created by Adobe Firefly)

The Future

    Will AI replace teachers? That is a fear that many in education have. There are those that fear it will promote educational inequities, further dividing classrooms into students whose families have the resources to afford the high-speed internet connection that eases access to ChatGPT and students whose families do not. There are also worries about biases in the data AI uses to craft its answers to users’ prompts. And it will be no small challenge for teachers to figure out how to use the technology to develop students' critical thinking skills without sacrificing the meaningful connections that can be the product of human-to-human teaching—an even more urgent challenge when it comes to students who mentally checked out during the abrupt shift from in-person instruction to virtual school during the pandemic (Waxman, 2023). 

    Personally, I don't think so. It's up to each teacher to decide how they will use in their classroom. When it comes to getting knowledge to stick, there may be no substitute for human relationships. To many teachers, even if they’re getting ready to welcome ChatGPT into the classroom when the doors re-open this year, that’s reason enough not to fear the extent of the disruption on the horizon (Waxman, 2023).

    There is no substitute for hugs, high-fives, celebrations, and tears on rough days when you are with your students. No robot or computer can change that. 


Future Reference: Why I’m Banning Student AI Use This Year

  

References

Chow, A. R., & Perrigo, B. (2023, February 16). The AI Arms Race Is Changing Everything. Time. https://time.com/6255952/ai-impact-chatgpt-microsoft-google/

Grosshans, H. (2024, July 26). Top 5 Myths About Kids' Online Safety Legislation. Common Sense Media. Retrieved August 4, 2024, from https://www.commonsensemedia.org/kids-action/articles/top-5-myths-about-kids-online-safety-legislation

Jalonick, M. C., & Ortutay, B. (2024, July 30). Senate passes bill to protect kids online and make tech companies accountable for harmful content. Associated Press. Retrieved August 4, 2024, from https://apnews.com/article/senate-child-online-safety-vote-f27c329679feb2d74787fc3887aa710f

[John Spencer]. (2023, January 9). How Will Schools Respond to the AI Revolution? [Video]. YouTube. https://youtu.be/KgygRCdHbmc?si=DdNyIA8JLRYeCm9S

Levesque, E. M. (2018, October 18). The role of AI in education and the changing US workforce. BROOKINGS. Retrieved August 4, 2024, from https://www.brookings.edu/articles/the-role-of-ai-in-education-and-the-changing-u-s-workforce/

Michigan State University (2021, November 3). Exploring the ethics of artificial intelligence in K-12 education. Michigan State University College of Education. Retrieved August 4, 2024, from https://education.msu.edu/news/2021/exploring-the-ethics-of-artificial-intelligence-in-k-12-education/

Pazzanese, C. (2020, October 26). Great promise but potential for peril. The Harvard Gazette. Retrieved August 4, 2024, from https://news.harvard.edu/gazette/story/2020/10/ethical-concerns-mount-as-ai-takes-bigger-decision-making-role/

Waxman, O. B. (2023, August 8). The Creative Ways Teachers Are Using ChatGPT in the Classroom. Time, (Time Special Edition Artificial Intelligence: A New Age of Possibilities). https://time.com/6300950/ai-schools-chatgpt-teachers/

Comments

  1. Excellent insights shared here with strong resources for rural educators. How can we ensure that this is rolled out correctly? Approaches shared here provide a starting point for us to consider.

    ReplyDelete

Post a Comment