Ed3 Weekly Issue #81: Do Your Research
Exploring what the facts are saying about AI and cheeting
Hello web3 and education friends,
This week's resources cover:
What Schools Are Asking About AI
'The Simpsons' Made an NFT Episode and Fans Missed the Point
How Microsoft's AI teaching assistant helps generate classroom materials
This week’s deep dive:
What do AI chatbots really mean for students and cheating?
If you haven’t subscribed yet, click here, or use the button below to join hundreds of others learning about the intersection of web3 and education.
Gone are the days when education needed to be confined to the four walls of a classroom and the limited knowledge of textbooks. Today, we are seeing a shift in how knowledge is imparted and received, with AI co-pilots streamlining lesson planning and connecting teachers to a wealth of resources. These AI assistants are not just digital tools but collaborative partners reshaping the efficiency and personalization of education.
As we delve into this new world, we still find ourselves grappling with the ethical and practical dimensions of AI's integration. How do we cultivate "AI Readiness" among the youth, especially when age restrictions limit direct interaction with AI's full potential? Our reflection on these matters must go hand in hand with vigilant attention to academic integrity in an age where AI can both assist and undermine the educational journey.
Cultural narratives, like those woven into the fabric of "The Simpsons," remind us that our perceptions of technological advancements such as NFTs are ever-changing. These stories reflect our collective consciousness about the future's promise and the fickleness of trends—echoing the volatile dynamics that educators and students may face in the digitized marketplaces of knowledge.
New research emerging out of Stanford University offers a lens to view the constancy of student conduct amidst technological upheaval. It suggests that the answer to the challenges of cheating isn't in prohibitive measures but in fostering a culture of meaningful engagement and AI literacy—skills that will be essential as educational tools and ethical guidelines evolve alongside AI and Web3.
In this week’s resources, we will weave together a narrative of transformation, a glimpse into a future where education transcends its traditional limitations and becomes a dynamic, interactive, and individualized experience.
What Do AI Chatbots Really Mean for Students and Cheating?
Photo: Shutterstock
This article explores the work of Stanford researchers Denise Pope and Victor Lee. They have found that despite the advent of AI chatbots like ChatGPT, there has not been an increase in student cheating. Their research shows that the prevalence of cheating remains consistent, with technology access not being a significant factor.
Pope, of the Challenge Success program at Stanford, and Lee emphasize that cheating is often symptomatic of systemic educational pressures and not just a question of opportunity. They suggest that AI tools like ChatGPT could be beneficial for learning if used appropriately.
To mitigate the misuse of such AI, they recommend integrating AI literacy into education, rather than investing in unreliable AI-detection software or imposing ineffective bans. The researchers advocate for strategies that engage students meaningfully, thereby reducing the inclination to cheat.
Thank you for stopping by for another issue of my web3🤝education newsletter. If you’re on LinkedIn you can check out a version of this newsletter on my LinkedIn page and give me a follow. You can also link to all my work by checking out my website or give me a follow on the X platform.