top of page

Ethical AI in the Classroom: Designing Assignments for the AI-Savvy Student

  • Writer: alfordemilya
    alfordemilya
  • Mar 4
  • 5 min read

Just a few years ago, AI was a term used mostly in conjunction with making predictions for likely outcomes based on huge datasets. Generative AI (gen AI) changed all that. In just a few years, university students went from almost no experience with AI to overwhelmingly adopting gen AI for help with everything from writing papers to solving equations.


Universities are scrambling to keep up and many educators are calling for the complete abolition of AI in entire departments. The debate presents a conundrum for students, many of whom are worried about being accused of academic dishonesty but are unsure of where the boundaries exist in AI-assisted learning. And for educators across campus, AI poses a different problem: How can students arm themselves with the AI skills they’ll need after graduation without simply “asking ChatGPT” to do their work for them?


As Samantha Levine, 4th Year Marketing Student at the Ohio State Fisher College of Business, tells Novela about future job prospects, “I think intertwining AI will be the most valuable skill in marketing.” However, many students find themselves learning independently through trial and error.


Teaching AI ethics in the classroom goes beyond the discussion of academic integrity (though that discussion is crucial). Teaching students human oversight, intentional AI usage, and discussing the ethical challenges AI presents have become vital parts of higher education. Especially for marketing educators, who need to equip students with the technological know-how to succeed in the twenty-first century.


What is AI ethics education?


Across campuses and workplaces, the rise of accessible and widespread AI technology has created a demand for skills in order to use AI efficiently for better business outcomes. But the rapid adoption of new technology has raised ethical questions around privacy, ownership, and bias, along with a fierce debate on college campuses about gen AI and academic integrity. For educators, AI can be a valuable tool for helping students access and process information, as well as an excellent tool for organization and course planning, though many in academia still favor a blanket ban of all AI in the classroom.


For marketing professors, campus crackdowns on the use of AI for competing coursework present unique challenges. Marketing students will need strong AI skills in the workplace. Some universities, including MIT, Stanford, and Purdue, have met the problem of equipping students with AI skills without encouraging academic dishonesty by adopting university-wide policies for responsible AI use and even requiring students to complete coursework focusing on AI ethics.


But for most educators, teaching AI ethics is still optional, and many future-minded educators choose to tackle the questions around using AI in the classroom using in-class activities and assignments.


What strategies should educators adopt to teach AI ethics?


Today’s marketing students need to understand how to engineer AI prompts for many areas of marketing, including copywriting, content strategy, and analytics. But designing assignments around existing tools, like ChatGPT, with no oversight can lead to conflicts within the university regarding the use of AI to complete coursework.


For example, some students, like Elizaveta Mikhaylova and MSc Strategic Marketing at Imperial College use AI constantly in daily life, though they have little formal training around using AI in the workplace. “I use AI for almost everything–planning, calculations, and research ranging from academics to planning trip itineraries,” Mikhaylova says. But when it comes to using AI as a marketer, she says she wants practical skills for learning the “ability to use AI effectively, think critically, and analyse patterns to think creatively.”


In order to teach students the real-world, practical skills they’ll need to use AI in the workplace, marketing educators should consider designing lesson plans that ask students to interrogate how they use AI in their daily lives, consider the ethical implications of using AI in the workplace, and develop critical thinking skills around responsible AI use. A recent study found that the biggest knowledge gaps students faced in regard to AI were prompt engineering, bias awareness, and AI output management. Marketing professors are in a unique position to help students hone these skills in order to use AI ethically in their coursework and after graduation.


A recent paper out of Carnegie Mellon, for example, suggests that computer science professors design lessons around the AI Incident Database in order to raise awareness among students about responsible AI use and strengthen “governance and accountability mechanisms” in learners. To do so, educators first invite discussion about students’ current AI use and habits, ask students to analyze the repeated failures of AI systems, and then conduct a post-activity questionnaire to assess the impact of the assignment.


Marketing educators can borrow from this strategy by hosting in-class discussions about AI use, assigning activities allowing students to use AI for assistance in common marketing tasks–like copywriting or content strategy–then asking students to analyze the efficacy of their results, along with any potential ethical issues, such as plagiarism, potential for diminishing trust in a brand, and inherent audience bias within systems. Studying backlash to AI-generated content, such as the response to Coca-Cola’s holiday ads in 2024 and 2025, can be useful for helping students to understand the importance of a human-in-the-loop strategy for using gen AI in the workplace and on campus.


How can simulations help students use AI responsibly?


Across fields like computer science, engineering, and business, role-play and games have emerged as one of the most impactful ways that educators can reinforce the importance of ethics in AI usage while preparing students to use AI in the workplace. Simulations create a safe, imaginary environment in which students can explore AI prompt engineering, apply AI-powered decision-making to marketing strategy, and see outcomes in real-time. Simulations provide a controlled environment and allow educators to oversee students’ AI usage, which ensures that teaching AI complies with academic integrity policies while teaching practical AI skills.


For example, Novela’s AI Marketing Simulation asks students to draft ethical AI frameworks around marketing campaigns, discuss responsible content creation guidelines and potential for bias, then design AI-assisted content within those frameworks, testing that content on simulated audiences and measuring its impact while better understanding transparency, algorithmic bias, and the pitfalls of privacy issues–all within an environment designed by marketing professors that incorporates actual ethical dilemmas marketers face while respecting academic integrity policies.


For instance, after completing Novela’s AI Marketing Simulation, Omar Rodriguez, a marketing management MBA student at Emory, feels better equipped to handle the human-in-the-loop challenges AI presents. "The fact that the simulation is designed to emulate a real digital marketing environment and decisions, including the ability to create content, not just allocate resources, makes it extremely valuable and unique," Rodriquez says.


Why should marketing educators incorporate AI ethics into their courses?


Integrating ethics into AI marketing education goes beyond simple right-versus-wrong lessons. The real aim is building "ethical muscle memory"—when students practice ethical decision-making in realistic scenarios like the Novela AI Marketing Simulation, ethical thinking shifts from deliberate to instinctive.


Ready to integrate ethical AI marketing into your curriculum? Request a demo of Novela's AI Marketing Simulation today and see how our ethics curriculum can transform your students' understanding of responsible AI usage.

 
 
 

Comments


bottom of page