Sections:
Teaching and Learning in the Context of AI
Syllabus Statements
Citing AI as a Source
Additional Resources
As generative artificial intelligence (gen-AI) has become ubiquitous, we’re all navigating how to guide students in its responsible use. Faculty take many different approaches to AI in the classroom, from fully embracing it as co-author to prohibiting all use, and everything in between. The following resource gives an overview of important things to consider when developing your response to AI in your teaching and learning activities.
Teaching and Learning the Context of AI
As with any teaching tool, generative AI delivers both benefits and challenges. The latter includes issues of privacy, racism, and sexism (after all, much of its training information is sourced from the internet), misinformation, fake citations, and even malicious content. Decades of research on academic integrity (apart from the use of AI) directs us away from an exclusively punitive approach and toward pedagogical practices that engage students in discernment about how to appropriately use any available technologies, based on what academic integrity looks like in your own discipline. Some things to consider as you’re thinking about what role AI may play in your classroom:
AI tools like ChatGPT, Gemini, and Claude are large language models (LLMs) that generate responses by predicting text based on patterns learned during training. Because they are trained on large collections of text and other data, they can reflect human errors and biases—and can sound confident when they are wrong. GenAI chat tools are used to complete writing tasks, expedite data and spreadsheet use, draft lab reports, and complete coding assignments quickly. While they can complete any text-based task in seconds, genAI outputs often lack the sophistication, details, critical perspective, and specificity to course content and contexts that we expect in excellent student writing.
For example, AI can explain theoretical concepts well, but it is not able to generate deeper analyses of and personal reflections about that theory. Oftentimes, AI-generated writing is recognizable because of its use of repetitive phrases or stylistic constructions, and a lack of specific contexts. It’s worth talking with students about what genAI can and cannot do: students are still learning about the affordances and limits of these technologies. So are we. So is everyone.
When you are inclined to worry that the world of knowledge (and teaching and learning) is coming to an end, think about the use of new AI tools as an extension of already existing digital pedagogical practices. Digital pedagogy can look different depending on the instructor and their discipline: lectures may be paired with online research conducted by students to create a digital artifact. Faculty may use the “Designer” feature on PowerPoint to make their presentations more visually appealing without a lot of work. Faculty may allow or even encourage students to use Grammarly and other similar tools to check their writing. Faculty may also use such tools themselves.
With the abundance of available digital tools faculty can set boundaries and expectations by talking to students about their use of generative AI. How do you expect students to use these tools (if at all)? Why? Can you imagine using generative AI tools to support student learning? Many faculty here and elsewhere are addressing these issues in syllabus statements and assignment prompts. As with any technological tool, talk to your students. Don’t assume they know what’s expected or appropriate.
Whenever you consider incorporating a new technology or assignment into your course, you will always lead with the question: what am I hoping that students will be able to do or know as a result of this assignment? This course? Note both what your learning goals are, and what is less important to your particular course. Discerning that difference can help you to articulate for your students where AI assistance might be more or less appropriate for your assignments. Then, utilizing frameworks like Transparent Assignment Design (TILT) and Backwards Design can help you to develop assignments and courses that clearly and transparently support student learning.
ChatGPT and similar AI tools can increase access to learning resources for students with diverse learning needs. Captioning, audio description, text-to-speech, and speech-to-text are examples of generative AI tools that support inclusive learning. Additionally, students who struggle with writing may benefit from using genAI to help them develop their ideas or improve their writing. Collaboratively writing with AI, for example, may provide an initial structure for students to strengthen their positions, enhance and specify their arguments, integrate their own voice and perspective, or edit for clarity. Generative AI may help students reduce stress and spend more time engaging with course content to deepen their learning. Although Generative AI may offer important assists to student learning, it cannot offer personalized support, mentorship, and the relational aspect of teaching and learning, which is fundamental to what we all do at SCU.
Overall, what uses of AI you allow in your classes is your decision as an instructor; whatever you decide, it’s important to be clear with your students about your expectations, and to help them understand your rationale for your policies and practices around AI. Below are some strategies and considerations for creating a transparent culture of AI use in your classroom.
Syllabus Statements
Many faculty members have already begun incorporating syllabus statements about using AI critically and ethically. The authors of Teaching with AI provide a useful template for developing your own course policy by thinking through key questions for your own course:
- When is AI use permitted or forbidden? Why? Is brainstorming with AI cheating? How might AI enhance or inhibit learning in this class?
- If AI is allowed, must students share their AI prompts with you as part of assignment submission?
- How should AI use be credited?
- A warning about the limits of AI.
- Transparency regarding your planned usage of AI detection tools and how that information will be used.
- Clear rules about students’ ultimate accountability for work. (Bowen and Watson 139-140)
Thoughtful syllabus statements help students to understand your rationale for allowing or prohibiting certain uses of AI in your course. Doing so allows them to learn more about what AI does well, and what limits it has. Such a statement is also best used as the beginning of a conversation with your students about these issues, in which you might further explore the benefits and limitations of AI technologies for your subject area, perhaps developing your own class compact for its use (you can learn more about strategies for discussing all syllabus statement on our Talking about Syllabus Statements DRT page).
Also note that assignments that allow for some level of AI use will need further specification of appropriate conditions and processes for AI use as well. These specifications will depend on your own learning goals for the course and assignment, and your own disciplinary values. Examples from Teaching with AI include:
- You can use AI to assist you in creating ideas, outlines, characters, themes, arguments but not text. Use AI like a collaborator or tutor: ask for feedback or ways to improve, but all of the text needs to be yours.
- You can use AI to generate short sections of up to X words at once. You must ask the AI to create at least Y versions of any text and should then iterate from there: find ways to change and improve the AI-generated text. You should keep and submit your prompts and/or supply a pdf of the session transcript.
- You can use AI-generated text if you type it into the document yourself. AI text is never quite as good as you think at first. You will need to edit and adapt (adapted from Cummings, 2023) (Bowen and Watson 142-43).
Citing AI as a Source
If you allow the use of generative AI in your course, whatever your policy is, it is essential to clarify how your students can transparently cite their use of AI. Many AI tools also allow chat histories to be shared, and you might consider asking students to share their chat links along with their attributions so you can review and verify. We’ve generated a few ideas about potential student uses of AI and examples of how they might cite AI appropriately.
The examples below were generated interactively with ChatGPT 4.0 and subsequently edited for clarity. You may also be interested in teaching students how to formally cite AI in different styles, and guidelines for APA, MLA, and Chicago styles are included below.
Potential uses: summarizing articles, finding sources, or explaining complex topics.
Example attribution: "I used ChatGPT (OpenAI, 2024) to help identify potential research articles related to urban heat island mitigation, which I then reviewed and incorporated into my literature review."
Potential uses: identifying patterns in datasets.
Example attribution: "AI tools such as ChatGPT (OpenAI, 2024) assisted with generating initial R code to analyze the YRBUILT variable in the AHS_NF_23 dataset, which I modified for accuracy."
Potential uses: catching errors in grammar, punctuation, and style.
Example attribution: "Grammarly (2024) was used to review the final draft of this paper for grammatical accuracy and clarity in the writing style."
Potential uses: providing new and creative ideas for research projects, presentations, or assignments.
Example attribution: "ChatGPT (OpenAI, 2024) provided brainstorming suggestions for potential project directions, including the idea of exploring climate-sensitive mortality outcomes."
Potential uses: creating customized study plans based on a student’s learning style or areas of difficulty.
Example attribution: "AI-generated flashcards (2024) helped me prepare for the midterm by focusing on key terms and concepts from the course readings."
Potential uses: summarizing or organizing notes from lectures or reading materials.
Example attribution: "I utilized Otter.ai (2024) to transcribe and summarize class discussions, which helped inform the reflections presented in this ethnographic analysis."
Additional Resources
Check out SCU’s Markkula Center’s Hackworth Fellows student-generated discussion and guidelines for the ethical use of AI.
Classroom Policies for Using AI Generative Tools (Georgetown University)
The original version of this resource was created by Lisa Chang, Eric Haynie, and C.J. Gabbe (2024) and was updated by Amy Lueck and Eric Haynie (2025) for the Center for Teaching Excellence and Faculty Development. Last updated 12/16/2025.