Sections:
Teaching and Learning in the Context of AI
Syllabus Statements
Citing AI as a Source
Additional Resources
As generative artificial intelligence (gen-AI) has become ubiquitous, we’re all navigating how to guide students in its responsible use. Faculty take many different approaches to AI in the classroom, from fully embracing it as co-author to prohibiting all use, and everything in between.
Teaching and Learning the Context of AI
As with any teaching tool, generative AI delivers both benefits and challenges. The latter includes issues of privacy, racism, and sexism (after all, much of its training information is sourced from the internet), misinformation, fake citations, and even malicious content. Some things to consider as you’re thinking about what role AI may play in your classroom:
AI tools like ChatGPT, Gemini, and Claude are large language models (LLMs) that generate responses by predicting text based on patterns learned during training. Because they are trained on large collections of text and other data, they can reflect human errors and biases—and can sound confident when they are wrong. GenAI chat tools are used to complete writing tasks, expedite data and spreadsheet use, draft lab reports, and complete coding assignments quickly. While they can complete any text-based task in seconds, genAI outputs often lack the sophistication, details, critical perspective, and specificity to course content and contexts that we expect in excellent student writing.
For example, AI can explain theoretical concepts well, but it is not able to generate deeper analyses of and personal reflections about that theory. Oftentimes, AI-generated writing is recognizable because of its use of repetitive phrases or stylistic constructions, and a lack of specific contexts. It’s worth talking with students about what genAI can and cannot do: students are still learning about the affordances and limits of these technologies. So are we. So is everyone.
When you are inclined to worry that the world of knowledge (and teaching and learning) is coming to an end, think about the use of new AI tools as an extension of already existing digital pedagogical practices. Digital pedagogy can look different depending on the instructor and their discipline: lectures may be paired with online research conducted by students to create a digital artifact. Faculty may use the “Designer” feature on PowerPoint to make their presentations more visually appealing without a lot of work. Faculty may allow or even encourage students to use Grammarly and other similar tools to check their writing. Faculty may also use such tools themselves.
With the abundance of available digital tools faculty can set boundaries and expectations by talking to students about their use of generative AI. How do you expect students to use these tools (if at all)? Why? Can you imagine using generative AI tools to support student learning? Many faculty here and elsewhere are addressing these issues in syllabus statements and assignment prompts. As with any technological tool, talk to your students. Don’t assume they know what’s expected or appropriate.
Whenever you consider incorporating a new technology or assignment into your course, you will always lead with the question: what am I hoping that students will be able to do or know as a result of this assignment? This course? Note both what your learning goals are, and what is less important to your particular course. Discerning that difference can help you to articulate for your students where AI assistance might be more or less appropriate for your assignments. Then, utilizing frameworks like Transparent Assignment Design (TILT) and Backwards Design can help you to develop assignments and courses that clearly and transparently support student learning.
ChatGPT and similar AI tools can increase access to learning resources for students with diverse learning needs. Captioning, audio description, text-to-speech, and speech-to-text are examples of generative AI tools that support inclusive learning. Additionally, students who struggle with writing may benefit from using genAI to help them develop their ideas or improve their writing. Collaboratively writing with AI, for example, may provide an initial structure for students to strengthen their positions, enhance and specify their arguments, integrate their own voice and perspective, or edit for clarity. Generative AI may help students reduce stress and spend more time engaging with course content to deepen their learning. Although Generative AI may offer important assists to student learning, it cannot offer personalized support, mentorship, and the relational aspect of teaching and learning, which is fundamental to what we all do at SCU.
Understand the ethical and data privacy concerns inherent to AI use, and be prepared to help your students understand these concerns, if you invite them to use AI in their assignments.
The Markkula Center has created a valuable Guide to AI Ethics Literacy that is a resource for both faculty and students seeking to understand and critically evaluate AI use from an ethical, human-centered perspective.
Overall, what uses of AI you allow in your classes is your decision as an instructor; whatever you decide, it’s important to be clear with your students about your expectations, and to help them understand your rationale for your policies and practices around AI. Below are some strategies and considerations for creating a transparent culture of AI use in your classroom.
Addressing AI in Syllabus Statements
Many faculty members have already begun incorporating syllabus statements that address AI. Sample statements are widely available, including some great resources developed locally by colleagues in the School of Engineering (see AI^2).
Building on the critical questions above, we encourage faculty to develop (and frequently update) AI policies for their courses and assignments that reflect their learning goals, disciplinary values, and any ethical/data security concerns.
The authors of Teaching with AI provide a useful template for developing your own course policy by thinking through key questions for your own course:
- When is AI use permitted or forbidden? Why? Is brainstorming with AI cheating? How might AI enhance or inhibit learning in this class?
- If AI is allowed, must students share their AI prompts with you as part of assignment submission?
- How should AI use be credited?
- A warning about the limits of AI.
- Transparency regarding your planned usage of AI detection tools and how that information will be used.
- Clear rules about students’ ultimate accountability for work. (Bowen and Watson 139-140)
Thoughtful syllabus statements help students to understand your rationale for allowing or prohibiting uses of AI in your course. Such a statement is also best used as the beginning of a conversation with your students about these issues, in which you might further explore the benefits and limitations of AI technologies for your subject area, perhaps developing your own class compact for its use or prohibition (you can learn more about strategies for discussing all syllabus statement on our Talking about Syllabus Statements DRT page).
Also note that assignments that allow for some level of AI use will need further specification of appropriate conditions and processes for AI use and credit at the assignment level as well. These specifications will depend on your own learning goals for the course and assignment, and your own disciplinary values.
A framework for articulating different levels of AI use in your course and assignments is below, with some examples from Teaching with AI included for illustration:
- AI Assigned: Students are required to use AI on one or more components of their assignment, whether in generating a text or data set to critique or in contributing to their own project output.
- AI Encouraged: “You can use AI to assist you in creating ideas, outlines, characters, themes, arguments but not text. Use AI like a collaborator or tutor: ask for feedback or ways to improve, but all of the text needs to be yours.” (Bowen and Watson 142)
- AI Limited: “You can use AI to generate short sections of up to X words at once. You must ask the AI to create at least Y versions of any text and should then iterate from there: find ways to change and improve the AI-generated text. You should keep and submit your prompts and/or supply a pdf of the session transcript.” (Bowen and Watson 142-43)
- AI Prohibited: Students are not permitted to use AI on any stage of ideation, outlining, drafting, editing, or refining text or images. (You may want to use this prohibition sparingly, as this will require you to consider how you will structure assignments that make this both meaningful and practical; be mindful of the integration of AI tools in common platforms students are already using to research and draft, and provide clear and specific guidance about your expectations in relation to such tools as well).
Such policies are in line with decades of research on academic integrity (apart from the use of AI), which directs us away from an exclusively punitive approach and toward pedagogical practices that engage students in discernment about how to appropriately use any available technologies, based on what academic integrity looks like in your own discipline.
Citing AI as a Source
If you allow the use of generative AI in your course, whatever your policy is, it is essential to clarify how your students can transparently cite their use of AI. Many AI tools also allow chat histories to be shared, and you might consider asking students to share their chat links along with their attributions so you can review and verify. We’ve generated a few ideas about potential student uses of AI and examples of how they might cite AI appropriately.
The examples below were generated interactively with ChatGPT 4.0 and subsequently edited for clarity. You may also be interested in teaching students how to formally cite AI in different styles, and guidelines for APA, MLA, and Chicago styles are included below.
Potential uses: summarizing articles, finding sources, or explaining complex topics.
Example attribution: "I used ChatGPT (OpenAI, 2024) to help identify potential research articles related to urban heat island mitigation, which I then reviewed and incorporated into my literature review."
Potential uses: identifying patterns in datasets.
Example attribution: "AI tools such as ChatGPT (OpenAI, 2024) assisted with generating initial R code to analyze the YRBUILT variable in the AHS_NF_23 dataset, which I modified for accuracy."
Potential uses: catching errors in grammar, punctuation, and style.
Example attribution: "Grammarly (2024) was used to review the final draft of this paper for grammatical accuracy and clarity in the writing style."
Potential uses: providing new and creative ideas for research projects, presentations, or assignments.
Example attribution: "ChatGPT (OpenAI, 2024) provided brainstorming suggestions for potential project directions, including the idea of exploring climate-sensitive mortality outcomes."
Potential uses: creating customized study plans based on a student’s learning style or areas of difficulty.
Example attribution: "AI-generated flashcards (2024) helped me prepare for the midterm by focusing on key terms and concepts from the course readings."
Potential uses: summarizing or organizing notes from lectures or reading materials.
Example attribution: "I utilized Otter.ai (2024) to transcribe and summarize class discussions, which helped inform the reflections presented in this ethnographic analysis."
Additional Resources
Generative AI for Instructors course
Library Guide on Generative AI
The original version of this resource was created by Lisa Chang, Eric Haynie, and C.J. Gabbe (2024) and was updated by Amy Lueck and Eric Haynie (2025) for the Center for Teaching Excellence and Faculty Development. Last updated 01/05/2026.