Frequently Asked Questions (FAQ)
What is ChatGPT and Generative AI?
Generative artificial intelligence (AI) refers to a category of AI technology capable of generating new content by analyzing patterns and data gathered from extensive collections of sample materials. ChatGPT is a prominent example of such generative AI tools, which users engage with through a chat interface. Developed by OpenAI and launched on November 30, 2022, ChatGPT stands out for its capacity to allow users to shape and guide a conversation according to their preferred length, style, format, level of detail, and language.
Other AI-powered text generation tools include Google Bard, Bing Chat, and Claude. Moreover, various generative AI tools produce diverse types of outputs, including images (e.g., DALL-E or Midjourney), code (e.g., GitHub CoPilot), data analysis or visualization (e.g., ChatGPT Code Interpreter), or search engine results (e.g., Perplexity).
When is it safe to use ChatGPT?
The flowchart devised by Aleksandr Tiulkanov (AI and Data Policy Lawyer, January 2023) provides a general guideline on the circumstances in which ChatGPT may be safe to use
These are only general considerations and instructors should have their own course policies and guidelines on the use of AI tools in classes.
What is the School of Engineering’s policy on the usage of AI tools?
Please refer to the statement on the landing page of the AI^2 website for the policy. This website is also to be a resource and guidance to help inform faculty on how they wish to employ or not, the use of AI tools in their classrooms.
How do AI tools affect our teaching?
Generative AI tools can produce responses to written assignments such as essays, problem sets, etc. We should assume that students are already proficient in these tools and have integrated them into their daily workflow.
This does mean, we need to re-think handing out assignments that are easily fed into AI tools and increase interactions with students on how they arrive at the solution and choices rather than reliant on the result.
Does a student’s use of ChatGPT violate SCU’s Honor Code?
This is dependent upon the faculty decision on how they wish to allow or not, the use of AI tools in their classroom. It would be beneficial to clarify in a class syllabus what the expectations are for the student’s submitted work. This will allow for clear transparency and expectations between faculty and students.
Using AI tools is akin to asking a knowledgeable friend for answers or guidance. The given answer or guidance will have increased accuracy as AI progresses and develops.
Is there any way to prevent AI tools in a classroom?
If a user is resourceful, there are too many methods of circumvention. There is no bullet proof method to prevent the use of AI tools in a classroom without severely affecting computing usage.
Are there methods to identify work submitted using AI tools?
There isn’t a tool that will guarantee AI detection with 100% accuracy. We have compiled a list of AI detection tools that you can use (some are subscription based).
Some suggestions that can help you assess whether your student has submitted work that is AI generated:
- While tools like chatGPT will not generate the same result, it should be encouraged that you test your assignments against AI tools so that you have an understanding of the type of output they generated.
- Using 1-2 or more different AI detection tools may help with your assessment of the student’s work
- Engage the student’s work by having the student explain how they arrived at their answer or choice in producing the assignment.
How can I get a ChatGPT account?
Go to chat.OpenAi.com and register for free, for an account. You can find many quick-start guides online.
What are the privacy and security concerns with ChatGPT?
As with all internet based tools, there should be no expectation of privacy. Services like ChatGPT do collect personal information such as email address and a telephone number. Confidential information should not be uploaded to AI tools.
Link to OpenAI’s privacy & policy page: https://openai.com/policies/privacy-policy
Is it mandatory to include a syllabus statement on the use of AI-related tools?
It is strongly encouraged that all instructors clearly communicate to students their policies on this emerging matter in their course materials. Absent an explicit statement from a course instructor, use of or consultation with generative AI shall be treated analogously to assistance from another person (see a general perspective from the SoE).
Is there a syllabus statement(s) on student use of AI-related tools?
Yes, sample syllabus statements are provided on our repository here, including both general and discipline-specific ones. We have statements for different usage scenarios of AI tools.
How should students cite AI tools if they are permitted for an assignment?
AI tools extract and generate unique content from internet sources. As such citing AI tools is different than traditional material. There are some recommendations from the American Psychological Association (APA), Modern Language Association (MLA) and the Chicago Manual of Style have all provided recommendations in this area.
Where can I learn more about ChatGPT and generative AI tools in education?
Here are a few resources links to help. There are many emerging articles online to further your knowledge:
- Resources for exploring ChatGPT and higher education
- ChatGPT through an Education Lens
- ChatGPT & Education
- AI Text Generators and Teaching Writing: Starting Points for Inquiry
- AI Readings | Academic & Collaborative Technologies (ACT)Academic & Collaborative Technologies (ACT) (utoronto.ca)