Skip to main content

Generative AI Recommendations for Faculty

SUNY Cortland Recommendations for Faculty Regarding Generative AI (GenAI), Large Language Models (LLMs), and Student Writing

As faculty consider the use of Generative AI (AI that generates new writing based on prompts) and Large Language Models (where a large body of texts are used to train the prediction models) in their courses, it is important to balance the potentials for an emerging technology with implications for students’ learning and the value of writing as a mode of thinking.

While we can describe how Generative AI works, (for some plain-language descriptions see SUNY Fact2 Report, pg. 5,) the exact process of the technology is proprietary. GenAI corpus training does not seek consent and excludes other texts. Because GenAI output depends on prompts and iterations of use, “attribution” is not really attribution in the academic sense, but more like disclosure. Use of GenAI and details about such use can be disclosed, but not “re-found” or verified. Additionally, there are other known, ethical implications. Students misusing GenAI can hinder their own intellectual development. GenAI use has a tremendous environmental cost, with estimates of 1 liter of water per GPT use, in addition to other energy costs, (SUNY Fact2 Report, pg. 7.) Subscription fees for GenAI software, which only some students can afford to pay, continue to perpetuate a lack of equity and digital divides.

Based on this knowledge, we recommend faculty consider the following:

  1. Decide if, and if so, how they want to use GenAI with student writing and other student work. This decision should follow from a faculty member’s understanding of their course, discipline, SLOs, and personal ideologies/values.
  2. After choosing a position on GenAI, faculty should adopt a syllabus statement to communicate policies about GenAI in their course to their students.
  3. Follow the same course policies of GenAI use in their own teaching (e.g. with creating lesson plans, unit outlines, etc.)
  4. Understand that AI detectors are still evolving and have limits. The university has a license with Turnitin’s AI detection. If faculty members decide to use AI detection, it should be as a tool in conversation with students, rather than as a metric. Any AI detection tool is not 100% accurate and produces false positives.

In addition to developing clear course policies, faculty may also consider the following:

  1. Help students to understand how GenAI works, and how it differs from other possible AI writing (such as autocomplete, suggestions, grammar check, etc.)
  2. Provide materials, resources, and/or classroom experiences that help students to understand the implications of AI for their writing and their education, as well as other labor, ethical, and environmental issues.
  3. Model appropriate levels of use and forms of disclosure. This could involve disciplinaryspecific citation formats adapted for GenAI, an author’s statement explaining use, or some other evolving genre.