Recommendations and Guidelines

AI, machine learning, and generative technologies

“Artificial intelligence” is a catch-all term that encompasses a wide range of machine learning technologies that use large data sets – collections of information – to make predictions or conclusions.

“Generative AI” is the class of tools where the AI doesn’t make decisions or predictions but instead appears to create – or generate! – something like an image, a paragraph, a video, or a sound file.

Professional Development

Teaching, Learning, and AI Technologies is a self-enrol workshop space maintained by the CTLT.  It contains recordings of workshops and is updated regularly with discussions, resources, and professional development opportunities related to the topic.

Is it cheating if my students use generative technologies?

This will depend on the parameters of the assignment and the learning objectives of the course.  Some instructors may wish to engage with these technologies in their course activities either throughout their course, or within specific assignments.  Some instructors may wish to prohibit their use.

It is important to discuss the potential uses of these technologies with your students and clearly communicate where use is acceptable or unacceptable.

What can I do to encourage students to not use generative technologies in my courses?

Some instructors are concerned about students using generative technologies to, for example, write essays or other written assessments for them. There are strategies for designing assessments that are more resistant to generative technologies:

  1. Clearly outline expectations.  If you wish to allow or prohibit use of these platforms, clearly explain why in your course outline and discuss these expectations during class.
  2. Evaluate students on process, not only on the final product. You might want to collect outlines or research proposals for evaluation and place less weight on a final paper assignment.
  3. Include components of self-reflection (including reflection on prior learning or the student’s own life or work contexts) in assessments.
  4. Reflect on your desired learning outcomes for your assessments. Consider whether an essay/paper assignment as an evaluative form reflects the learning objectives in your class. Could you explore project-based learning or “unessay” instead?
  5. Ask students to complete certain work during class time. For example, utilize pre and post polls to capture student reflections about material learned. This strategy can also help students prepare for activities that build upon this, such as group work or discussions.

It’s also important to keep open lines of communication with students about these tools.

Consider exploring the limitations of these technologies with your students by, for example, asking it to create a bibliography for an assignment and then checking whether the sources it provides are reliable or even fabricated.

Is there a technology that can “catch” usage of these technologies?

In essence, no. The existing “AI Detectors” should not be used by UNBC Instructors because they have extremely high false positive ratings and have been shown to not be accurate enough to base an accusation of student misconduct on.  There are also alarming issues with these detection technologies impacting diversity, equity, and inclusion.  Given the speed with which these technologies develop and change, seeking a technological solution is entering an arms race that we cannot win. Be wary of claims made by technology companies in unsolicited emails and marketing campaigns.

Revising our pedagogies with strategies that make for more meaningful learning is a better approach.

See the page on Limitations of AI Detectors for more information.

What do I do if I suspect a student has used unauthorized AI assistance in an assignment?

Have a conversation with the student to determine their process and knowledge of the assignment in question. McMaster University has a great AI Misconduct Conversation Guide to guide this conversation with the student to see if it is likely that a student used unauthorized AI assistance.

Update your syllabus. Include a policy on student AI use.

  • Teaching, Learning, and AI Technologies is a self-enrol workshop space maintained by the CTLT.  It has a collection of suggested syllabus language you may choose to use in your course outlines.
  • This crowdsourced collection is created for the purposes of sharing and helping instructors see the range of policies being used by post-secondary educators to help in the development of their own for navigating generative technologies.

Talk with students about academic integrity.

Ensure you discuss your expectations regarding these technologies with your students. Consider updating it to be more student-centered (see Zinn 2021 template). Discuss why academic integrity is essential in their learning process.

Be transparent about assignments.

Reconsider your approach to grading.

Shift from extrinsic to intrinsic motivation.

  • Students are more likely to cheat when “the class reinforces extrinsic (i.e., grades), not intrinsic (i.e. learning), goals.” (UC San Diego, 2020, para. 6).
  • Consider how you might increase intrinsic motivation by giving students autonomy, independence, freedom, opportunities to learn through play, and/or activities that pique their interest based on their experiences and cultures.
  • Learn more about motivational theories in education from Dr. Jackie Gerstein.

Use these technologies as educational tools.

  • Before you ask students to use any of these tools for an assignment please ensure you understand the potential privacy impacts of the platform.  Teaching, Learning, and AI Technologies is a self-enrol workshop space maintained by the CTLT.  It outlines the privacy considerations you need to consider within the BC post-secondary context related to FIPPA regulations.
  • Instructors cannot require students to sign up for software tools that have not gone through a UNBC Privacy Impact Assessment to make sure their systems comply with BC Information Privacy laws.
  • Demonstrate the proper educational use of GenAI tools to students and model critical thinking about the outputs generated including thinking about bias, copyright issues, information source quality, and accuracy/inaccuracy.
  • Engage students in critiquing and improving generative outputs:
    • Pre-service teachers might critique how a generated lesson plan integrates technologies using the Triple E Rubric or examine whether it features learning activities that support diversity, equity, accessibility, and inclusivity.
    • Computer science students might identify potential ways to revise  generated code to reduce errors and improve output.
    • Analyze how generated text impacts different audiences.
  • Help students build their information literacy skills:
    • Ask students to conduct an Internet search to see if they can find the original sources of text used to generate output.
    • Have students generate prompts and compare and contrast the output

 

Additional Resources

This resource is adapted from ChatGPT & Education, AI in Education, and Assessment Design in an Era of Generative AI

Our gratitude to Brenna Clarke Gray (Thompson Rivers University) for hosting her AI in Education resource on the open web so we could benefit from it.

Our gratitude to Torrey Trust (College of Education, University of Massachusetts Amherst) for sharing her resource under the CC BY NC 4.0 license so we could freely use, remix, and share it.

License

Icon for the Public Domain license

This work (An Instructor's Guide to Teaching & Learning With Technology @UNBC by UNBC CTLT) is free of known copyright restrictions.

Share This Book