
October 2025
Canvas Updates
YouTube advertisements in videos embedded in Canvas
Starting October 31, 2025, YouTube may introduce advertisements on videos embedded directly within Canvas courses. In anticipation of this change, we enabled a warning overlay that reads: "This video may display YouTube ads." Viewers must click through the warning to access the content. The overlay displays each time the page is reloaded.
The overlay appears on YouTube content embedded by pasting a YouTube embed code into the Rich Content Editor or embedded using Instructure's YouTube integration. The overlay does not appear on YouTube content embedded using Kaltura (the preferred method of sharing YouTube videos in courses). We will continue to monitor this evolving issue and provide updates as we learn more.
How to confirm all grades in the Canvas Gradebook are visible to students
Canvas recently released a new setting in the Gradebook that allows instructors to double- check that all grades are visible to students. This setting is for those instructors who change the grade posting policy to something other than the default, which comes in handy when grading large projects and papers that take a few weeks to complete. To quickly check that all grades are visible to students, follow these steps:
- Select the gear icon in the top right corner of the Gradebook.
- Select the View Options tab across the top of the menu.
- Check the box for View hidden grades indicator.
When you scan the Gradebook, an orange dot will appear in the top left corner of the cell for any grade that is still hidden from students.
Summary of feedback from instructor focus groups about AI in higher education
In April 2025, Academic Technology Support Services (ATSS), in collaboration with academic technology professionals across the University of Minnesota, conducted a series of focus groups to understand instructor concerns and address the complexities of integrating generative AI into higher education. Focus group participants included instructors from multiple UMN campuses.
The goals were to gauge instructors' feelings about the value and applicability of AI tools in and beyond the classroom, and to identify where common assumptions about generative AI break down across different disciplines. Read Extra Points: "UMN instructors' perspectives on generative AI: April 2025 focus groups results" for a summary of the key findings.
Fall UMN virtual Digital Accessibility Summit
The Office for Digital Accessibility (ODA) invites you to join us on Thursday, November 6, 2025, 9 a.m.-4 p.m for the Fall UMN virtual Digital Accessibility Summit. Participate with your colleagues to learn and discuss digital accessibility topics.
Presentations will include:
- Building a More Accessible Future Together: A Session on Our Digital Accessibility Progress from the ODA
- Panel Discussion: Accessibility Strategy for Documents
- Media Captioning
- Creating Accessible Google Sites
- Vendor Demos:
- Prep/Continual Engine (PDF remediation)
- Equatio (STEM course content accessibility tool)
Registration is open for this event. You'll be able to attend as many or as few sessions as you’d like. All sessions will be recorded, and recordings will be shared with registrants following the event. The event is free and open to all UMN faculty, staff, and students.
The importance of audio descriptions when creating accessible videos
Using audio description in video is the practice of narrating key visual elements to ensure that learners who are blind or have low vision can fully engage with educational content. When creating instructional videos for your courses, accessibility should be a central consideration. This includes ensuring that key visual content is accessible to learners with low vision through the use of audio descriptions. Read Extra Points: “Designing Accessible Instructional Videos: The Role of Audio Description” for more information about using audio description to ensure accessibility in your instructional videos.
AI and Academic Integrity: Issues, Opportunities, and Resources
Monday, October 13; 11 a.m.-noon
This systemwide Zoom workshop will address core topics related to academic integrity in the era of GenAI, providing a frame for the debate over GenAI that recognizes instructors and students as learners and explorers.
Navigating the GenAI Landscape: A UMN forum for exploration and discovery
Friday, October 24, 9 a.m.-2 p.m.
Register today for this University-wide online forum that will foster discourse and promote the responsible adoption of AI tools across the University. The program will feature a keynote address, opportunities to attend panel discussions, hands-on workshops, and high-energy, “lightning round” 15-minute presentations.
Check out more events from all Teaching Support partners.
Spotlight
Resources and conversations around GenAI and academic integrity
The emergence of Generative Artificial Intelligence (GenAI) has fundamentally reshaped the academic landscape across higher education, prompting a candid discussion among instructors about its pedagogical and ethical implications. One particular area of concern centers on maintaining academic integrity in an environment where AI tools are increasingly available for use in students’ work processes.
In last spring’s UMN Instructor GenAI Focus Group, instructors shared this concern for students to outsource their learning to this technology. Instructors, and even students in the UMN Student GenAI Focus groups, are concerned that the use of GenAI will undermine the development of essential foundational academic skills like critical thinking, independent reasoning, and original idea generation.
A significant challenge identified in both the student and instructor focus groups is the inconsistency and ambiguity surrounding appropriate use of GenAI in learning. When instructors fail to provide clear guidelines, students are left to make their own assumptions about what constitutes acceptable use, leading to potential academic dishonesty issues.
The concerns are further complicated when instructors try to use GenAI detection tools to vet student work for originality. The University does not centrally support or recommend these tools due to their documented risks, including unreliability, the potential for false positives, and bias against non-native English writers. There are also significant data privacy concerns when submitting student work to unvetted third-party services, which infringes on student rights and potentially uses students’ work to train future AI models. This combination of risks and data concerns means instructors cannot easily rely on GenAI detection tools to monitor use, thus forcing an alternate approach.
All of the changes and new considerations precipitated by GenAI can feel overwhelming, but no one needs to face these challenges alone. Experts and professionals in all areas from around the University have created resources and are facilitating ongoing conversations for instructors to foster a culture of intellectual honesty.
Teaching Support, a collaborative network connecting instructors to resources across the University, offers specific guidance for instructors navigating the evolving GenAI context with clarity and purpose. These resources provide some basic principles for turning course policy into action:
- Transparency is key: The foundational strategy is for instructors to establish clear expectations by crafting a concise GenAI syllabus statement that explicitly defines allowed and prohibited uses, serving as a starting point for ongoing conversations.
- Focus on process over product: Instructors can break down large assignments and require the submission of "artifacts of learning," such as outlines, drafts, reflections, and in-class writing, which makes the student’s work process more visible and verifiable.
- Design for originality and context: Assignments should be made personal and context-specific by requiring students to use unique project data, local case studies, community-engaged work, or interviews. This demands application and synthesis that GenAI tools struggle to replicate convincingly.
- Teach AI literacy as a skill: UMN faculty focus groups highlighted the growing importance of prompt engineering and planning along with verifying and critiquing AI output as key skills for today’s learners. Instructors can integrate responsible GenAI use by allowing and encouraging it for planning tasks, like outlining an assignment, while maintaining that the final product is solely the student's own work.
This conversation is ongoing and still in early stages of application and discovery. Experts from around the University continue to collaborate to keep instructors informed about ways to foster a community of academic excellence and integrity.
One such conversation is coming up next week. Register for AI and Academic Integrity: Issues, Opportunities, and Resources, taking place Monday, October 13, 2025 from 11 a.m-noon. This workshop (hosted via Zoom) will address core topics related to academic integrity in the era of GenAI. Facilitators from across the University in student conduct, academic technology, and teaching and learning roles will share approaches to and resources for academic dishonesty, attribution and documentation, as well as discuss AI and its application to student academic integrity.
Additional Resources
- Request a teaching with technology consultation at [email protected]
- ATSS YouTube Channel
- Subscribe to the Teaching with Technology Newsletter
- Extra Points