AI Teaching & Learning Hub

As technology continues to evolve, the academic landscape is transforming. A significant advancement is the rise of generative artificial intelligence (AI), which includes powerful language models and tools developed by various companies and organizations. These AI systems use advanced algorithms to generate human-like text, images, and other content.

While generative AI offers numerous benefits for research and education, it's important to cultivate a culture of academic integrity to ensure the responsible use of this technology. At Eastern Florida State College, we aim to reinforce those academic integrity standards and offer faculty the following recommendations.

Start Here: AI Foundations

Learn about generative AI, the types of tools you may encounter, what AI can and cannot do, and a simple framework to guide thoughtful use in teaching and learning.

What is Generative AI?
Generative artificial intelligence refers to tools that create new content; such as text, images, audio, code, or video based on patterns learned from large datasets. These tools do not “understand” information the way humans do; instead, they predict what is likely to come next in a sentence, an image, or another type of output.
 
In higher education, generative AI tools often help with:
 
  • Drafting or revising text
  • Summarizing readings or data
  • Creating examples, practice questions, or outlines
  • Generating images, charts, or slide ideas
Used intentionally and within college policies, generative AI can support teaching and learning.
Types of AI Tools
AI is not a single tool. Different tools serve different purposes, and each carries different risks and benefits.
 

1. Text Generators

Tools that create written responses to prompts. They can draft explanations, outlines, examples, or code.
Key point: They may sound confident but can be inaccurate.
 

2. AI Agents

Tools that perform multi-step tasks or workflows (for example: “summarize these documents and draft an email”).
Key point: Always verify what data an agent can access or store.
 

3. AI Detectors

Tools that claim to identify whether writing was created by AI.
Key point: These tools are unreliable and should not be used for academic integrity decisions.
 

4. AI Browsers and Search Tools

Browsers that combine internet search with AI-generated summaries.
Key point: AI summaries can omit context or introduce errors.
 

5. Multimodal Tools

Tools that can process multiple types of inputs, such as text, images, audio, or video and create new output.
Key point: Be mindful of copyright, accessibility, and representation when generating media.
What AI Can & Cannot Do
Understanding AI’s strengths and limits helps faculty make informed decisions about how to use it.
 

What AI Can Do Well

  • Generate drafts, outlines, and variations of content
  • Rephrase or summarize information
  • Provide examples and practice questions
  • Support brainstorming
  • Automate routine instructional tasks

What AI Cannot Reliably Do

  • Guarantee accuracy or verify facts
  • Understand context or nuance
  • Make ethical or instructional decisions
  • Replace disciplinary expertise
  • Detect with certainty whether writing came from AI
AI works best as a tool that supports human expertise, not as a replacement for it.
Common Myths & Misconceptions

Myth: “AI is always correct.”
Reality: AI can produce convincing but incorrect or fabricated information.

Myth: “AI can reliably detect AI writing.”
Reality: AI detectors are unreliable and prone to false positives.

Myth: “If AI helped, the work has no educational value.”
Reality: With the right assignment design, AI can support learning without replacing it.

Myth: “Using AI is automatically cheating.”
Reality: Appropriate use depends on the course policy and purpose of the assignment.

Myth: “AI will replace instructors.”
Reality: AI changes how instructors work, but not the need for expertise, mentorship, and human judgment.

EFSC Guidelines

This section outlines the college-wide expectations for responsible, ethical, and compliant use of artificial intelligence in teaching and learning at EFSC. These guidelines support faculty in designing clear course policies, protecting student data, and using AI tools within institutional and legal requirements.

Understanding Classroom AI Use Policies

Instructors may take different approaches to the use of artificial intelligence in their courses. To provide clarity and consistency for students, course AI policies generally fall into three broad categories. These categories describe the level of AI use that is permitted in a course and outline student responsibilities related to that use. Students should always refer to their course syllabus for specific expectations.

Open Use

The most flexible category is open use, in which instructors allow or encourage students to use AI tools throughout the learning process.

A syllabus statement may say: “Students are encouraged to use AI tools to support learning. AI-generated content should be critically evaluated and revised to reflect the student’s own understanding and voice.”

In these courses, students may use AI in a variety of ways, while remaining responsible for demonstrating comprehension, critical thinking, and original engagement with course material.

Conditional Use

The conditional use category allows students to use AI tools under clearly defined guidelines. Instructors may permit AI for specific purposes such as brainstorming, outlining, or basic grammar support, while requiring that the final submission reflect the student’s own work. Some policies may also require students to disclose or cite their use of AI.

A typical syllabus statement might note: “Students may use AI tools for idea generation or outlining; however, all final work must be student-authored, and any use of AI must be cited.”

Students are responsible for understanding and following the limits outlined in the course syllabus.

Prohibited Use

The most restrictive category is prohibited use. In this approach, instructors do not permit students to use AI tools at any point in the learning or assignment process.

A syllabus statement might read: “All assignments must be completed without the use of artificial intelligence tools. Submitting AI-generated content constitutes a violation of the academic honesty policy.”

In courses with a prohibited-use policy, students are expected to complete all coursework independently without the assistance of AI tools.

Classroom AI Policy Examples
Instructors are encouraged to set clear, discipline-appropriate expectations for their courses. Examples include:
 
  • AI may be used for brainstorming but not writing final drafts.
  • AI may be used only for ungraded practice activities.
  • AI-generated content must be cited or disclosed according to course guidelines.
  • AI is allowed for accessibility purposes (such as text-to-speech), even when writing assistance is restricted.
  • AI use is not allowed on quizzes, exams, or in-class assessments.
Providing examples of permitted and prohibited uses helps students avoid misunderstandings.
Definitions: Educational Records, Metadata, and Identifying Information

Understanding what counts as protected student information helps faculty make informed decisions.

Educational Record

Any record that contains information directly related to a student and is maintained by the college, including assignments, exam responses, submissions in the LMS, grading information, and instructor feedback.

Identifying Information

Details that can reasonably link the work to a specific student, such as the student’s name, ID number, email, or any unique personal detail included within the assignment.

Metadata

Background information automatically attached to student work in the LMS, such as timestamps, submission history, file ownership, and system-generated identifiers. Even if a faculty member copies and pastes text from a student’s submission, the original work is still part of a protected educational record because it is tied to the student through the LMS.

Faculty must handle all three categories with care, especially when considering whether an AI tool is appropriate for use.

AI Pedagogy & Course Design

AI is changing how students learn, create, and solve problems. This section provides short, practical guidance for designing learning experiences that use AI thoughtfully while maintaining academic rigor and student ownership.

How AI Changes Learning

AI can help students generate ideas, summarize information, practice concepts, and revise writing. Because of this, some traditional assignments may no longer measure the skills they were designed to assess. Effective course design now requires clearer intentionality: what do students need to learn, and when is AI a support vs. a shortcut?

Designing AI-Adaptive Learning Experiences
AI-adaptive learning design means creating assignments that work well in a world where AI exists. Instead of banning AI outright or allowing unlimited use, instructors can clarify:
 
  • When AI is allowed
  • How it should be used
  • What skills students must still demonstrate independently
Assignments should encourage both productive use of AI and authentic demonstrations of learning.
Integrating AI to Support Key Skills

Critical Thinking

Use AI to provide examples, explanations, or alternative viewpoints, then have students evaluate accuracy, bias, or relevance.

Creativity

Allow AI for brainstorming or generating possibilities, while students refine, select, and justify their final choices.

Problem-Solving

Students can use AI to outline steps or explore options, but they must explain their reasoning or compare multiple approaches.

Writing and Revision

AI can help with early drafting or editing, while students remain responsible for content accuracy, organization, citations, and disciplinary voice.

Balancing AI Assistance with Student Ownership
AI should enhance, not replace student thinking. A balanced approach:
 
  • Defines what parts of an assignment must be completed by the student
  • Uses reflection, process work, or short oral explanations to confirm understanding
  • Encourages transparency about AI use
  • Keeps the focus on learning outcomes, not tool dependency
Thoughtful course design ensures that AI becomes a support for learning rather than a substitute for it.

Assessment Redesign for Authenticity

AI has changed what traditional assignments can measure. Authentic assessment focuses on higher-order thinking, personal reasoning, and real-world application tasks that require student judgment, not just AI output.

View the "Can AI Do This?" Decision Tree to help you when reviewing an assignment.

For more information for EFSC faculty, please check out events that are part of the GROW with AI initiative and you can also check for AI-related workshops and webinars on the professional development calendar.