Introducing AI in Academia (4 of 4): Using AI to Enhance Learning
- Allen Karsina

- Jan 16, 2025
- 10 min read
Updated: Jan 19, 2025
Recommendations for Integrating AI into the Classroom
This is the fourth and final introductory post in a series of blog posts written by the Artificial Intelligence Faculty Learning Community Group (AI FLC) at Regis College. Although the posts are primarily intended for the Regis community, the conversation is open to everyone. This post was written by Allen Karsina.

Welcome back! In the previous three posts of this introductory series, we introduced and explored several topics related to integrating AI into education.
The Main Topics of our Introductory Series
What is AI?
How is AI currently being used in education?
What are the benefits and opportunities of integrating AI into education?
What are the challenges and risks of integrating AI into education?
What are the best practices for integrating AI into education?
Now we are ready to look at the final question and suggest some recommendations for integrating AI into the classroom. These recommendations come from research, discussions, common sense, and even AI sources such as ChatGPT.

Recommendations
The following recommendations are primarily aimed at educators who are introducing AI tools into their classrooms, although these recommendation apply to students as well. Each recommendation is discussed briefly below.
Five Recommendations for Faculty and Students
Ensure all and any use of AI aligns with policy.
Emphasize the ethical and transparent use of AI.
Provide clear guidelines for using AI.
Integrate AI into the classroom purposefully, collaboratively and scientifically.
Design assignments to maximize the benefits and minimize the risks of AI.
Two Questions Students Should Always Ask
Always ask the following two questions when using AI tools:
Am I learning?
Am I following the class/school policies?
Align with Policy
Educators must ensure that they use AI tools in alignment with their institution's policies, and students should adhere to the guidelines established by their schools and instructors.
While this may seem straightforward, the rapid proliferation of new AI tools has created challenges in designing policies that balance innovative teaching practices with the need to maintain academic integrity. As a result, existing policies may not address all potential uses of AI tools, may contain ambiguities, or may evolve as we gain more experience with integrating AI into the classroom and as new technologies emerge.
Given this evolving landscape, educators must invest time in clarifying AI-related policies with their students. For instance, Regis College has included specific language in its syllabus template to address AI usage. Including explicit guidelines in your syllabus—and in each assignment—is an essential step toward fostering ethical and effective integration of AI into educational practices.
Emphasize the ethical and transparent use of AI
Educators should engage in open and honest discussions with their students about the use of AI. At a minimum, these conversations should address academic honesty, transparency, and the specific classroom guidelines for AI use. However, given the rapid evolution of AI technologies, it is also beneficial to include the rationale behind these policies and explore broader ethical concerns, such as over-reliance, bias, and misinformation.
Additionally, fostering a two-way dialogue by listening to students’ concerns, ideas, and suggestions can provide valuable insights and help shape the integration of AI into assignments in a way that is both innovative and practical. This collaborative approach not only clarifies expectations but also empowers students to use AI responsibly and effectively.
Provide clear guidelines for using AI
Provide clear and detailed instructions regarding the use of AI tools in your course. These should include both general guidelines and specific directions tailored to each assignment.
For each assignment:
Identify Permitted AI Tools:
Specify which AI tools students are allowed (or required) to use, such as ChatGPT, Perplexity, or others relevant to the assignment.
Clearly outline the purpose of using these tools and how they should contribute to the completion of the task.
Define Appropriate Use:
Detail how students should use AI tools. For example, should they use AI for brainstorming, generating outlines, or fact-checking?
Provide explicit examples of permissible uses and prohibited uses to eliminate ambiguity.
Citing AI Usage:
Specify how students must cite their use of AI tools. Include examples, such as:
"Generated initial outline using ChatGPT on [date]. Revisions made manually."
"Used Perplexity to summarize key points from the assigned article. Summaries cross-checked with original text."
Interaction Logs:
Indicate whether students should keep a log of their interactions with AI tools and submit it with their assignment. For example, include screenshots or transcripts of queries and responses, annotated to show how the AI outputs were used or revised.
Rationale for Guidelines:
Provide a clear explanation of why using AI tools in the specified manner enhances learning. Emphasize that AI should supplement, not replace, critical thinking, creativity, and subject mastery.
For example: "Using AI to generate an initial draft allows you to focus on refining ideas and improving clarity, which are essential skills. However, relying on AI to write entire sections bypasses critical learning objectives and may lead to over-reliance."
Ethical and Academic Integrity Considerations:
Stress the importance of transparency and honesty in using AI tools. Highlight that failure to follow these guidelines could lead to academic integrity violations.
Integrate AI into the classroom purposefully, collaboratively and scientifically
As educators, we design our courses with clear goals and objectives. The integration of AI into the classroom should follow the same principle. Clearly articulate the purpose of using AI in each assignment, ensuring students understand why AI is included—or excluded—and how it aligns with the learning objectives. AI should only be incorporated when it serves a specific educational purpose.
Given the rapid evolution of AI tools, collaboration with students is essential for identifying effective uses and avoiding potential misuses. Regularly seeking input and feedback from students on how AI is being applied in assignments will provide valuable insights and help refine its integration. This collaborative approach also encourages students to think critically about the ethical and practical implications of AI in their learning.
Lastly, objectively evaluate the role of AI in your classroom. Reflect on its effectiveness:
How did it enhance student learning and engagement?
What challenges or limitations arose?
What changes are necessary to improve its implementation?
By iterating based on feedback and evaluation, educators can ensure AI remains a tool that supports meaningful learning outcomes while addressing its potential risks.
Design assignments to maximize the benefits and minimize the risks of AI
Integrating AI into the classroom requires rethinking certain assignments, particularly those that could currently be completed almost entirely by AI. While alternatives such as oral assessments or in-class handwritten papers offer some solutions, they also come with their own practical and pedagogical challenges.
A more effective approach is to redesign assignments in ways that leverage AI’s benefits while mitigating its risks. Below are some recommendations to guide this process:
Provide Clear Instructions for AI Use. Specify exactly how students should use AI tools for each assignment, including permitted tools and their intended purpose (e.g., brainstorming, drafting, or fact-checking).
Encourage Critical Analysis of AI Outputs. Require students to evaluate AI-generated content critically, identifying its strengths, weaknesses, and areas requiring revision. Ensure they document and cite their use of AI.
Utilize AI Strategically. Incorporate AI for tasks like gathering factual information, generating suggestions, or providing initial feedback. Use it as a starting point, not a final solution.
Reflect on AI Use. Include a reflection component where students assess how AI influenced their work, the challenges they encountered, and what they learned from the process.
Seek Student Feedback. Regularly ask students for input on how AI tools are being integrated into assignments. Use this feedback to refine and improve instructional practices.
Revise Rubrics to Reward the Process. Shift grading emphasis from the end product to the learning process. For example, in a research paper assignment, allocate points for tasks such as researching, evaluating sources, developing outlines, and drafting. These tasks can incorporate specific AI tools, fostering engagement and skill development throughout the process.
Additional Recommendations
Continuously evaluate assignments to ensure they are aligned with learning objectives and make thoughtful adjustments as AI tools evolve.
Use assignments to teach ethical and responsible AI use, including the importance of transparency, accuracy, and originality.
Student Recommendations
AI tools offer significant opportunities for learning and productivity, but they also come with notable risks, particularly over-reliance and academic dishonesty. While these risks can be managed, it is essential for students to remain vigilant and proactive in monitoring their use of AI.
Monitoring for Over-Reliance
Ask yourself: "Am I learning?"
If the answer is "no" or "not very much," it’s a sign that you may be over-relying on AI. Adjust how you are using AI and explore alternative study strategies, such as engaging more deeply with course materials, participating in discussions, or seeking guidance from your instructor. AI should supplement your learning, not replace it.
Monitoring for Academic Integrity
Ask yourself: "Am I following the class/school policies?"
The answer to this question should always be "yes." If it’s "no," stop what you are doing immediately and adjust your approach to align with the policies.
If the answer is "I'm not sure," clarify expectations by consulting your professor or reviewing the class/school guidelines. It’s better to seek clarification early than to risk violating academic integrity standards.
Why These Questions Matter
These self-checks empower students to use AI responsibly while ensuring that it enhances, rather than undermines, their learning experience. Regularly reflecting on these questions can help you develop stronger study habits and maintain academic honesty.

Additional Recommendations
The recommendations above were designed to be brief (this is a blog post, not a workshop, after all). If you are an educator or student and have additional recommendations you think we should add, please let us know!
One final recommendation is illustrated below.
Interactive Activity. Ask Copilot or ChatGPT for best practices for integrating AI tools into your classroom. You will get more specific recommendations if you identify specific AI tools and specific assignments or uses. See the example below. |
Prompt: I would like my students to be able to use generative artificial intelligence tools such as ChatGPT, Copilot, NotebookLM, and so on, when they complete study guides. However, I am concerned that they might become over-reliant on these tools and even plagiarize using these tools. What are some best practices for integrating these tools into the use of study guides?
ChatGPT 4o Response (with minor formatting edits):
1. Establish Clear Expectations
Define Appropriate Use: Provide guidelines about when and how students can use AI tools, such as for brainstorming, rephrasing difficult concepts, or generating practice questions, but not for copying verbatim answers.
Explain Misuse: Highlight the risks of over-reliance, such as reduced critical thinking, and clarify what constitutes plagiarism when using AI tools.
2. Focus on Active Learning
AI as a Supplement: Encourage students to use AI-generated responses as a starting point rather than a final answer. For example, they could:
Critically evaluate the accuracy of AI-provided answers.
Rewrite AI-generated content in their own words, adding context or examples.
Discuss AI responses in group settings to promote deeper understanding.
Comparison Tasks: Assign activities where students compare their manual responses with AI-generated ones, analyzing strengths and weaknesses in both.
3. Transparency and Documentation
AI Logs: Require students to document their use of AI tools. For example:
Record prompts they used.
Submit both AI-generated outputs and their refined versions.
Attribution: Ask students to credit the AI tool in their work, noting its role (e.g., “ChatGPT was used to suggest examples, which were further analyzed and expanded upon”).
4. Scaffold AI Integration
Stepwise Use: Introduce AI tools gradually. For instance:
Begin with structured activities, like using AI to generate practice questions or summaries.
Progress to open-ended tasks where students combine AI assistance with their knowledge.
Refinement Exercises: Encourage students to refine AI-generated responses by:
Identifying errors or gaps.
Adding original insights or aligning responses with course concepts.
5. Encourage Verification and Critical Thinking
Fact-Checking: Require students to verify AI-generated answers using course materials or scholarly sources.
Critique AI Limitations: Assign tasks where students critique AI-generated content, focusing on its strengths, weaknesses, and alignment with Radical Behaviorism principles.
6. Design Plagiarism-Resistant Study Guides
Personalized Prompts: Create study guide questions that require students to integrate course materials, personal insights, or examples from class discussions, making it harder to directly copy AI outputs.
Process-Oriented Questions: Focus on “how” and “why” questions that require reasoning and explanation rather than simple facts.
7. Incorporate Reflection
AI Use Reflection: Include a reflection section in the study guide where students:\n - Explain how AI supported their learning.
Discuss any challenges they faced in using AI responsibly.
Reflect on their process for ensuring originality and understanding.
Metacognitive Questions: Add prompts like, “What did you learn from using AI for this task?” or “How did AI enhance your understanding of this concept?”
8. Regularly Monitor and Provide Feedback
Spot Checks: Periodically review AI-generated and student-submitted content for originality.
Feedback on Use: Provide constructive feedback on how students are using AI tools, highlighting effective strategies and areas for improvement.
Sample Study Guide Prompt Incorporating AI:
Original Question: “How can Radical Behaviorism be applied to address modern social issues?”
AI Usage Instruction: “Use an AI tool to generate a list of potential applications. Then, select one and write a detailed explanation, citing course readings to support your analysis. Include a 3-5 sentence reflection on how the AI tool helped you explore this topic.”
By combining clear guidelines, structured integration, and opportunities for reflection, you can empower students to use AI tools as effective learning aids while fostering their critical thinking and ethical responsibility.
News and Announcements
Interested in learning more about AI?
Regis faculty are encouraged to self-enroll in the "Using AI in your Teaching" course created by Dr. Diana Perdue. Faculty can access the course here (don't forget to click the "enroll me" button).
Interested in contributing a post or joining our Regis College Artificial Intelligence Faculty Learning Community?
Contact Allen Karsina at allen.karsina@regiscollege.ed
References
The sources loaded into the notebook in NotebookLM at the time this post was written are included below. Although I did not reference them directly in this post, they provided background information and are included here for anyone interested in reading more on the topic.
Baker, R. S. & Hawn, (2020). Algorithmic Bias in Education. Retrieved from https://learninganalytics.upenn.edu/ryanbaker/AlgorithmicBiasInEducation_rsb3.7.pdf on December 25, 2024.
Bobula, M. (2024). Generative Artificial Intelligence (AI) in higher education: a comprehensive review of challenges, opportunities, and implications. Journal of Learning Development in Higher Education, 30, 1-27.
Chan, C. K. Y., & Colloton, T. (2024). Generative AI in higher education: The ChatGPT effect. Routledge.
Giarmoleo, F. V., Ferrero, I., Rocchi, M., & Pellegrini, M. M. (2024). What ethics can say on artificial intelligence: Insights from a systematic literature review. Business and Society Review, 129(2), 258–292. https://doi.org/10.1111/basr.12336
Jennings, A. M., & Cox, D. J. (2024). Starting the conversation around the ethical use of artificial intelligence in applied behavior analysis. Behavior Analysis in Practice, 17(1), 107–122. https://doi.org/10.1007/s40617-023-00868-z
Kim, S. L., Cheong, M., Srivastava, A., Yoo, Y., & Yun, S. (2020). Knowledge sharing and creative behavior: The interaction effects of knowledge sharing and regulatory focus on creative behavior. Human Performance, 34(1), 49–66. https://doi.org/10.1080/08959285.2020.1852240
Lee, S. S., & Moore, R. L. (2024). Harnessing Generative AI (GenAI) for Automated Feedback in Higher Education: A Systematic Review. Online Learning Journal, 28(3), 85-104. https://doi.org/10.24059/olj.v28i3.4593
Parasuraman, R., & Manzey, D. H. (2010). Complacency and bias in human use of automation: an attentional integration. Human Factors, 52(3), 381-410. DOI: 10.1177/0018720810376055
Resnik, D.B., Hosseini, M. The ethics of using artificial intelligence in scientific research: new guidance needed for a new tool. AI Ethics (2024). https://doi.org/10.1007/s43681-024-00493-8
Urmeneta, A., & Romero, M. (2024). Creative Applications of Artificial Intelligence in Education. Palgrave Macmillan.
West, D., Luzeckyj, A., Toohey, D., Vanderlelie, J., & Searle, B. (2020). Do academics and university administrators really know better? The ethics of positioning student perspectives in learning analytics. Australasian Journal of Educational Technology, 36(2), 60–70. https://doi.org/10.14742/ajet.4653
Yang, S. J. H. (2021). Guest Editorial: precision education - a new challenge for AI in education. Educational Technology & Society, 24(1), 105–108.
Acknowledgements
ChatGPT 4o was used to suggest improvements for the text for some portions of this post.
This post was written by the author with the exception of the excerpt from ChatGPT4o.
The images were generated using Dall-e3 on ChatGPT by the first author.
The audio was generated using NotebookLM by uploading a draft of the blog and using the "audio overview" function.



Comments