Impact Assessment of an activity on AI Tools Utilization Among CHM151 Students

Submitted by Allen Reyes on
Duration
What is the Purpose of the Assessment?
  • Evaluate the effectiveness of a Canvas module designed to educate college students about AI tools, including their operation, benefits, risks, and practical applications. This was an in-class activity. 
  • Determine whether the activity enhances students' ability to recognize inaccurate AI outputs and encourages creative applications of AI.
Describe the necessity for this assessment

This assessment is essential for understanding students' baseline knowledge of AI and measuring changes post-intervention. Faculty and organizational leaders are increasingly focused on detecting and limiting AI misuse. This activity, however, aims to highlight AI’s value as a learning tool. Through this assessment, I seek to guide students in using AI constructively, enhancing their learning experience rather than fostering reliance for dishonest purposes.

Describe how the practice will be implemented

To begin, students completed a pre-activity survey which assessed their baseline knowledge, confidence, and attitudes toward AI tools. The activity was implemented as an in-class learning module hosted on Canvas, designed to provide students with hands-on experience using AI tools. The module began with an introductory overview of AI, covering its operations, potential benefits, risks, and ethical considerations. Students engaged in guided exercises that included exploring various AI tools, analyzing their outputs, and identifying inaccuracies. To promote practical application, the activity incorporated tasks where students used AI tools for academic purposes, such as generating research ideas, solving problems, or summarizing information. Structured discussions and reflection prompts encouraged students to critically evaluate the reliability and usefulness of AI-generated content. Finally, students completed a post-activity survey to assess the module’s impact on their learning experience, knowledge, confidence and attitudes.

Interpret, compare, and describe the results

Pre-activity survey results:

The pre-activity survey results illustrate a range of familiarity, perceptions, and expectations among CHM151 students regarding AI tools. Quantitatively, 40% of respondents indicated prior use of AI tools, primarily ChatGPT, while the remaining 60% reported no experience. Confidence levels averaged between 2 and 3 on a 5-point scale, suggesting that although many students possess some awareness of AI tools, their confidence in using these tools effectively for academic purposes remains relatively low. These findings reveal a need for foundational instruction in practical AI applications.

Qualitatively, students expressed a mix of curiosity and caution about AI. Approximately 80% of respondents expressed a desire to understand AI’s benefits, ethical use, and methods to ensure accuracy, while around 20% voiced doubts about its relevance for certain academic tasks. Commonly anticipated challenges included generating effective AI prompts (30%), ensuring reliability of AI-generated information (25%), and avoiding unintentional plagiarism (20%). Overall, 70% of students believed AI could positively impact their learning experience, though about 30% expressed skepticism. These responses underscore the module’s potential role in bridging gaps in both skill and understanding, guiding students toward responsible, student-centered use of AI while addressing common misconceptions and ethical concerns.

Post-activity survey results:

The post-activity survey results demonstrate a significant improvement in both confidence and understanding of AI tools among CHM151 students. Confidence levels increased, with 80% of respondents rating their confidence at 4 on a 5-point scale, and 20% at a 5, showing a clear enhancement compared to pre-activity scores. Additionally, 100% of students indicated an improved understanding of AI tools and expressed belief in AI’s positive impact on their learning experience, suggesting that the activity met its goal of fostering a constructive perception of AI in academic contexts.

Qualitatively, students highlighted key takeaways, such as gaining awareness of a broader range of AI tools, understanding AI’s potential for clarifying and re-explaining information, and recognizing the importance of verifying AI-provided sources. Approximately 60% reported a shift in attitude, noting a reduction in negative perceptions or newfound appreciation for AI's utility in studying. Nearly all students (90%) indicated that they overcame anticipated challenges from the pre-survey, such as concerns about reliability and effective prompt usage. Additionally, 100% of students expressed an intention to continue using AI tools in their college work, with some requesting further guidance on areas such as citation practices and advanced study techniques. These results suggest that the activity was highly effective, equipping students with both practical skills and a positive framework for responsible AI use in learning.

 

Comparative Results: 

The comparison between the pre-activity and post-activity survey results reveals a substantial positive shift in CHM151 students' confidence, understanding, and attitude toward AI tools, indicating that the activity met its goals effectively.

Confidence and Familiarity

In the pre-survey, only 40% of students had previously used AI tools, and confidence levels in using AI for academic purposes averaged around 2-3 on a 5-point scale, indicating limited familiarity and moderate confidence. After completing the activity, 80% of students rated their confidence at a level of 4, with the remaining 20% rating it as a 5. This indicates a significant increase, with all students now feeling highly confident in their ability to use AI tools effectively. This shift in quantitative data underscores the activity’s success in meeting its purpose of enhancing students’ comfort and competence in AI tool usage.

Understanding of AI Tools

Qualitatively, students’ understanding of AI’s applications and limitations improved notably. Before the activity, students expressed a mix of curiosity and concern, with around 60% identifying anticipated challenges, such as ensuring accuracy, avoiding plagiarism, and crafting effective prompts. In the post-survey, 100% of respondents reported an improved understanding of AI, with many mentioning new insights, such as recognizing the diversity of AI tools, learning to verify sources, and discovering ways AI could clarify complex topics. This shift aligns closely with the goal of helping students recognize AI's constructive academic uses and its limitations, as well as fostering a learning-centered rather than shortcut-focused approach.

Attitude and Perception

In terms of attitude, around 30% of students in the pre-survey expressed skepticism about AI’s relevance to academic learning, while in the post-survey, 60% indicated a positive change in attitude. These students reported reduced skepticism and an increased openness to AI as a helpful learning tool. Additionally, 100% of students now believe that AI can positively impact their learning experience, up from 70% in the pre-survey. This data reflects the activity’s success in shifting perceptions toward viewing AI as a beneficial, supportive resource.

Anticipated Challenges and Future Engagement

Before the activity, students identified challenges such as formulating effective prompts and ensuring reliable AI outputs. Afterward, 90% reported overcoming these challenges, with many now interested in exploring more advanced applications of AI, such as citation practices and study techniques. Furthermore, all students expressed an intent to continue using AI tools in their college work, showing a strong commitment to applying what they learned beyond this activity.

Conclusion

Overall, the data confirms that the activity met its objectives: it significantly increased confidence, enhanced understanding, positively influenced attitudes, and equipped students with skills to overcome initial challenges. Quantitatively, improvements in confidence (with an average post-activity rating of 4.2 compared to pre-activity averages of 2-3) and qualitative shifts in perception strongly support the case that the activity successfully introduced students to AI as a valuable and responsible learning tool in their academic journey.

After analyzing, and reflecting on the outcome, what are the next steps?

Expand the AI Module to Other Courses with a Focus on Integrity and Skill Development:

Building on the success of this activity, a logical next step would be to expand the module to other courses to help students across disciplines effectively and ethically incorporate AI into their academic work. By offering the module in various classes, students would gain context-specific skills that align AI use with their unique fields, enhancing their learning while maintaining academic integrity. This module could introduce foundational AI competencies tailored to each course’s subject matter, enabling students to see AI’s practical and ethical role within specific contexts, such as STEM, humanities, and social sciences.

Integrating this AI module into additional courses could include embedded assignments that require students to use AI tools as part of their research or problem-solving. Assessing their use of AI in a structured environment—along with guided reflections on accuracy, reliability, and responsible usage—would strengthen their ability to critically evaluate AI outputs. This reflective approach would underscore AI as a tool to enhance understanding and reinforce the principle of learning with integrity.

Provide Continuous Resources and Support:

To further reinforce responsible and effective AI use, creating a suite of ongoing resources accessible to students across these courses would be essential. This could include an FAQ, a best practices guide, and tips on verifying AI-generated information, all designed to support ethical, efficient, and accurate AI use. Regular updates to these resources would help students navigate evolving AI tools, ensuring they stay informed on the latest best practices for using AI constructively and responsibly.

Abstract

This study assesses the impact of an educational activity designed to enhance AI tool utilization among CHM151 students, with a focus on promoting ethical, effective use in academic work. Delivered as an in-class Canvas module, the activity aimed to educate students on AI operation, benefits, risks, and practical applications. The module also sought to improve students’ ability to identify inaccurate AI outputs and foster creative, responsible AI applications. Pre-activity survey results showed that 40% of students had prior AI experience, primarily with ChatGPT, while 60% had no prior exposure. Confidence levels ranged from 2 to 3 on a 5-point scale, and students expressed both curiosity and caution, noting anticipated challenges such as prompt generation and avoiding plagiarism. Post-activity findings indicated substantial improvement, with 80% of students rating their confidence at 4 and the remaining 20% at 5, showing that the activity significantly increased comfort and competence. Additionally, 100% of students reported improved understanding and intent to continue using AI tools, with 90% overcoming anticipated challenges. Attitudinal shifts showed a reduction in skepticism, as students increasingly viewed AI as a valuable learning aid. Given these outcomes, the next steps include expanding the module to other courses to provide context-specific AI skills across disciplines. Continuous support through resources, such as an FAQ and best practices guide, will ensure ongoing responsible, constructive AI use in academic settings. These findings underscore the module's efficacy in fostering both confidence and integrity in AI engagement, supporting student-centered learning in a technology-driven academic landscape.

Division/Department
Completed Full Cycle
Yes
Course Number
CHM151
Program Learning Outcomes/Course Level Outcomes