The purpose of this assessment was to discern how artificial intelligence impacts student learning, demonstrated by their ability to engage with artificial intelligence to craft an academic goal that aligns with the smart, measurable, attainable, realistic, and time-bound (SMART) goal criteria.
With the rapid development and expansion of chat-generated artificial intelligence in recent years, there is a need to help prepare students to use these tools ethically and effectively. Within the FYE101 course, we have also recognized that students struggle with setting SMART goals, usually lacking one or more criteria in the SMART framework. Thus, there was an opportunity for students to use AI tools to improve their SMART goals.
In the FYE101 course, students are asked to create a short-term academic SMART goal, identify the benefits and costs of their goal, brainstorm specific objectives and obstacles to achieving their goal, and consider how they might overcome those obstacles and realize the rewards.
To integrate AI into this assignment, students were asked to create a SMART goal, select an AI tool, use AI to refine their SMART goal, and submit a reflection about their experience using AI. The assignment was evaluated on (a) the first draft of the SMART goal, (b) how they engaged with artificial intelligence to ask thoughtful questions, (c) identifying errors in AI’s responses, (d) integrating personal anecdotes into their goal, (e) their final goal that integrates AI’s suggestions, (f) their 3-4 sentence reflection, and (g) submitting evidence of their AI conversation.
Comparing these scores to students’ scores on SMART goals in previous sections I taught in the fall 2023 semester demonstrates promise in helping students use AI technology. When looking specifically at the rubric criteria of how students scored on their goals meeting the criteria of the SMART framework: (1) In Fall 2023, students’ SMART goals averaged 81.04%. (2) In Fall 2024, students’ post-AI SMART goals averaged 91.82%. These results might suggest that AI's immediate feedback may help students strengthen their SMART goals and complement what they learn in the classroom.
While the quantitative data suggests some hope for this technology in our classroom, the qualitative feedback from students was insightful in understanding their learning process. Students generally found AI tools beneficial for refining and personalizing their SMART goals, but they also recognized the importance of using the tool as a guide rather than a crutch. They appreciated AI’s ability to organize their goals but remained aware of its limitations in providing deeply personalized advice. Further research across courses and sections could help further understand the benefits and limitations of these tools.
These results are not without their limitations—three come to mind. First, the scores would intuitively be higher because of their use of AI to strengthen their initial SMART goals. However, analysis of the students’ conversations with their AI tool of choice demonstrates thoughtful questions about the tactics and strategies that they might use to accomplish their goals.
Second, some skeptics might argue that the use of AI did not help students learn any differently about goal setting, but instead, it just helped students utilize tools to shortcut their own critical thinking. However, a content analysis of students’ reflections showed that many students found that using AI helped them refine and make their goals more specific and clear faster than traditional back-and-forth with instructors. For example, they appreciated how ChatGPT/Microsoft Copilot provided suggestions to enhance their original SMART goals, especially in adding measurable steps and clear deadlines. Students often mentioned that AI encouraged them to break down tasks into smaller, more manageable components, making their goals more actionable. How might we as instructors, then, leverage AI tools as some version of embedded support in our classrooms?
Lastly, this data could be limited by my (the instructor’s) bias in the potential value of using AI tools and how that might have shaped the development of the rubric. To mitigate this, I spoke with colleagues to develop the rubric to ensure it measured the important criteria. Using multiple raters to calibrate the rubric and establish a strong interrater reliability could be valuable in the future.
This assessment aimed to evaluate how artificial intelligence (AI) impacts student learning by enhancing their ability to craft specific, measurable, attainable, realistic, and time-bound (SMART) goals. Students in FYE101 often struggle with creating SMART goals, and this course assignment had students use AI to support the refinement of their drafted goals. Students were tasked with developing an academic SMART goal, engaging with an AI tool of their choice to improve it, and reflecting on their experiences with these tools. Results suggest that AI contributed to improvement, with post-AI goals averaging 91.82%, compared to 81.04% in previous semesters without students using AI. A content analysis of students’ reflections demonstrated that they found AI useful for refining their goals, but they recognized its limitations and stressed the importance of using it as a guide, not a substitute for critical thinking. These findings are not without their limitations, and there is a need for further exploration of how to harness the power of AI to support student learning. Moving forward, revisions to the assignment, expanding the scope of AI in FYE101, and developing an AI module are recommendations to deepen our understanding of AI's role in academic and personal growth.