Artificial intelligence (AI) has become woven into our lives. Recent rapid growth in AI capabilities makes it increasingly difficult to deny AI’s influence in the math classroom (NCTM, 2024). Proficiency with technological tools including knowing how and when to use a particular tool is vital in supporting reasoning and sense-making (AMTE, 2017) making it imperative for mathematics teachers to have a solid understanding of the capabilities and limitations of the available AI tools. The National Council of Teachers of Mathematics (NCTM, 2024) advocates that as AI advances it is important for teachers to supplement their knowledge of instruction and assessment with the ability to integrate AI tools.
To extend preservice teachers’ (PSTs) understanding of AI, ChatGPT was used following a discussion on pattern recognition and extension in geometry. The purpose of this paper is to explain how PSTs worked cooperatively to solve a geometry task, utilize ChatGPT to recognize and extend a pattern, and identify the implications for teaching. First, the use of chatbots in education will be examined. Next, the task and student work will be described. Finally, the PSTs’ reactions to ChatGPT will be discussed.
Background
One method of integrating AI into instruction is the AI-supported paradigm in which the learner is a collaborator and the interaction between the student and AI is bidirectional. The learner provides additional information and the AI revises responses (Ouyang et al., 2022). Chatbots that utilize artificial intelligence (such as ChatGPT) provide this back-and-forth social dynamic using the learners’ input to influence the response (Ouyang et al., 2022; Hwang & Chang, 2021). Although the development of chatbots dates back to the 1960s (Pierce, 2024), current iterations show a new landscape of possibilities (Hwang & Chang, 2021).
Using ChatGPT provides an opportunity for student-directed dialogue, which plays a vital role in developing mathematical understanding. Based on recent research with culturally and linguistically diverse students (Çakmak, 2022), there is reason to believe that conversations with a chatbot can lessen anxiety.
The Task
This task occurred in an undergraduate math education course with PSTs seeking elementary or middle-grade certification. Groups discussed this scenario: Someone created 3d models of prisms. She explained that one consisted of exactly 20 edges. How would you respond?
All agreed that this model was impossible based on their previous work creating models noting that the total edges were always triple the edges of one base. Groups were able to choose any chatbot. Interestingly, all chose ChatGPT which at the time was the free version, ChapGPT 3. Groups queried the chatbot, asking if a prism could have exactly 20 edges. Conversations with the chatbot continued until a correct answer or until it was repetitive. Initial responses stated it was possible (usually neglecting to count the edges of one of the bases). The PSTs asked clarifying questions to prompt ChatGPT to make corrections. For example, PSTs named a specific prism and ask for the total number of edges such as: “How many edges does a triangular prism have?” They would then continue to build a pattern by asking for the next sized prism (triangular prism, square prism, pentagonal prism, etc). When the PST stated that 20 was not part of the pattern, ChatGPT apologized and cited calculation issues. Some initial responses included directions for constructing a pyramid instead of a prism, stating non-whole numbers to represent the number of edges, or omitting some edges.
The groups receiving the most misinformation began their conversations with a direct question such as “Can a prism have 20 edges?” Conversely, groups that focused on establishing the pattern first had more reliable responses. For example, these groups began by asking ChatGPT to calculate the number of edges of a triangular prism. They continued by increasing the number of edges on the base. Responses reflected the pattern of a prism’s total edges, which is triple the number of edges of the base. After the pattern was established, groups asked if it was possible to have a total of 20 edges, to which ChatGPT replied that it was not possible.
Methods
This task was completed in the spring of 2024 with 18 undergraduate education students. The PSTs first established the pattern that a prism’s total edges was triple the number of edges of the base without the use of technology. Then they asked a chatbot if it was possible to have a prism with 20 edges. They saved their conversation with the chatbot. Each PST submitted their conversation and documented their experience in a short paper explaining their process and reactions. Finally, a discussion among the entire class was held focusing on their perceptions of the AI generated responses. The discussion was guided by two initial questions. First, explain the “thinking’ that was reflected in the AI generated response. Second, explain how you framed the questions to arrive at a correct response or explain why you were not able to guide the chatbot to a correct response. It was during the conversation that the PSTs unexpectedly made the connection to working with struggling students.
PSTs Reactions
During and after the assignment, the PSTs reported being intrigued by ChatGPT’s responses. As the groups shared their approaches and compared the responses, they agreed that ChatGPT typically provided some correct background information at the start of the response including the definition of a prism and a description of the characteristics of a prism. They did not anticipate the answers that substituted pyramids for prisms or answers containing non-whole or negative numbers. They reported that during their conversations they sometimes felt motivated by the sense of “breaking” the chatbot. When questioned, they elaborated that it is easy to assume the chatbot always gives correct information. This was a reminder of the need to be critical in their thinking and evaluation.
Some groups reported they felt like they were talking to a struggling student. One group of PSTs described the chatbot as “acting like a confused, unsure student needing a series of questions to arrive at a correct answer.” These groups tailored their questioning to include prompts pushing ChatGPT to recognize errors. Prompts included reminding the chatbot that they were focusing on a prism and referring to specific prisms.
After this task, groups chose to explore other topics with ChatGPT. Topics included questions about pyramids, Euler’s formula, concavity, and symmetry.
PST Reflections
The discussions in class and written reflections were rich and interesting. Fifteen of the eighteen PSTs reported familiarity with using ChatGPT for brainstorming and writing tasks. Ten of the eighteen PSTs reported using AI tools but never for math. All reported using or being aware of PhotoMath, but several were unsure if PhotoMath was considered AI.
Overall, the PSTs were surprised that ChatGPT was not able to think in reverse after establishing the pattern when they supplied the shape of the polygonal base asking the chatbot to calculate the total number of edges in the prism. One PST expressed surprise “because all these AI things are supposed to be correct and give good reasoning.” Another shared, “I honestly am surprised that the AI just kept giving me wrong answers and was trying to justify them.” They did not expect ChatGPT to offer justifications for its erroneous responses. The PSTs explained that if they had not known the correct answer before using ChatGPT, they would have believed the errors because they perceived confidence and authority in the responses. One student stated, “I truly thought that AI would have the programming to answer mathematical questions correctly.” Another “was surprised to see how easy it was for the AI to get a question wrong. This makes it very clear that AI cannot be relied on completely.” Some PSTs doubted the mathematical abilities of ChatGPT but they were also curious about how ChatGPT differs from PhotoMath and how it arrives at answers. They admitted that they had not previously thought about how different forms of AI work.
An interesting observation made by the PSTs was their perception of the conversation. This task was not designed to practice asking leading questions or using prompts to identify errors or arrive at correct answers. Yet that did occur. The PSTs concluded that by examining word choice in their questions and prompts[TD1] they influenced the responses. One PST summarized the thoughts of their group explaining “AI did correct itself once I asked it questions about its mistake.” They recognized that they were practicing questioning skills they could use with students. The conversations and written reflections demonstrated an unanticipated growth where the PSTs assumed the mindset of a teacher as they raised questions regarding the appropriate use of chatbots and AI in the classroom including how and when to plan for their use. The PSTs moved the conversation from focusing on a specific math task to how to best meet the needs of students in a responsible way.
Conclusion
This experience was rewarding as a teacher educator, and the PSTs described this as their favorite and one of the most intriguing tasks. They exhibited elevated motivation and exceeded expectations. They questioned the mathematical abilities of ChatGPT and developed a curiosity about how math AI tools vary. They also raised questions about the appropriate use of chatbots and AI in the classroom.
The most satisfying part was witnessing an obvious shift from the perspective of a student to the perspective of a teacher in the discussion. Their experience with this task went beyond their development of mathematical understanding. Their conversations and reflections showed meta-cognitive processes as they described the impact of their word choices on the responses received. They also made a clear connection about how this experience helped them consider how they interact with students who are having difficulties. As recent research (Çakmak, 2022) has examined how chatbots can be beneficial for culturally and linguistically diverse students to practice their developing language skills, I look forward to seeing how this can be expanded for students to practice their mathematical communication and for PSTs to practice their questioning skills.
References
Association of Mathematics Teacher Educators. (2017). Standards for preparing teachers of mathematics. Retrieved from amte.net/standards.
Association of Mathematics Teacher Educators. (2022). Position of the association of mathematics teacher educators on technology. Retrieved from https://amte.net/sites/amte.net/files/AMTE%20Technology%20Statement%20Oct %202022_0.pdf.
Çakmak, F. (2022). Chatbot-human interaction and its effects on EFL students’ L2 speaking performance and anxiety. Novitas-ROYAL (Research on Youth and Language), 16(2), 113–131
Hwang, G.-J., & Chang, C.-Y. (2021). A review of opportunities and challenges of chatbots in education. Interactive Learning Environments, 31(7), 4099–4112. https://doi.org/10.1080/10494820.2021.1952615
National Council of Teachers of Mathematics. (2024). Artificial intelligence and mathematics teaching. National Council of Teachers of Mathematics.
Ouyang, F., Jiao, P., McLaren, B. M., & Alavi, A. H. (Eds.). (2022). Artificial intelligence in stem education: The paradigmatic shifts in research, education, and technology. Taylor & Francis Group.
Pierce, D. (2024, February 28). From Eliza to ChatGPT: Why people spent 60 years building chatbots. The Verge. https://www.theverge.com/24054603/chatbot-chatgpt-eliza-history-ai-assis...