top of page

Evidence Set Three (TPA 5)

Introduction

 

Evidence Set Three is taken from TPA 5, the assessment, and reporting of feedback of student assessment. This evidence set looks at the 3D shapes unit from the previous evidence set (Evidence Set One (TPA 3)). During this unit, diagnostic, formative, and summative assessment were taken to understand where students understanding on 3D shapes was, to evaluate their learning and growth throughout the unit, and consolidate this understanding through a practical summative task. Throughout the evaluation of the assessment process, I studied two students with learning needs, one high performing and one student in need of specific support, relating their results with previously recorded mathematics results to measure their growth and understanding in this learning area.

 

The CARES model frames Evidence Set Three  (TPA 5) and draws reference to APST Focus Areas: 3.6; 5.1; 5.2; 5.3; 5.4; 5.5; 7.3.

Context

Evidence Set Three (TPA 5) took place in the same classroom at the prior evidence sets, with students participating in a mathematics unit on 3D shapes. Before this unit’s creation, I studied the students PAT-M results from the prior year, noting their strengths and weaknesses, adding them to their individual learner profiles, and noting the needs of the two case study students that were used as an example for the effectiveness of my teaching in this learning area. These two students were rather diverse from each other in their experiences. One student was a high performing male student from a farming family in South Australia’s Mid-North region, and the other student an immigrant from a working-class family background with ESL differentiation needs. The needs of both students were kept in mind in the unit, and their results analysed and used to demonstrate the effectiveness of this unit.

 

 For this unit, I wanted the assessment process to not reflect their prior, negative, experiences with assessment in mathematics, namely being through traditional tests and NAPLAN style standardised tests. As such, the assessment process for this unit involved gathering formative assessment in the moment, finding multiple methods for students to demonstrate their understanding, and creating a summative experience that led to a fun method to demonstrate their understanding both at the abstract level, and in practice.

 

Below are the annotated artefacts from Evidence Set Three (TPA 5) on the interpretation of student data and the evaluation of teaching strategies conducted regarding student assessment.

Context Artefacts

image.png

*Artefact 3.1: Learner Profiles and PAT-M results of two case study students within the TPA tasks, used to craft the assessment strategies for the altered learning program. (Adjusted for student privacy). [APST 3.6]

image.png

 Through this section of Evidence Set Three (TPA 5) there are two main areas of discussion. First is the application of feedback in the classroom, and second is the assessment of student learning. Giving feedback within the classroom was a systematic approach for me, using three-step research supported (Hillman and Stalets, 2019) process of:

 

  1. Comment on the strength of a demonstrated skill

  2. Connect the skill to the learning target/standard

  3. Clearly communicate the next step to the learner.

 

This framework was used to streamline the feedback process, to allow students to know what they were doing well, where they were at and where they had to go next within a learning area. This method was mostly successful, but upon reflection, required some differentiation and alteration on specific students who did not always respond well to steps two or three of the process.

 

The assessment process came in three parts. The diagnostic, formative, and summative sections of the unit. Students were given a diagnostic test at the onset of the prior unit, which sought out their understanding of 2D and 3D shapes. Students were asked to complete the test to the best of their ability, and I gradually put hints and answers up on the board as the test went on, noting who completed what at which stage of the test. This gave me an understanding of where students were in the learning area, and where I needed to take them moving forward. Students were then regularly formatively assessed using game like activities, teacher questions, and teacher observations, which were consolidated into a rolling three-point rubric divided into ‘Above and Beyond,’ ‘On Target,’ and ‘Not There Yet,’ based off Van Der Walle and co. (2019). This led to the final assessment item, a practical assessment in which students design and build a ‘robot’ using their understanding of 3D shapes. The hook was the robot prototypes would go to the Australian Government for potential construction. Students used printed 3D shape nets, paper, and other crafting materials to make their robots. Some students were given specific scaffolding to ensure they were meeting the standards set for the assessment of learning. Students were given opportunities to demonstrate their understanding through the planning of their robot, through the articulation of the features of 3D shapes at the conclusion of the planning process, and the robot itself demonstrating the students understanding of 3D shapes.

 

Below are the annotated artefacts from Evidence Set Three (TPA 5) that demonstrate the assessment decisions on student learning, and feedback given to students on their learning throughout the unit.  

Action

Action Artefacts

image.png
image.png
image.png

*Artefact 3.2: Written research-based explanations of assessment strategies within TPA 5. [APST 5.1]

“[Student Two] your design I can see you considered and fulfilled the prompts of the task well, and you have demonstrated on the planning sheet your indication of which 3D shape is which. Now [Student Two] we need to describe the properties of each 3D shape to demonstrate that we know which 3D shape is which.” (Greenslade to Student Two, 2024)

image.png

*Artefact 3.3: Research-based framework of feedback utilised during the TPA 5 and a written example of the feedback in use, following the model, whilst also using the feedback to infer consistent and comparable judgements on student work. (Adjusted for student privacy) [APST 5.2, 5.3]

Results

Throughout Evidence Set Three (TPA 5) assessment was made on student work, moderating, giving feedback on their learning, and reporting on their learning. This was done by comparing the student’s growth across the unit from their initial diagnostic results to the rolling formative assessment rubric. Focus was given to the two case study students within the assessment process, evaluating their progress compared to their prior PAT-M tests from the previous year. All students were assessed against the four-point assessment rubric for mathematics (Van Der Walle, et al, 2019), as outlined below:

 

  • Excellent: The student shows clear understanding, can fluently communicate concepts in multiple representations, and shows evidence of using ideas unprompted.

  • Proficient: The student understands concepts, can communicate concepts in multiple representations and uses designated models.

  • Marginal: The student understands concepts with some support, communicates concepts in limited representations, and uses scaffolded models.

  • Needs Work: Student does not understand concepts and can only communicate concepts in highly supported representations. Can only use scaffolded models with major teacher support.

 

The results were positive, with all students demonstrating growth within the learning area. Students were engaged with the summative activity, a general sense of fun being evident over the lessons. For the two case study students, there was clear improvement demonstrated by both students, both showing a clear understanding of 3D shapes and their properties, the student with ESL differentiation needs showing incredible improvement on articulating her understanding of 3D shapes and their properties verbally, a massive win for herself, and I. I was glad to share in her enthusiasm during the assessment, which I believed contributed to the learning.

 

I found the assessment process a difficult, but rewarding process. Moderating work with my mentor teacher was a valuable process, with an understanding made within this experience of how to use a wide base of evidence to get a clear picture of what students know and demonstrate in a learning area. Having gathered a wide set of evidence for this assessment proved invaluable of getting a clear picture of each student’s understanding and placing them within an appropriate grade band, using the four-point rubric mentioned above. Each student was given a report on their results, and was spoken to individually about their work, as well as presented with a certificate of achievement from the Australian Government’s Robotics Application Division (R.A.D, the fictional hook for the assessment) to take home.

 

Below are the annotated Artefacts from Evidence Set Three (TPA 5) demonstrating the interpretation of student data, reporting of student achievement, making consistent and comparable judgements, and giving feedback to students on their achievement.

Results Artefacts

image.png
image.png

*Artefacts 3.4: Within this artefact, the comparison and interpretation of student data is evident, as well as the reporting on student achievement using the rubric onto CANVAS for parents and caregivers to access. (Adjusted for student privacy). [APST 5.3, 5.4, 5.5]

image.png
image.png
image.png
image.png

*Artefact 3.5: The written comparisons and judgments of the two case study students of the TPA tasks on their assessments. [APST 5.3, 5.4]

image.png
image.png

*Artefact 3.6: Formative and Summative Assessment products that were assessed as part of TPA 5. (Adjusted for student privacy). [APST 5.3, 5.4]

Evaluating the Evidence Set Three (TPA 5) there are some clear successes and limitations to my practice in assessing and reporting student work. The successes are evident in the above sections, with the clear sequencing of assessment, wide collection of data, and the students clear understanding of what is expected of them throughout the unit through systematic and clear feedback. This allowed students to succeed in the learning and aided myself in the marking and moderating process. This was then reflected upon to further understand where my own limitations lay, and where I needed to develop professionally to improve in this part of the practice.

 

 The main area of improvement that was required through this assessment process is the involvement of parents and caregivers. While there are some attempts made to engage them with the results of the assessment process, through the reporting of grades and the take home elements of the certificate and robot, there is a need to understand that, to build a community space within the classroom that reaches outside the school, I need to find methods to develop my connections with parents and caregivers in the learning, and in this area, this is an element clearly lacking.

 

Below are the annotated artefacts from Evidence Set Three (TPA 5) on the successes and limitations of this learning experience, namely the integration of parents and caregivers into the learning process, and the successes of assessing and reporting student work.

Evaluation

Evaluation Artefacts

*Artefact 3.7: Reflection made on TPA practice regarding parents and caregivers’ involvement, and how to further build upon this standard in future practice. [APST 7.3]

“Within the final part of this TPA requirement, parents were shown images of the results of the mathematics unit, both their student’s grade and product through CANVAS. While this is a fine strategy for now, future practice should inquire as how further community involvement could be used to bring parents and caregivers within this learning space. Perhaps having the planning process be part of a homework element, where students are to build an aid to a parent or caregivers’ job to aid them? All food for thought.” (Greenslade, 2024)

Standards

Standards

 

Within Evidence Set Three (TPA 5) the necessary APST focus areas are addressed and examples for their implementation are evident within the above CARES model. Blow is a list of each focus area and how it is addressed through Evidence Set Three (TPA 5).

 

3.6. Evaluate and improve teaching programs

5.1. Assess student learning

5.2. Provide feedback to students on their learning

5.3. Make consistent and comparable judgements.

5.4. Interpret student data.

5.5. Report on student achievement

7.3. Engage with the parents/carers

bottom of page