2.8: Data Analysis
Candidates model and facilitate the effective use of digital tools and resources to systematically collect and analyze student achievement data, interpret results, communicate findings, and implement appropriate interventions to improve instructional practice and maximize student learning. (ISTE 2h)
Artifact: ITEC 7305 Data Overview / Screencast of Data Overview
This artifact was the collation of three years of data for MYP year 2 (7th Grade) performance on the Educational Records Testing (ERB). These data were placed into Excel spreadsheets to create a number of graphs that would illustrate performance across the testing areas to better understand the student performance. The interpreted results with findings were communicated by a PowerPoint presentation and a screen cast with a view to generating discussions to improve instructional practice for the next testing cycle cohort and to put strategies in place to maximize student learning outcomes and performance in this test.
This artifact gives credence to mastery of understanding “big data” in terms of looking critically at student achievement in one particular area of testing in a large international school. Systematic collection of data was assisted by collaboration with the Head of Curriculum and Professional Development, who furnished me with the data for the three years that Atlanta International School has been testing using this medium. I also took into consideration one year of data from 5th Grade ERB as this cohort would have had results available for 2013 and it would be interesting to track progress on the interventions placed for teaching and learning in light of the 2011 data. I placed the data into composite bars for ease of interpretation and for clear delineation of disseminating the data. Reading comprehension, mathematics, and quantitative reasoning were seen to be areas for focused improvement in the 5th Grade 2011, for example. Targeted strategies, therefore, were placed in the students 6th Grade Middle School experience to address these areas to the extent that two of these three areas (mathematics and quantitative reasoning) improved in this time. However, reading comprehension continued to decline, which signaled that the interventions that had been placed did not give the desired results and, in fact, the students reading levels across the board declined rather than improved. These findings were communicated in the PowerPoint and the screen cast and some suggestions and recommendations for possible interventions were placed at the conclusion of the presentation.
Learning about this data as a classroom teacher, I adjusted my teaching for students in the 8th Grade (Year 3 of the MYP programme) to assist them to handle reading levels with differentiated non-fiction text for classroom discipline reading. These interventions have continued in my instructional technologist role as I advise faculty to use a diversity of reading levels in their differentiated instruction and have accommodated assistive technology (in the form of readers and digital annotation tools) for students. This artifact became a great instructional tool to me personally as the data helped to better inform me about places to focus on for student teaching and learning the needed improvement from these testing outcomes. In future, should this opportunity arise again, I would broadcast the content of my findings to the wider faculty at one of the Professional Development opportunities as currently these big data are not analyzed (to my knowledge) in a whole school way to inform teaching and learning at Atlanta International School. I certainly believe that my response to learning about this method of data collection and analysis helped school and student improvement on a small and very siloed subject discipline in 8th Grade (MYP Year 3) Geography in school years 2013 – 2014. However, I instructed forty of the ninety students from the original test cohort (although three of those students were new to 8th Grade) and observed that due to the deliberate differentiation of non-fiction readings delivered to them over a semester, criterion reference levels in MYP pertaining to reading comprehension did improve.
This artifact gives credence to mastery of understanding “big data” in terms of looking critically at student achievement in one particular area of testing in a large international school. Systematic collection of data was assisted by collaboration with the Head of Curriculum and Professional Development, who furnished me with the data for the three years that Atlanta International School has been testing using this medium. I also took into consideration one year of data from 5th Grade ERB as this cohort would have had results available for 2013 and it would be interesting to track progress on the interventions placed for teaching and learning in light of the 2011 data. I placed the data into composite bars for ease of interpretation and for clear delineation of disseminating the data. Reading comprehension, mathematics, and quantitative reasoning were seen to be areas for focused improvement in the 5th Grade 2011, for example. Targeted strategies, therefore, were placed in the students 6th Grade Middle School experience to address these areas to the extent that two of these three areas (mathematics and quantitative reasoning) improved in this time. However, reading comprehension continued to decline, which signaled that the interventions that had been placed did not give the desired results and, in fact, the students reading levels across the board declined rather than improved. These findings were communicated in the PowerPoint and the screen cast and some suggestions and recommendations for possible interventions were placed at the conclusion of the presentation.
Learning about this data as a classroom teacher, I adjusted my teaching for students in the 8th Grade (Year 3 of the MYP programme) to assist them to handle reading levels with differentiated non-fiction text for classroom discipline reading. These interventions have continued in my instructional technologist role as I advise faculty to use a diversity of reading levels in their differentiated instruction and have accommodated assistive technology (in the form of readers and digital annotation tools) for students. This artifact became a great instructional tool to me personally as the data helped to better inform me about places to focus on for student teaching and learning the needed improvement from these testing outcomes. In future, should this opportunity arise again, I would broadcast the content of my findings to the wider faculty at one of the Professional Development opportunities as currently these big data are not analyzed (to my knowledge) in a whole school way to inform teaching and learning at Atlanta International School. I certainly believe that my response to learning about this method of data collection and analysis helped school and student improvement on a small and very siloed subject discipline in 8th Grade (MYP Year 3) Geography in school years 2013 – 2014. However, I instructed forty of the ninety students from the original test cohort (although three of those students were new to 8th Grade) and observed that due to the deliberate differentiation of non-fiction readings delivered to them over a semester, criterion reference levels in MYP pertaining to reading comprehension did improve.