Guiding Questions for MSL Systems:
Data Use and Assessments

Assessment Literacy
A MSL system and its associated processes should ideally contribute to increasing assessment and data literacy among educators. A lack of assessment and data literacy can lead to decreased buy-in and inconsistent implementation of your MSL system.

Do you have a plan for increasing assessment and data literacy among your educators?

  • Have you considered the current levels of assessment and data literacy within your district?
  • What plans do you have in place to ensure that limited assessment literacy does not impact educators’ ability and willingness to engage with the MSL system?
  • If your district is using SLOs, do both evaluators/coaches and those being evaluated have the assessment and data literacy necessary to set appropriate targets and evaluate evidence of student learning? If not, how does your district plan to address this need?
  • Does your MSL system enable teachers to implement cycles of inquiry and data-driven instructional practices in meaningful ways? Does participating in the MSL system provide teachers with a meaningful experience that can actually boost their overall assessment and data literacy?
Pre- and Post-Assessments
Measures that are based on growth between a pre-assessment (an assessment given at the start of the year, semester, or unit to measure student proficiency in specific content and skills) and a post-assessment (another assessment given at the end of the year, semester, or unit measuring proficiency in the same content and skills) present special measurement challenges.

Has your district considered the additional difficulties in measuring growth from a pre-assessment to a post-assessment?

  • How does your district plan to calculate growth between the pre- and post-assessment?
    • In making this determination, have you consulted best practices for measuring student learning over time?
    • If you plan to calculate growth centrally, have you considered the analytical capacity of your district and potential data limitations (e.g., the number of students, or n-size)?
  • It is often inappropriate to use the same instrument as a pre- and post-assessment because most students can be expected to learn at least some content between them. In these circumstances, what is your district’s plan to support educators in developing robust pre-assessments that measure what students should be expected to know at the beginning of instruction?
SLO Process and Supports
Many districts are using Student Learning Objectives (SLOs), where teachers set meaningful goals for learning over time based on the baseline performance of students and identify assessments to measure those goals (teachers may do so individually or use school- or district-determined assessments and targets). SLOs are attractive to districts because they are applicable across grades and content areas, although they require transparency and training/coaching to be implemented in a high-quality way.

Has your district designed the SLO system to ensure credibility with educators?

  • How do your SLO system and processes reflect district priorities and values?
  • How has your district connected the various components of the SLO process to everyday instructional practice?
  • What training and coaching strategies is your district using to ensure that all educators have consistent experiences in implementing and being evaluated on SLOs?
  • Does your district have a process through which teachers share their SLOs with one another to increase transparency and help ensure comparable levels of rigor?
  • Does your district collect sufficient SLO data to evaluate the effectiveness of the SLO process overall?
  • How has your district ensured that the SLO process provides educators with meaningful and actionable feedback on instruction?
  • How are you gathering feedback from educators on the overall process?

How has your district ensured that educators are prepared to select meaningful measures and targets?

  • Has your district ensured that you have a balanced assessment system, aligned to standards and adopted curricula?
  • Does your district have a plan to help educators select rigorous yet attainable targets for their SLOs? (See the Target Setting and Assessment Literacy sections for more information.)
    • What coaching will be available to educators on an ongoing basis?
    • Does your district have a process to vet and evaluate the defensibility and rigor of SLO targets once they have been set? If not, is there a plan in place to develop such a process?

What systems does your district have in place to make the SLO process as easy as possible for teachers and principals to implement?

  • Have you considered distributive leadership models where the responsibility for training and support is not solely the responsibility of one group (e.g., principals)?
  • What data collection tools have you provided to teachers and principals to help them better understand and manage the data involved in their SLOs?
  • Does your district provide SLO templates to facilitate the creation of high-quality SLOs?
  • Does your district provide examples to help teachers develop high-quality SLOs?
  • Have you considered how to align your district’s SLO process with other data-driven instructional practices, such as professional learning communities, data teams, or other processes through which educators analyze student work?
State Summative Assessments
Although the S.B. 10-191 rules require the use of state summative assessment data when available, districts must make several decisions regarding their use. In its report, Using Student Growth Percentiles for Educator Evaluations at the Teacher Level, the National Center for the Improvement of Educational Assessment released technical guidance regarding the use of SGPs in educator evaluations. (An executive summary is also available.)

How has your district addressed the technical issues associated with using state summative assessments in educator evaluation systems?

  • Has your district consulted guidance from CDE regarding the timing and use of state summative assessments in educator evaluations?
  • What are your plans for transitioning to new statewide assessments (e.g., PARCC or SAT)?
  • Has your district considered whether your measures capture growth or achievement? By focusing only on CMAS growth, are you inadvertently excluding:
    • Other state assessments (e.g., social studies or science, ACT, DLM, ACCESS)?
    • Grades (e.g., K-third grade)?
    • Students (e.g., new or transfer students from out of state, students missing prior year(s) of data, students who do not take state assessments)?
    • How is your MSL system accounting for these students and assessments?
  • If you are relying on achievement measures, have you considered possible unintended consequences from the use of achievement or status measures (e.g., impacting the recruitment and retention of teachers in low-achieving schools)? (See the Unintended Consequences section for more information.)
Use of SPF or DPF Data
Many districts use data from the School Performance Framework (SPF) or District Performance Framework (DPF) in their MSL systems because they are readily available and easily satisfy the legislative requirements to include Colorado state summative assessment results and Colorado growth data, when available.

Has your district consulted CDE’s guidance document, Using Colorado School/District Performance Frameworks in an Educator’s Body of Evidence for Evaluation?

  • Have you considered the multiple ways to use the data from the SPF or DPF, as outlined in the guidance from CDE (e.g., a straight rating or the change in percent of points earned)? Some options include:
    • Teachers selecting specific areas of the SPF that align to their content area.
    • Schools selecting areas that align to the school or district UIP.
  • In selecting your measures related to the SPF or DPF, have you considered whether it captures student growth (e.g., student growth percentile and/or SGP) or student achievement (e.g., percent proficient or advanced)? While either of these choices are defensible, it is important to be purposeful about this decision and to weigh the potential unintended consequences.
  • Have you considered whether other collective measures, such as UIP goals or team-based SLOs, are more aligned to the goals and values of your MSL system?
 

Facebook Twitter LinkedIn Share on Google+
Back to Top