...

Background on State  Assessment System HB14‐1202 Standards and Assessments Task Force

by user

on
Category: Documents
19

views

Report

Comments

Transcript

Background on State  Assessment System HB14‐1202 Standards and Assessments Task Force
Background on State Assessment System
HB14‐1202 Standards
and Assessments Task Force
Joyce Zurkowski, Executive Director of Assessment
Advance Organizer
 Assessment Continuum
Formative versus interim versus summative
 Legal requirements for assessments
 State
 Federal
 Transition to Colorado Measures of Academic Success
 Test Development Process
 Current Plans for 2014‐2015
2
Assessment Continuum
Formative, interim and summative assessment
State role and local role
3
Legal Requirements:
State Summative Content Assessments
Every student enrolled in a public school shall be required to take the assessments administered at the grade level in which the student is enrolled:
 English language arts ‐ grades 3‐11  Mathematics ‐ grades 3‐8 and three high school assessments
 Science ‐ once in elementary, once in middle school and once in high school
 Social studies ‐ once in elementary, once in middle school and once in high school
 Standardized, curriculum‐based, achievement, college entrance examination ‐ grade 11
4
Legal Requirements:
Additional State Assessment Requirements
 Participation as a governing board member in a multi‐state assessment consortia and use of assessments produced by the consortia
 English language proficiency assessments
 ReadAct – diagnostic and interim assessments grades K‐3
 School readiness – kindergarten (2015‐16)
5
Legal Requirements:
Federally Required Assessments
 Content assessments: annual  Reading/language arts and mathematics ‐ all students in grades 3‐8 and once in grades 10‐12
 Science ‐ all students once in grades 3‐5, 6‐8 and 10‐12
 English language proficiency assessments: annual  All students who are identified as English language learners
 National Assessment of Education Progress (NAEP): every other year  Reading and math ‐ sample of schools in 4th and 8th grade
6
Legal Requirements:
Federal Requirements for Content
Assessments
 Be the same academic assessments used to measure the achievement of all children;
 Be aligned with the State's academic content and achievement standards;
‐
 Be of technical quality and be consistent with relevant, nationally recognized professional and technical standards;
 Involve multiple measures (i.e., item types), including measures that assess higher‐order thinking skills and understanding; and
 Provide for accommodations for students with disabilities and English language learners
7
Transition to Colorado Measures of Academic Success
(CMAS)
8
9
10
Transition to CMAS
2012‐2013
TCAP
2013‐2014
TCAP/CMAS
2014‐2015
CMAS
TCAP Reading and Writing
(Grades 3‐10)
TCAP Reading and Writing
(Grades 3‐10)
New
CMAS English Language Arts
(Grades 3‐11)
TCAP Mathematics
(Grades 3‐10)
TCAP Mathematics
(Grades 3‐10)
New
CMAS Mathematics
(Grades 3‐8 and three high school assessments)
TCAP Science
(Grades 5, 8 and 10)
New
CMAS Science
(Grades 5 and 8)
New
CMAS Science
(Grades 5, 8 and 12)
New
CMAS Social Studies
(Grades 4 and 7)
New
CMAS Social Studies
(Grades 4, 7 and 12)
11
Test Development Process (Science and Social Studies)
Standards Development (2009)
Standards Adoption (Dec. 2009)
Stakeholder and Subcommittee Recommendations (Fall 2010)
Assessment System Attributes Adopted (Nov./Dec. 2010)
Request for Proposal (RFP) Process (Jan. ‐ May 2012)
Contract Process (June ‐ Dec. 2012) 12
Test Development Process (Science and Social Studies)
Assessment Framework Development (2012‐2013)
Item Type and Distribution Targets (2012‐2013) Item Writing for First Administration (Fall 2012 ‐ Spring 2013)
Item Review for First Administration (Winter ‐ Summer 2013)
(Content and Bias/Sensitivity)
Field Test Form and Ancillary Development ( Winter ‐ Summer 2013)
13
Test Development Process (Science and Social Studies)
Field Test for First Administration (Spring and Fall 2013)
Rangefinding for Field Test (Summer and Fall 2013)
Item Scoring of Field Test (Summer and Fall 2013) Data Review of Field Test (Summer and Fall 2013)
Forms and Ancillary Development (Winter 2014)
First Administration (Spring 2014)
14
Spring 2014 Elementary and Middle School ‐ Number of Daily Testers
100,000
90,000
80,000
70,000
60,000
50,000
40,000
Number of Daily Testers
30,000
20,000
10,000
Mon 4/14
Tue 4/15
Wed 4/16
Thu 4/17
Fri 4/18
Mon 4/21
Tue 4/22
Wed 4/23
Thu 4/24
Fri 4/25
Mon 4/28
Tue 4/29
Wed 4/30
Thu 5/1
Fri 5/2
‐
Week 1
15
Week 2
Week 3
The Partnership for Assessment of
Readiness for College and Careers (PARCC)
 Multi‐state consortium (AR, CO, DC, IL, LA, MD, MA, MS, NJ, NM, NY, OH and RI) allowing for cross state comparisons
 Developing English language arts assessments in grades 3‐11 and mathematics assessments in grades 3‐8 and two 3 assessment series for high school
 Participation of higher education
 Students receiving a college and career ready determination will be eligible for credit bearing courses.
 Additional assessment resources
16
 Mid‐year performance‐based assessment
 Speaking and listening
 Formative/diagnostic in grades K‐8
Current Plans for 2014-2015
PARCC Assessment Design
75% of the year
Performance‐
Based
Assessment
•Extended tasks
•Applications of concepts and skills
17
90% of the year
End‐of‐Year Assessment
•Innovative, computer‐
based items
Current Plans for 2014-2015
Secondary Math Options
 New high school mathematics assessment system has room for increased flexibility:
 High school mathematics standards are not grade specific
Grade
Assessment Options
8 (3 options)
8th grade assessment or Algebra I/Integrated I
9 (4 options)
Algebra I/Integrated I or Geometry/Integrated II
10 (6 options)
Algebra I/Integrated I or Geometry/Integrated II or Algebra II/Integrated III
11 (4 options)
Geometry/Integrated II or Algebra II/Integrated III
12 (2 options)
Algebra II/Integrated III
18
Current Plans for 2014-2015
Testing Time
 Windows versus used days
 Student versus school
19
Current Plans for 2014-2015
Calendar
Assessment
CMAS and CoAlt: Science and Social Studies
PARCC Performance‐Based Assessment (PBA)
Grade
Tentative Windows
12
11/03/14 to 11/21/14
3‐8, High School
3/09/15 to 4/03/15
CMAS and CoAlt: Science 4 and 7 (Social Studies)
and Social Studies
5 and 8 (Science)
CoAlt: English language
arts and math
PARCC End‐of‐Year Assessment (EOY)
ACCESS for ELLs®
Reading, Writing, Speaking and Listening
CO ACT
Grade 11
4/13/15 to 5/01/15
3-11
4/13/15 to 5/15/15
3‐8, High School
4/27/15 to 5/22/15
K‐12
1/12/15 to 2/13/15
Tues., April 28, 2015
Initial Test Date
Make‐up Test Date
Tues., May 12, 2015
TCAP
Grade
3
4‐5
6‐8
9‐10
11
Content Area
Sessions
Estimated
Total
Time (min)
Reading/Writing (ELA)
Mathematics
Total
Reading/Writing (ELA)
Mathematics
Total
Reading/Writing (ELA)
Mathematics
Total
Reading/Writing (ELA)
Mathematics
Total
Reading/Writing (ELA)
Mathematics
Total
4
2
240
130
370
360
195
555
360
195
555
360
195
555
6
3
6
3
6
3
PARCC
(typical student)
Sessions
Estimated
Total
Time (min)
5
4
5
4
5
4
5
4
5
4
0
280
210
490
320
210
530
350
220
570
350
250
600
350
270
620
Current Plans for 2014-2015
District Capacity
 We know that elementary and middle schools can assess online; however, depending on number of devices:
 Multiple groups for each grade level
 Technology directed to assessment
 Current PARCC Design:
 5 sections per student for the performance‐based assessment  4 sections per student for the end‐of‐the‐year assessment
 Option: Offer mathematics assessments as paper‐based assessments
 Reduces the number of sections by about 45% for PARCC assessments
22
CMAS Resources
Science and Social Studies
 Assessment Frameworks
http://www.cde.state.co.us/assessment/newassess‐sum
 Sample Items and Practice Testing Environments
www.pearsonaccess.com/co (available under the ePAT tab)
English Language Arts and Mathematics
 Model Content Frameworks
http://www.parcconline.org /parcc‐model‐content‐frameworks
 Sample Items
http://www.parcconline.org /samples/item‐task‐prototypes
 Practice Items
http://practice.parcc.testnav.com/#
23
For More Information
CDE Assessment Home page:
 http://www.cde.state.co.us/assessment
New assessment page:
 http://www.cde.state.co.us/assessment/newassess
Joyce Zurkowski, Executive Director, Assessment
[email protected]
24
Colorado Measures of
Academic Success
Fact Sheet
www.cde.state.co.us/assessment/index.asp
The Transition to New State Assessments
Background on New State Assessments: Colorado
Measures of Academic Success
Colorado assessments are changing in order to accurately assess student
mastery of the new Colorado Academic Standards. With the standards
being more focused, coherent and rigorous, assessments must adapt to
align with them. The Transitional Colorado Assessment Program (TCAP)
is being phased out and replaced by the Colorado Measures of Academic
Success, the state’s new English language arts, mathematics, science and
social studies assessments.
PARCC-Developed English Language Arts and Mathematics Assessments
Colorado is a governing member of a multi-state assessment consortium
called the Partnership for Assessment of Readiness for College and
Careers (PARCC). Involvement in this consortium allows the Colorado
Department of Education staff, along with staff from the Colorado
Department of Higher Education and Colorado educators, to collaborate
with individuals from across the U.S. to develop assessments for English
language arts and mathematics.
Beginning in the 2014-2015 school year, these new computer-based
assessments will be administered in grades 3-11 for English language arts
and in grades 3-8 with three high school assessments for mathematics.
Assessments in each content area will be administered in two
components: a performance-based assessment administered after
approximately 75 percent of the school year and an end-of-year
assessment administered after 90 percent of the school year.
Colorado-Developed Science and Social Studies Assessments
New state science and social studies assessments measuring the Colorado
Academic Standards were administered online in Colorado for the first
time in the 2013-14 school year. These assessments are being developed
collaboratively by the Colorado Department of Education, assessment
contractor Pearson, and Colorado educators. Elementary (4th grade social
studies and 5th grade science) and middle school (7th grade social
studies and 8th grade science) assessments were administered in the
spring of 2014. High school (12th grade science and social studies)
assessments will be administered in the fall of 2014.
Colorado’s State
Assessment Timeline
1997-2011
Colorado Student Assessment
Program (CSAP)
 Measured student learning of the
Colorado Model Content
Standards (mathematics, reading,
writing and science)
2012-2013
Transitional Colorado Assessment
Program (TCAP)
 Measures a blend of Colorado
Model Content Standards and the
new Colorado Academic
Standards
 Allows school districts to
transition their instruction from
the old standards to the new ones
2014
TCAP & Colorado Measures of
Academic Success
 TCAP continues for reading,
writing and math
 First year of new Coloradodeveloped social studies and
science assessments (part of the
state’s new Colorado Measures of
Academic Success)
2015
Colorado Measures of Academic
Success
 First year of new PARCCdeveloped English language arts
and mathematics assessments
 Second year of Colorado
developed social studies and
science assessments
MARCH 2013
COLORADO MEASURES OF ACADEMIC SUCCESS
2
Frequently Asked Questions
Why do we have state assessments?
As part of a balanced assessment system, state assessments provide valuable information to students,
families, schools, districts, the state, and taxpayers. A balanced assessment system is one that contains
formative assessments (quick checks for learning conducted by teachers throughout their class), interim
assessments (more formal progress monitoring often conducted several times throughout the year to
see how students are progressing), and summative assessments (end or course/unit, end of year
assessments to find out what students know and can do).
The state assessments are summative assessments. While formative, interim, and classroom-based
summative assessments inform classroom instruction on a regular basis, state summative assessments
are designed to be point-in-time snapshots of what students know and can do in core content areas.
The assessment results are used by: parents/families to gauge student progress over time; teachers to
inform and improve classroom practice and student placement; and by schools/districts to inform
program, school, and district improvement. The results are also the backbone of the state’s
accountability system. They are used to populate the annual school and district performance
frameworks which result in accreditation ratings for each school and district. The results are also used
for teacher evaluation purposes as one of the multiple measures required under S.B. 10-191. Changes
in the assessment system have significant implications for the state’s accountability and educator
evaluation systems.
Are states required to administer state-wide assessments and what are the minimum requirements?
States that accept federal funds for such purposes as supporting the education of children in poverty,
English language learners, and students with disabilities are required to administer state-wide
assessments to all students. Currently, Colorado receives approximately $326 million in federal funds
for these and related purposes. The minimum required assessments are:
 Grades 3 through 8 for English language arts and mathematics
 At least once in high school for English language arts and mathematics
 At least once in elementary school, once in middle school, and once in high school for science
For the assessments noted above, states must give the same assessments to all students and at least 95%
of the students must participate in the tests. There are also some required assessments specific to
certain populations of students (e.g., language proficiency assessments for English language learners).
How will Colorado’s new assessment system compare to the federal minimum?
Colorado’s new statewide summative assessment system, as outlined in state statute, includes the
following assessments:
 Grades 3 through 11 for English language arts (above the federal minimum for high school)
 Grades 3 through 8 and three times in high school for math (above the federal minimum for
high school)
 Once in elementary, middle, and high school for science (at federal minimum)
 Once in elementary, middle, and high school for social studies (no federal requirement)
 ACT in 11th grade (no federal requirement)
JULY 2014
COLORADO MEASURES OF ACADEMIC SUCCESS
3
A key rationale from education practitioners and policymakers for having additional assessments at the
high school level was based on the desire to have an early signal of whether students were on track to
graduate college/career ready (the 9th grade assessments) and to then have an assessment that gave a
closer approximation to their readiness (the 10th grade assessment) followed by the measure of college
and career readiness accepted by colleges and universities, the 11th grade assessment. Social studies
was added through feedback on the importance of this content area from practitioners, policymakers,
and the State Board of Education.
Are we adding to the number of state-administered assessments?
The new state assessments are comparable to the previous TCAP system with the statutory additions
of: social studies in grades 4, 7, and 12; and 11th grade testing in English language arts and math. The
addition of 11th grade (also known as the culminating) PARCC assessments allows students to use the
results of those assessments for higher education course placement purposes (and, for Colorado
institutions, for admissions purposes).
How much time will the tests take?
The testing time for the typical student in TCAP versus CMAS is comparable. Testing times are largely
the same with the exception of 3rd and 11th grade. Overall, the estimated amount of testing time on
CMAS is expected to be less than 1.5% of typical students' total instructional time. That said,
administering tests online can take more time from a scheduling standpoint than paper/pencil did, as
schools need to cycle students through their computer labs. The fewer devices, the lengthier the
window that is needed to complete the testing. This is why we were involved in field testing (and why
we started with our own online Colorado social studies and science tests – administered this past
spring) to learn what the challenges are from a scheduling/administration standpoint so that we can
learn strategies that minimize overall impact on schools.
What is our involvement with PARCC?
Colorado is an active member of PARCC. Over 50 educators, including higher education and K-12, are
involved in a range of assessment design and review committees. In addition, Commissioner
Hammond serves as a member of the Executive Committee, the key decision making body on PARCC.
Over 100 districts are currently participating in field testing the PARCC assessments in preparation for
spring 2015 administration.
What states/jurisdictions are actively participating in PARCC?
The following jurisdictions participate in PARCC: AZ, AR, CO, Washington D.C., IL, LA, MD, MA,
MS, NY, NM, NJ, OH, , and RI.
How is PARCC engaging higher education?
In total, 640 colleges and universities have committed to participate in PARCC. These colleges and
universities, including many flagship universities and most of the largest state systems, have pledged
to participate in the development of the new college-ready assessments in mathematics and English
Language Arts/Literacy and have signed on to ultimately use these tests as college placement tools.
JULY 2014
COLORADO MEASURES OF ACADEMIC SUCCESS
4
What are the benefits of PARCC participation?
 PARCC received $180 million to develop a high quality test that measures higher order thinking.
No single state would be able to devote those resources to develop its own test of comparable
quality.
 In our highly mobile society, students and their families need to know that if they move from
Colorado to another state, they will not miss a beat in their learning progression. They need to be
able to have performance measures that allow them to compare how they are performing with their
peers in other states. The PARCC tests will provide a basis for measuring student progress in
attaining the standards and their performance relative to their Colorado peers and PARCC peers.
 Colorado’s new state higher education admissions and remediation policies allow institutions to
use PARCC scores for both course placement and admissions purposes. The multi-state aspect of
PARCC provides another basis for comparison of Colorado students to their peers as colleges
consider their college readiness.
 The comparability of the test results and the basis of the test on the Common Core State Standards
(which Colorado adopted along with 44 other states) make it far superior to a state-developed test.
Are we testing too much?
In addition to the state-required tests, districts administer a range of local assessments from teacherdeveloped assessments to district-purchased assessments. In many cases, these assessments account
for 50-70% of the overall testing occurring. The question of the appropriate balance of state and district
assessments in our learning system is a good one. As we move to new tests, the change fosters a
healthy dialog about why we are testing, what role the tests play, how do the tests connect with local
assessments, etc. These are the right questions to be asking. CDE has engaged WestEd, a nonprofit
education research agency, to conduct a multi-phase study of these very issues about how/where the
state can relieve the testing burden (e.g., phasing in the movement to online tests, making some of our
non-federally required tests optional or done on a sampling basis, etc.), while still maintaining the
important role of shared assessments that provide accountability for our system.
Where can I learn more?

CDE Assessment website: www.cde.state.co.us/Assessment

CDE Standards website: www.cde.state.co.us/standardsandinstruction

Additional CDE fact sheets: www.cde.state.co.us/communications/factsheetsandfaqs
The Colorado Department of Education
CONNECTING . . . rigorous academic standards . . . meaningful assessments . . .
engaging learning options . . . excellent educators . . . for STUDENT SUCCESS
JULY 2014
Statewide Assessment Timeline Assessments added Spring 1997 Spring 1998 Reading Reading rd
& Writing 3 Grade th
4 Grade Spring 1999 Math th
5 Grade Reading & Writing th
7 Grade Spring 2000 Math th
8 Grade Spring 2001 Spring 2002 Reading Writing Grades -­‐ Grades -­‐ th th th
5 , 6 , 8 , th, th
9 10 Writing th
10 Grade Math rd
th
th
3 , 5 , 6 , th th
8 , 9 Spring 2006 Math Grades -­‐ rd th
3 , 4 Spring 2008 Science Grades -­‐ th
Spring 2012 th
5 , 8 , th
10 Spring 2013 Early Literacy Grades – rd
K -­‐ 3 Math 2013-­‐14 Social Studies Grades -­‐ th
Grades -­‐ th th th
6 , 7 , 9 Tested in th* 12 instead of 10th th
*12 in fall of 2014 th
ACT th
11 Grade State or Federal Policy The Elementary and Secondary Education Act (also referred to as No Child Left Behind Act) was revised in 2001 to require a reading and math assessment in each of grades 3-­‐8 and once during grades 10-­‐12, by 2005-­‐06. It also required a science assessment to be administered once in elementary, middle, and high school (to be phased in by 2007-­‐08). Colorado statute requires that reading, writing and math be assessed more than once in high school. An th
assessment is to be given in 9 grade to provide an early signal of whether students are on track to graduate college/career ready, a second assessment th
is provided in 10 grade, and the ACT is provided in th
11 grade as a final measure of readiness that would accepted by colleges and universities. In 2008, Colorado adopted Colorado’s Achievement Plan for Kids (CAP4K) which set forth a process of developing new standards and an aligned assessment system. The standards were set for full implementation in 2013-­‐14, with new assessments to follow. th*
Science 10 Grade 1997 Colorado statute required the State Board of Education to adopt state assessments aligned with the state model content standards in the priority areas of reading, writing, math and science. These assessments were phased in over time. th
4 , 7 , 12 CO statute (READ Act) requires a reading competency rd
test for K-­‐3 , beginning in 2013. CO statute directs CDE to participate in multi-­‐state assessment consortium for the development of English language arts and math assessments. CO statute requires a social studies assessment once in elementary school, once in middle school, and once in high school. CO statute th
adds 11 grade in English language arts and one additional math assessment in high school beginning 2014-­‐15. 2014-­‐15 English language arts to replace Reading & Writing Grades -­‐ rd th
3 -­‐ 10 th
add 11 Math 2015-­‐16 School Readiness Assessment Kindergarten One additional time in high school Assessment Timeline Summary: From 1997 to 2002, the state phased in implementation of the CSAP assessments, adding reading, writing, and math at various grade levels over that timeframe. By 2002, all th
students in grades 3-­‐10 were being assessed in reading and writing, and all students in grades 5-­‐10 were being assessed in math. ACT for 11 graders was added in 2001. In 2001, the Elementary and Secondary Education Act (ESEA) was adopted requiring reading and math in grades 3-­‐8 and once in grades 10-­‐12. In order to adhere to the new ESEA rd
th
requirements, Colorado added math for 3 and 4 graders in 2006. Colorado already met and exceeded the federal requirements for high school, valuing the early indicator of th
9 grade assessments. Pursuant to the ESEA reauthorization, Colorado added science in grades 5, 8, and 10 in 2008. Colorado’s assessment program has been stable since 2008. As the state has transitioned to its new standards under Colorado’s Achievement Plan for Kids, the state added social studies assessments in grades 4, 7, and 12, to reflect that this is an important content area to emphasize and for which to hold student accountable. The state also th
changed the high school grade in which the new science assessment is administered from 10 to 12. Beginning in the spring of 2015, assessments in English language arts for 11 graders will be added and the high school mathematics tests will change to three end-­‐of-­‐course assessments. Beginning in 2013-­‐14, in grades K-­‐3, specific district-­‐administered reading assessments are required pursuant to the READ Act. In addition, districts are phasing in implementation of required school readiness assessments in kindergarten by 2015-­‐16. Colorado Standardized Assessment Grid Grade Federal Requirements (FR) Level for Content Assessments* K ‐ English Assessment Annually in Grades 3‐5 st
‐ Math Assessment 1 Annually in Grades 3‐5 2nd rd
‐ Science Assessment 3 Once in Elementary 4th 5th 6th 7th 8th 9th 10th 11th 12th 2014 Colorado Meets FR with: TCAP ‐ English TCAP ‐ Math TCAP ‐ English TCAP – Math TCAP ‐ English TCAP – Math CMAS ‐ Science ‐ English Assessment TCAP ‐ English Annually Grades 6‐8 TCAP ‐ Math ‐ Math Assessment TCAP ‐ English Annually Grades 6‐8 TCAP ‐ Math ‐ Science Assessment TCAP ‐ English Once in Middle School TCAP – Math CMAS ‐ Science ‐ English Assessment Once TCAP ‐ English in High School TCAP ‐ Math ‐ Math Assessment Once TCAP – English in High School TCAP ‐ Math ‐ Science Assessment Once in High School CMAS –Science (fall 2014) Colorado adds: READ Act Assessment READ Act Assessment READ Act Assessment READ Act Assessment CMAS ‐ Social Studies CMAS ‐ Social Studies Fed reqs. are once in high school. CO assesses twice in high school. ACT is added in 11th gr. CMAS ‐ Social Studies (fall 2014) 2015 Colorado Meets FR with: Colorado Adds: READ Act Assessment and School Readiness Assessment READ Act Assessment READ Act Assessment PARCC ‐ English READ Act Assessment PARCC ‐ Math PARCC ‐ English CMAS ‐ Social Studies PARCC ‐ Math PARCC ‐ English PARCC – Math CMAS ‐ Science PARCC ‐ English PARCC ‐ Math PARCC ‐ English CMAS ‐ Social Studies PARCC ‐ Math PARCC ‐ English PARCC – Math CMAS ‐ Science PARCC ‐ English Fed reqs. are once in high PARCC – Math** school. CO assesses three times in high school. PARCC – English PARCC – Math** ACT is added in 11th gr. PARCC – English PARCC‐ Math** CMAS – Science (fall 2015) CMAS ‐ Social Studies (fall 2015) *In addition to content assessments, there is a federal requirement for an English language proficiency assessment for all English learners enrolled in K‐12. Colorado uses ACCESS. **PARCC has three high school mathematics assessments. Students may start the sequence in 8th or 9th grade. They may finish the sequence in 10th through 12th grade. Students may take the same assessment more than once before moving on to the next. The above chart represents the typical student’s expected test schedule. TCAP – Transitional Colorado Assessment Program ‐ TCAP was designed as an interim assessment to use as Colorado transitioned from the old content standards to the new Colorado Academic Standards. CMAS – Colorado Measures of Academic Success ‐ CMAS is the new overarching state assessment program in Colorado which includes the PARCC‐developed English language arts and math assessments and the Colorado‐developed science and social studies assessments, as required in S.B. 08‐212, Colorado’s Achievement Plan for Kids (CAP4K) and related statutory changes. CoAlt –Colorado Alternate Assessment ‐ CoAlt is the new overarching state assessment program in Colorado for students with significant cognitive disabilities. As required by federal law, for each CMAS assessment, there is a CoAlt assessment. PARCC – Partnership for Assessment of Readiness for College & Career ‐ PARCC is a multi‐state assessment consortium that is developing shared tests in English language arts and math. By participating in the consortium, Colorado does not have any costs related to the design of the tests and benefits from the ability to have cross‐state comparisons of student performance results. ACT – Nationally recognized college entrance exam required by state statute. READ Act Assessments – Schools may use any of 10 approved literacy assessments (7 English language and 3 Spanish language assessments to choose from). School Readiness Assessments – Required of all incoming kindergarten students beginning in 2015‐16 per CAP4K. Statutory Expectations & Desired
Attributes of Colorado’s Statewide
Assessment System
The following document provides a summary of the statutory expectations and desired attributes that have been
articulated over time and have shaped Colorado’s statewide assessment system.
Assessment Attributes Adopted by the State Board of Education and Colorado Commission
on Higher Education
The Preschool to Postsecondary Education Alignment Act (S.B. 08-212 or CAP4K) was passed in 2008 and intended to
align Colorado’s educational system from preschool to college by establishing a seamless system of standards,
expectations, and assessments. From fall of 2009 through spring of 2009, the Assessment Stakeholders Committee met
to consider the state’s revised content standards and make recommendations for revisions of Colorado’s assessment
system. The committee consisted of 35 members, including superintendents, chief academic officers, university
professors, community members, business representatives, assessment directors, representatives from the early
childhood community, military representatives and local board members, and involved the work of subcommittees
focusing on summative, formative and interim, postsecondary and workforce readiness, school readiness, and special
populations assessments. After recommendations for Colorado’s new assessment system were drafted, the Colorado
Department of Education and Colorado Department of Higher Education oversaw a Regional Town Hall Assessment
Revision Tour, holding 10 meetings across the state and gathering feedback on the draft from over 200 Coloradoans. In
November 2010, the State Board of Education (SBE) and the Colorado Commission on Higher Education unanimously
adopted an agreement concerning the architecture of a statewide assessment system and in December 2010 the State
Board voted to approve the following attributes of the system.
•
SBE acknowledges that summative assessments aligned to the Postsecondary and Workforce
Readiness Colorado Academic Standards (including both content knowledge and skills) will be
part of the new assessment system.
o The state summative assessments will at least measure Math and English Language Arts, including the
application of knowledge and skills, in grades 3-11.
o Science and Social Studies will be measured, at least once in elementary, middle, and high school.
o To take advantage of new interactive item types and facilitate the timely return of the results, the
summative assessments will be on-line to the extent it fits instructional best practice and is fiscally
possible or feasible.
•
SBE agrees that formative assessment will be part of the new assessment system; therefore, the
state will support and provide guidance for local education providers in formative assessment.
o CDE will support districts in providing on-going professional development activities in the use of these
practices and interpretation of these results. Support should include differentiated approaches based on
local needs: collaboratively developing learning progressions, models/exemplars, and videos as
resources allow.
o CDE will support the creation of district or regional consortia to promote professional dialogue on
assessment throughout the state as resources allow.
•
SBE agrees that interim assessments should be a part of the new assessment system and that the
state will support and provide guidance for local education providers with the uses of interim
assessments.
o Districts will determine timing, frequency, and design/selection of interim assessments.
o CDE will offer exemplary, voluntary interim assessment tools aligned to the state-tested subjects with
the goal of providing interim assessments aligned to all standards as resources allow.
Statutory Expectations and Desired Attributes of Colorado’s Statewide Assessment System
o
2
CDE will provide a vetting process and rubrics to assist district in purchasing or designing rigorous and
standards-focused interim assessments for all grades and all content areas as resources allow.
•
SBE agrees that school readiness assessments will include primarily a mix of state-approved
formative and interim assessment tools.
o CDE recommends that the Colorado Basic Literacy Act (CBLA) be updated to align with the
Postsecondary and Workforce Readiness (PWR) Colorado Academic Standards and that numeracy be
added to reflect the same instructional values of progress monitoring and early intervention where
needed.
o The state will offer districts a menu of approved school readiness assessment tools.
o In preschool through second grade school readiness assessments will rely on formative assessment
practices and interim assessments.
o In grades 1-2 mastery of the PWR Colorado Academic Standards will be measured.
o Districts are encouraged to introduce developmentally appropriate end-of-year assessments.
•
SBE acknowledges that the English Language Learners assessments will include a screener and
proficiency test aligned to the Colorado English Language Proficiency (ELP) standards.
o The proficiency assessment window will begin after the October count and will close prior to the
beginning of any other state annual summative academic content testing.
o The proficiency assessment will be technically sound and administration practices will be consistent with
the expectations of other high stakes state assessments (e.g., tests will be secure and students will not
be exposed to the same form multiple times).
•
SBE agrees that the priority of postsecondary and workforce readiness measures will be part of
the new assessment system. The system will capture bodies of evidence for a student’s academic
progress, planning, preparation, and readiness competencies, which will not necessarily be used
for state accountability purposes.
o A dashboard will be a visual display for P-20 students to use as on-going documentation of their
postsecondary and workforce readiness progress, including ICAP (Individual Career and Academic Plans)
documentation, student work exemplars, employer recommendations, and other relevant materials.
o Results can be used for high school, as well as college and career eligibility and guidance. The body of
evidence displayed on the dashboard can be used for graduation considerations or endorsed high school
diplomas.
Relevant Statements from Education Accountability Act Legislative Declaration
In 2009, the General Assembly passed the Education Accountability Act (S.B. 09-163), outlining the role that assessment
and growth data are meant to play in the district and school accountability system. The Legislative Declaration for the
Act provides an overview of the desired goals of the legislation and includes language explaining the significance of
academic achievement and growth data.
•
Section 22-11-102(1), C.R.S.: “The General Assembly hereby finds that an effective system of
statewide education accountability is one that:
o (a) focuses the attention of educators, parents, students, and other members of the community on
maximizing every student’s progress toward postsecondary and workforce readiness and postgraduation success;
Prepared by CDE Staff, September 2014
Statutory Expectations and Desired Attributes of Colorado’s Statewide Assessment System
o
o
o
•
3
(b) reports information concerning performance at the state level, school district or Institute level, and
individual public schools level that is perceived by educators, parents, and students as fair, balanced,
cumulative, credible and useful;
(c) provides more academic performance information, and fewer labels, to move from a punitive
accountability system to one that is positive and focused on learning and achieving high levels of
academic performance; and
(d) holds the state, school districts, the Institute, and individual public schools accountable for
performance on the same set of indicators and measures statewide, ensures that those indicators and
measures are aligned through a single accountability system, to the extent possible, that objectively
evaluates the performance of the thorough and uniform statewide system of public education for all
groups of students at the state, school district or Institute, and individual public school levels, and, as
appropriate, rewards success and provides support for improvement at each level.”
Section 22-11-102(2), C.R.S.: “The general assembly further finds that an effective education
accountability system will be built around implementation of the Colorado growth model that:
o (a) Uses a common measure to describe in plain language how much academic growth each student
needs to make and how much growth the student actually makes in one year toward proficiency on
state content standards;
o (b) Incorporates a complete history of each student's assessment scores in calculating the student's
needed and achieved academic growth;
o (c) Focuses the attention of educators, parents, and students on maximizing students' academic growth
and achievement over time and reveals where, and among which groups of students, the strongest
academic growth is occurring and where it is not;
o (d) Assists the state in closing the achievement gaps that plague the public education system by
spotlighting the gaps in students' academic growth rates and ensuring that educators have the data
necessary to assist the neediest students in making more than a year's academic growth in a year's time
so that these students can catch up to the academic performance levels of their peers; and
o (e) Provides data that are recognized by educators, parents, students, the higher education community,
the business community, and other stakeholders as fair, balanced, objective, and transparent to support
individual school, school district, institute, state, and federal education accountability purposes.
Values and Expectations Embedded in Education Accountability Act and Assessment
Attributes
Below is a staff-developed synthesis of the values and expectations for Colorado’s assessment system and growth model
that are expressed in both the Educational Accountability Act and the assessment attributes adopted by the State Board.
•
Performance on statewide assessments will signal mastery of Colorado Academic Standards at
grade level, including both knowledge and application of skills.
o Results from the final high school summative assessment will provide an indication of a student’s
postsecondary and workforce readiness. This readiness may have implications for high school
graduation guidelines, higher education admissions policies and criteria for endorsed diplomas.
•
Statewide assessments and growth data will measure progress toward postsecondary and
workforce readiness.
o Assessments will be used to help prevent the chronic need for postsecondary remediation.
Prepared by CDE Staff, September 2014
Statutory Expectations and Desired Attributes of Colorado’s Statewide Assessment System
o
4
Assessment and growth data will focus attention on maximizing students’ academic growth and
achievement over time and reveal where, and among which groups of students, the strongest academic
growth is occurring and where it is not.
•
Statewide assessments and growth data will inform instruction.
o When fiscally possible and fitting to best instructional practices, assessments will be given online to
accommodate the timely return of results.
o The state will support districts in providing voluntary formative practices assessments and interim
benchmark assessments.
o Assessment and growth data will lead to a deeper understanding of each child so that instruction and
support can be differentiated to best meet individual needs.
•
Statewide assessments and growth data will provide meaningful results.
o Summative assessments will be given as late in the school year as possible to allow for more instruction.
o Assessment and growth data will be a part of a system of accountability that is perceived by educators,
parents, and students as fair, balanced, cumulative and useful.
o Assessment and growth data will assist in holding districts and schools accountable on the same set of
indicators and measures statewide to objectively evaluate the performance of a thorough and uniform
statewide system of public education for all groups of students.
Prepared by CDE Staff, September 2014
Learning about Federal Minimum
Assessment Requirements Questions to Investigate
Background
Policy Option- Federal Minimum Assessment Requirements
CDE has received questions regarding the implications of Colorado moving from its current state assessments to the minimum federal assessment requirements (English language arts and mathematics in grades 3-­‐8 and once in grades 10-­‐
12; science once at the elementary, middle and high school level). Because Colorado is at the federal minimum in grades 3-­‐8 for English language arts and math and for science, the impact of moving to the federal minimums is largely experienced at the high school level.* In response, CDE has begun investigating the impact of not having consecutive grade high school assessments and thus no high school growth information, on the School and District Performance Frameworks. Additionally, many have suggested learning from other states around possibilities for accountability and growth when only the minimum federally required assessments are administered. As a result, CDE has generated the following questions to be investigated. *Note: Colorado requires social studies assessments in grades 4, 7, and 12. This is above the federal minimum. For the purposes of this investigation, the focus is on learning more about other state assessment systems for English language arts and math and the impact on growth data of assessing once in these content areas in high school. Questions to Investigate
•
•
What is the range of assessment/content coverage in states? o What states assess at fewer grades/content areas than Colorado? o What states assess only at the federal minimum requirements? o What states assess at more grades/contents than Colorado? o How have states changed their assessment/content coverage over time? ! What were the reasons why the states added or subtracted state assessments? ! What were the results of these changes? Of those states that assess only the minimally required grades/content areas (or levels lower than Colorado), do any include growth indicators in their school and district accountability system? o Do any use student growth percentiles (Colorado’s Growth Model) as their growth metric? o What other ways do states calculate growth? ! Are there methods that allow for baselining adequate growth expectations so that after 2 years of data from all students, we could determine whether adequate growth was met when either not all students are assessed or not all grades are assessed annually? ! Do any states use end of course exams at the high school level for calculating growth? If yes, then how? o Do any other states use growth as a component of high school accountability? ! If not, how does high school accountability differ from elementary and middle school accountability? Are there concerns around these differences in those states? September 17, 2014 Document Title If yes, how is growth calculated at the high school level? How much is it weighted in the calculations? If applicable, how is adequate growth calculated in those states? Colorado’s adequate growth definition requires proficiency within 3 years or by 10th grade. If there is no 10th grade assessment, what options are there for calculating adequate growth? ! Do any states switch from standards-­‐based assessments to college-­‐ready assessments in high school? ! Do any states measure adequate growth to college ready assessments? For example, instead of a goal of proficiency by 10th grade, create a goal of a minimum score on CO ACT. The Smarter Balanced Assessment Consortium (SBAC), the counter-­‐part to PARCC, is not creating assessments in consecutive high school grades. How are states in that consortium planning to include/not include growth in high school accountability? What is the role of local assessments in other states? Are other states feeling the same burden? !
•
•
Questions for the United States Department of Education •
•
•
2 Can we use 9th grade instead of 10th grade for the federal high school assessment requirement? Are any states approved to use local, vendor-­‐created assessments (NWEA MAPS for example) for federal requirements and accountability? If no states are currently approved, is this allowable? If yes, what would be the process to getting local assessments approved? Are there options to assess random samples of students (i.e., not all kids, all years)? 11/10/14 Federal Minimum Assessments
Multi-State Review
Elena Diaz-­‐Bilello and Alyssa Pearson November 2014 1 Objectives
!  Provide an overview of the review completed by The Center for Assessment regarding how other states assess and hold schools accountable at the high school level. !  Specifically, we will share informaEon about: !  Assessment coverage in other states !  Growth methodologies in other states !  High school accountability indicators and weights !  Discuss some of the trade-­‐offs for different opEons 2 The Center for Assessment
Multi-State Review
1.  Alaska** 2.  Colorado 3.  Connec;cut 4.  Delaware* 5.  Georgia** 6.  Hawaii** 7.  Idaho* 8.  Indiana 9.  Kansas 10.  Louisiana* 11.  MassachuseNs* 12.  Maine* 13.  Michigan* 14.  Mississippi* 15.  Montana* 16.  Nebraska* 17.  Nevada* 18.  New Hampshire** 3 19.  New Jersey 20.  North Carolina* 21.  North Dakota** 22.  Oregon 23.  Pennsylvania** 24.  Rhode Island** 25.  South Carolina* 26.  Utah** 27.  Vermont* 28.  Virginia 29.  Washington 30.  West Virginia 31.  Wisconsin* 32.  Wyoming * Indicates Technical Advisory Committee (TAC)/Expert Panel (EP) membership
**Indicates managing the TAC/EP
1 11/10/14 What are these 32 states doing?
!  Some assess addi;onal grade/content assessments beyond federal requirements (14 states) !  More test at federal minimums, with only one grade/content area assessed at high school (18 states) 4 Assessment Changes in these
States
!  Reasons that some of the states reviewed reduced or made changes to assessment systems: ! Budgetary reasons ! Reduced tesEng Eme !  Reasons for increasing assessments or coverage: ! Desire to ensure accountability for 4 core content areas: social studies, science, English language arts, and math ! Desire to evaluate student achievement for more grades in high school 5 ! Long tradiEon of tesEng ! Educator evaluaEons With fewer assessments, how do
you measure growth?
!  Over half of states in this review that assess at the federal minimum do not currently evaluate growth in high school accountability (10 out of 18) !  For those (8) that do: ! Colorado Growth Model (4 states)-­‐ use the last middle school grade (e.g., grade 8) as a prior for the high school test. ! The other four each use a different growth approach: value-­‐added measures or VAMs (PA), gain-­‐scores (Nebraska), value-­‐tables (Delaware), and growth to proficiency (North Dakota) 6 2 11/10/14 With fewer assessments, what
about adequate growth?
!  Most states in this review that use the Colorado Growth Model at the federal minimums, do not use Adequate Growth Percen;les in high school (with the excep;on of Oregon) ! Adequate Growth at the end grade doesn’t add any informaEon ! Loss of “check-­‐point” data a\er middle school test !  Value-­‐Added Measure (VAM) states evaluate adequacy based on whether the observed growth met or exceeded the predicted growth !  Growth to proficiency model evaluates adequacy based on whether a school is mee;ng incremental targets set each year 7 With fewer assessments, what does
high school accountability look like?
!  Many states reviewed, assessing at the federal minimums, do not currently evaluate growth in high school accountability (10) !  Many use “career and college readiness” metrics, including: ! GraduaEon rates ! Dropout rates ! % of Students meeEng AP Benchmarks ! % of Students with dual-­‐enrollment ! ACT/SAT performance ! A_endance 8 Weight of Growth Indicator
For those states reviewed that meet the federal minimums and use growth, how much is the growth indicator weighed? STATE Weight Given in Accountability Framework Hawaii 15% Massachuse_s 25% Nevada 10% (10 points out of a possible 100) Nebraska WeighEng to be determined Oregon 30% (20% to overall; 10% to subgroup growth) Pennsylvania 40% Colorado 50% (35% to overall; 15% to disaggregated growth) 9 3 11/10/14 Considerations
!  Value of growth at the high school level ! median growth percenEles (normaEve growth where student is compared to other students sharing similar starEng points) ! adequate growth percenEles (criterion referenced where student is compared against a standard) !  Validity of measuring growth between non-­‐consecu;ve grades (i.e. 8th grade to 11th grade) ! School and district accountability ! Educator evaluaEons !  Other indicators available for high school accountability 10 4 11/10/14 Federal Minimum Requirements and Implications for Accountability A Presenta.on for the Colorado Department of Educa.on Responses for the Assessment Taskforce NOVEMBER 6, 2014 States considered: 32 States 1.  Alaska** 2.  Colorado 3.  Connec3cut 4.  Delaware* 5.  Georgia** 6.  Hawaii** 7.  Idaho* 8.  Indiana 9.  Kansas 10.  Louisiana* 11.  MassachuseHs* 12.  Maine* 13.  Michigan* 14.  Mississippi* 15.  Montana* 16.  Nebraska* 17.  Nevada* 18. New Hampshire** 19.  New Jersey 20.  North Carolina* 21.  North Dakota** 22.  Oregon 23.  Pennsylvania** 24.  Rhode Island** 25.  South Carolina* 26.  Utah** 27.  Vermont* 28.  Virginia 29.  Washington 30.  West Virginia 31.  Wisconsin* 32.  Wyoming *Indicates Technical Advisory Committee (TAC)/Expert Panel (EP) membership
**Indicates managing the TAC/EP
[1]
Page 2 Getting a sense of the testing landscape •  Three major design typologies for state tes.ng programs: 1.  Administers end of course (EOC) tests for students in high schools. 2.  Administers to grades (e.g., K-­‐2 or more than 1 required high school grade) or content areas (e.g., social studies) outside of federal minimum requirements 3.  Keeps to federal minimum requirements Page 3 1 11/10/14 Getting a sense of the testing landscape •  Many of the states reside in the third category. Most states in this category have fewer tests administered than CO because ELA tests comprise of both reading and wri.ng sec.ons (1 test) or they have wri.ng assessed in fewer grades and only assess one grade in high school. •  The modal number of assessments administered in high schools in most states is at one grade in the high schools (mee.ng the federal minimum requirements) Page 4 Type 1: EOC Tests •  In 2011, 28 states using EOCs for accountability purposes (Domaleski, 2011) •  Examples of EOC states we work with that compute growth in high schools: Utah, Georgia, Washington, Louisiana, North Carolina, Pennsylvania •  Most of these places allow for having more than 1 EOC test in a given content area and grade Page 5 For Type 1 states using growth Taskforce Ques3on: How is growth calculated for EOCs? Response: •  In order to calculate growth using EOCs, one needs to create norm groups (students taking a par.cular test trajectory or pathway) and run the growth analyses on those students. When there are mul.ple course/test taking pa]erns, then one runs mul.ple analyses with all the dis.nct norm groups. Page 6 2 11/10/14 Example: Two norm groups established for Math EOCs when multiple pathways are present Grade 8 Math
Algebra 1 Geometry Grade 7 Math
Algebra I Algebra II Page 7 Type 2: Outside of Federal Minimums •  Colorado in this category with social studies and three grades at high school •  Examples of other states that test outside of content areas or grades required by minimums and also use some of those assessments/grades for accountability: Alaska, Wisconsin, South Carolina, West Virginia, Idaho, Michigan, and North Carolina •  Overlap between Type 1 and Type 2. Examples of these states: Utah, Louisiana, Virginia, South Carolina, North Carolina, and Washington Page 8 Type 3: Sticks to Federal Minimums •  Many states are in this category and have fewer tests administered than CO since CO exceeds federal minimums •  Most of these states only have 1 high school grade assessed: Grade 10 or 11 •  Two states that use EOCs to fulfill federal minimums for accountability: Pennsylvania and Indiana •  Examples of states in this category that do not currently evaluate growth in high school for accountability: Wyoming, Maine, Connec.cut, Rhode Island, Vermont, Indiana, New Jersey, and Kansas. Page 9 3 11/10/14 Summary of States by Typology and Growth Model Used Type 1
1. 
2. 
3. 
Georgia – CGM Indiana – CGM Louisiana – VAM, but not for school/district accountability 4.  Mississippi – moved this year to value table from a VAM 5.  North Carolina -­‐ EVAAS 6.  South Carolina -­‐ EVAAS 7.  Utah – CGM 8.  Pennsylvania -­‐ EVAAS 9.  Washington – CGM 10.  Virginia – Uses CGM but not for school accountability Type 2 and Growth Model
1. 
2. 
3. 
4. 
5. 
6. 
7. 
8. 
9. 
10. 
11. 
12. 
13. 
14. 
Alaska – Value table Colorado – CGM Georgia – CGM Idaho -­‐ CGM Louisiana – VAM, but not for school/
district accountability Michigan – Performance level change Mississippi – moved this year to value table from a VAM North Carolina – EVAAS South Carolina – EVAAS Utah – CGM Virginia – Uses CGM but not for school accountability Washington – CGM Wisconsin – CGM, VAM (VARC), and Gain Scores, but only report CGM for school accountability West Virginia – CGM
Type 3 and Growth Model
1. 
2. 
3. 
4. 
5. 
6. 
7. 
8. 
9. 
10. 
11. 
12. 
13. 
14. 
15. 
16. 
17. 
18. 
Connec.cut – Performance level change Delaware – Value table Hawaii – CGM Indiana – CGM Kansas – CGM Massachuse]s – CGM Maine -­‐ CGM Montana -­‐ no growth model used for accountability Nebraska -­‐ Gain Scores Nevada – CGM New Hampshire – CGM New Jersey – CGM North Dakota -­‐ Growth to standard Oregon -­‐ CGM Pennsylvania -­‐ EVAAS Rhode Island – CGM Vermont – no growth model used for accountability Wyoming – CGM Page 10 Taskforce Questions: Any states switched from standards based to “college ready assessments” in high schools? •  Very few, and those that have switched or will switch are using the ACT (includes Work Keys and Aspire) or SAT in high school •  Examples of these states: Wyoming (grade 11), Maine (grade 11), Michigan, Wisconsin, and Alabama. •  All of these states NOT currently compu.ng growth with these assessments Important note rela3ve to “college ready assessments”: most states are using standards based tests that are intended to be aligned to college ready standards. In fact, any state that has a waiver has made this claim via principle 1. Page 11 Taskforce Question: State changes to assessment/
content coverage over time? •  Reasons that some states reduce or make changes to assessment: -­‐  Budgetary reasons -­‐  Reduced tes.ng .me •  Reasons for increasing assessments or coverage: -­‐  Desire to ensure accountability for 4 core content areas: social studies, science, English language arts, math -­‐  Desire to evaluate student achievement for more grades in high school -­‐  Long tradi.on of tes.ng -­‐  Educator evalua.ons Page 12 4 11/10/14 Taskforce Question: Role of local assessments in other states? •  In a few places (e.g., Kentucky and New Hampshire), performance tasks will play a more prominent role in becoming integrated with school accountability. •  In most places, local assessments being used to support the evalua.on of student outcomes for educator evalua.on systems. •  In some states (e.g., Washington, North Carolina, and Rhode Island), local assessments (i.e., schools decide which assessments to use) included as part of school improvement planning process to measure outcomes and progress made in low performing schools. Page 13 Taskforce Question: Which states are dissatisPied or facing broad public discontent with current testing programs Most states in general are experiencing increased pressure from stakeholder groups to minimize tes.ng burden. Places we work with where the Department of Educa.on is exploring ways to reduce test burden: -­‐  Colorado -­‐  Connec.cut -­‐  Missouri -­‐  Vermont Page 14 Exploring Federal Minimums Addressing Taskforce Ques.ons Specific to High Schools and Growth Page 15 5 11/10/14 How to compute AGPs without grade 10? Taskforce ques.on: If there is no 10th grade [state] assessment to measure to, how else could adequate growth be calculated? •  Response: Colorado only needs to define an “end point” and there needs to be a path of tests leading to that end point (e.g., ACT in grade 11). Even if the tests aren’t in each grade, we can map out a trajectory indica.ng (in)adequate growth. Page 16 Additional considerations for high school AGPs in CO •  Target growth in high school quickly loses relevance as we reach the terminal .me point. We can’t create targets that extend beyond the last measurement, so 3 year targets become 2 year targets, and 2 year targets become 1 year targets. •  If ACT used, CO can explore looking at growth to proficiency in 11th grade Page 17 Compute AGPs with a grade gap between high and last middle school grade? Taskforce Ques.on: Say we assess grades 8-­‐11 in 2015 and 2016. Then, could we calculate adequate growth targets based on those 2 years of data, if in 2017 we only assess grades 8 and 11 (or 8 and 10)? •  Response: Yes. However, in later years, if students only get tested in grades 8 and 11, you won’t be able to make a determina.on of “adequate” or “not adequate”. Once you lose the intermediate “check points”, you no longer get to say “on track”/“not on track”. That is, we would know where we want a student to be in grade 9, but since you aren’t checking up on students any longer in grade 9, we can’t say whether a student is on track or not. •  In the same scenario (assess grades 8 and 11), target growth when intermediate grades are removed, is just reduced to a ques.on of whether the student was proficient in grade 11 or not. Page 18 6 11/10/14 For Type 3 states with no growth in HS Taskforce Ques.on: How does high school accountability differ from elementary and middle schools? Response: These states include “career and college readiness” metrics in high school accountability. Examples of common metrics found in Type 3 states: -­‐  Gradua.on rates -­‐  Dropout rates -­‐  % of Students mee.ng AP Benchmarks -­‐  % of Students with dual-­‐enrollment -­‐  ACT/SAT performance -­‐  A]endance Page 19 For Type 3 states with growth in HS Task Force Ques.on: How is growth calculated at the high school level? Response: Common growth approaches used with Type 3 states are VAMs (PA), gain-­‐scores (Nebraska), value tables (Delaware), growth to proficiency (North Dakota) and CGM (see list). Note: For those using the CGM, they use the last middle grade (e.g., grade 8) as a prior for the high school test (e.g., Hawaii, Massachuse]s, Oregon). Page 20 For Type 3 states with growth in HS Task Force Ques.on: How much is growth weighted in the accountability calcula.ons for HS? Response: Varies greatly by state for example: STATE Weight Given in Accountability Framework Hawaii 15% Massachuse]s 25% Nevada 10% (10 points out of a possible 100) Nebraska Weigh.ng to be determined Oregon 30% (20% to overall; 10% to subgroup growth) Pennsylvania 40% Page 21 7 11/10/14 For Type 3 states with growth in HS Taskforce Ques.on: How is adequate growth considered? Response: -­‐  Most CGM Type 3 states do not use AGPs in HS ( just use the norma.ve MGPs or use AGPs only in elementary and middle). Excep.on: Oregon -­‐  VAM states evaluate adequacy based on whether the observed growth met or exceeded the predicted growth Page 22 Smarter Balanced States Taskforce Ques.on: How is growth being included for accountability for high school arer the transi.on for these states? Response: Many states going through re-­‐design work now. Some ideas include (as well as possibly using Smarter Balanced interim tests as priors for grade 11): Kansas – use end of course tests as possible priors Hawaii – consider ACT suite of tests as possible priors Page 23 8 www.cde.state.co.us
Value-added models (VAM)

Uses regression-based methods that can be as simple as accounting for year to year changes in students’ test
scores to highly complex forms that account for nested structures, differences in student characteristics, and
persisting effects of previous teachers and schools. Causal attribution is intentional in this approach since valueadded inferences are being made.

States who use VAMs make strong causal attributions by framing results as a teacher or school’s direct
contribution to performance achieved.
- EVAAS is a proprietary VAM developed by William Sanders
Colorado Growth Model (CGM)

Also known as the Student Growth Percentile Model, SGPs are calculated using regression-based methods for
describing students’ current achievement (conditioned on prior scores) relative to peers sharing the same
academic score history.

Considered to be descriptive in nature since the focal area of interest for using this model is: How much
growth/progress was achieved?
Value-Tables
Also known as a “categorical model”, ascribes values (points) to changes in proficiency levels between time point 1
(typically prior year) relative to the most recent year. Point values are typically structured to reward maintaining
proficiency and achieving higher levels of proficiency, and for moving from lower proficiency levels to proficient and
above.
Simple Gains
Describes change in performance between time point 1 and time point 2. At the group level, the average gain (e.g.,
average scale score change) achieved between 2 time points is computed.
Growth to Standard (Proficiency)
Determine whether students are on track to meeting state proficiency standards by a specified grade in the future.
Common method: the model works backwards from the current score to a future grade and then divides the total
distance needed to reach proficiency by annual increments.
MONTH YEAR
Quick Reference on Federal and
State Required Assessments
Federal Statute
ESEA, Title I, Part A
State Statute
Colorado Revised Statutes
English
Language Arts
(ELA)
3rd – 8th grade and not less than
once during 10th -12th grade
§1111(b)(3)(C)(vii)
2013-14 only: 3rd-10th grade reading and writing §22-7-409 (1)
Math
3rd – 8th grade and not less than
once during 10th -12th grade
§1111(b)(3)(C)(vii)
2013 only: 3rd-10th grade §22-7-409 (1)
Not less than once during 3 rd – 5th
grade, not less than once during 6th
– 9th grade, and not less than once
during 10th -12th grade
§1111(b)(3)(C)(v)
Once in elementary, once in middle school, and once in high
school, in specific grades identified by CDE §22-7-409 (1)(j)
Science
Social Studies
2014-15 and after: 3rd – 11th grade English language arts §22-7409 (1) (h)
2014-15 and after: 3rd – 8th grade and 3 times in high school §227-409 (1) (i)
Once in elementary, once in middle school, and once in high
school, in specific grades identified by CDE §22-7-409(1)(k)
CoAlt*
§1111(b)(3)(C)(ix)
§22-7-409 (1.2)(d)(I)(B)
Spanish
Reading
§1111(b)(3)(C)(ix)
3rd and 4th grade §22-7-409 (3.5)(a)
Spanish
Writing
§1111(b)(3)(C)(ix)
3rd and 4th grade §22-7-409 (3.5)(a) and (b)
ACT
11th grade §22-7-409(1.5)(a)
ACCESS for
ELLS
§1111(b)(7)
§22-24-105
School
Readiness
Kindergarten §22-7-1004(2)(a) and §22-7-1014(2)(a)
Early Literacy
(READ)
Kindergarten – 3rd grade §22-7-1205(1)(a) and §22-7-1209(1)(b)
* Alternate assessments for students with significant cognitive disabilities
Student Participation: The Elementary and Secondary Education Act requires the participation of “all students,” including those who receive
reasonable adaptations and accommodations, in the statewide assessments. See Title I, Part A, §1111(b)(3)(C)(ix)(I). A school or local education agency
may only demonstrate adequate yearly progress (“AYP”) if not less than 95% of each student demographic group takes the statewide assessments. See
Title I Regulations, 34 CFR §200.20 (2)(c)(1)(i). Pursuant to rules of the Colorado State Board of Education, the rates of student participation in these
statewide assessments must also be considered in the assignment of district accreditation ratings and school plan types. See 1 CCR 301-1, sections 5.02
and 10.01(A).
Where can I learn more?
Specific information about statewide assessments can be found here: http://www.cde.state.co.us/assessment
Assessment Continuum
Purpose
Formative Assessment
and Instruction
Interim
Assessment
Summative
Assessment
To inform instruction
To determine progress
To measure mastery
Broad
Content
Focus
Focused
Student now and tomorrow
Use the focused content to
demonstrate understanding of
broader concepts
Classroom
Curriculum or program
next “quarter”
Broadest
Culmination of learning by
applying concepts and content
to demonstrate mastery
School/District
curriculum or program
next year
Frequency
Fall
Formality
Winter
Spring
Daily
Periodic
Informal
Moderately Formal
Formal
Teacher and/or external
publisher created, administered
and scored
Teacher and/or external
publisher created, administered
and scored
Teacher created, administered,
scored
Technical
Rigor
More Rigorous
End of Instruction
Most Rigorous
Less Rigorous
Types of
Scores
Informal
Moderately Formal
Perhaps none, rubric, number
correct, check for understanding
Percent correct, scale scores,
subscores, norm-referenced scores
School
Level of
Scores
Classroom
Individual
Individual
User of
Scores
Teacher and student
Stakes
Teacher, student, parent, and
school administrators
Moderate Stakes
Low Stakes
Formal
Scale scores, subscores, normreferenced scores
State
District
School
Individual
Teacher, student, parent,
school/district administrators,
and public
High Stakes
Grades and accountability
Grades
Informational only
DRAFT
Draft Date: June 14, 2011
Office of Standards and Assessments
Colorado
CMAS and CoAlt
Science and Social Studies
Interpretive Guide to
Assessment Reports
A Guide for Parents and Educators
2014
Table of Contents
Section A: General Information for Parents and Educators ................................... A-1
Background ........................................................................................................................ A-1
Colorado Measures of Academic Success (CMAS) .......................................................... A-1
Confidentiality and Reporting of Results ............................................................................ A-2
Purpose of this Guide ........................................................................................................ A-2
Section B: A Parent and Educator Guide to Understanding the Colorado
Measures of Academic Success (CMAS) Student Performance Report .............. B-1
Program Overview ............................................................................................................. B-1
Types of Scores on the CMAS Student Performance Report ........................................... B-1
Scale Scores .........................................................................................................B-2
Performance Levels ..............................................................................................B-2
Percent Correct ......................................................................................................B-3
Sample CMAS Student Performance Report .................................................................... B-3
Overall Assessment Scores .................................................................................B-4
SubscalePerformance ...........................................................................................B-4
Performance by Prepared Graduate Competencies (PGCs) and Grade Level
Expectations (GLEs) ..............................................................................................B-6
PerformancebyItemType .....................................................................................B-7
Performance Level Descriptions.............................................................................B-8
Section C: A Parent and Educator Guide to Understanding the Colorado Alternate
Assessment (CoAlt) Student Performance Report................................................. C-1
Program Overview .............................................................................................................C-1
Types of Scores on the CoAlt Student Performance Report .............................................C-1
Scale Scores ........................................................................................................ C-2
Performance Levels ............................................................................................. C-2
Percent of Points Earned ...................................................................................... C-2
Sample CoAlt Student Performance Report ......................................................................C-3
General Information ............................................................................................. C-3
Table of Contents
i
Overall Assessment Scores ................................................................................ C-3
Content Standard Performance ........................................................................... C-4
Performance Level Descriptions .......................................................................... C-5
Section D: An Educator Guide to Understanding the Colorado Measures of
Academic Success School and District Reports .................................................... D-1
Program Overview .............................................................................................................D-1
Purpose and Use of CMAS Results ...................................................................................D-1
CMAS School and District Reports ....................................................................................D-2
Types of Scores on the CMAS School and District Reports .............................................D-2
Scale Scores ........................................................................................................ D-2
Performance Levels ............................................................................................. D-2
Percent Correct ..................................................................................................... D-3
Appropriate Score Comparisons and Uses ..........................................................................D-4
CMAS Performance Level Summary Report .....................................................................D-6
General Information ............................................................................................. D-6
Performance Level Distribution Data.................................................................... D-6
CMAS Content Standards Roster ......................................................................................D-9
General Information ............................................................................................. D-9
Performance Level and Content Standards Information ...................................... D-9
Prepared Graduate Competencies and Grade Level Expectations Performance….
........................................................................................................................... D-10
CMAS Item Analysis Report ............................................................................................D-14
General Information ........................................................................................... D-14
Item Analysis Information ................................................................................... D-14
Item Map Information ......................................................................................... D-15
Section E: An Educator Guide to Understanding the Colorado Assessments
(CoAlt) Science and Social Studies School and District Reports ..........................E-1
Program Overview ............................................................................................................. E-1
Purpose and Use of CoAlt Results .................................................................................... E-1
CoAlt School and District Reports ..................................................................................... E-2
Types of Scores on the CoAlt School and District Reports ............................................... E-2
Table of Contents
ii
Scale Scores .........................................................................................................E-2
Performance Levels ..............................................................................................E-3
Percent of Points Earned .......................................................................................E-3
Appropriate Score Comparisons and Uses ................................................................................ E-3
CoAlt Performance Level Summary Report ........................................................................ E-5
General Information ..............................................................................................E-5
Performance Level Distribution Data.....................................................................E-5
CoAlt Content Standards Roster ......................................................................................... E-8
General Information ..............................................................................................E-8
Performance Level and Content Standards Information .......................................E-8
Section F: Appendices............................................................................................... F-1
Appendix A: CMAS Performance Level Descriptors ......................................................... F-2
Appendix B: CoAlt Performance Level Descriptors ........................................................... F-7
Appendix C: Prepared Graduate Competency and Grade Level Expectation Ordering . F-12
Table of Contents
iii
Section A
General Information for Parents and Educators
Background
The Colorado-created Colorado Academic Standards (CAS) were adopted by the State
in science and social studies in December of 2009. The CAS outline the concepts and
skills, including 21st century skills, that students need in order to be successful in the
current grade as well as to make academic progress from year to year. The ultimate
goal of the CAS is for all Colorado students to graduate college and career ready.
School districts were required to fully implement the CAS in the 2013-14 school year.
In partnership with Colorado educators and Pearson, Inc., the Colorado Department of
Education developed new assessments to evaluate student mastery of the CAS in
science and social studies. These assessments were given for the first time in the
spring of 2014 to students in grades 4 and 7 in social studies, and to students in grades
5 and 8 in science.
Two assessments address the CAS: the Colorado Measures of Academic Success
(CMAS) and the Colorado Alternate Assessment (CoAlt). The vast majority of students
participate in CMAS while a very small percentage of students with significant cognitive
disabilities participate in CoAlt.
Colorado Measures of Academic Success (CMAS)
CMAS is Colorado’s standards-based assessment designed to measure the Colorado
Academic Standards (CAS) in the content areas of science and social studies. The
purpose of the CMAS assessments is to indicate the degree to which students have
mastered the CAS in science and social studies at the end of the tested grade level.
CMAS results are intended to provide one measure of a student’s academic progress
relative to the CAS.
CMAS for science and social studies are Colorado’s first state-wide computer-based
assessments. The online assessments allow for new item types that were not possible
under the prior paper-based system, such as science simulations. Online presentation
also fosters increased student engagement. The assessments were designed to provide
high level content area information (i.e., a science score or social studies score), as well
as standard-specific scores. For example, social studies assessment reports include an
General Information
A-1
overall social studies score as well as subscores in the areas of history, geography,
economics and civics. Districts and schools may use scores at this level to compare
performance from year to year to monitor the effectiveness of their programs.
Colorado Alternate Assessment (CoAlt)
CoAlt is the standards-based assessment designed specifically for students with
significant cognitive disabilities who are unable to participate in CMAS, even with
accommodations. Students must meet eligibility requirements to take CoAlt. CoAlt
assesses the performance expectations of the CAS for students with significant
cognitive disabilities as expressed in the Extended Evidence Outcomes (EEOs) of the
Colorado Academic Standards (CAS). The primary purpose of the assessment
program is to determine the level at which Colorado students with significant cognitive
disabilities meet the EEOs of the CAS in the content areas of science and social studies
at the end of the tested grade level.
CoAlt assessments are administered in a one-on-one setting between teachers and
students. Teachers use CoAlt scoring rubrics to evaluate student responses before
submitting performance results.
Confidentiality and Reporting of Results
The results of individual student performance on CMAS and CoAlt are confidential and
may be released only in accordance with the Family Educational Rights and Privacy Act
of 1974 (20 U.S.C. Section 1232g). Aggregated student performance data representing
16 or more students is made available to the public. Aggregated reports do not contain
the names of individual students or teachers.
Purpose of this Guide
This guide provides information on the reports provided for CMAS and CoAlt at the
individual student, school and district levels. Sections B and C can be shared with
parents. They display and explain elements of the CMAS and CoAlt science and social
studies student performance reports to assist parents in understanding their child’s test
results. Sections D and E display and explain elements of the CMAS and CoAlt science
and social studies school and district reports.
General Information
A-2
Section B
A Parent and Educator Guide to Understanding
the Colorado Measures of Academic Success
(CMAS) Student Performance Report
Program Overview
Colorado Measures of Academic Success (CMAS) are Colorado’s standards-based
assessments designed to measure the Colorado Academic Standards (CAS) in the
content areas of science and social studies. The CAS contain the concepts and skills
students need to learn in order to be successful in the current grade and to make
academic progress from year to year. CMAS science assessments are given to
students in grades 5 and 8 while CMAS social studies assessments are given in grades
4 and 7 each spring. The purpose of the CMAS is to indicate the degree to which
students have mastered the CAS in science and social studies at the end of the
tested grade level. CMAS results are intended to provide one measure of a
student’s academic progress relative to the CAS.
CMAS for science and social studies are Colorado’s first state-wide computer-based
assessments. The online nature of the assessments allows for new item types that were
not possible under the prior paper-based system, such as science simulations. Online
assessments also foster increased student engagement. The assessments were
designed to provide high level content area information (i.e., a science score or social
studies score) as well as standard specific scores (i.e., score for history, geography,
economics and civics).
A Student Performance Report is created for each student who takes a CMAS
assessment so that parents can understand their child’s command over the Colorado
Academic Standards in the assessed grade level and content area. This section of the
guide explains the elements of the Student Performance Report.
Types of Scores on the CMAS Student Performance Report
To understand each part of the Student Performance Report, it is important to become
familiar with the types of assessment scores that are included on the report. At varying
levels, student performance is described by scale scores, performance levels and
percent correct. State, district and school level information is also provided in relevant
CMAS Student Performance Report
B-1
sections of the Student Performance Report to help parents understand how their
child’s performance compares to other students.
Scale Scores
When the points a student earns on an assessment are placed on a common scale, the
student’s score becomes a scale score. Scale scores adjust for slight differences in
difficulty on versions of the assessment that can vary slightly from student to student
within a year (referred to as forms of the assessment) or between school years (referred
to as administrations). Scale scores allow for comparisons of assessment scores, within
a particular grade and subject area, across administrations. As an example, a student
who receives a score of 475 on one form of the 7th grade social studies assessment is
expected to score a 475 on any form of the assessment. Scale scores maintain their
meaning and can be compared across years. A student who scores 650 on 8th grade
science in 2015 will demonstrate the same level of mastery of concepts and skills as an
8th grade science student who scores 650 in 2014. For CMAS science and social
studies, the scale scores cannot be used to compare student performance across
grades (e.g., grade 4 to grade 7) or subject areas (e.g., science to social studies).
Scale scores for the CMAS science and social studies assessments range from 300 to
900. Scale Scores are reported for the overall test, content standards and Scientific
Inquiry/Nature of Science (referred to as “Reporting Categories”), and Item type.
Performance Levels
Scale Scores are used to determine a student’s performance level for the overall
assessment. Each performance level describes a range of scores at the overall
assessment level (i.e., science or social studies). The range of scores in each
performance level were recommended by a group of Colorado educators and adopted
by the Colorado State Board of Education. Performance levels describe the concepts
and skills that students are expected to demonstrate at each of the levels. The grade
level concepts and skills related to each performance level are listed on Page 4 of the
Student Performance Report. The four cross-grade and content area performance
levels are Distinguished Command, Strong Command, Moderate Command, and
Limited Command. Performance Level Descriptors for each grade level and content
area are included in Appendix A of this document.
Students in the top two performance levels, Distinguished Command and Strong
Command, are considered on track to being college and career ready in the content
area of science or social studies. Although the Moderate Command or Limited
Command performance levels indicate students in these levels may need academic
support to successfully engage in further studies in the content area, it will be important
for conversations to occur between parents and educators to determine whether the
CMAS Student Performance Report
B-2
school as a whole has made a complete transition to the new standards. Schools that
have not fully transitioned to the new standards may have a number of students who
score at the lowest two levels. A focus on the school level science or social studies
program may be appropriate.
Percent Correct
Percent correct refers to the number of points a student earned out of the total number
of points possible within a reporting category. The percent correct indicator can only be
used to compare performance of the individual student to the average district and
average state performance on the specific set of items being considered. Some groups
of items may be more difficult than other sets of items; so unlike the scale score, the
percent correct indicator cannot be compared across groups of items or across school
years. Percent correct scores are provided for Prepared Graduate Competencies
(PGCs) and Grade Level Expectations (GLEs). PGCs and GLEs are described more
fully later in this guide.
Sample CMAS Student Performance Report
A sample grade 8 Student Performance Report is displayed at the end of this section on
pages B-9 to B-12. Each page of the sample report is included individually. The sample
report includes the same type of information that is included on all of the science and
social studies reports. The information below describes each part of the report. To learn
more about each part of the Student Performance Report, match the white letters in
black circles from the sample report to the information included with the corresponding
letters on the following pages.
General Information (Refer to Page 1 of the Student Performance Report)
A. Identification Information
The top of the Student Performance Report lists your child’s name, part of the
state student identification number (SASID), birthdate, school and district.
B. Test Date
The season and year your child took the assessment is indicated.
C. Subject Area
The subject area of your child’s assessment is identified (either science or social
studies).
D. Grade Level
The grade level of your child’s assessment is indicated.
CMAS Student Performance Report
B-3
Overall Assessment Scores (Refer to Page 1 of the Student Performance
Report)
E. Explanation of Overall Performance
A brief explanation of the overall assessment results is given to help you
understand the information provided in the box below the explanation.
F. Your Child’s Overall Scale Score and Performance Level
Your child’s overall scale score (the number between 300 and 900) and
performance level (Distinguished Command, Strong Command, Moderate
Command or Limited Command) are provided. The scale score and performance
level included in this part of the report represent your child’s overall performance
on the assessment in the content area (science or social studies), Grade level,
and content area specific performance level descriptors providing the concepts
and skills students are typically able to demonstrate at each level may be found
on the last page of the report.
G. Graphical Representation of Overall Performance: Scale Score and
Performance Level by Student, School, and District
Your child’s scale score is indicated by a large diamond on the graph. The
arrows to the left and right of the diamond indicate the range of scores your child
would likely receive if the assessment was taken multiple times.
The average scale scores at the school, district, and state levels are identified to
the left of the graph and are indicated by smaller diamonds on the graph. By
comparing the location of the diamonds, you can see how your child performed in
comparison to the average student in their school, district, or the state. If your
child’s score diamond is to the right of the school, district or state average
diamond, then your child performed better than that group’s average. If your
child’s diamond is to the left of the school, district or state diamond, then on
average, that group performed better than your child.
The dotted lines on the graph show the lowest scores needed to achieve
Moderate Command, Strong Command, and Distinguished Command
performance levels. The scale scores representing each of those scores are
indicated on the bottom of the graph.
SubscalePerformance (Refer to Page 1 of the Student Performance
Report)
H. Explanation of Subscale Performance
In this part of the report, your child’s performance is presented by individual
CMAS Student Performance Report
B-4
reporting categories. Information to help you understand the graphical
representation in this section is included.
I. Reporting Category Descriptions
Reporting categories include the standards for social studies (history, geography,
economics and civics) and for science (physical science, life science, and earth
systems science). Science also includes Scientific Investigation and the Nature
of Science as a reporting category. Descriptions of the reporting categories from
the Colorado Academic Standards (CAS) are included in this section of the
report.
J. Subscale Scores
Subscale scores indicate how your child performed in each reporting category.
Like the overall science and social studies scale scores, subscale scores range
from 300 to 900 and can be compared across school years. Average subscale
scores are also provided for your child’s school and district.
K. Graphical Representation of Subscale Performance by Student, School,
and District
The graphical representation of subscale performance shows how your child
performed in each reporting category. Your child’s performance is represented by
a large diamond on the graph. The arrows around your child’s diamond show the
range of scores that your child would likely receive if the assessment was taken
multiple times.
The graphical representation also shows how your child performed in comparison
to other students in your child’s school, district, and the state. Performance of
students in the school and district are represented by smaller diamonds. If your
child’s score diamond is to the right of the school or district average diamond,
then your child’s scale score was higher than the school or district average scale
score. If your child’s diamond is to the left, then your child’s scale score was
lower than the school or district average.
The shaded areas of the graph represent the performance of about 70% of
students in the state. If your child’s score diamond is to the right of the shaded
area, your child’s performance is considered relatively strong in that area in
comparison to other students in the state. If your child’s score diamond is to the
left of the shaded area, your child’s performance is considered relatively weak in
that area in comparison to other students in the state. These categories are
based on the state performance for the current year and can change from year to
year.
L. Document Process Number
The document number located in the bottom-right corner of the report is a unique
CMAS Student Performance Report
B-5
number that is assigned to your child’s record by the testing contractor.
Performance by Prepared Graduate Competencies (PGCs) and Grade Level
Expectations (GLEs) (Refer to Page 2 of the Student Performance Report)
M. Explanation
Prepared Graduate Competencies (PGCs) and Grade Level Expectations (GLEs)
are important parts of the Colorado Academic Standards. PGCs represent the
concepts and skills students need to master in order to be college and career
ready by the time of graduation. GLEs are grade-specific expectations that
indicate that students are making progress toward the PGCs. This section of the
report describes performance with percent correct for PGCs and GLEs.
N. Graph Key
The graph key includes the explanatory text for the bars in the Percent Correct
graph: student’s performance, district average and state average.
O. Standard, PGC, and GLE
Descriptions of the Prepared Graduate Competencies (PGCs) and Grade Level
Expectations (GLEs) that were included on the assessment are listed under each
standard.
P. Points possible
This number shows the total points possible for each Prepared Graduate
Competency (PGC) and Grade Level Expectation (GLE) on the assessment.
Q. Graphical Representation of Percent Correct
The graph shows the percentage of items that were answered correctly out of the
total number of items for each Prepared Graduate Competency (PGC) and
Grade Level Expectation (GLE). When looking at the shaded bars in the graph,
you can compare your child’s performance to the average district and state
performance. Keep in mind that there are relatively few points associated with
each PGC or GLE. A student’s bar can look much longer or much shorter based
on getting a single correct or incorrect item-response.
The graph for the Grade Level Expectation (GLE) is blank when Prepared
Graduate Competencies (PGCs) have only one associated Grade Level
Expectation (GLE) because the information is the same for both the GLE and
PGC.
Remember, percent correct score information cannot be compared across PGCs,
GLEs, or years.
CMAS Student Performance Report
B-6
Performance by ItemType (Refer to Page 3 of the Student Performance
Report)
CMAS assessments include selected-response and constructed-response items.
Selected-response items require students to choose the correct answer(s) from options
provided. Sometimes these are referred to as multiple choice items. In the CMAS
computer-based assessments, these can also include technology-enhanced items
referred to as drag-and-drop and hot spot. Constructed-response items require students
to develop their own answers to questions.
R. Selected-Response Scale Score
Your child’s scale score for selected-response items is shown. The arrows to the
left and right of diamond indicate the range of scores your child would likely
receive if the assessment was taken multiple times. You can compare your
child’s scale score with the average scale scores for selected-response items for
your child’s school, district, and the state. Your child’s school and district can
compare next year’s groups of students to this year’s students by looking at
selected-response scale scores. This information can be used to support school
and district program and instructional improvement decisions.
S. Constructed-Response Scale Score
Your child’s scale score for constructed-response items is shown. The arrows to
the left and right of diamond indicate the range of scores your child would likely
receive if the assessment was taken multiple times. You can compare your
child’s scale score with the average scale scores for constructed-response items
for your child’s school, district, and the state. Your child’s school and district can
look at next year’s groups of students and compare them to this year on the
constructed-response scale score. This information can be used to support
schools and district program and instructional improvement decisions.
T. Graphical Representation of Selected-Response and ConstructedResponse Scale Scores
A graphical representation of your child’s scale score is provided. The large
diamond on the graph represents your child’s scale score. The arrows around
your child’s score diamond show the range of scores that your child would likely
receive if the assessment was taken multiple times. The smaller diamonds
represent the average scale scores of your child’s school, district, and the state.
If your child’s score diamond is to the right of the school, district or state average
diamond, then your child performed better than that group’s average. If your
child’s diamond is to the left of the school, district or state diamond, then on
average, that group performed better than your child.
CMAS Student Performance Report
B-7
Performance Level Descriptions (Refer to Page 4 of the Student
Performance Report)
U. Performance Level Descriptions
Specific grade level and content area descriptions have been developed for each
of the four CMAS performance levels:
 Distinguished Command
 Strong Command
 Moderate Command
 Limited Command
Your child’s report will reflect the performance level descriptions specific to the
assessed grade level and content area. These performance level descriptors
describe the specific concepts and skills that students in each performance level
can typically demonstrate for your child’s assessed grade level and content area.
Performance level descriptors for each grade level and content area are included
in Appendix A of this document.
Students in the top two performance levels, Distinguished Command and Strong
Command, are considered on track for being college and career ready in the
content area of science or social studies.
CMAS Student Performance Report
B-8
Sample CMAS Student Performance Report – Page 1
CMAS Student Performance Report
B-9
Sample CMAS Student Performance Report – Page 2
CMAS Student Performance Report
B-10
Sample CMAS Student Performance Report – Page 3
CMAS Student Performance Report
B-11
Sample CMAS Student Performance Report – Page 4
CMAS Student Performance Report
B-12
Section C
A Parent and Educator Guide to Understanding
the Colorado Alternate Assessment (CoAlt)
Student Performance Report
Program Overview
CoAlt is the standards-based assessment designed specifically for students with
significant cognitive disabilities who are unable to participate in CMAS, even with
accommodations. Students must meet participation requirements to take CoAlt. CoAlt
assesses the performance expectations of the CAS for students with significant
cognitive disabilities as expressed in the Extended Evidence Outcomes (EEOs) of the
Colorado Academic Standards (CAS). The primary purpose of the assessment program
is to determine the level at which Colorado students with significant cognitive disabilities
meet the EEOs of the CAS in the content areas of science and social studies at the
tested grade level. CoAlt results are intended to provide one measure of a student’s
academic progress relative to the EEOs of the CAS. CoAlt science assessments are
given to students in grades 5 and 8, while CoAlt social studies assessments are given in
grades 4 and 7 each spring
CoAlt assessments are administered in a one-on-one setting between teachers and
students. Teachers use CoAlt scoring rubrics to evaluate student responses before
submitting performance results.
A Student Performance Report is created for each student who takes a CoAlt
assessment. This section of the guide explains the elements of the Student
Performance Report.
Types of Scores on the CoAlt Student Performance Report
To understand each part of the Student Performance Report, it is important to become
familiar with the types of assessment scores that are included on the report. Student
performance is described by a scale score, performance level and percent of points
earned. State level information is also provided in relevant sections of the Student
Performance Report to help parents understand how their child’s performance
compares to other students.
CoAlt Student Performance Report
C-1
Scale Scores
When the points a student earns on an assessment are placed on a common scale, the
student’s score becomes a scale score. Scale scores adjust for slight differences in
difficulty on versions of the assessment that can vary slightly from student to student
within a year (referred to as forms of the assessment) or between school years (referred
to as administrations). The scale score allows for a comparison of assessment scores,
within a particular grade and subject area, across administrations. As an example, a
student who receives a score of 125 on one form of the 7th grade social studies
assessment is expected to score a 125 on any form of the assessment. Scale scores
maintain their meaning and can be compared across years. For example, a student who
scores 125 on 8th grade science in 2015 will demonstrate the same level of mastery of
concepts and skills as an 8th grade science student who scores 125 in 2014. For CoAlt
science and social studies, the scale scores cannot be used to compare students’
performance across grades (ex. grade 4 to grade 7) or subject areas (ex. science to
social studies).
Scale scores for the CoAlt science and social studies assessments range from 0 to 250.
Scale Scores are reported for the overall test.
Performance Levels
Scale Scores are used to determine a student’s performance level for the overall
assessment. Each performance level describes a range of scores at the overall
assessment level (i.e., science or social studies). The range of scores in each
performance level were recommended by a group of Colorado educators and adopted
by the Colorado State Board of Education. Performance levels describe the concepts
and skills students are expected to demonstrate at each of the levels. The grade level
concepts and skills related to each performance level are listed on Page 2 of the
Student Performance Report. The four cross-grade and content area performance
levels are Novice, Developing, Emerging, and Exploring.
Performance Level
Descriptors are included in Appendix B.
Students who do not respond to any items on the assessment receive an inconclusive
designation.
The top two Performance Levels indicate that with the appropriate supports, the student
is prepared for further study in the content area.
Percent of Points Earned
Percent of points earned refers to the number of points a student earned out of the total
CoAlt Student Performance Report
C-2
number of points possible within a reporting category. The percent of points earned
indicator can only be used to compare performance of the individual student to the
average state performance on the specific set of items being considered. Some groups
of items may be more difficult than other sets of items; so unlike the scale score, the
percent of points earned indicator cannot be compared across groups of items or across
school years. Percent of points earned are provided at the standard level. For social
studies, the standards are history, geography, economics, and civics. For science, the
standards are physical science, life science and earth systems science.
Sample CoAlt Student Performance Report
A sample grade 7 social studies Student Performance Report is displayed at the end of
this section on pages C-6 to C-7. The sample report includes the same type of
information that is included on all of the science and social studies reports. The
information below describes each part of the report. To learn more about each part of
the Student Performance Report, match the white letters in black circles from the
sample report to the information included with the corresponding letters on the following
pages.
General Information (Refer to Page 1 of the Student Performance
Report)
A. Identification Information
The top of the Student Performance Report lists your child’s name, part of the
state student identification number (SASID), birthdate, school and district.
B. Test Date
The season and year your child took the assessment is indicated.
C. Subject Area
The subject area of your child’s assessment is identified (either science or social
studies).
D. Grade Level
The grade level of your child’s assessment is indicated.
Overall Assessment Scores (Refer to Page 1 of the Student Performance
Report)
E. Explanation of Overall Performance
A brief explanation of the overall assessment results is given to help you
CoAlt Student Performance Report
C-3
understand the information provided in the box below the explanation.
F. Your Child’s Overall Scale Score and Performance Level
Your child’s overall scale score (the number between 0 and 250) and
performance level (Exploring, Emerging, Developing or Novice) are provided. An
inconclusive designation is given to students who did not respond to any items
on the assessment. The scale score and performance level included in this part
of the report represent your child’s overall performance on the assessment in the
content area (science or social studies). Grade level and content area-specific
performance level descriptors providing the concepts and skills students are
typically able to demonstrate at each level may be found on the second page of
the report.
G. Graphical Representation of Overall Performance: Scale Score and
Performance Level by Student, School, and District
Your child’s scale score is indicated by a large diamond on the graph. The
arrows to the left and right of the diamond indicate the range of scores your child
would likely receive if the assessment was taken multiple times.
The average scale score at the state level is identified to the left of the graph and
is indicated by a smaller diamond on the graph. By comparing the location of the
diamonds, you can see how your child performed in comparison to the average
student at the state level. If your child’s score diamond is to the right of the state
average diamond, then your child performed better than the state average. If
your child’s diamond is to the left of the state diamond, then on average, the
state performed better than your child.
The dotted lines on the graph show the lowest scores needed to achieve
Emerging, Developing, and Novice performance levels. The scale scores
representing each of those scores are indicated on the bottom of the graph.
Content Standard Performance (Refer to Page 1 of the Student
Performance Report)
H. Content Standard Descriptions
This section of the report provides descriptions for social studies (history,
geography, economics and civics) and for science (physical science, life science,
and earth systems science).
I. Points Earned
Points earned indicates how many points your child earned for each content
standard.
J. Points Possible
CoAlt Student Performance Report
C-4
Points possible indicates the total number of points possible for each content
standard.
K. Graphical Representation of Subscale Performance by Student, School,
and District
The graphical representation of content standard performance shows how your
child performed in each standard. Your child’s performance is represented by a
large diamond on the graph. The arrows around your child’s diamond show the
range of scores that your child would likely receive if the assessment was taken
multiple times.
The graphical representation also shows how your child performed in comparison
to other students in the state. Performance of students in the state is
represented by smaller diamonds. If your child’s score diamond is to the right of
the state average diamond, then your child’s percent of points earned was higher
than the state average. If your child’s diamond is to the left, then your child’s
percent of points earned was lower than the state average.
L. Graph Key
The graph key indicates the bar color for your student’s score and the average
score for students in the state.
M. Document Process Number
The document number found in the bottom-left corner of the report is a unique
number, per administration, that is assigned to your child’s record by the testing
contractor.
Performance Level Descriptions (Refer to Page 2 of the Student
Performance Report)
N. Performance Level Descriptions
Specific grade level and content area descriptions have been developed for each
of the four CoAlt performance levels:
 Novice
 Developing
 Emerging
 Exploring
Your child’s report will reflect the performance level descriptions specific to the
assessed grade level and content area. These performance level descriptors describe
the specific concepts and skills that students in each performance level can typically
demonstrate for your child’s assessed grade level and content area. Performance level
descriptors are located in Appendix B.
CoAlt Student Performance Report
C-5
Sample CoAlt Student Performance Report – Page 1
CoAlt Student Performance Report
C-6
CoAlt Student Performance Report – Page 2
CoAlt Student Performance Report
C-7
Section D
An Educator Guide to Understanding the
Colorado Measures of Academic Success
School and District Reports
Program Overview
CMAS are Colorado’s standards-based assessments designed to measure the
Colorado Academic Standards (CAS) in the content areas of science and social studies.
The CAS contain the concepts and skills students need to learn in order to be
successful in the current grade and to make academic progress from year to year.
CMAS science assessments are given to students in grades 5 and 8 while CMAS social
studies assessments are given in grades 4 and 7 each spring.
The science and social studies CMAS assessments are Colorado’s first state-wide
computer-based assessments. The online assessments allows for new item types that
were not possible under the prior paper-based system such as science simulations.
Online assessments also foster increased student engagement. The assessments were
designed to provide high level content area information (i.e., a science score or social
studies score) as well as standard specific scores (i.e., scores for history, geography,
civics and economics).
Purpose and Use of CMAS Results
 Report on the status and progress of student achievement
 Make judgments about student learning relative to standards
 Gauge school, district, and state year-to-year progress
 Improvement planning (e.g., prioritize professional learning and resource
decisions, consider program alignment with academic standards, reflect on
effectiveness of school initiatives)
Standardized assessments are a valuable tool for evaluating programs. However, any
assessment can furnish only one part of the picture. CMAS science and social studies
assessment results are not able to identify, let alone measure, every factor that
contributes to the success or failure of a program. Assessment results can be most
helpful if considered as one component of an evaluation system.
CMAS School and District Reports
D-1
CMAS School and District Reports
In addition to individual Student Performance Reports, schools and districts receive
Performance Level Summaries, Content Standards Rosters and Item Analysis
Summaries. These reports summarize how students in the school or district performed
and are described later in this section.
Note that the sample reports included in this guide are for illustration purposes only.
They are provided to show the basic layout of the reports and the information they
provide.
Types of Scores on the CMAS School and District Reports
To understand each part of the CMAS school and district reports, it is important to
become familiar with the types of assessment scores that are included on the report. At
varying levels, student performance is described by scale scores, performance levels
and percent correct. State, district and school level information is provided in relevant
sections of the reports so that performance at these levels can be compared.
Scale Scores
When the points a student earns on a test are placed on a common scale, the student’s
score becomes a scale score. Scale scores adjust for slight differences in difficulty on
versions of the assessment that can vary slightly from student to student within a year
(referred to as forms of the assessment) or between school years (referred to as
administrations). The scale score allows for a comparison of assessment scores, within
a particular grade and subject area, across administrations. As an example, a student
who receives a score of 475 on one form of the 7th grade social studies assessment is
expected to score a 475 on any form of the assessment. Scale scores maintain their
meaning and can be compared across years. A student who scores 650 on 8th grade
science in 2015 will demonstrate the same level of mastery of concepts and skills as an
8th grade science student who scores 650 in 2014. For CMAS science and social
studies, the scale scores cannot be used to compare students’ performance across
grades (e.g., grade 4 to grade 7) or subject areas (e.g., science to social studies).
Scale scores for the CMAS science and social studies assessments range from 300 to
900. Scale Scores are reported for the overall assessment, content standards and
Scientific Inquiry/Nature of Science (referred to as “Reporting Categories”) and Item
type.
Performance Levels
Scale Scores are used to determine a student’s performance level for the overall
CMAS School and District Reports
D-2
assessment. Each performance level describes a range of scores at the overall
assessment level (i.e., science or social studies). The range of scores in each
performance level were recommended by a group of Colorado educators and adopted
by the Colorado State Board of Education. The performance levels describe the
concepts and skills students are expected to demonstrate at each of the levels. The
grade level concepts and skills related to each performance level are listed on Page 4 of
the Student Performance Report. The four cross grade and content area performance
levels are Distinguished Command, Strong Command, Moderate Command and Limited
Command. Performance level descriptors for each grade level and content area are
included in Appendix A.
Students in the top two performance levels, Distinguished Command and Strong
Command, are considered on track to being college and career ready in the content area
of science or social studies. Although the Moderate Command and Limited Command
performance levels indicate students in performing in these levels may need academic
support to successfully engage in further studies in the content area, it will be important
for conversations to occur between parents and educators to determine whether the
school as a whole has made a complete transition to the new standards. Schools that
have not fully transitioned to the new standards may have a number of students who
score at the lowest two levels. A focus on the school level science or social studies
program may be appropriate.
Percent Correct
Percent correct refers to the number of points a student earned out of the total number
of points possible within a reporting category. The percent correct indicator can only be
used to compare performance of the individual student to the average district and
average state performance on the specific set of items being considered. Some groups
of items may be more difficult than other sets of items; so unlike the scale score, the
percent correct indicator cannot be compared across groups of items or across school
years. Percent correct scores are provided for Prepared Graduate Competencies
(PGCs) and Grade Level Expectations (GLEs).
Percent correct information can be useful in helping schools identify areas in which
further diagnosis is warranted. As with all assessments given at a single point in time,
the data generated from this snapshot should be used in conjunction with other
evaluations of performance to provide an in-depth portrait of student achievement. Once
an area of possible weakness has been identified, supplementary data should be
gathered to further define which instructional intervention would be most effective.
Percent correct will also be used on the Item Analysis Summary reports. For these
reports, the percent of students getting an item correct by school, district and state will
be provided. In this case, the percent correct will be an indicator of how difficult students
at the school, district and school levels found specific items
CMAS School and District Reports
D-3
Appropriate Score Comparisons and Uses
The types of comparisons that can be made differ by the scores being compared. Some
scores (e.g., performance levels and scale scores) allow for cross year comparisons,
some (e.g., percent correct) do not. In addition, the reliability of the comparisons or
conclusions made vary depending on the size of the group (e.g., number of points
contributing to a particular score or the number of students included in a comparison
group). In general, the larger the group is the more reliable the comparison or
conclusions made will be. The smaller the group, the less reliable the comparison or
conclusions made will be. High stakes decisions should not be based on scores of small
groups of students or on scores with a low number of points contributing to them. The
following table provides some of the comparisons that can and cannot be made by
particular types of scores.
Compare an
individual
student’s
performance to a
target group’s
performance (ex.
student to
school, district,
or state) within
the same year
Score Comparisons
Compare a group’s
Compare an
performance to another
individual
group’s performance (ex.
student’s
one school to another
performance to a
school, a district to the state, target group’s
students of one
performance (ex.
race/ethnicity group to
school, district, or
students in another
state) across
race/ethnicity group) within
years
the same year
Compare a
group’s
performance
to the same
group’s
performance
across years
Compare to
other scores
of the same
type in a
different
subject or
grade
NO
(These are
content and
grade
specific)
NO
(These are
content and
grade
specific)
Performance
Levels
YES
YES
YES
YES
Scale
Scores
YES
YES
YES
YES
YES
NO
(These are
specific to the
year of the
assessment.)
NO
(These are
specific to
the year of
the
assessment.)
NO
(These are
specific to the
PGC/GLE.)
YES
NO
(These are
specific to the
year of the
assessment.)
NO
(These are
specific to
the year of
the
assessment.)
NO
(These are
specific to the
Reporting
Category.)
Percent
Correct
Relative
Strengths
and
Weaknesses
(Subscale
Reporting
Categories)*
YES
YES
CMAS School and District Reports
D-4
* Potential relative strengths or weaknesses provide information about a student’s
performance in the reporting category compared to all students in the state. The
potential relative strengths and weaknesses are based on the state average
performance. They are not based on the standards and should not be interpreted in the
same way as the overall performance levels.
Some assessment scores can be used to compare the performance of different
demographic or program groups. All CMAS scores for science (grades 5 and 8) and
social studies (grades 4 and 7) can be analyzed within the same grade and subject area
for any single administration to determine which demographic or program group had the
highest average scale score, the lowest percentage achieving Strong Command, the
highest percentage achieving Moderate Command, etc.
Other scores can be used to help evaluate the academic performance of demographic
or program groups. For example, aggregations of reporting-category data can help
districts and schools identify areas of potential academic weakness for a group of
students. This same methodology can be applied to an entire school or district.
In addition, all assessment scores can be compared to district and statewide
performance within the same subject area for any administration.
CMAS School and District Reports
D-5
CMAS Performance Level Summary Report
The Performance Summary Report is available for each grade assessed at each school
or district. It contains aggregated performance level information across the school,
district and state. It also contains disaggregated performance level data by student
demographic and program categories and subgroups for either the school or the district.
Page 1 of a sample report is included on page D-8.
General Information
A. Identification Information
The report identifies the names and codes of the school and district.
B. Test Date
The administration season and year is indicated.
C. Subject Area
The subject area of the report is identified (either science or social studies).
D. Grade Level
The grade level of the assessment is indicated.
Performance Level Distribution Data
E. Demographic and Program Categories and Subgroups
Demographic and program categories with subgroups are listed on the left side of the
table. Results for students for whom no demographic or program information was
coded are included in the “not indicated” subgroups.
F. Total Number Tested
The number of students who should have been assessed is provided.
G. Average Scale Score
The average scale score is displayed for state, district, school and each demographic
or program subgroup. The average does not include students with no scores.
H. Performance Level Results
The number and percentage of students who achieved Limited Command, Moderate
Command, Strong Command, and Distinguished Command, as well as aggregated
Strong and Distinguished Command, are displayed for each demographic or program
subgroup
CMAS School and District Reports
D-6
I. No Scores Reported
This is the number of students registered to take the CMAS who did not receive
scores. They are not included in the denominator for the Performance Level
percentages.
J. Process number
The process number found in the bottom-right corner of the report is a unique
number, per administration, that is assigned to the report by the testing contractor.
CMAS School and District Reports
D-7
Sample CMAS Performance Level Summary Report
CMAS District Reports
D-8
CMAS Content Standards Roster
The Content Standards roster is available for each grade assessed at each school. It
lists every student for whom an answer document or online record was submitted. This
report provides the overall performance level, reporting category, and PGC/GLE data for
each student. It also provides the same information aggregated at the state, district and
school levels. A sample report is included on pages D12-D13.
General Information (Refer to Page 1 of the Content Standards Roster)
A. Identification Information
The report identifies the school and district name and code.
B. Test Date
The administration season and year is indicated.
C. Subject Area
The subject area of the report is identified (either science or social studies).
D. Grade Level
The grade level of the assessment is indicated.
The general information is repeated on page 2 of the report.
Performance Level and Content Standards Information (Refer to Page 1 of the
Content Standards Roster)
E. Key
The key indicates the ranges of scale scores for each performance level for the
overall test. It also explains the symbols used to identify the performance
indicators for content standard performance (potential relative strength, typical, or
potential relative weakness).
F. Student Information
Students are identified by first, last, and middle initial. Expelled students are
included only in the district version of the roster and their status is indicated in this
section.
G. Content Standards Performance School Summary
The percentage and number of students in a school who show potential relative
strength (filled in circle), typical performance (half filled in circle), and potential
relative weakness (empty circle) for the reporting categories are provided for each
standard. At the state level, the distribution is approximately 15%/70%/15%.
CMAS District Reports
D-9
H. State, District, and School Average
For comparison purposes, the average overall scale score and content standard
(reporting category) scale score are shown for the state, district, and school.
I. Overall Performance Level
The overall performance level is indicated for each student on the roster.
J. Overall Scale Score
The overall scale score is indicated for each student on the roster.
K. SEM Range
The standard error of measurement (SEM) is related to the reliability of the
assessment. It can vary across the range of scale scores, especially at the very
high and low ends where there typically are fewer items measuring that level of
achievement. The SEM represents the range of overall scores the student would
be likely to earn if the assessment was taken again.
L. Results for Each Content Standard (Reporting Category): Scale Score and
Performance Indicator
For each content standard (reporting category), the student’s scale score (SS)
and performance indicator (PI) of potential relative strength, typical performance,
or potential relative weakness is shown.
M. Process number
The process number found in the bottom-right corner of the report is a unique
number, per administration, that is assigned to the report by the testing contractor.
Prepared Graduate Competencies and Grade Level Expectations Performance
(Refer to Page 2 of the Content Standards Roster)
N. Student information
Students are identified by first, last, and middle initial.
O. State, District, and School Average
For comparison purposes, the average percentage correct is shown for the
Prepared Graduate Competencies (PGC) at the state, district, and school levels. If
there are two or more Grade Level Expectations under a PGC, percent correct is
shown for these as well.
P. Prepared Graduate Competencies and Grade Level Expectations
Prepared Graduate Competencies (PGCs) and Grade Level Expectations (GLEs)
are important parts of the Colorado Academic Standards. PGCs represent the
concepts and skills students need to master in order to be college and career
CMAS District Reports
D-10
ready by the time of graduation. The GLEs are grade-specific expectations that
indicate that students are making progress toward the PGCs.
Q. Points Possible
The number of points possible for each PGC and GLE is identified.
R. Performance for Prepared Graduate Competencies and Grade Level
Expectations
This section of the report describes performance with percent correct for PGCs
and GLEs. The percentage correct for each PGC is presented. If there is more
than one GLE within a PGC, then the percentage correct by GLE is also provided.
The PGCs and GLEs are listed in the same order using the same number
references as they appear on the second page of the student performance report.
The order and text for each PGC and GLE is included in Appendix C.
CMAS District Reports
D-11
Sample CMAS Content Standards Roster
CMAS District Reports
D-9
CMAS District Reports
D-9
CMAS Item Analysis Report
A CMAS Item Analysis Report is available at the district and school levels for each
assessed grade level and content area. The report includes item level score information
at the school, district and state levels. The back of the report includes item map
information.
Information included on the Item Analysis Report can be used to identify patterns of
items where a school(s) is performing better or worse than the district or state or where a
district is performing better or worse than the state. For example, within a particular
GLE, a school within a district may be out-performing the district and the state while the
school may be performing worse than the district and the state in another GLE. In
combination with other evidence and data, schools and districts can use the information
in the CMAS Item Analysis Report to identify patterns across standards and GLEs that
may be indicative of potential areas of strength or weakness. A sample Item Analysis
Report is included on pages D16-D17.
General Information (Refer to Page 1 of the Item Analysis Report)
A.
Identification Information
The report identifies the school and district name and code.
B.
Test Date
The administration season and year is indicated.
C.
Subject Area
The subject area of the report is identified (either science or social studies).
D.
Grade Level
The grade level of the assessment is indicated.
The general information is repeated on page 2 of the report.
Item Analysis Information (Refer to Page 1 of the Item Analysis Report)
E.
Number of students with valid scores
The number of students with valid scores is indicated.
F.
Graph Key
The graph key includes explanatory text for the symbols and lines in the
bars in the graph: State, District and School.
CMAS District Reports
D-14
G.
Percent of Average Points Earned
The percent of average points earned is included to the left of the graphical
representation of state, district and school performance by item. Items that
were more difficult for students across the state have a lower percent of
average points earned. For 1-point selected response items, the percent of
students who correctly responded is recorded. For 2- and 3-point
constructed response items, the average points earned is divided by 2 or 3
respectively in creating the percentage.
H.
Numbered Items
Items are identified by numbers in blue text at the bottom of the graph and
are ordered from most difficult to least difficult, such that the easiest item is
labeled as 1.
I.
Standard and Grade Level Expectation (GLE)
Below each item number, the standard and Grade Level Expectation (GLE)
for that item is listed.
J.
Graphical Representation of State, District, and School Level
Performance by Item
The graphical representation shows how the state, district, and school
performed on each operational item. The state is represented as a red line
with triangles, the district is represented as a blue line with circles, and the
school is represented by a green line with squares.
K.
Document Process Number
The document number located in the bottom-right corner of the report is a
unique number that is assigned by the testing contractor.
Item Map Information (Refer to page 2 of the Item Analysis Report)
L.
Item Map Information
Page 2 of the Item Analysis Report includes information for all of the
operational items that were included on the assessment. Items are
ordered from most to least difficult. For each item, the following information
is included:

Difficulty Order Most to Least (matches page 1)

Standard and Grade Level Expectation (GLE) Numbers

Location on the test (section number and item number)

Standard by Name

Prepared Graduate Competency (PGC)

Grade Level Expectation (GLE)

Item Type (Selected Response (SR); 2-point Constructed Response
(CR-2); 3-point constructed Response (CR-3)
CMAS District Reports
D-15
Sample Item Analysis Report – Page 1
CMAS District Reports
D-16
Sample Item Analysis Report – Page 2
CMAS District Reports
D-17
Section E
An Educator Guide to Understanding the
Colorado Assessments (CoAlt) Science and
Social Studies School and District Reports
Program Overview
CoAlt is the standards-based assessment designed specifically for students with
significant cognitive disabilities who are unable to participate in CMAS, even with
accommodations. Students must meet eligibility requirements to take CoAlt. CoAlt
assesses the performance expectations of the CAS for students with significant cognitive
disabilities as expressed in the Extended Evidence Outcomes (EEOs) of the Colorado
Academic Standards (CAS). The primary purpose of the assessment program is to
determine the level at which Colorado students with significant cognitive disabilities meet
the EEOs of the CAS in the content areas of science and social studies at the tested
grade level. CoAlt results are intended to provide one measure of a student’s academic
progress relative to the CAS. CoAlt science assessments are given to students in grades
5 and 8 while CoAlt social studies assessments are given in grades 4 and 7 each spring.
CoAlt assessments are administered in a one-on-one setting between teachers and
students. Teachers use CoAlt scoring rubrics to evaluate student responses before
submitting performance results.
This section of the guide explains the elements of the CoAlt school and district reports.
Purpose and Use of CoAlt Results

Report on the status and progress of student achievement

Make judgments about student learning relative to standards

Gauge school, district, and state year-to-year progress

Improvement planning (e.g., prioritize professional learning and resource
decisions, consider program alignment to academic standards, reflect on
effectiveness of school initiatives)
Standardized assessments are a valuable tool for evaluating programs. However, any
CoAlt School and District Reports
E-1
assessment can furnish only one part of the picture. CoAlt science and social studies
assessment results are not able to identify, let alone measure, every factor that
contributes to the success or failure of a program. Assessment results can be most
helpful if considered as one component of an evaluation system.
CoAlt School and District Reports
In addition to individual Student Performance Reports, districts receive Performance
Level Summaries and schools and districts receive Content Standards Rosters. These
reports summarize how students in the school or district performed and are described
later in this section.
Note that the sample reports included in this guide are for illustration only. They are
provided to show the basic layout of the reports and the information they include.
Types of Scores on the CoAlt School and District Reports
To understand each part of the Student Performance Report, it is important to become
familiar with the types of assessment scores that are included on the report. Student
performance is described by a scale score, performance level and percent of points
earned.
Scale Scores
When the points a student earns on a test are placed on a common scale, the student’s
score becomes a scale score. Scale scores adjust for slight differences in difficulty on
versions of the assessment that can vary slightly from student to student within a year
(referred to as forms of the assessment) or between school years (referred to as
administrations). The scale score allows for a comparison of assessment scores, within a
particular grade and subject area, across administrations. As an example, a student who
receives a score of 125 on one form of the 7th grade social studies assessment is
expected to score a 125 on any form of the assessment. Scale scores maintain their
meaning and can be compared across years. A student who scores 125 on 8th grade
science in 2015 will demonstrate the same level of mastery of concepts and skills as an
8th grade science student who scores 125 in 2014. For CoAlt science and social studies,
scale scores cannot be used to compare student performance across grades (e.g., grade
4 to grade 7) or subject areas (e.g., science to social studies).
Scale scores for the CoAlt science and social studies assessments range from 0 to 250.
CoAlt School and District Reports
E-2
Scale Scores are reported for the overall test.
Performance Levels
Scale Scores are used to determine a student’s performance level for the overall
assessment. Each performance level describes a range of scores at the overall
assessment level (i.e., science or social studies). The range of scores in each
performance level were recommended by a group of Colorado educators and adopted by
the Colorado State Board of Education. Performance levels describe the concepts and
skills students are expected to demonstrate at each of the levels. The grade level
concepts and skills related to each performance level are listed on Page 2 of the Student
Performance Report. The four cross-grade and content area performance levels and
there are Novice, Developing, Emerging and Exploring. Performance level descriptors are
included in Appendix B of this document.
Students who do not respond to any items on the assessment receive an inconclusive
designation.
The top two Performance Levels indicate that with the appropriate supports, the student is
prepared for further study in the content area.
Percent of Points Earned
Percent of points earned refers to the number of points a student earned out of the total
number of points possible within a reporting category. The percent correct indicator can
only be used to compare performance of the individual student to the average state
performance on the specific set of items being considered. Some groups of items may be
more difficult than other sets of items; so unlike the scale score, the percent of points
earned indicator cannot be compared across groups of items or across school years.
Percent of points earned are provided at the standard level. For social studies, the
standards are history, geography, economics, and civics. For science, the standards are
physical science, life science and earth systems science.
Appropriate Score Comparisons and Uses
The types of comparisons that can be made differ by the scores being compared. Some
scores (e.g., performance levels and scale scores) allow for cross year comparisons,
some (e.g., percent of points earned) do not. In addition, the reliability of the comparisons
or conclusions made vary depending on the size of the group (e.g., number of points
contributing to a particular score or the number of students included in a comparison
group). In general, the larger the group is the more reliable the comparison or conclusions
made will be. The smaller the group, the less reliable the comparison or conclusions
CoAlt School and District Reports
E-3
made will be. High stakes decisions should not be based on scores of small groups of
students or on scores with a low number of points contributing to them. The below table
provides some of the comparisons that can and cannot be made by particular types of
scores.
SCORE COMPARISONS
Compare an
individual
student’s
performance to
a target group’s
performance
(ex. student to
state) within the
same year
Compare a
group’s
performance to
another group’s
performance
(ex. one district
to another
district, a
district to the
state, students
of one
race/ethnicity
group to
students in
another
race/ethnicity
group) within
the same year
Compare an
individual
student’s
performance to
a target group’s
performance
(ex. state)
across years
Compare a
group’s
performance to
the same
group’s
performance
across years
Performance
Levels
YES
YES
YES
YES
Scale
Scores
YES
YES
YES
YES
YES
NO
(These are
specific to the
year of the
assessment.)
NO
(These are
specific to the
year of the
assessment.)
Percent of
points
earned
YES
Compare to
other scores of
the same type
in a different
subject or
grade
NO
(These are
content and
grade specific)
NO
(These are
content and
grade specific)
NO
(These are
specific to the
standards.)
Some assessment scores can be used to compare the performance of different
demographic or program groups. All CoAlt scores in science (grades 5 and 8) and in
social studies (grades 4 and 7) can be analyzed within the same grade and subject area
for any single administration to determine which demographic or program group had the
highest average scale score, the lowest percentage achieving Novice, the highest
percentage achieving Developing, etc.
Other scores can be used to help evaluate the academic performance of demographic or
program groups. For example, aggregations of reporting-category data can help districts
and schools identify areas of potential academic weakness for a group of students. This
CoAlt School and District Reports
E-4
same methodology can be applied to an entire school or district.
In addition, all assessment scores can be compared to district and statewide performance
within the same subject area for any administration.
CoAlt Performance Level Summary Report
The Performance Level Summary Report is available for each grade assessed in a
district. It contains aggregated performance level information across the district and state.
It also contains disaggregated performance level data by student demographic and
program categories and subgroups for the district. A sample Performance Level Summary
Report is included on page E-7.
General Information
A. Identification Information
The report identifies the school and district name and code.
B. Test Date
The administration season and year is indicated.
C. Subject Area
The subject area of the report is identified (either science or social studies).
D. Grade Level
The grade level of the assessment is indicated.
Performance Level Distribution Data
E. Demographic and Program Categories and Subgroups
Demographic and program categories with subgroups are listed on the left side of
the table. Results for students for whom no demographic or program information
was coded are included in the “not indicated” subgroups.
F. Total Number Tested
The number of students who should have been assessed is provided.
G. Average Scale Score
The average scale score is displayed for state and district and for each
demographic or program subgroup. The average does not include students with no
scores.
CoAlt School and District Reports
E-5
H. Performance Level Results
The number and percentage of students who achieved Exploring, Emerging,
Developing and Novice, as well as aggregated Developing and Novice, are
displayed for each demographic or program subgroup
I. No Scores Reported
The number of students registered to take the CoAlt who did not receive scores or
meet the Inconclusive criterion is included. They are not included in the
denominator for the Performance Level percentages.
J. Process number
The process number found in the bottom-right corner of the report is a unique
number, per administration, that is assigned to the report by the testing contractor.
CoAlt School and District Reports
E-6
Sample CoAlt Performance Level Summary Report
CoAlt School and District Reports
E-7
CoAlt Content Standards Roster
The Content Standards roster is available for each grade assessed at each school. It lists
every student for whom an online record was submitted. This report provides the overall
and standards level data for each student. It also provides the same information
aggregated at the state level. A sample Content Standards Roster is included on page E10.
General Information (Refer to Page 1 of the Content Standards Roster)
A. Identification Information
The report identifies the school and district name and code.
B. Test Date
The administration season and year are indicated.
C. Subject Area
The subject area of the report is identified (either Science or Social Studies).
D. Grade Level
The grade level of the assessment is indicated.
Performance Level and Content Standards Information (Refer to Page 1
of the Content Standards Roster)
E. Key
The key indicates the ranges of scale scores for each performance level for the
overall test.
F. Student Information
Students are identified by first, last, and middle initial. Expelled students are
included only in the district version of the roster and their status is indicated in this
section.
G. Overall Performance Level
The overall performance level is indicated for each student on the roster.
H. State, District, and School Average Scale Score
The average scale score is shown for the state, district, and school. Below, the
scale score for each student is shown. Students with an Inconclusive designation
do not have a scale score.
CoAlt School and District Reports
E-8
I. Points Possible
The number of points possible for each content standard is shown.
J. Percentage of Points Earned
This section of the report describes performance with percent of points earned by
content standard. The average percentage of points earned for the state, district,
and school are shown. Below, the percent points earned for each student is
shown. These fields are blank for students with an Inconclusive designation.
K. Process number
The process number found in the bottom-right corner of the report is a unique
number, per administration, that is assigned to the report by the testing contract.
CoAlt School and District Reports
E-9
Sample CoAlt Content Standards Roster
F-10
Section F
Appendices
Appendices
F-1
Appendix A
CMAS Performance Level Descriptors
Appendices
F-2
Colorado Measures of Academic Success: Grade 4 Social Studies
Performance Level Descriptors (PLDs)
st
Students demonstrate mastery of social studies concepts and 21 century skills aligned to the Colorado
Academic Standards at various performance levels. The performance level descriptors are organized in a
manner that assumes students demonstrating higher levels of command have mastered the concepts and
skills within the lower levels. For example, a student at moderate command also masters the concepts and
skills of limited command.
At Distinguished Command, a student typically can
 analyze primary source documents and connect the various eras and events in Colorado history to events in
U.S. and World History;

use geographic tools to investigate and analyze settlement patterns, how people adapt to and modify the
physical environment, and how places in Colorado have changed over time;


analyze opportunity costs and ways to reduce financial risk to make financial decisions; and
analyze multiple perspectives on an issue and provide solutions.
At Strong Command, a student typically can
 explain cause-and-effect relationships present in Colorado history using historical tools such as organizing
and sequencing events and reading primary sources;

create and investigate questions about Colorado in relation to other places and examine the connections
between the physical environment and human activities such as migration;

explain how the natural, human, and capital resources of Colorado have influenced the types of goods and
services provided;



analyze opportunity costs and risk to make financial decisions;
compare arguments for both sides of a public policy debate; and
explain the origins, structure, and functions of the Colorado government and its relationship with local and
federal governments.
At Moderate Command, a student typically can
 describe how the people and cultures who have lived in Colorado have interacted with each other and have
affected the development of Colorado;

describe how Colorado’s political structure developed, including the Colorado Constitution and the
relationship between state and national government;

compare the physical geography of Colorado with that of neighboring states and describe how places in
Colorado are connected by technology and the movement of goods and services;



identify and define types of economic incentives, choices, opportunity costs, and risks that individuals face;
connect goods and services produced throughout Colorado’s history to economic incentives; and
provide examples of civic and political issues faced by the state.
At Limited Command, a student typically can
 recognize that major political and cultural groups have affected the development of Colorado;




use maps, grids, and other geographic tools to answer questions about Colorado;
describe various technological developments, including those that affect Colorado industries;
identify goods and services produced in Colorado; and
identify the structure and functions of the Colorado government and the services it provides.
Appendices
F-3
Colorado Measures of Academic Success: Grade 7 Social Studies
Performance Level Descriptors (PLDs)
st
Students demonstrate mastery of social studies concepts and 21 century skills aligned to the
Colorado Academic Standards at various performance levels. The performance level descriptors
are organized in a manner that assumes students demonstrating higher levels of command have
mastered the concepts and skills within the lower levels. For example, a student at moderate
command also masters the concepts and skills of limited command.
At Distinguished Command, a student typically can
 analyze historical sources while formulating historical questions and defending a thesis;

use geographic tools to investigate and analyze data to make inferences and predictions
regarding regional issues and perspectives in the Eastern Hemisphere;



demonstrate how supply and demand influence changes in equilibrium price and quantity;
evaluate how various governments interact and investigate examples of global collaboration; and
apply various definitions of good government to evaluate the actions of different governments.
At Strong Command, a student typically can
 explain the historical time periods, individuals, groups, ideas, perspectives, themes, and how
people are interconnected within regions of the Eastern Hemisphere;

summarize the development of early civilizations, including Greece, Rome, China, Africa, and the
medieval world;




describe how the physical environment influences economy, culture, and trade patterns;
explain how resources, production, choices, supply, demand, price, profit, and taxes are related;
analyze how national and international government policies influence the global community; and
compare the rights, roles, and responsibilities of citizens in various governments.
At Moderate Command, a student typically can
 describe the contributions of various peoples and cultures in the Eastern Hemisphere;

compare different physical systems and cultural patterns to describe how different regions and
places are interconnected;





examine multiple points of view and issues in various regions in the Eastern Hemisphere;
recognize how supply and demand influence price, profit, and production in a market economy;
compare how taxes affect individual income and spending;
compare different forms of government in the world and their sources of authority; and
explain the rights and roles of citizens in various governments.
At Limited Command, a student typically can
 recognize the contributions of various peoples and cultures to the Eastern Hemisphere;




use geographic tools to answer questions and identify patterns in the Eastern Hemisphere;
identify factors that cause changes in supply, demand, and price;
define resources and identify trade patterns based on the distribution of resources; and
list the responsibilities and roles of citizens in various governments.
Appendices
F-4
Colorado Measures of Academic Success: Grade 5 Science
Performance Level Descriptors (PLDs)
st
Students demonstrate mastery of science concepts and 21 century skills aligned to the Colorado
Academic Standards at various performance levels. The performance level descriptors are
organized in a manner that assumes students demonstrating higher levels of command have
mastered the concepts and skills within the lower levels. For example, a student at moderate
command also masters the concepts and skills of limited command.
At Distinguished Command, a student typically can
 evaluate and provide feedback on scientific evidence and reasoning about the separation of
mixtures and how separation affects the total weight/mass;
 develop hypotheses about why similarities and differences exist between the body systems and
parts of humans, plants, and animals;
 evaluate scientific claims about natural resources, in terms of reasonability and validity; and
 assess and provide feedback, through reasoning based on evidence, on scientific explanations
about weather and factors that change Earth’s surface.
At Strong Command, a student typically can
 explain why certain procedures that are used to separate simple mixtures work and discuss any
unexpected results;
 evaluate evidence and models of the structure and functions of human, plant, and animal organs
and organ systems;
 investigate and generate evidence that human systems are interdependent;
 analyze and interpret data to explore concerns associated with natural resources; and
 formulate testable questions and scientific explanations around weather and factors that change
Earth’s surface.
At Moderate Command, a student typically can
 discuss how the mass/weight of a mixture is a sum of its parts and design a procedure to
separate simple mixtures based on physical properties;
 create models of human, plant, and animal organ systems, and compare and contrast similarities
and differences between the organisms;
 explore and describe the origins and usage of natural resources in Colorado; and
 interpret data about Earth, including weather and changes to Earth’s surface.
At Limited Command, a student typically can
 select appropriate tools and follow procedures to separate simple mixtures;
 identify how humans, plants, and animals address basic survival needs;
 identify the functions of human body systems;
 distinguish between renewable and nonrenewable resources; and use appropriate tools and
resources to gather data regarding weather conditions and Earth processes.
Appendices
F-5
Colorado Measures of Academic Success: Grade 8 Science
Performance Level Descriptors (PLDs)
st
Students demonstrate mastery of science concepts and 21 century skills aligned to the Colorado
Academic Standards at various performance levels. The performance level descriptors are
organized in a manner that assumes students demonstrating higher levels of command have
mastered the concepts and skills within the lower levels. For example, a student at moderate
command also masters the concepts and skills of limited command.
At Distinguished Command, a student typically can
 design an investigation to predict the movement of an object by examining the forces applied to it;
 use models to predict amounts of energy transferred;
 analyze data and models to support claims about genetic reproduction and traits of individuals;
 use observations and models to develop and communicate a weather prediction; and
 evaluate scientific theories and investigations that explain how the solar system was formed.
At Strong Command, a student typically can
 use mathematical expressions and appropriate information from sources to describe the movement
of an object;
 analyze different forms of energy and energy transfer using tools;
 construct an experiment to show mass is conserved;
 investigate the characteristics and behaviors of waves using models, technology, and basic rules of
waves;
 analyze human impact on local ecosystems;
 use mathematics to predict the physical traits and genetic makeup of offspring; and
 relate tides, eclipses, lunar phases, and seasons to the motion and positions of the Sun, Earth, and
the Moon, using the basic rules of the solar system.
At Moderate Command, a student typically can
 analyze speed and acceleration of moving objects;
 describe different forms of energy and energy transfer;
 use a variety of sources, including popular media and peer-generated explanations, to investigate
and describe an environmental issue;
 analyze data and historical research for various weather conditions and compare to historical data
for that date and location; and
 investigate and ask testable questions about Earth’s different climates using various techniques.
At Limited Command, a student typically can
 distinguish between physical and chemical changes;
 recognize the relationship between pitch and frequency in sound;
 identify human activities that alter the ecosystem;
 recognize that genetic information is passed from one generation to the next;
 compare basic and severe weather conditions and develop an action plan for safety; and
 use tools and simulations to explore the solar system.
Appendices
F-6
Appendix B
CoAlt Performance Level Descriptors
Appendices
F-7
Colorado Alternate Grade 4 Social Studies Performance Level Descriptors (PLDs)
Students demonstrate social studies concepts and skills aligned to the Grade
Level Expectations and Extended Evidence Outcomes contained in the Colorado
Academic Standards.
At Novice Level, with appropriate support, a student can typically:
 Identify historical eras, groups (e.g., miners, settlers and farmers), ideas, and themes in Colorado
history
 Identify the cause and effect of growth in Colorado during various key events in U.S. history
 Integrate historical knowledge with geographical skills
 Recognize that particular dwellings, tools, and modes of transportation are specific to certain
geographic areas and cultures in Colorado’s history
 Identify regions and activities of Colorado based on specific physical features and label a map
 Identify choice and opportunity cost and compare the difference between the two
 Identify a specific perspective on an issue
 Identify the origins and structures of government
At Developing Level, with appropriate support, a student can typically:
 Sequence Colorado historical events
 Identify the locations of specific activities or events in Colorado’s history
 Identify specific factors that affected the growth of Colorado
 Match tools, modes of transportation, and products to natural resources or locations in Colorado
 Label a map using given map symbols
 Identify ways in which Colorado communities and markets were (and are) connected
 Identify the approximate value of goods
 Identify the functions of different levels of government
 Identify how people respond to positive and negative consequences
At Emerging Level, with appropriate support, a student can typically:
 Match historical Colorado cultures with related artifacts, modes of transportation, and resources
 Match physical, natural, and geographic features on a map to their appropriate symbols
 Identify types of goods, services and resources native to Colorado
 Recognize that items vary in their value
 Recognize that there are different levels of governance
At Exploring Level, with explicit modeling, a student can typically:
 Identify artifacts (e.g., tools, housing, modes of transportation and clothing) related to Colorado
history
 Identify features on a map of Colorado
 Recognize that items have value
 Recognize emergency situations and appropriate responses that affect members of the Colorado
community
 Recognize that there are laws and rules
An Inconclusive designation is given to students who did not respond to any items on the
assessment.
Appendices
F-8
Colorado Alternate Grade 7 Social Studies Performance Level Descriptors (PLDs)
Students demonstrate social studies concepts and skills aligned to the Grade Level Expectations
and Extended Evidence Outcomes contained in the Colorado Academic Standards.
At Novice Level, with appropriate support, a student can typically:
 Determine appropriate questions to ask in order to learn about specific historical events
 Compare information from multiple sources related to a significant historical event
 Identify the best source of information regarding a historical event and use a historical event to
match a source with a particular perspective
 Match natural resources with ancient communities and their dwellings
 Use a map to determine where to go for a specific purpose and to determine the direction in which
to travel from one point to another
 Estimate the total purchase price of an item with sales tax included
 Recognize how supply and demand can affect price
 Recognize rights and responsibilities of citizens
At Developing Level, with appropriate support, a student can typically:
 Match artifacts with their ancient culture or location within the Eastern Hemisphere
 Select the appropriate source of information to answer questions surrounding historical events
 Recognize that sources have different purposes
 Use map symbols and directionality words to locate places on a map
 Recognize that communities were built near natural resources
 Identify the environmental resources that influenced settlement in the Eastern Hemisphere
 Recognize that the total purchase price of an item will increase because of sales tax
 Identify community needs or services that are paid for by taxes
 Differentiate between laws and rules
 Identify the positive and negative consequences of obeying laws and rules
At Emerging Level, with appropriate support, a student can typically:
 Recognize significant artifacts related to ancient civilizations of the Eastern Hemisphere
 Select the appropriate source of information to answer social studies questions
 Identify the appropriate questions to ask in order to learn more about an event or era
 Use symbols to identify a location on a map
 Identify reasons goods and services might go on sale
 Identify ways in which countries and nations resolve differences
 Recognize local laws, state laws, and federal laws and identify examples of following these
laws/rules
At Exploring Level, with explicit modeling, a student can typically:
 Recognize artifacts
 Identify part(s) of a map (e.g., title, key, compass rose, scale)
 Recognize there are different types of informational resources
 Recognize that areas have different natural resources
 Recognize that many items have a sales tax
 Recognize that all countries have laws
An Inconclusive designation is given to students who did not respond to any items on the
assessment.
Appendices
F-9
Colorado Alternate Grade 5 Science Performance Level Descriptors (PLDs)
Students demonstrate science concepts and skills aligned to the Grade Level Expectations and
Extended Evidence Outcomes contained in the Colorado Academic Standards.
At Novice Level, with appropriate support, a student can typically:
 Demonstrate that the weight of a mixture is the same before and after separation
 Distinguish between healthy choices and unhealthy choices for the human body
 Compare and contrast characteristics between groups of plants and groups of animals
 Sort animals by observable characteristics
 Identify ways to conserve resources
 Identify landforms that are created by Earth’s forces
 Identify forms of precipitation by physical characteristics
At Developing Level, with appropriate support, a student can typically:
 Determine the weight of an individual component of a mixture after separation
 Identify the function of the internal organs of the human body
 Recognize a relationship between healthy choices and a healthy body
 Understand how plants and animals get the food they need to survive
 Compare the physical characteristics of plants to plants and animals to animals
 Distinguish between renewable and nonrenewable resources
 Identify forces that create common landforms
 Use weather condition symbols to recognize different types of weather based on observable
characteristics
At Emerging Level, with appropriate support, a student can typically:
 Identify physical properties of matter
 Select appropriate tools to separate simple mixtures based on physical properties
 Separate simple mixtures based on physical properties
 Identify the functions of the sensory organs, stomach, lungs and heart
 List ways to maintain a healthy body
 List observable characteristics of animals
 Match animals to animals and plants to plants based on similar physical characteristics
 List basic survival needs for plants and animals
 List Earth’s resources
 Identify a source of energy as renewable or nonrenewable
 Label basic landforms of Earth
 Compare forms of precipitation
At Exploring Level, with explicit modeling, a student can typically:
 Recognize physical properties of matter
 Identify observable parts of the human body
 Recognize basic survival needs for plants and animals
 Identify basic Earth resources
 Recognize basic landforms of Earth
 Identify common forms of precipitation (e.g., rain and snow)
 Recognize sources of daily/weekly weather information
An Inconclusive designation is given to students who did not respond to any items on the
assessment.
Appendices
F-10
Colorado Alternate Grade 8 Science Performance Level Descriptors (PLDs)
Students demonstrate science concepts and skills aligned to the Grade Level Expectations and
Extended Evidence Outcomes contained in the Colorado Academic Standards.
At Novice Level, with appropriate support, a student can typically:
 Match an object to itself before and after a physical or chemical change
 Compare and contrast different water or sound waves using wave characteristics
 Determine if different materials can absorb, reflect, or refract light
 Predict the effect of a human activity on a local ecosystem
 Identify why the appearances of the Sun and the moon change in the sky, including phases of the
moon and eclipses
At Developing Level, with appropriate support, a student can typically:
 Determine an object’s directionality and compare the speeds of moving objects
 Determine sources for light and heat
 Determine if an object has undergone a physical or chemical change
 Identify sources of waves
 Identify human activities that have an effect on local ecosystems
 Identify traits that are passed down from parent to child
 Compare safe and unsafe practices during severe weather conditions
 Use models and simulations to explore the motions of Earth, the moon, and the Sun
At Emerging Level, with appropriate support, a student can typically:
 Recognize that the speed and direction of a force can change moving objects
 Compare different forms of energy
 Label chemical and physical changes
 Label different types of waves
 Recognize the effect of human activity on the local ecosystem
 Identify similarities and differences in parents and children
 Identify severe weather conditions and follow a simple action plan for severe weather
 Recognize facts and fiction in regards to space exploration
At Exploring Level, with explicit modeling, a student can typically:
 Identify objects changing speed while moving
 Recognize that heat, light, and electricity are forms of energy
 Identify different types of waves
 Recognize stages of human aging
 Recognize different weather conditions
 Identify different climates
 Identify scientific tools related to weather and space exploration
 Acknowledge that celestial objects have patterns of movement
An Inconclusive designation is given to students who did not respond to any items on the
assessment.
Appendices
F-11
Appendix C
Prepared Graduate Competency and Grade Level
Expectation Ordering
Appendices
F-12
Prepared Graduate Competency and Grade Level Expectation Ordering
CMAS Social Studies
Grade 4
1
PGC 1
GLE 1
PGC 2
GLE 2
2
PGC1
GLE 1
PGC 2
GLE 2
3
PGC 1
GLE 1
History
Develop an understanding of how people view, construct, and interpret history
Organize and sequence events to understand the concepts of chronology and cause and
effect in the history of Colorado
Analyze key historical periods and patterns of change over time within and across
nations and cultures
The historical eras, individuals, groups, ideas and themes in Colorado history and their
relationships to key events in the United States
Geography
Develop spatial understanding, perspectives, and personal connections to the world
Use several types of geographic tools to answer questions about the geography of
Colorado
Examine places and regions and the connections among them
Connections within and across human and physical systems are developed
Economics (PFL)
Understand the allocation of scarce resources in societies through analysis of individual
choice, market interaction, and public policy
People respond to positive and negative incentives
GLE 2
Acquire the knowledge and economic reasoning skills to make sound financial decisions
(PFL)
The relationship between choice and opportunity cost (PFL)
4
Civics
PGC 2
PGC 1
Analyze and practice rights, roles, and responsibilities of citizens
GLE 1
1. Analyze and debate multiple perspectives on an issue
PGC 2
GLE 2:
Analyze the origins, structure, and functions of governments and their impacts on
societies and citizens
The origins, structure, and functions of the Colorado government
Appendices
F-13
Prepared Graduate Competency and Grade Level Expectation Ordering
CMAS Social Studies
Grade 7
1
History
PGC 1
GLE 2
Develop an understanding of how people view, construct, and interpret history
Seek and evaluate multiple historical sources with different points of view to investigate
a historical question and to formulate and defend a thesis with evidence
Analyze key historical periods and patterns of change over time within and across
nations and cultures
The historical eras, individuals, groups, ideas and themes within regions of the Eastern
Hemisphere and their relationships with one another
2
Geography
GLE 1
PGC 2
PGC 1
GLE 1
PGC 2
GLE 2
3
PGC 1
GLE 1
Develop spatial understanding, perspectives, and personal connections to the world
Use geographic tools to gather data and make geographic inferences and predictions
Examine places and regions and connections among them
Regions have different issues and perspectives
Economics (PFL)
Understand the allocation of scarce resources in societies through analysis of individual
choice, market interaction, and public policy
Supply and demand influence price and profit in a market economy
GLE 2
Acquire the knowledge and economic reasoning skills to make sound financial decisions
(PFL)
The distribution of resources influences economic production and individual choices
(PFL)
4
Civics
PGC 2
PGC 1
GLE 1
PGC 2
GLE 2
Analyze and practice rights, roles, and responsibilities of citizens
Compare how various nations define the rights, responsibilities, and roles of citizens
Analyze the origins, structure, and functions of governments and their impacts on society
and citizens
Different forms of government and international organizations and their influence in the
world community
Appendices
F-14
Prepared Graduate Competency and Grade Level Expectation Ordering
CMAS Science
Grade 5
1
PGC 1
GLE 1
2
PGC 1
GLE 1
Physical Science
Apply an understanding of atomic and molecular structure to explain the properties of
matter, and predict outcomes of chemical and nuclear reactions
Mixtures of matter can be separated regardless of how they were created; all weight and
mass of the mixture are the same as the sum of weight and mass of its parts
Life Science
Analyze how various organisms grow, develop and differentiate during their lifetimes based
on an interplay between genetics and their environment
All organisms have structures and systems with separate functions
PGC 2
Analyze how the relationship between structure and function in living systems at a variety of
organizational levels, and recognize living systems' dependence on natural selection
GLE 2
Human body systems have basic structures, functions, and needs
3
Earth Systems Science
PGC 1
Describe how humans are dependent on the diversity of resources provided by Earth and Sun
GLE 1
Earth and sun provide a diversity of renewable and nonrenewable resources
PGC 2
GLE 2
GLE 3
Evaluate evidence that Earth's geosphere, atmosphere, hydrosphere, biosphere interact as a
complex system
Earth’s surface changes constantly through a variety of processes and forces
Weather conditions change because of the uneven heating of Earth’s surface by the Sun’s
energy. Weather changes are measured by differences in temperature, air pressure, wind, and
water in the atmosphere and type of precipitation
Appendices
F-15
Prepared Graduate Competency and Grade Level Expectation Ordering
CMAS Science
Grade 8
1
PGC 1
GLE 1
PGC 2
GLE 2
GLE 4
PGC 3
GLE 3
2
PGC 1
GLE 1
PGC 2
GLE 2
3
PGC 1
GLE 1
GLE 2
PGC 2
GLE 3
GLE 4
Physical Science
Observe, explain, and predict natural phenomena governed by Newton's laws of
motion, acknowledging the limitations of their application to very small or very fast
objects
Identify and calculate the direction and magnitude of forces that act on an object, and
explain the results in the object’s change of motion
Apply an understanding that energy exists in various forms, and its transformation
and conservation occur in processes that are predictable and measurable
There are different forms of energy, and those forms of energy can be changed from
one form to another – but total energy is conserved
Recognize that waives such as electromagnetic, sound, seismic, and water have
common characteristics and unique properties
Apply an understanding of atomic and molecular structure to explain the properties
of matter, and predict outcomes of chemical and nuclear reactions
Distinguish between physical and chemical changes, noting that mass is conserved
during any change
Life Science
Explain and illustrate with examples how living systems interact with the biotic and
abiotic environment
Human activities can deliberately or inadvertently alter ecosystems and their
resiliency
Analyze how various organisms grow, develop, and differentiate during their
lifetimes based on an interplay between genetics and their environment
Organisms reproduce and transmit genetic information (genes) to offspring, which
influences individuals’ traits in the next generation
Earth Systems Science
Evaluate evidence that Earth's geosphere, atmosphere, hydrosphere, and biosphere
interact as a complex system
Weather is a result of complex interactions of Earth's atmosphere, land and water,
that are driven by energy from the sun, and can be predicted and described through
complex models
Earth has a variety of climates defined by average temperature, precipitation,
humidity, air pressure, and wind that have changed over time in a particular location
Describe and interpret how Earth's geologic history and place in space are relevant to
our understanding of the processes that have shaped our planet
The solar system is comprised of various objects that orbit the Sun and are classified
based on their characteristics
The relative positions and motions of Earth, Moon, and Sun can be used to explain
observable effects such as seasons, eclipses, and Moon phases
Appendices
F-16
F-17
Structure of Evaluation
Teacher Evaluations
Evaluated using: (1) a measure of
individually-attributed growth, (2)
a measure of collectivelyattributed growth; (3) when
available, statewide summative
assessments; and (4) where
applicable, Colorado Growth
Model data.
This year districts can
have another year of
practice for the growth
measures (SB 165)
Evaluated using: (1) observations;
and (2) at least one of the
following: student perception
measures, peer feedback,
parent/guardian feedback, or
review of lesson plans/student
work samples. May include
additional measures.
50%
Student
Academic
Growth
Quality Standard VI:
VI. Responsibility for student academic
growth
Principals have an evaluation system aligned
and almost identical to the teacher system
50%
Professional
Practice
Quality Standards I-V:
I. Mastery of content
II. Establish learning environment
III. Facilitate learning
IV. Reflect on practice
V. Demonstrate leadership
www.cde.state.co.us/accountability
Overview of HB 14-1182
As Colorado implements new standards and assessments, some adjustments will
need to be made to the state’s accountability system. The greatest impact will
come in the spring of 2015 with the transition from the reading, writing and
mathematics Transitional Colorado Assessment Program (TCAP) assessments to
the Colorado Measures of Academic Success (CMAS) assessments that include the
Partnership for Assessment of Readiness for College and Careers (PARCC)
assessments in English language arts and mathematics.
House Bill 14-1182 was passed in 2014 to address the impact of the 2015
assessment transition on school and district accountability.
New, rigorous learning expectations, the Colorado Academic Standards, are now
in place in all Colorado classrooms. The goal of the new standards is to prepare all
students for success in college and careers. New state assessments are needed to
measure student progress towards meeting the new expectations: new standards
require new assessments.
These new assessments provide feedback on student performance in relation to
the new expectations. The CMAS assessments include Colorado-developed
science and social studies assessments and PARCC-developed English language
arts and mathematics assessments. In addition, there will be new alternate
assessments for students with significant cognitive disabilities. These new
assessments introduce a higher baseline for student learning: new assessments
bring new student scores.
A large part of Colorado’s educational accountability system is based on the
results from state assessments; implementing new state assessments has an
impact on district and school accountability. To ensure the validity and fairness of
our accountability frameworks (the District and School Performance Frameworks
(DPF/SPF)), HB 14-1182 outlines adjustments to the education accountability
system during the assessment transition period.
Per the new legislation, 2015 school plan type assignments and district
accreditation ratings will be based on:
 2014 school plan type assignments and district accreditation ratings1
Accountability Timeline
2014-2016*
• August 2014: Preliminary 2014
SPF/DPF reports shared with
districts
• September - November 2014:
Districts submit requests to
reconsider for 2014 plan types
• December 2014: Final 2014
SPF/DPF reports public
• Spring 2015: New CMAS PARCC
assessments administered
• Summer/fall 2015: PARCC
standard setting process
• Fall 2015: Preliminary 2015
school and district plan types
released to districts (based on 2014
ratings, participation and
assurances)
• October 2015: PARCC
achievement results released to
districts and public
• October - December 2015:
Requests to reconsider for 2015
plan types
• February/March 2016: Final
2015 plan types and accreditation
ratings made public
• Spring 2016: Informational
SPF/DPF reports shared with
districts and schools, using PARCC
results and enhanced frameworks
• September 2016: Preliminary
2016 SPF/DPF 2.0 framework
reports released to districts
* Timeline and dates are based on
the best understanding of the
assessment timelines, to date.
Please note that they are subject to
change.
1
2014 ratings will use elementary and middle level CMAS science and social studies results for participation only and not
achievement.
July 2014
2



2015 assessment participation rates
Accreditation assurances (for districts)
Optional: 2014-15 student performance data (aligned with the Colorado Academic Standards) or postsecondary
workforce data that districts may optionally submit through the request to reconsider process
The legislation also allows more flexibility for the State Board of Education to identify additional options for schools
entering Year 5 of the accountability clock during 2015-16.
Because of the state assessment transition, CDE will not produce official 2015 School and District Performance
Frameworks. Instead, preliminary school plan types and district accreditation ratings will be assigned in the fall of 2015
using the criteria listed above. After a more in-depth request to reconsider process during the fall and winter, school
plan types and district accreditation ratings will be finalized and publicized in the late winter of 2016.
Informational framework reports incorporating results from CMAS assessments (both PARCC-developed English
language arts and math and Colorado-developed science and social studies) will be provided for educator use in the
spring of 2016. This will help districts and schools better understand their performance on the new assessments.
Additionally, these informational reports will provide a preview of other adjustments the department is making to
improve the frameworks (SPF/DPF 2.0) based upon feedback from the field.
For 2015 ratings, the request to reconsider process will be an opportunity to share more recent and aligned
performance data with the state. As less state performance data will be available in August 2015 than in prior years, the
2014 ratings will serve as the basis for the 2015 ratings. However, districts and schools may have more recent local
student performance data to share with the state. This additional data will help the state to determine the most
appropriate plan types for schools and districts.
In submitting a request to reconsider for a different school plan type or district accreditation rating, districts will be able
to submit results on local assessments (aligned with the Colorado Academic Standards). Templates for submitting local
data and guidelines for performance expectations will be available for this process by the fall of 2015. Districts may also
submit more recent postsecondary and workforce readiness data.
The reconsideration process is expected to begin in October 2015 and end in January 2016. Districts may begin working
with CDE earlier in the school year to receive help in preparing data for a submission. The department will wait to make
decisions on requests and assignment of final district accreditation ratings and recommendations for school plan types
until the CMAS PARCC assessment results are available.
HB 14-1182 does not pause or stop the accountability clock for schools or districts during the assessment transition
period. The assessment transition will affect schools and districts identified for Priority Improvement and Turnaround in
different ways, based on the year they are entering on the clock.
 Schools and districts entering Year 5 of the state accountability clock on July 1, 2015 will be subject to action by
the State Board of Education on or before June 30, 2016. While this timeline will have been set prior to the
July 2014
3

2015 ratings, the board will be able to consider the results of the 2015 transitional ratings prior to making a
determination of recommended actions.
School and districts entering Year 4 of the state accountability clock on July 1, 2015 may enter Year 5 based on
the 2015 ratings. Thus, it is imperative that a careful review of 2015 student performance results be completed
to determine if the 2014 rating is the most appropriate to use.
Colorado law requires that the State Board of Education recommends specific action for any school, institute or district
remaining on a Priority Improvement or Turnaround plan for five consecutive years. For the 2015-16 school year, and
for ratings given in the 2015-16 school year, HB 14-1182 allows the State Board of Education to recommend an action
NOT specified in statute but still having a “comparable significance and effect.”
Schools assigned and districts accredited with a Priority Improvement or Turnaround plan will continue to receive
differentiated support from the Colorado Department of Education (CDE) during the transition year and beyond.
To date, the current Unified Improvement Planning (UIP) timelines are scheduled to remain in place. Key deadlines
include:
 Jan. 15, 2016 for the CDE review of Priority Improvement and Turnaround plans.
 April 15, 2016 for public posting of school and district plans.
In keeping with the UIP as a continuous improvement process, plans should continue to be implemented, monitored
and adjusted based on the data that is available. During the transition year, it will be useful for schools and districts to
use local assessments that are aligned with the Colorado Academic Standards to update the data analysis and progress
monitoring components (e.g., interim measures) of the plan. Target setting will also need some modifications. More
guidance and trainings to support planning during the assessment transition will be available.
Additional legislation was passed during the 2014 legislative session addressing the impact of the assessment transition
on educator evaluations. This legislation, Senate Bill 14-165, provides flexibility for districts/BOCES regarding the 50
percent measures of student learning/outcomes portion of the evaluation for the 2014-15 school year only. Teachers,
principals and specialized service professionals will receive a rating/score for each standard, including the measures of
student learning/outcomes standard. Districts have flexibility for the 2014-15 school year when determining how much
weight the measures of student learning/outcomes standard counts in the educator’s final evaluation rating. Districts
can decide to weight the measures of student learning rating anywhere from 0-50 percent of the final rating. For more
information, reference the Senate Bill 14-165 fact sheet listed below.





Senate Bill 14-165: www.cde.state.co.us/educatoreffectiveness/sb14165factsheet
Accountability website: www.cde.state.co.us/accountability
Priority Improvement and Turnaround Support:
www.cde.state.co.us/accountability/performanceturnaround
Unified Improvement Planning website: www.cde.state.co.us/uip
To view all CDE fact sheets, visit: www.cde.state.co.us/Communications/factsheetsandfaqs
July 2014
Rules for incorporating Student Growth for Teachers and Principals Evaluation
Excerpt from 1 CCR 301-87
Rules for Student Growth in Teacher Evaluations
5.01 (E) (7)
Method for Evaluating Teacher Performance Related to Student Academic Growth. No later than
July 2013, a description of the method for evaluating Teachers’ performance related to Student Academic Growth.
School Districts and BOCES shall categorize Teachers into appropriate categories based on the availability and technical
quality of student assessments available for the courses and subjects taught by those Teachers. School Districts and
BOCES shall then choose or develop appropriate Measures of Student Academic Growth to be used in the evaluation of
each personnel category. The Department will develop technical guidance, based on research and best practices that
emerge from the pilot of the State Model System and the implementation of other local systems during the Pilot Period,
which School Districts and BOCES may choose to use in developing their own Measures of Student Academic
Growth. This technical guidance shall address methods for ensuring that such Measures of Student Academic Growth
meet minimum standards of credibility, validity, and reliability.
Measures of Student Academic Growth shall be generated from an approach or model that makes design choices explicit
and transparent (e.g., in a value-added model, transparency about student- or school-level factors which are statisticallycontrolled for) and has technical documentation sufficient for an outside observer to judge the technical quality of the
approach (i.e., a value-added system must provide adequate information about the model). Measures of Student
Academic Growth shall be generated from an approach or model that presents results in a manner that can be
understood and used by Educators to improve student performance.
Student Academic Growth shall be measured using multiple measures. When compiling these measures to evaluate
performance against Teacher Quality Standard VI, School Districts and BOCES shall consider the relative technical
quality and rigor of the various measures.
Measures of Student Academic Growth shall include the following:
5.01 (E) (7) (a) A measure of individually-attributed Student Academic Growth, meaning that outcomes on that measure
are attributed to an individual licensed person;
5.01 (E) (7) (b) A measure of collectively-attributed Student Academic Growth, whether on a school-wide basis or across
grades or subjects, meaning that outcomes on that measure are attributed to at least two licensed personnel (e.g.,
measures included in the school performance framework, required pursuant to section 22-11-204, C.R.S.);
5.01 (E) (7) (c) When available, Statewide Summative Assessment results; and
5.01 (E) (7) (d) For subjects with annual Statewide Summative Assessment results available in two consecutive grades,
results from the Colorado Growth Model.
Rules for Student Growth in Principal Evaluations
5.01 (E) (3)
Method for Evaluating Principal Performance Related to Student Academic Growth. No later than
July 2013, a description of the method for evaluating Principals’ performance related to Student Academic Growth. The
Measures of Student Academic Growth used for evaluating Principals’ performance against Quality Standard VII must
meet the following criteria:
5.01 (E) (3) (a) School Districts and BOCES shall ensure that data included in the school performance framework,
required pursuant to section 22-11-204, C.R.S., is used to evaluate Principal performance. School Districts and BOCES
may choose to weight specific components of the school performance framework differently than they are weighted in the
school performance framework, depending on the Principal’s responsibilities and the performance needs of the school, so
long as student longitudinal growth carries the greatest weight.
5.01 (E) (3) (b) School Districts and BOCES shall incorporate at least one other Measure of Student Academic Growth
and must ensure that the Measures of Student Academic Growth selected for Principal evaluations are consistent with the
Measures of Student Academic Growth used for the evaluation of Teachers in each Principal’s school, as described in
section 5.01 (E) (7) of these rules.
5.01 (E) (3) (c) School Districts and BOCES are strongly encouraged to involve principals in a discussion of which of the
available Measures of Student Academic Growth are appropriate to the Principals’ schools and school improvement
efforts.
5.01 (E) (3) (d) Measures of Student Academic Growth shall reflect the growth of students in all subject areas and
grades, not only those in subjects and grades that are tested using Statewide Summative Assessments, and shall reflect
the broader responsibility a Principal has for ensuring the overall outcomes of students in the building.
5.01 (E) (3) (e) School Districts and BOCES shall seek to ensure that Measures of Student Academic Growth
correspond to implementation benchmarks and targets included in the Unified Improvement Plan for the school at which a
Principal is employed.
5.01 (E) (3) (f) School Districts and BOCES shall seek to ensure that Measures of Student Academic Growth are valid,
meaning that they measure growth towards attainment of the academic standards adopted by the local school board
pursuant to § 22-7-1013, C.R.S. and that analysis and inferences from the measures can be supported by evidence and
logic.
5.01 (E) (3) (g) School Districts and BOCES shall seek to ensure that Measures of Student Academic Growth are
reliable, meaning that the measures should be reasonably stable over time and in substance and that data from the
measures will be sufficient to warrant reasonably consistent inferences.
5.01 (E) (3) (h) Early Childhood - Grade 3. For the evaluations of Principals responsible for students in early childhood
education through grade 3, measures shall be consistent with outcomes used as the basis for evaluations for Teachers
teaching these grade levels, which may include, but are not limited to, assessments of early literacy and/or mathematics
shared among members of the school community that may be used to measure student longitudinal growth.
5.01 (E) (3) (i) Grades 4 - 8. For the evaluation of Principals responsible for students in grades 4-8, a portion of the
Principal’s evaluation for Quality Standard VII shall be based on the results of the Colorado longitudinal growth model,
calculated pursuant to section 22-11-203, C.R.S., for subjects tested by Statewide Summative Assessments. The weight
of this measure may be increased to reflect the increased proportion of subjects covered by Statewide Summative
Assessments over time. A portion of the Principal’s evaluation for Quality Standard VII also shall be based on other
appropriate Measures of Student Academic Growth for students in grades 4-8, which may include, but are not limited to,
Measures of Student Academic Growth shared among the evaluated personnel in the school.
5.01 (E) (3) (j) Grades 9 - 12. For the evaluation of Principals responsible for students in grades 9-12, a portion of the
Principal’s evaluation for Quality Standard VII shall be based on the results of the Colorado longitudinal growth model,
calculated pursuant to section 22-11-203, C.R.S., for subjects tested by state summative assessments. To account for
the portion of Teachers without direct or indirect results from the Colorado longitudinal growth model, a portion of a
Principal’s growth determination may be based upon appropriate Measures of Student Academic Growth for personnel
teaching in subjects and grades not tested by Statewide Summative Assessments, which may include, but are not limited
to, Measures of Student Academic Growth shared among evaluated personnel in the school.
5.01 (E) (3) (k) For the evaluation of Principals responsible for students in multiple grade spans, School Districts and
BOCES shall select a combination of Measures of Student Academic Growth reflecting the grade levels of all students in
the school.
5.01 (E) (3) (l) When compiling Measures of Student Academic Growth to evaluate performance against Principal
Quality Standard VII, School Districts and BOCES shall give the most weight to those measures that demonstrate the
highest technical quality and rigor.
Preliminary Report of Themes from the First Administration of Science and
Social Studies Assessments and PARCC Field Testing
Spring 2014
Introduction
The Colorado Measures of Academic Success (CMAS) assessment program includes the
Colorado-developed science and social studies assessments and the Partnership for Assessment
of Readiness for College and Careers (PARCC)-developed English language arts and mathematics
assessments. PARCC is a multi-state consortium developing computer-based English language
arts and mathematics assessments. Current PARCC states include Arkansas, Colorado, Illinois,
Louisiana, Maryland, Massachusetts, Mississippi, New Jersey, New Mexico, New York, Ohio,
Rhode Island and the District of Columbia.
CMAS assessments were implemented across the state in the spring of 2014. In all elementary
and middle schools, assessments were given to students in social studies in grades 4 and 7 and
in science in grades 5 and 8. Additionally, approximately 524 schools in 96 Colorado school
districts participated in the PARCC English language arts and mathematics field test. English
language arts field tests were administered in grades 3-11 and mathematics assessments were
administered in grades 3-8 with three high school assessments (available for traditional or
integrated/international pathways).
This document summarizes themes from the first administration of the science and social
studies assessments and field testing of the English language arts and math assessments. CDE
is continuing to receive and review data from the first administration of these assessments and
will provide additional information at the August 2014 meeting of the State Board of Education.
CMAS: Science and Social Studies Spring Administration
Approximately 251,000 elementary and middle school students took CMAS Science and Social
Studies assessments this spring. The computer-based assessments were comprised of selected
response, constructed response, and technology-enhanced item types. Science assessments
included simulation activities that involved a computer-enabled animation, experiment, or
other activity that allowed students to change variables, observe processes, and interact with
environments. Social studies assessments included performance event activities based on
multiple sources. Simulations and performance events required students to answer a series of
constructed response, selected response, and technology-enhanced questions.
Prepared by CDE Staff, June 2014
1
Site Readiness for the First Operational Assessment Administration
A variety of trainings and resources were provided to the field prior to the spring science and
social studies administration. Site readiness trainings that walked district technology
coordinators through the preparations for standing up the technology for their assessment
environments were provided both live and via webinar. Additionally, a technical site readiness
support website was created to provide a checklist and links to supporting documentation.
For continued support beyond training, 20 regional district technology coordinator meetings
were held in 10 regions across the state. These meetings provided an opportunity for local
district technology coordinators to meet, exchange expertise and ideas, and receive specific
support for the unique challenges faced by districts in preparing their assessment
environments. A Google group was used to create a community of practice for district
technology coordinators. The community of practice was designed to provide a place for
communication and collaboration about site readiness for online assessments.
CDE and Pearson conducted 87 district site technical visits between the spring of 2013 and the
spring of 2014. The visits were designed to help district technology coordinators review
infrastructure, devices, and networks within their district to confirm readiness for the
computer-based assessments. Experienced technology representatives from CDE and Pearson
answered site readiness questions regarding network and local technology configurations and
identified any potential technology risks. These visits were between 2-3 hours depending on
questions, needs, and level of readiness.
Outside of these supports, districts with concerns related to low bandwidth, thin clients, Linux
or other issues impacting online testing were instructed to contact the CDE Assessment Unit
technology specialist for targeted support.
Administration Training and Manuals
CDE and Pearson worked collaboratively to develop a variety of CMAS Science and Social
Studies manuals and training resources. In addition to a recorded webinar that was available to
all districts as needed, the Assessment Unit conducted nine in-person CMAS procedure
trainings across the state. Training locations included Grand Junction, Durango, La Junta,
Alamosa, Colorado Springs, Limon (2 training events because of weather complications),
Greeley, Broomfield/Westminster, and Centennial.
Customer Service and Support for Districts during Administration
Multiple support structures were available to schools that encountered challenges during
assessment administration. Pearson’s support model included a first line customer service
center (help desk) that was available to schools via telephone and instant message; a level 2
support team that assisted districts with more complicated or persistent technology challenges;
the Pearson Colorado program team whose team members work closely with the CDE
Assessment Unit and are dedicated to the Colorado program; and an onsite technology team
comprised of Pearson technology experts who provided onsite assistance to districts/schools
experiencing persistent technology challenges.
Prepared by CDE Staff, June 2014
2
During the administration window, Pearson customer service received a total of 6,399 calls and
613 emails. The average speed for a customer service agent to answer a call was 21 seconds
and the average handling time of each call was 11 minutes and 40 seconds. The majority of
customer service inquiries (approximately 81%) were related to technology during
administration. Of these, about 27% of questions pertained to the online test engine, TestNav
8, and about 54% of questions were related to the PearsonAccess test management portal (e.g.,
managing test sessions). Customer service was also contacted about resolving password issues
(about 10% of calls), uploading, managing and submitting student data (about 5% of calls), and
other miscellaneous topics such as preparing technology for online testing, ordering additional
materials, and clarifying paper administration guidelines.
Additionally, district technology coordinators were provided with a telephone bridge number
staffed by the readiness team from CDE and Pearson. This gave district technology coordinators
the opportunity to ask any last minute site readiness questions or troubleshoot any technology
issues in their assessment environments. The bridge was available the week before the
assessment window opened through the first week of the assessment administration. These
supports provided coverage when the Pearson help desk experienced a phone outage so
district technology coordinators were still able to receive support. The resiliency and
resourcefulness of the district technology coordinators were impressive. Many district
technology coordinators used the community of practice Google group to post challenges and
triumphs in their assessment environments so that their peers could learn from their
experiences and benefit from their successes.
Beyond the Pearson support structure, CDE Assessment Unit staff worked closely with districts
before, during, and after assessment administration to ensure that issues were escalated to
Pearson as needed and to resolve policy and procedural issues.
Documenting Feedback, Identifying Systemic Issues and Working towards Solutions
Throughout the administration, Pearson and CDE documented issues so that they could be
identified and resolved for future administrations. Detailed customer service logs were created
during administration to document customer interactions and identify systematic issues.
Districts were provided with multiple stakeholder surveys including surveys for district
assessment coordinators, district technology coordinators, CMAS test administrators, and CoAlt
test examiners. Survey results are currently under review and will be shared in August and
used to improve future administrations. Additionally, CDE convened a representative group of
district assessment coordinators and district technology coordinators on May 21st to discuss
and gather feedback on the science and social studies administration.
Administration Issues and Concerns
Colorado was one of the first states to use Pearson’s new online test engine, TestNav 8. The
move to the new testing platform allowed schools and districts to use iPads and Chromebooks
for testing, which districts indicated was a priority for this spring. The move to the new system
did mean that schools and students experienced working through the glitches and bugs
Prepared by CDE Staff, June 2014
3
expected in a new system. With a few exceptions, schools were able to resolve issues and
complete computer-based administration successfully. Colorado had approximately 99% of its
students take the online assessment, with initial data indicating approximately 99% of those
students were able to complete testing successfully. Most of the one percent utilizing the
paper-based assessments were due to students’ accommodations needs, rather than
technology challenges. Colorado’s online schools did experience challenges setting up
appropriate online testing environments. In those cases where they were not successful in
establishing the environments, they were allowed to use the paper-based form of the test.
The lessons learned this spring will be used to improve Pearson’s online test engine and
technology preparation guidance for next year. The efforts and patience of district and school
assessment coordinators, district technology coordinators, test administrators and students
were invaluable during the administration. Based on recurrent and commonly experienced
administration issues and concerns, the following priorities have been established for future
administrations:

Closely align PearsonAccess training and operational sites. The PearsonAccess training
and operational sites did not tightly align in certain places. Pearson has been directed
to ensure that the two sites are closely aligned so that districts will be able to practice
and refine their skills and knowledge base as far in advance of the assessment
administration as possible.

Streamline the process for accommodation and special form assignment. The Student
Data Upload process was somewhat cumbersome and required districts to record
accommodation/special form information for students at more than one step of the
process. This duplicative process was confusing for some districts and required targeted
assistance in many cases. Streamlining this process has been identified as a top priority
for future administrations.

Improve the TestNav 8 test engine. As expected with the new test engine, glitches
were experienced. In some cases students were kicked out when the test engine, local
internet, local intranet, device, operating system and/or browser were not interacting
optimally. When this occurred, students needed to re-login to the assessment in order
to continue testing. An estimated 3% of students experienced persistent challenges
requiring repeated login attempts. Adjusting configurations (ex. browser selections,
content filter settings, etc.) tended to address the issue and prevented recurrence on
subsequent testing days. CDE has prioritized this issue for Pearson to address.

Improve the TestNav Exit Process. The TestNav exit process was inconsistent across
sections and overly complicated, involving too many steps. CDE has provided Pearson
with direct and explicit changes that need to be made to the exit process to make it
more efficient and intuitive.
Prepared by CDE Staff, June 2014
4

Refine PearsonAccess reporting features and status indicators. Improvements to the
PearsonAccess reporting features are needed to help districts and schools identify the
progress of their students during administration (i.e., which sections each student has
completed).

Improve Pearson customer service. With TestNav 8 being a new system for not only
Colorado, but also Pearson, first line customer service was not able to provide solutions
as quickly as typically expected. Based in part on analysis of the customer service logs,
additional responses to typical questions will continue to be developed and utilized
during the next administration.
CMAS: PARCC English Language Arts and Mathematics
PARCC assessments are comprised of two components: a performance-based assessment and
an end-of-year assessment. The English language arts performance-based assessment includes
literature analysis, narrative writing, and research simulation tasks. The mathematics
performance-based assessment includes tasks that require students to solve problems, express
mathematical reasoning, construct mathematical arguments, and apply concepts to solve realworld problems. Some student responses to the performance-based assessment will be
machine scored, while others will be scored by humans. The English language arts and
mathematics end-of-year assessments involve tasks that require students to demonstrate their
content-specific acquired skills and knowledge. Unlike the performance-based assessments, all
responses on the end-of-year assessment will be machine scored.
Preparation for Administration
PARCC developed multiple training resources and support documentation to prepare districts
for the field test administration. The CDE Assessment Unit also provided resources. In-person
PARCC administration trainings were included in many of the CMAS trainings throughout the
state. Webinar recordings were also available. Like the science and social studies assessments,
the PARCC assessments were delivered on Pearson’s new TestNav 8 testing engine. Therefore,
the technology resources and supports that were put in place by Colorado for the science and
social studies assessments were directly transferrable to the PARCC assessments.
PARCC Field Tests
The purpose of the PARCC field tests was to examine the quality of assessment items to ensure
that the items selected for the 2014-15 operational assessments are of high quality, to test the
assessment administration procedures, and to provide schools and districts with the
opportunity to administer the PARCC assessments. Student results will not be available for
students or schools.
The performance-based assessment field test window was open March 24-April 11, and the
end-of-year field test window was open May 5-June 6. Some students participated in online
Prepared by CDE Staff, June 2014
5
assessment administration while others took the assessments via paper. Throughout all of the
PARCC states, roughly 790,000 students completed online performance-based and end-of-year
tests. Roughly 170,000 students took the performance-based assessments on paper and just
over 100,000 took the end-of-year on paper. In Colorado, about 279 schools in about 76
districts participated in the performance-based assessment field test. About 324 schools in
about 82 districts participated in the end-of-year field test. Across the two field tests,
approximately 26,000 Colorado students participated in the computer-based administration.
Paper participation rates are not yet available by state; however, approximately 65% of
Colorado schools were selected to test online and 35% of schools were selected to test on
paper. Most students participated in only one assessment component in one content area.
Support during the PARCC Field Test Administration
Most districts called the Pearson help desk instead of CDE. One of the main call drivers for
Colorado was password assistance. Many times, people were confused about the differences
between the PARCC and the science and social studies PearsonAccess sites.
The performance-based test administration had a higher call volume than the end-of-year test
administration. During both administrations, the average time to handle a help desk call was
approximately 16 minutes. The speed to answer the call was typically less than 0:05 seconds.
When there were help desk delays, there were a greater number of abandoned calls. The top
call drivers also depended upon where the school was within the testing window. At the
beginning of the window, test management had high call volumes, as did using TestNav and
TestNav errors. At the end of the window, returning test materials was a call driver.
Similar to what was reported during the science and social studies assessment administration,
some districts indicated that they experienced difficulties with the Pearson help desk staff that
sometimes provided incorrect or conflicting information. Some districts experienced Java
issues which were addressed by exiting out of the error and having students re-login. In other
districts, these Java errors kept students from testing until the Java error was fixed or a
different browser was used. There were some updates (Java and Firefox) during the
administration that were problematic. Most schools were aware of the Java update and
decided not to update to the latest version of Java to avoid complications.
Feedback from Field Test Participants
The lessons learned during this administration are being used by PARCC member states to
refine and resolve administration issues for the spring assessments. According to feedback and
the completion rates, most students were able to use the platform. Students who were
exposed to the sample questions and tutorials before testing demonstrated greater familiarity
with the platform. Districts participating in the PARCC field tests experienced technology
challenges that were similar to those experienced during the science and social studies
administration including:
 TestNav 8 was sensitive to device, operating system, browser and test engine
interactions;
Prepared by CDE Staff, June 2014
6



New item types and tools (particularly the math equation editor) were difficult for some
to navigate;
Handling software updates that were sometimes scheduled during the assessment
window could be complex; and
Customer service did not always know how to resolve issues.
CDE gathered a group of representative districts to provide feedback on their field test
experiences. Many of the issues discussed were similar to those raised during the science and
social studies feedback session. This feedback is being used by CDE to prioritize improvements
that need to be made for future administrations. Additionally, a variety of stakeholder surveys
were provided to districts. Survey results from the end-of-year field test are still being gathered
by PARCC. Survey results will be reported by PARCC and used to identify systemic issues and
improve future processes.
Summary
Overall, Colorado districts succeeded in administering the state’s first online statewide
assessments. Challenges were encountered along the way; however, districts and schools were
committed to working through them in a manner that allowed students to complete testing.
Lessons learned during the science and social studies administration and the PARCC
administration will be invaluable to improving next year’s assessment administration. Many of
the issues and concerns expressed by districts affirmed the list of improvements that had
already been identified and prioritized by CDE. CDE will continue to work with districts, schools,
Pearson and PARCC to support districts in continuously improving future online
administrations.
Prepared by CDE Staff, June 2014
7
Fly UP