President’s Column

by user

Category: Documents





President’s Column
O c to b e r 2 0 1 1 WERA Newsletter
Washington Educational Research Association
Seattle, WA
President’s Column
Washington State 12 years
ago, we were implementing
the High School WASL for
the first time. Boy, that
seems like a hundred years
ago! Since then the state
has added CBAs, CAA and
CIA options, as well as
transitioned the on-demand
tests to the MSP and HSPE,
and most recently the EOC
tests. For those of you new to Washington state,
welcome to our world of acronyms*!
This all officially started in 1993 when the state
legislature passed the Education Reform Act (House
Bill 1209) establishing guidelines to create clear,
rigorous, academic standards for what every student
is expected to know and do by graduation. As with
changes in the state assessments over the years, the
standards too have evolved through various versions,
adding specificity to the EALRs through the
development of GLEs, and ratcheting up the math
standards to stay in line with higher expectations.
And now we are facing an entirely new set of
standards (CCSS) and an entirely new assessment
system (SBA) to be implemented over the next few
years. Yikes!
Thank goodness we have an organization like WERA
to help us navigate these treacherous waters! For
example, at the upcoming December Assessment
Conference, where WERA and OSPI forge a
partnership to help us all learn the most up-to-date
information, the more than 50 breakout sessions
include many that address the new Common Core
State Standards as well as changes in the state
assessment. Even though we are all grappling with
reduced resources, this conference is a “one-stop
shopping” bargain for the latest state updates as well
as recent advances in research that have powerful
implications for our work. Hence the theme of the
conference: “Assessing What We Value; Valuing What
We Assess.”
President’s Column
Editor’s Column
WERA Community News
Data Café
NAEP Corner
WERA Business News
WERA Network of Assessment Directors Minutes
Calendar of Events
WERA/OSPI Conference
Conference Keynote Speakers
Volunteering in Paradise
Interview with Deb Came
CogAT Author Discusses Issues…
10 – 11
13 – 14
15 – 16
17 – 18
Both the December Assessment Conference and the
one-day 2012 Spring Conference (“Measuring
Progress: Concepts and Applications” on March 29,
2012) reflect the first of WERA’s four goals for 201112 established by the Executive Board at our June
Provide conference programs,
professional development offerings,
and resources that address the
professional needs and emerging
issues of our members, especially in
the application of educational
research to improve instructional
and assessment practices and the
appropriate use of data and
information to make informed,
objective decisions.
WERA’s other three goals address strategies
designed to keep the organization strong
and current in providing the most costeffective services to our members:
Update WERA’s communications
and publications in ways that will
optimize the dissemination of
relevant and valuable information
to our members in a timely and
effective manner.
Establish financial procedures and a
budget that will continue to ensure
WERA’s financial strength.
Increase WERA membership by
providing value-added membership
Continued on next page…
The Standard Deviation
Page 2 / October 2011
In line with our second goal, we hope you have noticed that we are using more electronic means of
communication. In addition, we are discussing the possibilities of interactive features on the WERA website. In
line with our third goal, we updated our financial auditing procedures following the advice of an outside expert.
And in line with our fourth goal, we are discussing the idea of offering the option of WERA-sponsored Special
Interest Groups (SIGs) to members who have focused interests (e.g., early learning, ELL). All of us on the
Executive Board welcome suggestions and input from our members with regard to these or other ideas that will
help keep WERA current and proactive in optimally meeting our members’ needs.
For, as we all know, change just keeps on coming. Whether or not we like it, as educators we must keep up with
the seemingly never-ending changes, and I am happy to report that WERA is here to help us do just that.
--Nancy Katims, Ph.D., is WERA President, and Director of Assessment, Research, and Evaluation for Edmonds School
District. She can be contacted at [email protected].
Classroom-Based Assessments
Certificate of Academic Achievement
Common Core State Standards
Central Intellig. . .oops . . . Certificate of Individual Achievement
Essential Academic Learning Requirements
English Language Learners
Grade-Level Expectations
High School Proficiency Exam
Measurements of Student Progress
Office of Superintendent of Public Instruction
SMARTER Balanced Assessment
Washington Assessment of Student Learning
Washington Educational Research Association
The Standard Deviation
Page 3 / October 2011
Editor’s Column
by Andrea Meld, Ph.D.
One of the rewards of being an editor is to see good
writing take off. I would like to thank the many
contributors who have made this deluxe edition of
The Standard Deviation possible:
President Nancy Katims for the WERA
President’s Column,
President-Elect Pete Bylsma for compiling
the WERA Business News,
Dawn Wakely for preparing the minutes of
the August 2011 Network of Assessment
Directors Meeting,
Past President Bob Silverman, Board
Member Christopher Hanczrik, and
Executive Secretary Sharon Rockwood, who
took charge of providing information and
publicity for the upcoming WERA/OSPI
conference in December,
Past Board Member Phil Dommes for his
evaluation of Dr. Lohman’s presentation on
the identification of gifted students at the
Spring 2011 WERA Conference,
NAEP Coordinator Angie Mangiantini for
her report on mapping state assessments to
NAEP standards, and
WERA Member Jill Hearne for her
noteworthy essay on her experiences as a
literacy volunteer in the Caribbean island of
And now a word from Peter Hendrickson, editor of
the WERA Educational Journal:
December Issue: Career and College Readiness
The December WERA Educational Journal, a peer
reviewed publication, will focus on Career and
College readiness. Look for several papers including
a community college perspective, validity issues
and measures that predict readiness. A brace of
book reviews will embrace both the focus and the
upcoming December conference. And look for
columns on ethics, food, assessment and puzzles.
The May 2012 issue will focus on measuring growth
in student achievement. Inquiries and submissions
should be addressed to Editor Peter Hendrickson at
[email protected].
--Andrea Meld, Ph.D. is Applications and Assessment
Analyst for the Kent School District and editor of
The Standard Deviation.
“Teachers open the door, but you
must walk through it yourself”
-Chinese Proverb
Other features in this issue include an interview
with Deb Came, the new Director of Student
Information at OSPI, over a dozen data tips, tricks
and trivia in Data Cafe, WERA Member News,
photos from past conferences, and the Events
Calendar. We hope you enjoy this issue of The
Standard Deviation. Please send all comments,
suggestions, and contributions to the editor at
[email protected].
Check out the WERA Publications page http://www.wera-web.org/pages/publications.php
Page 4 / October 2011
The Standard Deviation
WERA Community News
Deb Came is the new Director of Student
Information at OSPI. Previously she worked at OFM
for 11 years, most recently at the Education Research
and Data Center, which is implementing the state’s
new Preschool through Workforce (P-20) longitudinal
data warehouse. (Read an interview with Deb in this
issue of TSD.)
Margaret Ho is now the English Language
Assessment Coordinator at OSPI. Margaret previously
worked for the Edmonds School District working in
the area of Bilingual education. She also worked in
test development for Hawaii, and at OSPI in the
Migrant and Bilingual division where she was
instrumental in developing the WLPT-II assessment.
at OSPI, where she was the Washington State
Coordinator for the National Assessment of
Educational Progress (NAEP).
Scenes from recent WERA Conferences
Marty McCall is now serving as the Psychometrician
for the Smarter Balance Consortium. Marty comes to
SBAC from the Northwest Evaluation Association
(NWEA), where she led the development of multiple
facets of computer adaptive test design. Prior to
NWEA, she was the Psychometrician for OSPI.
Andrea Meld, formerly Data Analyst in Assessment
and Student Information at OSPI, has just started as
Applications Analyst/Data Analyst for Testing and
Assessment at the Kent School District. Andrea
worked at OSPI for more than eight years.
Tonya Middling is now Coordinator of Instructional
Leadership and Professional Learning at Kent School
District. Tonya previously worked at OSPI for 11
years. Most recently, she served as Director of
Secondary Education and School Improvement, OSPI.
Don Orlich, Professor Emeritus of the Science
Washington State University, announced that the
10th edition of Teaching Strategies: A Guide to
Wadsworth/Cengage Learning this fall. His coauthors
are Robert Harder, Richard Callahan, Mike Trevisan,
Abbie Brown, and Darcy Miller.
Andrew Dean Ho, then at University of Iowa,
discusses “problems with proficiency” at the
Spring 2009 WERA Conference
Ben Rarick is the new Executive Director for the
State Board of Education. Mr. Rarick previously
worked for the Washington State House of
Representatives as a Senior Budget Analyst for the
Education Appropriations and Ways and Means
Committees. He was also Project Coordinator in the
Evans School of Public Affairs at the University of
Washington, and Director and Policy Associate for
the New Jersey Department of Education.
Kathryn Sprigg is now serving as Interim Director at
the Office of Accountability in the Highline School
District, replacing Alan Spicciati who is the Interim
Superintendent. Prior to joining Highline, she worked
Conversation at 2009 WERA conference
The Standard Deviation
Page 5 / October 2011
Data Café
A Baker’s Dozen: 13 Data Tips, Tricks and Trivia
by Andrea Meld, Ph.D.
1. Origins of the Baker’s Dozen
If you’ve ever purchased a baker’s dozen of donuts,
cookies, or dinner rolls, you bought 13 items. There are
several theories about the origins of this term. The
standard theory goes back to the 12th century.
Apparently, bakers provided an extra item to avoid the
appearance of shortchanging their customers, which
could result in severe punishment. A charitable theory is
that during the Depression, bakers gave their customers
an extra portion as an act of kindness. Still another
explanation is based on geometry. Before the advent of
muffin tins, bakers found that the most efficient way to
arrange round objects in a rectangular cookie sheet and
avoid the corners, which conduct heat unevenly, is to
follow a compact pattern of three rows of 4, 5, and 4.
They then sold each batch as a “baker’s dozen.”
Sidebar: In 1831, Carl Friedrich Gauss whose legacy
includes the bell-shaped curves used in statistics, proved
that hexagonal packing of spheres yields the highest
density of all 3-D lattice arrangements
2. Create Roman Numerals in Excel
For numbers up to 3,999, the formula =ROMAN(a1)
produces Roman numerals in an adjacent column.
Examples of calendar years: 1492 = MCDXCII, 1918 =
MCMXVIII, 2011 = MMXI. You can’t add or subtract the
resulting Roman numerals, however.
How to Keep Leading Zeros in Excel When Pasting
in Data
Highlight the column where you will be pasting the data.
Right click to get Format Cells. From the Custom
category, highlight 000-00-0000. It will appear in the
Type box. Delete the dashes, and click on OK. When you
paste data to this column the leading zeros will remain
in place.
4. How to Concatenate in Excel
For example, if you wish to concatenate first name,
middle initial, and last name with spaces between, and a
period after the middle initial, the formula is
=CONCATENATE(“A1”, “ “, “B”, “.”, “C1”). Examples:
Alice C Ford and Bill T Billings in three separate columns
become Alice C. Ford and Bill T. Billings in one column.
5. Remove Duplicate Records in Excel
I suggest copying the column of interest into another
field to preserve the original data. Highlight the copied
field and from the Data tab, select Remove Duplicates.
Continued on next page…
The Standard Deviation
Page 6 / October 2011
6. Using Conditional Statements in Excel
For example, a formula for showing meeting standard
based on score might read IF (B1>=500, “Met”, “Not
Met”). If the score is 500 or above, the student met
standard, otherwise they did not met standard. You can
also use conditional formatting in the tool bar to
highlight the scores that met standard. In this case, the
rule is that cells greater than 499 will be highlighted in
green text and background:
FREQUENCIES variables = PrimaryLast.
Using IF Statements in SPSS to Create New
“IF” statements can be used in a variety of ways in
SPSS. Although string variables must be defined,
numeric variables do not. In the examples below,
MetStandard is a new string variable and Group is a
new numeric variable:
STRING MetStandard (A3).
IF (Score GE 500) MetStandard = “Yes”.
IF ANY(ReportingGrade, “03”, “04”, “05”)
Group = 1.
IF ANY(ReportingGrade, “06,” “07”, “08”)
Group =2.
7. How To Concatenate in SPSS
The first command creates a new variable,
StudentFullName. The second command creates the
concatenated name and removes any extra spaces
between the first and last name.
STRING StudentFullName (A16).
COMPUTE StudentFullName
=concat(rtrim(firstname), " ", lastname).
Crocket Davey
Davey Crocket
8. Identify Duplicate Records in SPSS.
You can Identify Duplicate Records from the dropdown menu under Data, or run this syntax:
DO IF (PrimaryFirst).
COMPUTE MatchSequence=1-PrimaryLast.
COMPUTE MatchSequence=MatchSequence+1.
LEAVE MatchSequence.
FORMAT MatchSequence (f7).
COMPUTE InDupGrp=MatchSequence>0.
/DROP=PrimaryFirst InDupGrp MatchSequence.
VARIABLE LABELS PrimaryLast 'Indicator of
each last matching case as Primary'.
VALUE LABELS PrimaryLast 0 'Duplicate Case' 1
'Primary Case'.
10. Create Aggregate Average Scores by School in
You may want to create data files at the aggregate
level by school, for example, from student level data.
Here is an example. First convert variables to
numeric if they are in string format. Starting with a
student level data, the following syntax can be used
to create a new SPSS data file with means and
standard deviations for selected subjects by grade
level and school.
ALTER TYPE ReadingScaleScore MathScaleScore
SciencedScaleScore (F8.0).
/BREAK= Grade DistrictCode DistrictName
SchoolCode SchoolName
11. Create Data Dictionaries (file formats) for SPSS.
From the menu, click on File, then Display Data File
Information, then click on Working File. From the
Continued on next page…
The Standard Deviation
Page 7 / October 2011
Output view, click on File, then Export, and a screen will appear. Save as an Excel file to your directory and click
on OK. When you retrieve the dictionary in Excel, you can delete any extraneous material and format as you like.
12. Create Your Own Family Tree.
The New York Times has posted a web page where you can learn about multicultural families and create your
own family tree. http://www.nytimes.com/interactive/us/family-trees.html#index
13. “Artistic Effects” with Clip Art.
Right click on the image, and a menu of various effects under “Format Picture” appears that can change the
appearance of graphics for data display or Power Point presentation.
Below, left to right: original image, watercolor sponge effect, glow edges, pencil sketch.
-Andrea Meld, Ph.D., is editor of The Standard Deviation.
Scenes from recent
WERA Conferences
WERA members Ric Williams (Everett SD) and
Ali Williams (Mukilteo SD)
Joe Wilhoft, who is now Executive Director,
Smarter Balance Consortium
Tara Jeffries (Mukilteo SD)
Page 8 / October 2011
The Standard Deviation
NAEP Corner
by Angie Mangiantini
In August 2011, the National Center for Education
Statistics released an analysis of the NAEP 2009 state
assessment results and state level achievement data
in reading and mathematics for grades 4 and 8. This
study assigns a NAEP scale equivalent score based on
the percent proficient on the state assessment for
the 2008-2009 school year. Please note that in all of
the analyses completed in this study, the data are
based solely on schools that participated in NAEP
reading and mathematics assessments in 2005, 2007
and 2009 at grades 4 and 8.
Mapping state standards onto the NAEP scales
The percentage of students, reported on the state
assessment as meeting standard, in each school
assessed by NAEP in 2009 is matched to the point on
the NAEP scale corresponding to that percentage. For
example, 60% of eighth graders in a school that
participated in NAEP in 2009 achieve proficient on
the state assessment in mathematics. The school’s
2009 NAEP results are then reviewed to see at which
scale score on the NAEP scale 60% of the students are
performing. The NAEP equivalent score is then
assigned for the school. This process is done for
each school in the state that participated in the 2009
NAEP assessments. The NAEP equivalent scores are
then aggregated by subject and grade. Relative error
is calculated for scale scores and is a measure of how
well the mapping procedure reproduces the
percentages reported by the state as meeting the
standard in each NAEP participating school. When
the relative error is greater than .5 then it is
considered too large to support useful inferences
from the placement of the state standard on the
NAEP scale without additional evidence.
Comparisons over time
Comparisons between the 2009 mapping results and
the 2005 or 2007 mapping results in reading and
mathematics at grades 4 and 8 were conducted
separately. This was done to find the extent to which
NAEP corroborated the changes in achievement
measured on the state assessment.
Data sources
This study uses (a) NAEP data files for states that
participated in 2005, 2007 and 2009 reading and
mathematics assessments, (b) state assessment
school level files from the National Longitudinal
School State Assessment Score Database, and (c)
school level achievement data for the 2006-07 and
2008-09 school years from EDFacts.
Using NAEP as a common yardstick, these analyses
compared state proficiency standards (cut scores)
with one another.
Grade 4 Reading
In 4th grade, reading for 2009 the NAEP equivalent
score for Washington was 205. This was an increase
of 3 points from 2007 and 8 points from 2005. The
2009 NAEP equivalent scores ranged from a high in
Massachusetts of 234 to a low in Tennessee of 170 –
64 points. Washington is ranked 16th in the nation in
reading in this report. Between 2007 and 2009,
Washington students in grade 4 made greater gains
on the NAEP reading assessment than on the state
Grade 8 Reading
The grade 8 reading NAEP equivalent score for 2009
was 253. There was no change in the score from
2007 to 2009. Washington 8th grade students are
ranked 11th in the nation in reading. The high
equivalent score was 267 for Missouri and the lowest
score was in Texas at 201. The range is 66 points.
The relative error around Washington’s equivalent
score is greater than .5. This means that NAEP and
the state assessment may not measure the same
subject matter. Therefore making inferences about
the state proficiency cut score for grade 8 reading in
Washington based on the NAEP equivalent scale
score is not advised. Additional studies comparing
the NAEP reading assessment and Washington’s
grade 8 reading assessment would have to be
completed before any inferences may be made.
Grade 4 Mathematics
Washington ranked third in the nation in
Mathematics. Our NAEP equivalent score was 243.
The highest score was 255 for Massachusetts and the
lowest in Tennessee with a score of 195.
Washington’s score increased by 4 points from 2007
to 2009 and by 8 points from 2005 to 2009. Changes
that are more positive were seen for grade 4 on the
NAEP mathematics assessment than on the state
assessment between 2007 and 2009.
Grade 8 Mathematics
Washington’s grade 8 mathematics proficiency
standards are higher than all other states except
Massachusetts. Washington’s NAEP equivalent score
was 288, Massachusetts score was 300. The range of
the scores was 71 points, with Massachusetts high of
300 and Tennessee’s score of 229.
Washington, like 35 other states has set a proficiency
cut score on the state reading assessment at the
Below Basic performance level for grade 4.
Conclusions cannot be made for the grade 8 2009
NAEP reading equivalent scale score due to the
relative error. The authors state on page 33 of the
study, “Setting a criterion (relative error) serves to
call attention to the cases in which we should
Continued on next page…
The Standard Deviation
Page 9 / October 2011
consider the limitation on the validity of the mapping as an explanation for otherwise unexplainable results.”
“…because the relative error criterion is arbitrary, results for all states are included in the report, regardless of
the relative error of the mapping of the standards.”
At both grade 4 and 8 in mathematics, Washington’s equivalent scale scores are among the highest in the nation.
In 4th and 8th grades, the proficiency cut scores on state mathematics assessment have been set to the NAEP Basic
achievement level. The complete report can be found at:
-Angie Mangiantini is NAEP State Coordinator for OSPI. She can be reached at [email protected]
WERA Business News
by Pete Bylsma, Ed.D.
The WERA Board gathered in June 2011 with its three new members to plan for the coming year. With President
Nancy Katims at the helm, the Board examined WERA’s membership and financial trends, ways to improve the
annual conferences, and options for professional development during the year. Other groups have gathered to
plan future activities. Here are some highlights:
Last year we had 440 members. We have averaged more than 500 members in recent years, but
Washington is still the largest state education research association in the country.
We held four successful conferences last year. The largest was the Assessment Conference in
December, and three smaller 1-day conferences focused on the new state accountability index
(January, Pete Bylsma), brain research (April, John Medina), and educating gifted students (June, David
Lohman). Documents for these events are posted on the WERA Web site.
Various committees are planning the December and March conferences (see other articles in the
newsletter about these events).
Annual membership is now included in the conference registration, and the registration process has
been simplified.
WERA is in healthy financial shape. An independent CPA reviewed our current procedures and made
some recommendations to ensure we conform with current standards. Another independent
consultant involved in the financial audit remarked that our financial records were among the best he
had ever seen by a non-profit organization.
The Board is discussing ways to increase revenues from education agencies, business, and
foundations so WERA can offer more funding for grants and to help pay conference fees for those
who cannot afford to attend due to cuts in professional development budgets.
We will be improving the WERA Web site and considering ways to improve communications among
our members (perhaps through social networking sites).
Nominations for next year’s Board are being compiled. If you are interested in being on the Board or
know somebody who would be good in that role, please contact Nancy Katims. We will release the
final slate of candidates at the December conference.
-Pete Bylsma, Ed.D., is WERA President-Elect, and Director of Assessment and Student Information for Renton School
The Standard Deviation
Page 10 / October 2011
WERA Network of Assessment Directors August 2011 Meeting Minutes
by Dawn Wakeley
Over 40 people from across the state attended the
August 19, 2011, meeting of the Network of
Assessment Directors at the SeaTac Hilton. The
Assessment network was pleased to have several
OSPI staff members provide their insights about
state assessments and work that occurred over the
summer in standard setting for math and science.
They also previewed work that will occur this fall as
we enter another year of teaching, learning, and
assessment in Washington state.
Network members began by informally sharing their
preliminary district results. The August network
meeting takes place before the release of state
results, so discussion helps participants put their
own district results in perspective. Pat Cummings,
Director of Research and Evaluation in Tacoma,
showed some of the data displays he uses to
visualize the assessment data over time for his
district. An insight from a number of assessment
directors was the degree of variability from year to
year in the assessments and ways to help visualize
that so teams are not over-interpreting year to year
changes. Comparison of individual results to a
reasonable comparison group is a good practice; for
example, the results of a school compared to its
district’s average or to a group of schools with
comparable demographics; the results of a district
compared to the state’s average or to a group of
districts with comparable demographics.
Robin Munson, OSPI Assistant Superintendent
Assessment and Student Information, gave an update
on new agency staff, state budget challenges and
impacts, and the process for standard setting for End
of Course (EOC) math and new grades 5 and 8
science assessments. She also shared information on
WAAS Portfolio revisions and the new Washington
Language Proficiency Test, operational this year.
Budget reductions continue to impact us all, with the
certainty of more on the horizon. The OSPI
assessment budget was reduced by $700,000 with
some of that in contract renewals, staff time
(furlough 5.2 hours per employee/month) and
Collection of Evidence (COE). COE payments to
districts has been reduced from $300 to $200 and
OSPI may pay for only one COE per student per
subject area. An FAQ will be posted to the OSPI
website, with more information posted in the
Washington Assessment Weekly.
Robin reviewed the timeline for the work completed
over the summer on standard setting and how the
recommendations for the cut scores came from
multiple sources and included many teachers and
district central office staff, as well as content experts
from across the state.
Contrasting Groups Study involved individual
ratings of students by their teachers before the
tests were given. (250 teachers participated with
13,240 students across the state);
Grade-level Panels participated in standard
setting activities across three days during the
summer resulting in a set of recommended cut
scores. (115 participants including teachers and
district administrators);
Articulation Panels reviewed the grade/course
level recommendations, resulting in slightly
progression between grade levels made sense.
(16 participants from across the grade level
Policy Advisory Panel reviewed both sets of
recommendations in light of district policy issues
and made separate recommendations. (13
administrators, and superintendents).
OSPI expects to continue with the same methodology
of standard setting. Robin emphasized the
importance of a robust data set for the contrasting
group study as the starting point for standard
setting. There will be opportunities in the new school
year for teachers to participate in a contrasting
group study for the new Biology EOC assessment.
Webinars may be used as they were last year for
teacher training. Pay special attention to course
coding, as with math classes last year. to ensure the
right EOC tests are associated with the classes. The
new Biology EOC is the grade 10 state science test so
any grade 10 students not in a biology course this
year will still need to take the assessment for AYP.
Greta Bornemann, Director of Mathematics Teaching
and Learning, OSPI, stressed the importance of the
performance level descriptors (PLDs) in both the
contrasting group study and in providing support
and focus for teachers to better understand the
targets. She discussed the value of working with the
performance level descriptors for many teacher
groups and suggested that Assessment Directors
may continue to leverage that with the EOC Biology
work coming up this fall. The performance level
descriptors for Algebra and Geometry can be found
at www.k12.wa.us/assessment/statetesting/PLD/default.aspx
Continued on next page…
The Standard Deviation
ETS conducted a mode-to-mode analysis (pencil and
paper vs. on-line) for science at the direction of OSPI
at grades 5 and 8. Matched students were selected
based on prior math and reading performance. The
conclusion reached was that “In general, students
performed very similarly regardless of testing mode.”
Items that were the same on both tests performed
similarly across the two modes.
Directors discussed the challenges of meeting state
targets for participation until adequate computer
and internet access are available. Currently there is a
great deal of variation among districts.
Robin provided some updates on AYP. The proposal
the state made to the Department of Education (DOE)
has not been approved yet. Direction to districts
regarding the AYP bar adjustment, (if any) will be
Superintendent Dorn is unlikely to rush into a waiver
process, wanting to clearly understand the details
embedded in any waiver process for the state and
any impact on school districts.
EOC Algebra and Geometry exams will be offered in
winter 2012. These exams are not available as an
early AYP test for grade 10 students who are in EOC
math courses ending in spring. Winter 2012 math
EOC exams are available for:
• Students in EOC math courses that conclude at
the first semester break (e.g., block schedules),
regardless of grade level;
Students who took the spring 2011 math tests
but did not pass or were absent;
Grade 11 and 12 students who have not met
math graduation requirements.
OSPI will release the revisions in the WAAS
assessment and the Language Proficiency test this
fall. Dr. Margaret Ho is the new coordinator for the
Language Proficiency Assessment program; the
vendor is CTB McGraw-Hill. Districts will continue to
use the new placement tests this fall but the annual
testing after the first of the year will be with the new
assessment. More information will be provided in
early fall.
The state report card will use new calculations for
graduation rate, based on the National Governor’s
Association formula. The new formula, required by
all states as of August 2012, is designed to ensure a
consistent methodology across states. This year,
results using both new and old methodologies will be
displayed on the OSPI website.
The availability for data reporting and visualization
continues to expand. A new tool just released is the
Page 11 / October 2011
Washington Achievement Data Explorer (WADE). To
view performance indicators by district, compare
districts, or view the state averages by a variety of
factors, go to www.cedr.us/districts.html.
The meeting then took up EOC math assessments.
Districts currently face how to go about determining
the options and support for students who did not
pass the EOC exams, and how to follow up with
course placement or interventions in the fall. The
meeting divided into smaller groups to discuss
testing conditions for the spring assessments, what
worked or didn’t. Teams captured their reflections
and ideas on the shared Google site. (For access,
please contact Dawn, whose contact information is at
the end of this article.)
The meeting ended with announcements about the
upcoming WERA/OSPI conference, “Assessing What
We Value, Valuing What We Assess” to be held on
Dec. 7-9, 2011. The Network of Assessment Directors
meeting will be on Friday afternoon so members can
attend preconference sessions. The spring WERA
conference “Measuring Progress - Concepts and
Applications” will be held on March 29, 2012 at Puget
Sound ESD.
-Dawn Wakeley is the Associate Director of Teaching
and Learning for the Tahoma School District, and
provides direction, along with Brian Rick (Bellingham
SD), Pat Cummings (Tacoma SD), and Amy Nelson
(Peninsula SD) for the Assessment Directors Network.
Contact Dawn at 425-413-3424 or
[email protected]
“If you can’t explain what you’re
doing in simple English, you’re
probably doing something wrong.”
-Alfred Kazin, American critic and
author (1915 -1998)
The Standard Deviation
Page 12 / October 2011
Calendar of Events
WERA Items
The 26th Annual Washington State
Assessment Conference
Assessing What We Value – Valuing What We Assess
Co-sponsored by WERA & OSPI
Dec. 7–9, 2011, Seattle Hilton Airport Hotel
WERA 2012 Spring Conference
Measuring Progress: Concepts and Applications
March 29, 2012, Puget Sound ESD
Renton, WA
Other Events of Interest to WERA
ASCD 2011 Fall Conference
October 28 -30,2011, Las Vegas, NV
ASCD Annual Conference 2012
A Collective Call to Action
March 24-26, 2012, Philadelphia, PA
2012 AESD Annual Conference
Hosted by North Central ESD
The AESD Network . . . The Road Ahead
April 12-14, 2012, Lake Chelan, WA
American Educational Research Association (AERA)
2012 Annual Meeting
Non Satis Scire: To Know is not Enough
April 13-17, 2012, Vancouver, BC
CCSSO National Conference on Student Assessment
June 26-20, 2012, Minneapolis, MN
American Evaluation Association
Evaluation 2011
November 2 – 5, 2011, Anaheim, CA
Washington State ASCD & Northwest ESD 101
The Many Faces of Assessment –Ensuring Effective
Instruction and Preparing for Effective Intervention
November 5, 2011, Spokane, WA
National Alliance of Black School Educators
(NABSE) 39th Annual Conference
Education is a Civil Right
November 16–20, 2011, New Orleans, LA
2011 WSSDA Annual Conference
Take Charge
November 17-20, 2011, Bellevue, WA
Implementing the Common Core System
(ICCS) Meeting
December 5-7, 2011, San Diego, CA
The 10th Annual Hawaii International
Conference on Education
January 5-8, 2012, Honolulu, Hawaii
Dianne Ravitch, Research Professor, New York
University, and author of The Death and Life of the
Great American School System: How Testing and
Choice Are Undermining Education (2010), will
address the Saturday Luncheon for the WSSDA
Annual Conference on November 19th, 2011 at 12:15
P.M. Fees for the keynote alone are $30.00, without
lunch, or $60.00 with lunch. More information at:
The Standard Deviation
Page 13 / October 2011
Professional Development Opportunity
Washington Educational Research Association (WERA) and the
Office of the Superintendent of Public Instruction (OSPI)
are pleased to announce the
26th Annual Washington State Assessment Conference
Assessing What We Value – Valuing What We Assess
December 7- 9, 2011
Pre-Conference Workshops on (Wednesday) December 7
Seattle Airport Hilton Hotel and Conference Center
Dr. Bena Kallick, co-author of the “Habits of Mind” series
“Performance Based Assessment That Promotes Rigorous, Self-Directed Learning”
Dr. Doug Fisher, Professor of Language and Literacy Education, San Diego State Univ.
“The Purposeful Classroom”
-- plus over 50 breakout sessions on a variety of timely topics of value for teachers,
principals, curriculum specialists, central office staff, and post-secondary educators!
Winter 2011 Conference Registration, Agenda, Lodging information
Register Today!
Credit card, purchase order and check payments accepted online.
Please note this conference is the only statewide opportunity where OSPI staff will be providing updates on
state assessment activities, Common Core Standards, and new directions for student assessment.
Visit www.wera-web.org for more conference, membership, and sponsor information. Questions?
[email protected] or 206.417.7776 x2
All registrants will automatically receive a one-year Washington Educational Research Association (WERA)
membership ($25 value) as a benefit of registering for the annual two-day Assessment Conference. If you register
for only the Pre-Conference and would like to become a WERA member, join now.
This benefit eliminates the need to complete separate payments or forms. Just Register for the conference and you
will receive membership from September 1, 2011 – August 31, 2012. Visit www.wera-web.org to learn more about
the mission, professional development opportunities, publications and other helpful information regarding
educational research and assessment provided to WERA members throughout the year.
The Standard Deviation
Page 14 / October 2011
WERA Spring Conference
March 29, 2012
“Measuring Progress: Concepts and Applications”
Featuring National and State Experts and Presentations
9:00 am – 2:45 pm
Puget Sound ESD
Visit www.wera-web.org after December 10, 2011
for updated conference and registration information.
The Standard Deviation
Page 15 / October 2011
Please join us for
The 26th Annual WERA/OSPI Assessment Conference
Assessing What We Value -- Valuing What We Assess
December 7 – 9, 2011
SeaTac Hilton Conference Center
Thursday - Dr. Bena Kallick will speak on “Performance Based Assessment That
Promotes Rigorous, Self-Directed Learning.” This keynote will focus on creating a
classroom environment that promotes self-directed learning, organized around three
keys: self-managing, self-monitoring, and self-modifying. Each will be accompanied with
examples from classrooms to show the application of these ideas.
Dr. Kallick is a private consultant providing services to school districts, state
departments of education, professional organizations, and public agencies throughout
the United States and abroad. She has worked with many school districts in Washington
state. Bena received her doctoral degree in educational evaluation from Union Graduate
School. Her areas of focus include group dynamics, creative and critical thinking, and
alternative assessment strategies for the classroom. She is the author of Assessment in
the Learning Organization; the Habits of Mind series; Strategies for Self-Directed Learning; Learning and Leading
with Habits of Mind; Habits of Mind Across the Curriculum (all co-authored with Arthur Costa); and Using Curriculum
Mapping and Assessment to Improve Student Learning (co-authored with Jeff Colosimo). Her books have been
translated into several languages including Dutch, Chinese, Spanish, Italian, Hebrew, and Arabic.
Her work with Dr. Art Costa led to the development of the Institute for Habits of Mind
(www.instituteforhabitsofmind.com), an international institute dedicated to transforming schools into places where
thinking and Habits of Mind are taught, practiced, valued and infused into the culture of the school and community.
The Institute also provides services and products to support bringing the Habits of Mind into the culture of schools
and the communities they serve. Kallick's teaching appointments have included Yale University, University of
Massachusetts Center for Creative and Critical Thinking, and Union Graduate School.
Continued on next page…
The Standard Deviation
Page 16 / October 2011
Friday - Dr. Doug Fisher will speak on “The Purposeful Classroom.”
Building student competence requires precision teaching rather than prescriptive
methods. We can increase our precision when we have a clear learning purpose, create
meaningful tasks for students, and employ systems for determining when learning goals
have been met. In this session we consider the role that learning goals play an increasing
role in student achievement.
Douglas Fisher is Professor of Language and Literacy Education in the Department of
Teacher Education at San Diego State University, the Co-Director for the Center for the
Advancement of Reading at the California State University Chancellor’s office, and past
Director of Professional Development for the City Heights Educational Collaborative. Dr.
Fisher has worked closely with Washington state school districts. He has written many
articles on reading and literacy, differentiated instruction, and curriculum design. His
books include Creating Literacy-Rich Schools for Adolescents (with Gay Ivey); Improving Adolescent Literacy:
Strategies at Work (with Nancy Frey); and Teaching English Language Learners: A Differentiated Approach (with
Carol Rothenberg). A former early intervention specialist, language development specialist, he has experience
teaching high school English, writing, and literacy development.
The Assessment Conference convenes Thursday and Friday, December 8 – 9, 2011, and offers over 50 breakout
sessions on a variety of important topics of interest to teachers, principals, curriculum specialists, central office
staff and post-secondary educators.
The Pre-Conference will be held Wednesday, December 7, 2011 and offers 10 half-day workshops. Choose one
workshop in the morning and one in the afternoon. Lunch is provided. Pre-registration for specific sessions is
For more information, http://www.wera-web.org
-Prepared by Christopher Hanczrik, OSPI, and WERA Board Member,
and Bob Silverman, Past WERA President
The Standard Deviation
Page 17 / October 2011
Volunteering in Paradise
by Jill Hearne, Ph.D.
From S/V Lookfar, in the Eastern Caribbean
The view is stunning, behind me is a volcanic peak
rising into a cloud forest. As I gaze out to the ocean,
the breaking surf on the rocks below creates a mist
that blows back on the sloping lawn. Birds circle
overhead and the next closest land east is Africa.
Lunch is a multi-dish meal cooked in the small
kitchen from fresh locally grown organic vegetables
called “provisions”. Am I in an expensive eco resort?
No, I am volunteering at a school on the island of
Dominica with a landscape dominated by 7
magnificent volcanoes. Since there are no sheltered
bays, there are few beaches resulting in limited
tourism and no high rise resorts.
Rod and I have been out sailing for the last 12 years
and lately have been doing volunteer work as we sail.
This last season we spent a month on the island of
Dominica where I worked with their Director of
Curriculum, Instruction and Assessment doing staff
development with principals and teachers. We
decided on Dominica for two reasons. First, I had an
introduction through the head of an NGO, Hands
Across the Sea, which funds and delivers books to
Eastern Caribbean countries. Second, this particular
island is one of the poorest and least developed
islands in the West Indies.
Until a few years ago, only the privileged here went
to secondary school. The first class of universal
secondary students graduated in 2010, so issues of
teacher training and alignment are huge here. A very
talented central staff assisted by a consortium of
educational organizations working in the Eastern
Caribbean has in place a written standards-based
curriculum and a sound assessment system. The
missing element is instruction. I concentrated my
work in literacy training and school planning. In
addition, I did a needs analysis for the Minister of
Education, noting strategic action that would help
align their work.
Most of the schools are rural and remote; transport
is mainly small mini-vans that go from town to town.
I wanted to work with as many schools as possible so
Rod would take me into town in the dinghy at 7:00 in
the morning to get on a local mini-van. An hour later
I would reach the capital where I would then either
catch a ride with a Ministry van or central office
person with a car (a rare thing, owned by very few
upper management people). It would take as much as
3 hours to get from the capital, Roseau, to most of
the schools. I would then reverse the process and
return to the boat around 6:00pm. Although schools
may be only 10 to 15 miles apart as the crow flies, it
can take 45 minutes to get from one to another,
making most staff development a wilderness trek.
While I worked with schools, Rod spent his days
doing boat varnish or building bookshelves for the
local school libraries.
Most teachers have only a secondary education but
class size is usually less than 8 students. Since
universal schooling here is so new, first generation
literacy is the goal in many rural villages. Teachers
usually walk or ride the bus with the children to get
to school. Everyone has a garden so there is no
starvation here; every family has a plot to grow
“provisions:” yams, sweet potatoes, breadfruit, taro
root, and cabbage. In fact the island has the highest
number of centenarians in the world!
While it is an English-speaking island, home language
is a patois heavily influenced by French, as Dominica
is positioned between Guadeloupe and Martinique.
Few people in the villages are able to read and write.
In spite of challenges, I found staff members at all
schools to be upbeat, eager to learn and very
dedicated. I managed to visit 13 elementary schools,
6 secondary schools, and work with 45 principals
and 60 teachers. These schools are open to the air,
with wooden shutters and a concrete floor. Often the
only teaching tool is the blackboard, usually heavily
pitted. Shortage of chalk and paper limit the teaching
activities. There are few books, hence the importance
of Hands work.
Most instruction is of the chalk and talk variety,
when there is chalk! This next year I will return for
another month and work with principals on
instructional modules they can deliver individually at
their schools with the theme, “All of the students, All
of the time.” While the environment is healthy,
poverty is quite high; the GDP per capita is $10,500.
Therefore whatever I do in staff development needs
to be based on a “no costs, no materials” basis.
Hands Across the Sea had no hard data to show
potential donors of the efficacy of access to books.
To help with this, I wrote a grant in conjunction with
the Peace Corps there to study the effects of access
to books on students’ interest in Reading,
particularly male students. Because of WERA, I was
able to reach out to my colleagues and get advice on
reading inventories. Special thanks go to Nancy
Katims and Peter Hendrickson for their advice and
counsel. It was a real boon to sit at the only Wi-Fi
connection near our anchorage and be able to reach
out to the knowledgeable research community here
in Washington state. The fact that it was also the
beach bar wasn’t too bad either!
Continued on next page…
The Standard Deviation
Page 18 / October 2011
To support this island and read more about projects in the Eastern Caribbean, go to Hands Across the Sea;
http://www.handsacrossthesea.net/. For sailing news on S/V Lookfar or discussions on Assessment;
[email protected]
Principal starting the school day
Working on literacy
Hiking with the school
The Standard Deviation
Page 19 / October 2011
Interview with Dr. Deb Came, Director of Student Information at OSPI
by Andrea Meld, Ph.D.
Hi, Deb. Thanks for agreeing to do this interview for
The Standard Deviation. I understand that before
coming to OSPI, you had worked at the Office of
Financial Management (OFM) for 11 years, most
recently at the Education Research and Data Center
(ERDC). What is the ERDC? What kind of work do they
ERDC was created by the legislature in 2007, and seeks
to promote a seamless, coordinated preschool-tocareer (P-20) experience for all learners by providing
objective analysis and information. ERDC focuses on
research and data that span educational sectors,
ranging from preschool through graduate school and
the workforce.
There has been a lot of interest in the P-20 longitudinal
data system for Washington State that ERDC has been
developing. What will this look like? What will this allow
researchers to do? Some examples?
Washington State was awarded an ARRA grant to
continue developing a P-20 longitudinal data system.
This work is a collaborative project between ERDC,
OSPI, and other partner agencies. Ultimately there will
be a more efficient and comprehensive system for P-20
data, including a data warehouse and established
protocols for sharing data with education agencies and
researchers. There are so many possibilities for
research! As the system is developed and a longer
history of data is available, the possibilities for making
connections across P-20 data are vast. A document is
on the ERDC website that illustrates the variety of P-20
questions that could be answered using the
longitudinal data system
What are some of the ways you see ERDC and OSPI
collaborating on educational research issues?
ERDC and OSPI have already been collaborating in
many ways and that partnership will continue to grow
as the data systems develop and longitudinal and
cross-sector research questions are presented. One
example is the P-20 Feedback Reports for High Schools
that were shared with schools and districts in August.
The feedback reports will soon be publicly available at
There will be more collaboration looking at long-term
outcomes of K-12 students including post-secondary
enrollments and completions as well as workforce
outcomes. Another example will be OSPI, ERDC, and
the Department of Early Learning working together
to bridge Pre-K and K-12 information.
How do you see your role as the new Director of
Student Information at OSPI? What are some of the
projects you are especially looking forward to?
The Student Information at OSPI section plays a
pivotal role in data collection from districts and
providing information to numerous audiences. I
would like the group to continue being an efficient
conduit for student data and to provide expertise
about how that data can be used and its limitations. I
am looking forward to the new capabilities of the K12 SLDS as well as the increasing opportunities for
looking at student outcomes. The addition of student
course-taking data in CEDARS gives policy-makers
and researchers more information to work with in
assessing student outcomes and education policies.
One last thing, are you a Husky or a Cougar (or some
other mascot)? Where did you go to college and
graduate school?
The Northwestern Wildcats are nearest to my heart; I
received a B.A. in economics from Northwestern
University. I continued the theme of purple and
economics and went to graduate school at UW, so I
am also a Husky.
Thank you, Deb – I’d like to wish you all the best in
the coming year.
Deb Came, Ph.D., is Director of Student Information
at OSPI.
She may be reached at 360-725-6356, or
[email protected]
Interviewer: Andrea Meld, Ph.D., Ed., The Standard
“If we are to reach real peace in this
world…we shall have to begin with the
-Mohandas Gandhi, Indian nonviolent
civil rights leader (1869-1948)
The Standard Deviation
Page 20 / October 2011
CogAt Author Discusses Issues in Gifted Identification
by Phil Dommes, Ph.D.
In late June 2011, WERA sponsored a successful day of professional development, focusing on issues in the
identification of gifted students. Workshop presenter Dr. David Lohman, currently a professor of educational
psychology at the University of Iowa, has authored many articles and chapters on individual differences and is
the author of the newly revised Cognitive Abilities Test (CogAT). Lohman had a comfortable style which kept
the audience of fifty plus educators attentive throughout the presentation.
One of the first issues Lohman unpacked was how one describes the target group: “gifted” vs “talented.” The
term “gifted” tends to suggest an innate set of traits inherent to the individual. “Talented,” on the other hand,
has the connotation of a variable set of abilities which reflect both aptitude and experience. As is often the
case, definition can have a major influence on how one assesses, evaluates, and places students. Lohman
strongly prefers the “talented” description and refers to “talent identification” rather than “gifted
Another issue addressed by Lohman was how to aggregate scores in identifying academically talented
students. Referencing the Cognitive Abilities Test (CogAT), Lohman recommends the use of profile scores over
composite scores. Profile scores retain the marked differences that often exist in the abilities of individual
students, especially in those who are more talented. Composite scores (e.g., a total combined score for
verbal/quantitative/non-verbal) mask these important differences and increase the possibility of under-serving
students with a strong ability in one area but not another. Lohman also counsels the use of multiple measures
and differentiated measures of ability and achievement.
After a review of how tests are normed, Lohman makes a strong case for the use of local norms. These norms
help in looking at the total local population of tested students and also in looking at the relative rankings of
students within subgroups.
Lohman closed the session with a preview of the Seventh edition of the Cognitive Abilities Test. This edition
represents the most significant revision of the CogAT since it was first published in 1968. Those who are
concerned with cultural sensitivity will be especially pleased that the new version is more ELL friendly.
-Phil Dommes is Director of Assessment in the North Thurston School District and the WERA Educational Journal
Book Editor.
David Lohman’s Power Point is available on the WERA website.
More Scenes from recent WERA Conferences
Larry Ainsworth, giving a keynote address for the Spring
2009 WERA Conference
Caitlin Scott, Education Northwest
The Standard Deviation
WERA Board Members 2011-2012
Washington Educational Research Association
Nancy Katims, President
Director of Assessment, Research & Evaluation
Edmonds School District
20420 68th Avenue West
Lynnwood, WA 98036
Phone: 425-431-7302
Fax: 425-431-7123
[email protected]
Pete Bylsma, President-Elect
Director of Assessment and Student Information Services
Renton School District
300 South West 7th Street
Renton, WA 98057
Phone: 425-204-2335
Fax: 425-204-2327
[email protected]
Bob Silverman, Past President
Vice President of Educational Technology Consulting
9306 Milburn Loop South East
Lacey, WA 98513
Phone: 360-349-7469
Fax: 360-923-0907
[email protected]
Sharon Rockwood, Executive Secretary
Washington Educational Research Association
PO Box 15822
Seattle, WA 98115
Phone: 206-417-7776 ext. 2
Fax: 206-417-4525
[email protected]
Page 21 / October 2011
At-Large Board Members
Jodi Bongard
Executive Director of Elementary Education
Issaquah School District
565 North West Holly Street
Issaquah, WA 98027
Phone: 425-837-7025
Fax: 425-837-7676
[email protected]
(Term expires April 30, 2014)
Christopher Hanczrik
Director of Assessment Operations
Office of Superintendent of Public Instruction
PO Box 47200
Olympia, WA 98504
Phone: 360-725-6350
Fax: 360-725-6332
[email protected]
(Term expires April 30, 2014)
Kathryn Sprigg
Interim Director, Office of Accountability
Highline School District
15675 Ambaum Blvd SW
Burien WA 98166
Phone: 206-433-2334
Fax: 206-433-2351
[email protected]
(Term expires April 30, 2012)
Mike Jacobsen
Director of Assessment and Curriculum
White River School District
240 North A
Buckley, WA 98321
Phone: 360-829-3951
Fax: 360-829-3358
[email protected]
(Term expires April 30, 2012)
The mission of the Washington Education Association is to improve the professional practice of educators
engaged in instruction, assessment, evaluation, and research.
WERA Services
• WERA provides professional development through conferences, publications, and seminars.
• WERA provides forums to explore thoughtful approaches and a variety of views and issues in education.
• WERA provides consultation and advice to influence educational policy regarding instruction, assessment,
evaluation, and research.
The Standard Deviation
Page 22 / October 2011
The Standard Deviation Newsletter
Andrea Meld, Ph.D.
Applications and Assessment Analyst
Kent School District
Phone: 360-725-6438
[email protected]
Washington Educational
Research Association
PO Box 15822
Seattle, WA 98115
We’re on the Web!
Visit us at:
Photo Editor
Don Schmitz
Mukilteo School District
Editorial Assistants
Angie Mangiantini
Layout Designer
Michelle Sekulich
The Boeing Company
Executive Secretary
Sharon Rockwood
The Standard Deviation is published spring,
winter, and fall as an online newsletter.
Submissions are welcomed from WERA members
and others. Kindly submit articles for consideration
using APA format.
Fly UP