School Indicators & Profiles SIG
A service to members of the American
State Accountability Issues With a Focus on Title I
Gerald L. Richardson, Ph.D., Florida Department of Education
Paper presented at the annual meeting of the American Educational Research Association, Montreal, Canada, April 1999
This discussion paper deals with three issue areas that have emerged
during the last four years of state accountability in Florida, including:
sanctions and rewards, disaggregate reports, and data verification. Issues
pertaining to sanctions and rewards include the over- and under-representation
of Title 1 schools. The major reporting issue deals with Title 1 requirements
for disaggregate reports. Data verification issues center on involving
schools and districts in the quality control process and trying to control
for "gaming" the system. Since all of these issues stem from the unique
nature of the current accountability system in Florida, the place to start
is with a brief overview.
Florida's accountability system has operated virtually unchanged for the last four years. The system is focused squarely on academic performance. It takes into account multiple measures of student performance over time. It results in a straightforward system for grouping schools that is easily understood by even the most casual observer. Minimum performance criteria were established in advance of implementation and have been held constant ever since.
Florida's accountability system is like a scorecard with six cells containing reading, writing and math results for the current and previous years. Test results are shown as the percent of students scoring at or above a particular performance standard (e.g., 50th percentile on norm-referenced tests in reading and math). These percentages are then compared to minimum performance levels (e.g., at least 33% of the students scoring above the standard at the elementary level). Each cell in a school's accountability scorecard has a zero (0) if the particular school level performance measure is at or above minimum state criteria or a one (1) if it is below minimum criteria and, therefore, "critically low." Finally, schools are put into performance groups, depending on how many (if any) school performance measures are below critically low thresholds.
The following scorecard is for a Group 1 school and indicates that the
entire school is critically low performing.
Number of Accountability Indicators
Next is a Group 4 school, which would be considered adequate with regard
to critically low criteria:
Sanctions and Rewards with a Focus on Title I
Simply stated, Title 1 schools have been over-represented among schools sanctioned as "critically low performing" and under-represented among schools that received performance awards.. When the first "critically low" list came out in November, 1995, it contained a total of 158 schools. The vast majority of the elementary schools (98%) identified were Title 1 schools; yet Title 1 only accounts for just over half (55%) of all elementary schools in the state.
When accountability results were first released, there was a media frenzy about the "list of shame" and harsh retorts from schools and districts. But the lowest performing schools did respond to public pressure and made needed improvements--at least enough to get off the dreaded list. Under the state's generous exit criteria, schools could get off the list by raising test scores above minimum criteria in just one subject area. By 1996, the list had shrunk from 158 schools to 71, a decrease of 55%. Still, all but one of the remaining critically low schools had Title 1 programs. Efforts were further intensified and by 1997, there were just 30 schools on the list, 27 of which had Title 1 programs.
Among those last struggling 30 schools were ten that faced the further public embarrassment of having to appear before the State Board of Education (governor and elected cabinet) if they had remained on the list for the third consecutive year after initial identification. All were Title 1 schools. Luckily, all of the last remaining schools managed to escape State Board action. At the time of the last accountability report in November, 1998, there were only four schools on the list, three new elementary schools (all Title 1) and one high school that was a repeater from the original list. Clearly, Title 1 schools have been in focus for concentrated improvement efforts.
On the other hand, Title 1 has not been prominent in recent competition
for cash awards open to all schools under the Florida Recognition Program,
despite the fact that it was patterned after a Title 1 activity that identified
higher performing, high poverty "successful" schools. Instead of $500 or
$1,000 awards that Title 1 gave to its top schools, winners in the state's
recognition program received an average of about $40,000. Of 140 schools
awarded for sustained high achievement and/or substantial improvement in
1998, only 19 were Title 1 schools. Though some of this shortfall was due
to the addition of other eligibility criteria, such as attendance rates,
Title 1 has been somewhat out of focus
for recognition and rewards.
Public Reporting and Title 1 Disaggregate Reports
Yearly results from the state's accountability system are reported in two distinct ways. For the annual press release that concentrates on critically low performing schools, there is the so-called "off, on, new" list which was developed in response to media attention as to which schools came off the infamous list, which ones stayed on and which ones were newcomers. For school and district level audiences the annual School Accountability Report categorizes all schools in Groups 1-4 based on two years of data in reading, writing and math. In addition, the report shows other outcome data such as attendance, promotion, dropout and suspension rates. Also provided are school characteristics, such as poverty, minority and mobility rates. All of these data are featured on a single page report which lists all elementary, middle and high schools in alphabetic order within accountability group. The same data are posted on the Internet as part of the Florida Department of Education's Homepage.
Title I require that schools receiving supplemental funding must disaggregate student achievement data into about a dozen subgroups, plus a number of larger comparison groups (e.g., migrant vs nonmigrant). Who produces disaggregate reports is an important issue since it can be a very time consuming, non-instructional activity for school or district staff. However, since all data are collected and processed centrally in Florida, schools can request special Title I disaggregation reports from the state. These reports are designed specifically to meet federal requirements, plus they have the added advantages of uniform analysis procedures and reporting format.
Disaggregate reports have become very popular even for non-Title 1 schools, given recent initiatives to "close the gap" between higher and lower performing subgroups. However, the Title 1 format is cumbersome and does not lend itself to providing multiple years' data on the same page; therefore, improvement (or lack thereof) is hard to view at a glance. Beside, many Title 1 subgroups exist in name only for most schools.
For the kind of school level reports that are mandated under Title 1,
there are many subgroups that simply do not have enough students to warrant
reporting, much less accountability. As shown below, when subgroups that
have so few students that confidentiality might be compromised (say n<10)
are eliminated, this leaves only seven out of eleven Title 1 subgroups
that affect even 10% of elementary schools in the state. When subgroups
that have membership levels below which statistical validity is questionable
(say n<30) are eliminated, then another Title 1 category falls out of
The data in Table 1 suggest that nearly half of the disaggregate groups
required for Title 1 schools exist for such small numbers of students in
schools that reporting is not advisable for confidentiality reasons or
that accountability determinations are questionable for statistical validity
reasons. Unfortunately, this has resulted in a dual reporting system, one
for statewide distribution and one to satisfy Title 1 requirements.
Data Verification and System Gaming
Do test results accurately reflect schools' accountability status? Unfortunately quality control schemes do not always include the players most affected. In Florida, however, district and school level staff are built-in to quality control procedures from the beginning. Long before school results are published by the state, or appear in local news media, or impact decisions about sanctions or rewards, district personnel are given preliminary data. Superintendents designate a single contact person to be responsible for verifying data accuracy. These contact persons are usually testing coordinators, MIS directors or assistant superintendents for instruction. They typically share the data with other district staff who are familiar with their schools or distribute the data to principals for review.
Perceived discrepancies must be resolved within 30 days, after which all files are frozen for accountability purposes. While some discrepancies have been large (e.g., omission of whole schools), most involve classification questions about a few students. The most frequent problems concern exceptional (special) education and LEP classification because this determines which students are included in state accountability reports. Though time consuming, the verification process can be credited for the fact that--to date--the state has yet to sustain a challenge from a district or school that the test data used were inaccurate. Otherwise, system credibility might be jeopardized.
The premise that high stakes accountability systems are vulnerable to "gaming" or manipulation of results by means other than direct and appropriate instruction should always be treated as a given. If it can happen, it will happen. That is why multiple levels of audit and oversight to channel natural gaming instincts toward desirable behaviors are needed. Florida has state laws that provide fines and other administrative actions against school personnel who deliberately violate test security or manipulate results. Further, the state monitors results electronically to detect discrepancies such as pattern responding, low percent of students tested and extraordinarily high gains from one year to the next. However, only the most flagrant violations are caught in this manner.
The real oversight of test preparation and administration procedures
is applied at the district and school level with activities that range
from anecdotal reports from test proctors to sophisticated computer analysis
of answer document erasure patterns. To date we have experienced incidents
of gangs selling cribsheets, teachers and administrators changing answer
sheets, and districts using an alternate form of a norm-referenced test
in the fall to practice for the real test in the spring. One incident resulted
in the mutual decision by state and district officials to keep a school
on the critically low list until improvement could be verified the following
Summary and Conclusions
Florida's first four years' experience with accountability systems has taught us numerous lessons. Whatever system is used, it must be capable of being reduced to very simple terms. If consumers do not know how the system works, it will be hard to sustain. Title 1 schools will likely remain in the spotlight in terms of needed improvements. Reward systems may need to be re-examined in order to encourage improvement in schools that need it most. High stakes outcomes will continue to fan media interest, hopefully with beneficial results. Current Title 1 reporting requirements do help focus attention on achievement gaps that need to be closed; however, the Title 1 format is cumbersome and requires subgroups that frequently exit in name only. The result has been dual reporting systems. Data verification and stakeholder involvement remain important components of the accountability system to insure the accuracy of data and guard against gaming the system.