Learning Point Associates logo.

Learning Point Associates Contact Us Privacy Policy Search

Great Lakes East
Comprehensive Center

Record of Services

Fall 2007

INDIANA

State Manager: Jayne Sowers

Does curriculum make a difference in student achievement? The answer is "yes" according to researchers and scholars such as Kercheval (2001), Marzano (2003), and Zavadsky (2006). Armed with this current research, the Indiana Department of Education (IDOE) selected the corresponding NCLB federal sanction for its districts in corrective action: "Institute and fully implement a new curriculum..." (U.S. Department of Education, 2006, p. 49). The IDOE Division of Compensatory Education/Title I views curriculum as having a great potential in increasing student achievement in the state and thus is undertaking substantial effort to inform and support districts in creating and implementing a new curriculum in a rigorous and cohesive manner.

The IDOE Division of Compensatory Education/Title I requested assistance from Great Lakes East to identify and put into place research-based processes and tools to assist districts in curriculum development and implementation. The process of "mapping and aligning the curriculum" is one that is nationally recognized as a powerful method and the one that the IDOE Division of Compensatory Education/Title I is adopting for state districts in corrective action. The process begins with teachers individually "mapping," or completing charts, on the content, skills, timelines, assessments, and corresponding state standards of their own teaching. Through well-defined procedures, teachers share their individual maps with other teachers. Carefully crafted dialogues lead to significant discussions and debates, and finally to consensus as to the agreed-upon curriculum. The curriculum is built by teachers within the same grade levels or topics in order to provide a consistent curriculum across the district rather than varying by each school. This is followed by an examination of the curriculum by teachers across the grade levels (e.g., English 4, 5, 6) to create a challenging flow of skill development with increasing cognitive demand. The curriculum is examined for redundancies and repetitions (the same standards being repeatedly taught over several grade levels without higher expectations or cognitive demand) and for gaps (standards that are not taught or are not taught often enough).

The curriculum mapping process is not a "quick fix." The process requires 2 to 3 years to complete for a single subject area. For almost all Indiana districts, this is a new concept. Therefore, the IDOE Division of Compensatory Education/Title I and Great Lakes East together are developing a system of supports for the districts, including the following:

  • Building a research base regarding curriculum
  • Developing a list of essential requirements of the process and the product
  • Providing workshops on curriculum mapping
  • Training coaches to offer on-site technical assistance to the districts

Compiling new information and requirements in a set of 11 tools (e.g., Tool 2: Key Components—What should be in a curriculum? and Tool 5: Curriculum Self-Assessment—How does our curriculum measure up?)

In addition to the support for developing new curriculum, IDOE and Great Lakes East assisted in evaluating 52 district improvement plans and making recommendations during the summer of 2007. Upcoming assistance for districts include three regional Title I Administrators workshops, as well as the "Beyond District Improvement Plans" workshop planned for November 14, 2007. This workshop is in response to needs expressed by the districts in three areas: (a) measuring the effectiveness of programs and initiatives, (b) planning professional development for teaching students with disabilities, and (c) planning professional development for teaching students learning English as a new language.

During the 2007–08 school year, Great Lakes East will be responding to a new request by IDOE to assist in developing a support system for the districts with schools in improvement. Currently, Great Lakes East is reviewing the research in this area, surveying other SEAs concerning their supports, meeting cross-divisionally with IDOE staff to determine services they provide to schools in improvement, and convening meetings with external stakeholders to determine their concerns and ideas.

References

Kercheval, A. (2001). A case study of key effective practices in Ohio's improved school districts. Bloomington, IN: Indiana Center for Evaluation. Retrieved October 15, 2007, from http://www.indiana.edu/~ceep/projects/PDF/200107_Key_Effec_Prac_Interim_Report.pdf

Marzano, R. J. (2003). What works in schools: Translating research into action. Alexandria, VA: Association for Supervision and Curriculum Development.

U.S. Department of Education. (2006). LEA and school improvement. Non-regulatory guidance (Rev. ed.). Washington, DC: Author. Retrieved October 25, 2007, from http://www.ed.gov/policy/elsec/guid/schoolimprovementguid.pdf

Zavadsky, H. (2006). How NCLB drives success in urban schools. Educational Leadership, 64(3), 69–73.

MICHIGAN

State Manager: Gary Appel

Supporting English Language Learners (ELLs). The Michigan Department of Education (MDE) ELL Advisory Committee met at the end of September to review and finalize MDE's five-year strategic plan, a roadmap for creating a comprehensive approach to ELLs in the state. At the meeting, the advisory committee focused on strategic plan implementation, including the roles individual committee members and the professional associations they represent can play. Developing action research implementation across the state was discussed and specific strategies for engaging institutions of higher education were identified. In addition, a professional development initiative in partnership with the Michigan Association for Bilingual Education and Michigan Teachers of English to Speakers of Other Languages was presented and discussed. The work of Great Lakes East will now focus on implementation of the ELL strategic plan. The committee also reviewed and finalized MDE's comprehensive ELL directors' manual, Educating English Language Learners: A Handbook of Resources for Program Managers, and developed multiple strategies for its statewide rollout in fall 2007.

Special Education. During meetings in July, August, and September, the MDE's Disproportionality Core Team, with assistance from Darren Woodruff of the American Institutes for Research and Beverly Mattson of RMC Research Corporation, assisted the Office of Special Education and Early Intervention Services Continuous Improvement Monitoring System staff to develop strategies and processes to determine whether the district's disproportionality is a result of inappropriate identification of students for special education and related services. Site visits began in early October. Each visit involves a review of district policies, procedures, and practices through a student record review, staff interviews, and a district self-review using a modified version of the National Center for Culturally Responsive Educational Systems rubric. Darren Woodruff joined the first site visit and participated in the debriefing following the visit.

Increasing Teacher Quality. Work continued on an individual professional development plan (IPDP) process as part of MDE's Professional Learning Strategic Plan. Learning Point Associates consultant Amy Colton synthesized the work of the stakeholder group and MDE staff and, in late September, she and Donna Hamilton of MDE met with the stakeholder group's structures and processes committee to refine the IPLP process. The committee includes representatives from K–12 LEAs, intermediate service districts (ISDs), higher education representatives, and teacher unions. Work will soon begin on designing the IPLP field test, and the recruitment of districts is well underway.

In addition, work continued on a professional development review system for schools and districts with MDE and the Michigan Staff Development Council. The review system will help schools and districts assess the quality of their professional development as they work to support teacher learning and the IPLP. The core team met in July and September and completed the standards for individual, school-level, and district-level professional development, began development of the rubrics and list of evidence for each standard, and identified the steps in the review process.

High School Reform. Bersheril Bailey, a senior program associate with Learning Point Associates and Great Lakes East, has been working closely with MDE, ISDs, and various professional organizations to map the high school reform landscape across the state. Bailey is helping MDE identify the resources needed in Michigan to increase student achievement in high schools. She is also helping to "darken the dotted line" between MDE and ISDs by identifying innovations currently being developed or implemented across the state to improve high schools. In November, she will assist MDE in forming a Great Lakes East and MDE core high school reform team.

Statewide System of Support. Statewide system of support (SSOS) is emerging as a separate area of technical assistance need. Mike Radke, assistant director in the Office of School Improvement, requested assistance on evaluating and improving the state's system of support for schools in need of improvement. In late September, Great Lakes East, in partnership with the Center for Innovation & Improvement, designed and facilitated an SSOS Evaluation Think Tank with MDE staff and evaluators from Learning Point Associates, the American Institutes for Research, and RMC Research Corporation. The Think Tank explored strategies for evaluating SSOS and began drafting an evaluation plan with MDE.

OHIO

State Manager: Mark Mitchell

Statewide Data System (D3A2). The request for proposal to fully develop the professional development data modules was awarded to Data Driven Decisions for Academic Achievement–D3A2 at Hamilton County Educational Service Center. This work will be directed by Lynn Ochs, technology director at ESC and formerly a member of the D3A2 Professional Development Committee. The Professional Development Committee met in Columbus on September 6, 2007, for a quarterly meeting. The next meeting of the whole group is scheduled for December 14, 2007. At the meeting, Mark Mitchell presented the nearly complete Ohio Data Primer, Carol Daniels and Bob Reece updated the group on work to the Move Ahead Tool, and Project Director for D3A2, Eric James, shared implementation of the system statewide.

As a part of this meeting, a small professional development module review subcommittee was established to provide guidance and an ongoing review function for the modules. Rachel Granatir, Steve Grant, and Mark Mitchell currently serve in this capacity. The professional development data modules, Move Ahead, and the Decision Framework are targeted for completion by the end of 2007. Universal access to all of these tools has been an overriding principle of the statewide data system; therefore, all of these tools will eventually have an online or Web-based presence. Once D3A2 is fully functioning as a data repository, all of these tools will either support its effective use or pull data from it to enable more rigorous and focused use of data to improve instructional practice and student performance on a districtwide and schoolwide basis.

Redesign of Statewide System of Support. The Ohio Department of Education (ODE) is in the process of redesigning its district and school improvement support system to ensure high-quality technical assistance by regional providers and to help build the capacity of district leadership teams for using data and planning tools designed by ODE and its partners. This redesign incorporates at least two aspects:

  • A quality assurance system for state diagnostic teams
  • The development of a focused planning model (Stage 2 of school improvement)

Quality Assurance System. The initial screening of potential school improvement process facilitators, now called State Diagnostic Team members, began with the application process through ODE. This initial screening of Stage 1 applicants was followed by a rigorous two-day performance assessment (July 30–31, 2007). Sheryl Poggi, who had led a similar effort in Illinois, brought her experience to the design of this performance assessment. The design effort included developing assessment or rating instruments for each performance-based station, a plan for the performance assessment; and an outline of the decision-making process for selecting or screening out a candidate.

Performance Assessment and Evaluation. The candidates moved through performance assessment stations in groups. The performance assessment was designed so that half the candidates moved through the assessment in one day and the other half of the candidates the following day. The stations were facilitated by ODE staff from the Office of Educational Reform, as well as many other offices across ODE. These staff members acted as monitors and raters of candidates. The stations were as follows:

  • Continuous improvement plan review: Candidates were given a continuous improvement plan and real data from an Ohio district, then asked to analyze the data and respond to questions.
  • Interview and portfolio presentation: Each candidate was asked to put together a portfolio of personal experiences in education and then present this portfolio to two ODE staff, who also were interviewers.
  • Preparation for presentation: Candidates used their data review and knowledge of the district to prepare a short presentation to a simulated district leadership team.
  • Video analysis of teaching: Candidates viewed a video of classroom mathematics instruction at an elementary school in Ohio. They were then given time to comment on what they observed in the video.

At the conclusion of the two-day period, scores were compiled and added for each candidate for each station. The selection of candidates was based on these scores, which aligned with the consensus view of the raters regarding the candidates' skills and abilities to be effective in this work. Candidates who passed this screening were then invited to a four-day training session to help prepare them for Stage 1 of the School/District Improvement Framework on August 13–16, 2007.

State Diagnostic Team Training and Performance Assessment (Stage 1). This event presented a final opportunity to assess participant problem solving, facilitation, and teaming skills, as well as to provide initial training for working with corrective action districts in Stage 1 (needs assessment).

The training was designed to be an extension of some of the performance-based activities of the previous performance assessment. Candidates engaged in simulated diagnostic team meetings, reviewed district documents, and analyzed videos of classroom practice. This training paralleled a larger meeting of State Support Team members from regional centers in Ohio. Both the State Diagnostic Team members and the State Support Team members attended large sessions to learn more about the continuous improvement planning process and how the new data tools designed by ODE support this process. At the conclusion of this meeting, state diagnostic team members were hired as ODE intermittent or part-time employees.

Pilot Training for Diagnostic Review Teams (Stage 1). On September 17–20, 2007, State Diagnostic Team members along with their new colleagues from the Office of Educational Reform gathered at the Crowne Plaza in Dublin, Ohio, for more intense training on the diagnostic review process (September 17–18) followed by a pilot experience working in selected Columbus City Schools. Linda McDonald, Karen Sanders, and Ray Hart of RMC, and partners in Great Lakes East led this effort. Also invited to the training were focused monitoring staff from the Office for Exceptional Children at ODE and staff from the Center for School Finance at ODE. The involvement of these other groups within ODE represents working across centers and an intentional effort to make the diagnostic review (Stage 1) and later the monitoring of the district continuous improvement plan and process more cohesive and efficient (Stage 4). In the past, the focused monitoring staff and the fiscal auditors would go into districts independently of each other.

The focus of this pilot training was to become more familiar with the Diagnostic Review instruments as well as to provide an opportunity to conduct a review in a school building. The Diagnostic Review in Stage 1 enables the collection of qualitative forms of data through classroom observations; focus groups, and individual interviews of district and school staff, parents, and administrators, as well as review of documents. This qualitative view and data gathering from a district and selected buildings is fed into and considered as teams and districts work through the Decision Framework Tool. An additional pilot experience with a Corrective Action District is planned for late October 2007.

Focused Planning Model (Stage 2). In Stage 2 of the Ohio School Improvement Framework, school districts develop a Comprehensive Continuous Improvement Plan. Great Lakes East is facilitating a working group to design a step-by-step process to be used by state diagnostic team members and state support team members to guide the development of a rigorous plan—one that focuses on critical needs, and includes aligned research-based strategies and action steps. This work will align with and build from Stage 1 and the generation and analysis of qualitative data from the Diagnostic Tool and quantitative data to address essential questions as detailed in the Decision Framework. All this data will support the identification of critical needs and an understanding of root causes that serve as the planning focus for Stage 2.

Emerging Work: State Systems of Support Evaluation. As this redesigned SSOS becomes operational at the district and school levels, it will be important to evaluate its effectiveness in terms of expected changes in behaviors and indicators at these levels. Great Lakes East will work with ODE in the coming months to design an evaluation of the redesigned SSOS.

Archive

Record of Services from Summer, 2010

Record of Services from Spring, 2010

Record of Services from Winter, 2010

Record of Services from Fall, 2009

Record of Services from Summer, 2009

Record of Services from Spring, 2008

Record of Services from Winter, 2008

Record of Services from Fall, 2007

Record of Services from Spring, 2007

Record of Services from Winter, 2007

Record of Services from October, 2006

Record of Services from July, 2006

Record of Services from April, 2006