Special Education

Overview of the Annual Performance Report Development:

See Overview of the Development of the Annual Performance Report (APR) in the Introduction section, page 1.

Monitoring Priority: FAPE in the LRE

Indicator 11: Percent of children who were evaluated within 60 days of receiving parental consent for initial evaluation or, if the State establishes a timeframe within which the evaluation must be conducted, within that timeline.
(20 U.S.C. 1416(a)(3)(B))

Measurement:

  1. # of children for whom parental consent to evaluate was received.
  2. # of children whose evaluations were completed within 60 days (or State-established timelines*).

Account for children included in (a) but not included in (b).  Indicate the range of days beyond the timeline when the evaluation was completed and any reasons for the delays.

Percent = [(b) divided by (a)] times 100.

*The State’s established timelines to complete the initial evaluation and eligibility determinations is 30 school days for preschool students and 60 calendar days for school-age students.

New York State’s (NYS) Calculation:

NYS’ formula calculating results for this indicator is as follows:

  1. # of children for whom parental consent to evaluate was received (does not include students whose evaluations were completed past the State-established timelines for reasons that are in compliance with State requirements.)
  2. # of children whose evaluations were completed within 30 school days for preschool children and 60 calendar days for school-age students.
    Percent = [(b) divided by (a)] times 100.

Data Source:

Beginning with the 2007-08 year, NYS collects data for this indicator via the Student Information Repository System (SIRS) and verifies these data by displaying them in a VR11 report, which was developed in the PD Data System.  SIRS is NYS' individual student data reporting system.

NYS’ Method Used to Collect Data

NYS collects individual student data through SIRS.  School districts report specific dates when special education events occur, such as the date of referral, date of written parent consent for an initial individual evaluation and the date of the Committee on Preschool Special Education (CPSE) or Committee on Special Education (CSE) meeting to discuss evaluation results.  Information is also collected regarding the number of days from receipt of parent consent to evaluate the child and the date of the CPSE or CSE meeting to discuss evaluation results.  If the number of days exceeds the State-established timelines, reasons for delays are collected.  Some reasons are considered to be in compliance with State requirements and other reasons are not in compliance.  Each school district’s compliance rate is calculated.  NYS requires documentation from each school district whose compliance rate is less than 100 percent that demonstrates each student’s evaluation was completed and that it complies with the regulatory timelines associated with timely completion of initial individual evaluations.

Federal Fiscal Year (FFY) Measurable and Rigorous Target
FFY 2010
(2010-11 school year)
100 percent of children with parental consent to evaluate will be evaluated within State-required timelines.

Actual Target Data for FFY 2010:

In FFY 2010, 84 percent of students with parental consent to evaluate received their initial individual evaluations within State-required timelines.

  • 76.5 percent of preschool children had their initial evaluations completed within 30 school days of the date of the parent’s consent to evaluate; and
  • 90 percent of school-age students had their initial evaluations completed within 60 calendar days of the date of the parent’s consent to evaluate.
Children Evaluated Within 60 Days (or State-established timeline) during FFY 2010

a. Number of children for whom parental consent to evaluate was received

13,760

b. Number of children whose evaluations were completed within 60 days (or State-established timelines)

11,534
4
Percent of children with parental consent to evaluate, who were evaluated within 60 days (or State-established timeline)  (Percent = [(b) divided by (a)] times 100) 84%

Account for children included in (a) but not included in (b) in the above table:

There are 2,052 students in (a) and not in (b) of the above table.  These are students for whom evaluations were not completed within State-established timelines for reasons which are not in compliance with State requirements. The chart below provides information regarding the extent of delays and reasons for not completing the initial evaluations of children within the State established timelines.

Reasons for Delays, FFY 2010 Number of Children by Number of Days of Delay in Completing Evaluations, FFY 2010 Total Percent
of
Total
1-10 11-20 21-30 Over 30
An approved evaluator was not available to provide a timely evaluation. 104 47 21 68 240 10.8%
Evaluator delays in completing evaluations. 271 211 122 183 787 35.30%
Delays in scheduling CPSE or CSE meetings. 490 286 165 258 1,199 53.90%
Total 865 544 308 509 2,226  
Percent of Total 38.9% 24.4% 13.8% 22.90%   100%

Discussion of Improvement Activities Completed and Explanation of Progress or Slippage that Occurred for FFY 2009:

Explanation of Progress or Slippage:

In 2010-11, NYS’ compliance rate improved to 84 percent, an increase of seven percentage points over the State’s rate of 77 percent in 2009-10.  This improvement is significant because the State measures its performance each year based on a different representative sample of school districts.  Therefore, with the exception of NYC, the State’s results only reflect compliance for those districts where the State has not previously monitored for this indicator and does not reflect improvements made by other districts that have corrected their noncompliance.  More than 99 percent of findings of noncompliance identified in 2009-10 and 99 percent of findings identified in 2008-09 have been corrected.  Improvement for this indicator, therefore, demonstrates the proactive attention given to this compliance issue through the State’s improvement activities.

The percent of preschool children who had their initial evaluations completed within 30 school days of the date of the parent’s consent to evaluate improved by approximately nine percentage points.  The percent of school-age students who had their initial evaluations completed within 60 calendar days of the date of the parent’s consent to evaluate improved by five percentage points.

The percent of preschool children that did not have their evaluations completed within the State-required timeline continues to significantly impact the State’s results for this indicator.  Factors impacting this rate include, but are not limited to, the following:

  • The State’s timeline for preschool evaluations (30 school days) is shorter than the federally-required 60 calendar days, which further contributes to evaluation delays.
  • State law allows the parent of a preschool child to select the approved evaluator to conduct the individual evaluation.  Parents do not always select approved evaluators who are able to complete the individual evaluation within the State’s required timeline.  These issues will be addressed through legislative and regulatory changes to be proposed in 2012 (see improvement activities section below).

A review of the length of delays indicates the following:

  • 38.9 percent of all delays in completing initial evaluations were for 1-10 days;
  • 24.4 percent for 11-20 days;
  • 13.8 percent for 21-30 days; and
  • 22.9 percent for more than 30 days.

The percentages in length of the delays as reported above show improvement in the lengths of delays.

A review of the reasons for the delays indicates:

  • 10.8 percent of delays were because an approved evaluator was not available to provide a timely evaluation;
  • 35.3 percent because of evaluator delays in completing the evaluations; and
  • 53.9 percent related to timeliness of scheduling CPSE or CSE meetings to discuss evaluation results.

There has been significant improvement in the percentage of delays caused because an approved evaluator was not available to provide a timely evaluation.  This is an issue which the State has been directly addressing statewide through its approval of programs and monitoring of approved evaluators.

One major root cause of this reason for delays continues to be personnel shortages, particularly in New York City (NYC) and the other Big Four cities.  The State and NYC are implementing court settlement actions under the Jose P. court case relating to availability of professionals in personnel shortage areas (e.g., speech and language and bilingual evaluators).

Correction of FFY 2009 Findings of Noncompliance (if State reported less than 100% compliance):

Level of compliance (actual target data) State reported for FFY 2009 for this indicator:  77%

  1. Number of findings of noncompliance the State made during FFY 2009 (the period from July 1, 2009 through June 30, 2010)
165
(98 school districts)
  1. Number of FFY 2009 findings the State verified as timely corrected (corrected within one year from the date of notification to the local educational agency (LEA) of the finding)  
114
(68 school districts)
  1. Number of FFY 2009 findings not verified as corrected within one year [(1) minus (2)]
51
(30 school districts)
Correction of FFY 2009 Findings of Noncompliance Not Timely Corrected (corrected more than one year from identification of the noncompliance):
  1. Number of FFY 2009 findings not timely corrected (same as the number from (3) above)
51
(30 school districts)
  1. Number of FFY 2009 findings the State has verified as corrected beyond the one-year timeline (“subsequent correction”)
51
(30 school districts)
  1. Number of FFY 2009 findings not verified as corrected [(4) minus (5)]
0 findings
(0 school districts)

Actions Taken if Noncompliance Found Is Not Corrected:
Not applicable.

Verification of Correction of Noncompliance Found in FFY 2008 (either timely or subsequent):
For each district with noncompliance identified, the State verified the correction of noncompliance by requiring submission of the specific date that the individual evaluation was completed for each individual student whose evaluation was not timely.  To verify correction of noncompliance for all students, the districts were required to report to the State the percent of students who had a timely evaluation over a three-month period of time.  See http://www.p12.nysed.gov/sedcar/forms/vr/1011/html/verif11.htm.

Based on a regional sampling process, the State verified the reports of correction of noncompliance by on-site reviews.

The State verified the correction of noncompliance for NYC’s report for individual students whose evaluations were not timely in FFY 2009.  NYC’s annual submission of data for this indicator was used to verify that all children are receiving their individual evaluations within the required timelines.  For this indicator, if NYC’s data did not show 100 percent timely evaluations, this is reported as a new finding for the year reported.

Correction of Remaining FFY 2008 Findings of Noncompliance (if applicable):

  1. Number of remaining FFY 2008 findings noted in the Office of Special Education Program’s (OSEP) June 2011 FFY 2009 APR response table for this indicator
6 findings
(3 school districts)
  1. Number of remaining FFY 2008 findings the State has verified as corrected
6 findings
(3 school districts)
  1. Number of remaining FFY 2008 findings the State has not verified as corrected [(1) minus (2)]
0 findings
(0 school districts)

Verification of Correction of Remaining FFY 2008 findings:

The State required school districts with less than a 100 percent compliance rate for this indicator to submit a statement of assurance from the School Superintendent of correction of the identified noncompliance.  Prior to the school district’s submission that it has corrected the noncompliance, it was required to conduct a review to ensure that each identified student, whose initial evaluation was not completed in compliance with State timelines, and for whom data was not already available in SIRS, had since had his or her initial evaluation completed.

Except for NYC, the districts were also required to monitor and document over a three-month period that all students (or a representative sample for the Big Four districts) had their individual evaluations completed within the required time period.  These results were required to be documented on a form provided by the State.

The State verified the correction of noncompliance for NYC’s report for individual students whose evaluations were not timely in FFY 2009.  NYC’s annual submission of data for this indicator was used to verify that all children are receiving their individual evaluations within the required timelines.  For this indicator, if NYC’s data did not show 100 percent timely evaluations, this is reported as a new finding for the year reported.

For all districts outside of NYC, based on a regional sampling methodology, selected school districts that had submitted a statement of assurance of corrected noncompliance were selected for verification reviews on the accuracy of their reports.  If it was identified that the school district continued to have areas of noncompliance, a new compliance assurance plan was issued to address any instances of individual noncompliance, as well as to resolve any underlying systemic reason(s) for the noncompliance.

Correction of Remaining FFY 2007 Findings of Noncompliance (if applicable):
  1. Number of remaining FFY 2007 findings noted in OSEP’s June 2011 FFY 2009 APR response table for this indicator
2 findings
(1 school districts)
  1. Number of remaining FFY 2007 findings the State has verified as corrected
2 findings
(1 school districts)
  1. Number of remaining FFY 2007 findings the State has not verified as corrected [(1) minus (2)]
0 findings
(0 school district)

Verification of Correction of Remaining FFY 2007 findings:

The State verified that the individual students reported in 2007 have since had their individual evaluations completed.  NYC’s annual submission of data for this indicator was used to verify that all children are receiving their individual evaluations within the required timelines.  For this indicator, if NYC’s data did not show 100 percent timely evaluations, this is reported as a new finding for the year in which the data is reported.

Correction of Any Remaining FFY 2006 Findings of Noncompliance (if applicable):

Not applicable.  The State issued findings based on FFY 2006 data in the FFY 2007 school year.

Correction of Any Remaining Findings of Noncompliance from FFY 2005 or Earlier (if applicable):

Not applicable.  The State issued findings based on FFY 2005 data in the FFY 2007 school year.

Additional Information Required by the OSEP APR Response Table for this Indicator (if applicable):

The State must demonstrate, in the FFY 2010 APR, due February 1, 2012, that the State is in compliance with the timely initial evaluation requirement in 34 CFR §300.301(c)(1).  Because the State reported less than 100 percent compliance for FFY 2009, the State must report on the status of correction of noncompliance reflected in the data the State reported for this indicator.

The State’s report of compliance for FFY 2010 is less than 100 percent. 

 

The State must demonstrate, in the FFY 2010 APR that the remaining six uncorrected noncompliance findings identified in FFY 2008 and the remaining two uncorrected noncompliance findings identified in FFY 2007 were corrected. All uncorrected noncompliance findings identified in FFY 2008 and FFY 2007 have been corrected as explained above.
When reporting on the correction of noncompliance, the State must report, in its FFY 2010 APR, that it has verified that each LEA with noncompliance reflected in FFY 2009 and, for each LEA with remaining noncompliance identified in FFY 2008 and FFY 2007 data, the State reported for this indicator that the LEA:  (1) is correctly implementing 34 CFR §300.301(c)(1) (i.e., achieved 100 percent compliance) based on a review of updated data such as data subsequently collected through on-site monitoring or a State data system; and (2) has completed the evaluation, although late, for any child whose initial evaluation was not timely, unless the child is no longer within the jurisdiction of the LEA, consistent with OSEP Memo 09-02.  In the FFY 2010 APR, the State must describe the specific actions that were taken to verify the correction. The process the State used to verify the correction of noncompliance is identified above.  The verification process is based on a review of updated data through the State’s data system and on-site reviews of a sample of districts.  The State’s verification system ensures both individual and systemic correction of noncompliance.
The State’s failure to correct longstanding noncompliance raises serious questions about the effectiveness of the State’s general supervision system.  The State must take the steps necessary to ensure that it can report, in the FFY 2010 APR, that it has corrected this noncompliance. All longstanding noncompliance findings have been corrected.

Improvement Activities Completed in 2010-11

  • The Office of Special Education used information obtained from federal technical assistance resources to further inform its activities to improve timely evaluations for students with disabilities:
    • In May 2010, the New York State Education Department (NYSED) issued annual determination letters to superintendents of school districts that were identified as having noncompliance for Indicator 11.  The National Early Childhood Technical Assistance Center (NECTAC) checklist, “Local Corrective Action Plans:  Collection and Use of Valid and Reliable Data for Determining Factors Contributing to Noncompliance” (2008), was referenced to provide school districts with examples of questions that should be considered when investigating contributing factors for noncompliance and developing improvement strategies.
    • Links to federal and State technical assistance resources were also included in the annual determination letters to assist district personnel to better understand the issues and effective practices pertaining to Indicator 11.  The link for NECTAC (http://www.nectac.org/external link) was among the resources listed.
    • The Office of Special Education staff participated in monthly Communities of Practice (CoP), hosted by various federal technical assistance centers, in an effort to keep updated on the latest policy information and new resources that NYSED could use directly or share with stakeholder groups.  Included in the monthly CoP calls were those sponsored by NECTAC relating to Indicator 11.
  • To improve timely correction of noncompliance, the Office of Special Education continued the use of electronic notices, sent to school districts at three-month intervals, as a reminder of the noncompliance that needs to be corrected and the next steps that will be taken by the Office of Special Education should timely correction not occur.  Special education monitoring staff also receive copies of the electronic notices and take appropriate proactive actions, including direct follow-up upon a finding that noncompliance was not corrected within nine months.
  • The State continued to provide a three-day training program for chairpersons of CSEs and CPSEs, which includes training on the timelines and process for conducting individual initial evaluations and determining eligibility for special education.  In 2010-11, 46 three-day sessions were provided throughout NYS.
  • Early Childhood Direction Centers funded by the State and NYS Special Education Quality Assurance (SEQA) staff facilitated regional meetings with preschool evaluators and school districts to identify and address the reasons that preschool students were not receiving their evaluations within the required timelines.
  • Links to technical assistance resources were provided to school districts with their notifications of findings of noncompliance.
  • The State continued to direct school districts to NECTAC for information to assist them in developing compliance assurance plans, with particular attention to NECTAC’s  “Resources for Systems Change and Improvement Planning” section of the SPP/APR calendar, available at http://spp-apr-calendar.rrfcnetwork.org/explorer/
    view/id/650?1#category1
    external link.  Additionally, a team of NYSED Special Education Policy, Program Development and SEQA staff who work with early childhood issues and programs participate regularly in the monthly CoP calls sponsored by NECTAC to gain insight into critical issues and benchmark practices nationally.
  • At the OSEP Leadership Conference, the State met with representatives from other States, led by federal resource centers, to discuss issues around correction of noncompliance in large school districts.  NYS continues to participate as a member of a workgroup to address this issue.
  • During 2009-10, Office of Special Education staff and bilingual specialists from the Regional Special Education Technical Assistance Support Centers and staff from the Bilingual/English as second language Technical Assistance Centers provided technical assistance to districts that were conducting bilingual evaluations for preschoolers.
  • The State and NYC are implementing court settlement actions under two court cases:  DD and Jose P., both relating to timely evaluations and placements of students with disabilities.
  • In 2011, State law was amended to address corporate practice law limitations for private approved evaluation programs.  This will address the State’s prior inability to approve any new preschool evaluators to address availability of approved evaluators.

Revisions, with Justification, to Proposed Targets / Improvement Activities / Timelines / Resources for FFY 2010 [If applicable]

  • An amendment to the State’s regulations will be proposed in 2012 to conform the State’s timeline for timely preschool evaluations to 60 calendar days, consistent with the State’s timeline for school-age students with disabilities.
  • A bill to amend State law to modify the parent’s role to select the preschool evaluator will be submitted in 2012.

4 9,482 students’ evaluations were completed within 60 days (or the State established timelines) and another 2,052 students’ evaluations were completed beyond the required timeline, but for reasons authorized in the exception provided in 34 CFR §300.301(d).

Last Updated: April 17, 2012