Massachusetts Focused Monitoring
The purpose of data verification as part of the Massachusetts General Supervision and Focused Monitoring process is to establish the reliability of data utilized and published by the Department of Public Health. Based on IDEA 2004, the Office of Special Education Programs (OSEP) requires that data on each monitoring indicator, for each program, be reported publicly every year. OSEP requires, as part of the State Performance Plan (SPP) and the Annual Performance Report (APR) that states report annually on the timeliness and accuracy of data.
Data is used to select which programs will receive on-site visits under Focused Monitoring protocols, and is used to monitor program performance on an ongoing basis. Accuracy of data is therefore key to both DPH's commitment to monitor programs, and to programs' efforts to provide quality services to children and families. The correlation of data from all sources, (such as provider-reported data, service delivery reports, parent complaints, and performance measures) is routinely examined in order to determine its accuracy and reliability. Data verification is also important as an ongoing assessment, training, and quality assurance management tool.
Data sources for the Early Intervention system include:
- Early Intervention Information System: Client Registration, Service, and Discharge data
- Service Delivery Reports
- Agency Uniform Financial Reports
- EI Parent Complaint/Procedural Safeguards system
- Annual Report/Self-Assessment
Data Verification Methodology
Gathering information through the Annual Self-Assessment process allows the state Lead Agency to set new goals and assists us in defining a baseline. The Data Verification plan, based on the Annual Self-Assessment submitted by each provider to DPH, includes the following components:
Verifying accuracy of data entry and transmittal: EIIS Forms. The Annual Self-Assessment asks, in Section II, programs to affirm that "EIIS forms (Referral, Evaluation, IFSP, and Discharge) are present and complete in each child's file and validated within the EIIS system as complete". EIIS data will be cross-referenced to determine whether each of the above-referenced fields is complete for each program that so affirms. Programs with 100% accuracy will receive a certificate from the Lead Agency. A program with an accuracy rate under 95% will receive a phone call from their Regional Specialist informing the Program Director of this. A program with under 90% completion will be required to respond with a file review. Regional Specialist. If the rate is lower than 85%, staff will schedule a data verification visit to the program.
Gathering baseline data on the Transition Planning Conference, a required priority area. The Annual Self-Assessment, Section III: Transition Survey asks for a list of all children in program six or more months who were discharged and referred to an LEA. The Lead Agency will select 10 records from a sampling of programs that reported that transition meetings were held within the required timeframes and that the LEA was present. For these 10 records, we will request that the Transition Page of the IFSP be sent to the Regional Specialist for review. This assures both that the appropriate activities have occurred and that they are documented in the correct location. Providers have been notified that additional documentation on Transition practices may be required as follow-up to completing the Annual Self-Assessment.
Verifying accuracy of data entry and transmittal, as well as eligibility information reported: Annual Self-Assessment, section IV: File Review. Lead Agency personnel will cross-reference eligibility information, comparing what is on EIIS with what was submitted in the Annual Self-Assessment. This assures that eligibility requirements are being met with regard to assessment (asks about eligibility tool), at risk categories and established conditions.
Lead Agency personnel continue to monitor performance measures annually. DPH continues to develop and issue "error reports", by program, every other month. The error reports highlight those programs where 5 or less children have errors in their data, as well as programs that have shown significant improvement since the previous report. Error reports also look closely at programs where either 20% of children or at least 75 children have significant errors in their records, particularly if those errors have persisted over time.