• This page, Audit of the Department of Early Education and Care Objectives, Scope, and Methodology, is   offered by
  • Office of the State Auditor

Audit of the Department of Early Education and Care Objectives, Scope, and Methodology

An overview of the purpose and process of auditing the Department of Early Education and Care.

Table of Contents

Overview

In accordance with Section 12 of Chapter 11 of the Massachusetts General Laws, the Office of the State Auditor has conducted a performance audit of certain activities of the Department of Early Education and Care (EEC) for the period July 1, 2016 through June 30, 2018.

We conducted this performance audit in accordance with generally accepted government auditing standards. Those standards require that we plan and perform the audit to obtain sufficient, appropriate evidence to provide a reasonable basis for our findings and conclusions based on our audit objectives. We believe that the evidence obtained provides a reasonable basis for our findings and conclusions based on our audit objectives.

Below is a list of our audit objectives, indicating each question we intended our audit to answer; the conclusion we reached regarding each objective; and, if applicable, where each objective is discussed in the audit findings.

Objective

Conclusion

  1. Did EEC conduct all group care program licensing site visits in accordance with its policies and procedures?

No; see Finding 3

  1. Did EEC ensure that corrective actions for noncompliance identified during its group care program inspections were implemented in accordance with Section 1 of Title 102 of the Code of Massachusetts Regulations (CMR)?

Yes

  1. Did EEC investigate all reported instances of abuse or neglect in its residential programs in accordance with 606 CMR 3 and Section 9 of Chapter 15D of the General Laws, as well as other applicable EEC requirements?

No; see Findings 1 and 4

  1. Did EEC ensure that all required background record checks (BRCs) were performed for group care program employees involved in unsupervised contact with children in accordance with 606 CMR 14?

No; see Finding 2

To achieve our audit objectives, we gained an understanding of the internal control environment we determined to be relevant to our audit objectives by reviewing EEC’s internal control plan and applicable laws, regulations, and agency policies and procedures. We also conducted interviews with EEC’s staff and managers. We reviewed and tested the operating effectiveness of internal controls relevant to site visits, corrective action plans for noncompliance, investigations conducted for allegations of child abuse or neglect, and program employee BRCs.

Additionally, we conducted further audit testing as described in the following subsections.

Group Care Program Licensing Site Visits

To determine whether EEC conducted all group care program licensing site visits in accordance with its policies and procedures, we selected a random nonstatistical sample of 25 group care programs from a total population of 354 group care programs that were licensed during the audit period. For this sample of programs, we reviewed the data maintained in the Licensing Education Analytical Database (LEAD) to determine whether each program had at least one license renewal site visit during the audit period. We also reviewed the entire population of 366 group care program licensing site visits during the audit period that had site visit due dates (deadlines) established in LEAD. We compared the deadlines to the dates the site visits occurred to determine whether the visits were conducted before the deadlines.

Corrective Actions for Noncompliance

To determine whether EEC ensured that corrective actions for noncompliance identified during group care program inspections were implemented in accordance with its policies and procedures, we selected from LEAD a judgmental nonstatistical sample5 of 25 group care programs from a population of 306 group care programs that were still operating at the time of our test. We then referred to LEAD to determine the most recent instances of noncompliance identified at each program during the audit period and requested from EEC the corrective action plans that were submitted to, and approved by, EEC. We conducted site visits to these programs, reviewed the corrective action plans with program management, and asked management to show and explain to us how the plans were implemented. Through our interviews and onsite review, we determined whether the plans were implemented.

Investigation of Allegations of Child Abuse or Neglect

To determine whether EEC investigated residential program compliance in response to reported instances of child abuse or neglect, we first determined whether EEC was in possession of all 51A reports submitted regarding children at residential programs during the audit period. We made this determination by obtaining a list from the Department of Children and Families (DCF) of all residential program 51A reports DCF sent to EEC during the audit period. We then obtained files from EEC containing all the 51A reports6 it had received from DCF for the same population.

We compared the list of 51A reports created from the files provided by EEC to DCF’s list of 51A reports and reconciled the variance. We identified all 447 51A reports on DCF’s list that EEC could not account for and conducted our testing using a risk-based approach. We limited our population to the 51A reports that EEC could not account for because these were more likely not to have been investigated by EEC.

We used the search function in LEAD to determine whether EEC had investigated the 51A reports it could not account for. If we were able to find an associated case in LEAD for a 51A report, we concluded that EEC had investigated it. If we could not locate an associated case in LEAD, we concluded that EEC had not investigated the report.

To determine whether EEC completed all required group care program investigations by the due dates set by its staff, we selected a nonstatistical random sample of 50 closed investigations from a population of 1,061 investigations completed at group care providers during the audit period. We measured the completeness of an investigation by comparing the date the investigation report was approved to be sent to the program, according to LEAD, to the established due date for the investigation.

Additionally, to determine whether investigation site visits were conducted when required, we selected a random sample of 35 from a population of 134 closed investigations that were identified as high risk in LEAD. We reviewed the investigation activity narratives in LEAD to determine whether licensors or investigators had conducted site visits if required.

BRCs

To determine whether EEC ensured that all required BRCs were performed for group care program employees who had unsupervised contact with children, we judgmentally selected a sample of 25 group care programs from a population of 306 group care programs that were still operating at the time of our test. We requested a list of people who were employed during the audit period from each of the 25 programs. The total population of employees from the programs in our sample was 1,663; 832 of these employees were hired during the audit period. We reviewed the BRC data in LEAD to determine whether all 1,663 program employees had the required BRCs. We also reviewed the 832 employees hired during the audit period to determine whether their Sex Offender Registry Information checks were conducted before they began working, as required by 606 CMR 14.

Data Reliability

We determined the reliability of the LEAD data by interviewing key users and administrators of LEAD to obtain their assessment of the reliability of the data and by cross-referencing certain residential program information from Licensing Manager 2010 to LEAD, including the total count of residential programs in our population, the program type (group care program or temporary shelter) of each program, and each program’s first license issue date, all of which should have stayed the same during the migration of data. We also traced certain fields from the BRC source documentation provided by the group care programs for a sample of employees to the BRC data in LEAD. We determined the LEAD data to be sufficiently reliable for the purposes of our audit.

We assessed the reliability of the 51A reports extracted from the associate commissioner’s email by tracing a sample of these reports to LEAD and to the list of 51A reports provided by DCF. We determined that the population of 51A reports received from EEC was sufficiently reliable for the purposes of our audit.

When sampling, we used a nonstatistical sampling method, whose results we could not project across the entire population.

5.     Nonstatistical sampling is the selection of a test group that is based on the auditor’s judgment rather than a formal statistical method.

6.     All personally identifiable information was redacted from these reports.

Date published: May 5, 2020

Help Us Improve Mass.gov  with your feedback

Please do not include personal or contact information.
Feedback