• This page, Audit of the Department of Early Education and Care Objectives, Scope, and Methodology, is   offered by
  • Office of the State Auditor

Audit of the Department of Early Education and Care Objectives, Scope, and Methodology

An overview of the purpose and process of auditing the Department of Early Education and Care.

Table of Contents

Overview

In accordance with Section 12 of Chapter 11 of the Massachusetts General Laws, the Office of the State Auditor has conducted a performance audit of certain activities of the Department of Early Education and Care (EEC) for the period January 1, 2021 through December 31, 2022.

We conducted this performance audit in accordance with generally accepted government auditing standards. Those standards require that we plan and perform the audit to obtain sufficient, appropriate evidence to provide a reasonable basis for our findings and conclusions based on our audit objectives. We believe that the evidence obtained provides a reasonable basis for our findings and conclusions based on our audit objectives.

Below is a list of our audit objectives, indicating each question we intended our audit to answer; the conclusion we reached regarding each objective; and, if applicable, where each objective is discussed in the audit findings.

ObjectiveConclusion
  1. Did EEC conduct annual site visits to residential care programs within the designated timeframes in accordance with Section VIII(b) of EEC’s “Differential Licensing Residential Handbook”?
No; see Finding 1
  1. Did EEC investigate reported instances of abuse or neglect of children in its residential care programs in accordance with Section 3.04(5)(c) of Title 606 of the Code of Massachusetts Regulations (CMR) and Section 9(c) of Chapter 15D of the General Laws?
No; see Finding 2
  1. Did EEC perform required background record checks for residential care program employees and interns in accordance with 606 CMR 14.05(1)(a) and Section 7(a)(i) of Chapter 15D of the General Laws?
No; see Finding 4
  1. Did EEC perform required background record checks for Family Child Care (FCC) program licensees and household members in accordance with 606 CMR 14.05(2)?
No; see Finding 5
  1. Did EEC have a language access plan as required by the Office of Access and Opportunity’s Language Access Policy Implementation Guidelines?
No; see Finding 6
  1. To what extent did EEC ensure that its licensees provided staff members with trainings to recognize the signs of and prevent abduction, sexual exploitation, or human trafficking?
To some extent; see Finding 7
  1. Did EEC ensure that its licensees supported LGBTQ children and teenagers by adopting the recommendations made by the Massachusetts Commission on LGBTQ Youth in its fiscal years 2021 and 2022 Report and Recommendations?
No; see Finding 8
  1. Did EEC ensure that noncompliant funded programs addressed deficiencies within 14 days in accordance with the EEC Essentials Training Requirements and Applicability policy?
No; see Finding 9
  1. Did EEC investigate high-risk complaints of unlicensed care in accordance with its Internal Policy Handbook for EEC Childcare Operations Staff?
No; see Finding 3

To accomplish our audit objectives, we gained an understanding of the aspects of the internal control environment relevant to our objectives by interviewing EEC staff members and by reviewing EEC’s internal control plan, policies, and procedures that were effective during the audit period. We tested the operating effectiveness of certain controls related to our objectives by examining the following Licensing Education Analytical Database (LEAD) visit items:

  • EEC’s review of residential program staff records documented by EEC staff members during visits to verify the completion of background record checks for program staff members and
  • EEC’s review of funded program staff records documented by EEC staff members during visits to verify the completion of EEC Essentials training.

To obtain sufficient, appropriate evidence to address our audit objectives, we performed the procedures described below.

Annual Site Visits

To determine whether EEC conducted annual site visits to residential care programs within the designated timeframes in accordance with Section VIII(b) of EEC’s “Differential Licensing Residential Handbook,” we took the following actions:

  • We selected a random, nonstatistical sample of 40 residential programs from a population of 364 residential programs with licenses that were active during the audit period.
  • For each program in the sample, we obtained, from LEAD, the date of the most recent visit conducted before the audit period and the dates of the next two visits.
  • We calculated when the visit was due to be conducted for programs, using EEC’s “Differential Licensing Residential Handbook” guidelines, and determined whether EEC conducted a visit before the calculated visit due dates.
  • We calculated visit due dates based on the previous visit date and the handbook’s outlined intervals—typically annually—with specific windows of two to four months before or after the due date, based on the type of visit that occurred.

We determined that, during the audit period, 18 (45%) residential programs out of our sample of 40 did not receive a visit or had a late visit (see Finding 1).

Reported Instances of Abuse and Neglect

To determine whether EEC investigated reported instances of abuse or neglect of a child in its residential care programs in accordance with 606 CMR 3.04(5)(c) and Section 9(c) of Chapter 15D of the General Laws, we took the following actions:

  • We obtained a spreadsheet of 51A Reports from the Department of Children and Families (DCF) and 51A Reports as PDFs from EEC.
  • We reconciled this data to include only 51A Reports that were screened in by DCF, resulting in a list of 885 51A Reports.
  • From this population, we selected a random, nonstatistical sample of 60 51A Reports. For each sampled 51A Report, we conducted a query within LEAD to identify the related EEC investigation.
  • We obtained the case identification number, category (based on risk level), investigation identification number, investigator, investigation due date, and investigation completion date for each 51A Report selected from LEAD.
  • We compared the investigation due date to the investigation completion date to determine whether the sampled 51A Reports were investigated by EEC by their due dates.

We determined that, during the audit period, 2 (3%) of the 51A Reports out of our sample of 60 were internally investigated by the program where the 51A Report incident occurred but were not investigated by EEC, as required (see Finding 2). Additionally, during our review, we identified an issue outside the scope of our objective related to late submission of 51A Report investigations (see Other Matters).

Residential Program Employee Background Record Checks

To determine whether EEC performed required background record checks for residential care program employees and interns in accordance with 606 CMR 14.05(1)(a) and Section 7(a)(i) of Chapter 15D of the General Laws, we took the following actions:

  • We obtained the population of 364 residential programs with licenses that were active during the audit period.
  • We first selected the 20 residential programs that had been identified by EEC as noncompliant with background record checks during the audit period.
  • From the remaining population of 344 residential programs, we selected a judgmental, nonstatistical9 sample of 20 residential programs, based on a weighted score. The weighted score was calculated using the number of incidents10 a residential program had during the audit period divided by its licensed capacity, which is the number of children the program could care for at one time.
  • Using the above approach, we selected residential programs by the following amounts and classifications: seven high-risk, seven moderate-risk, and six low-risk residential programs.

At our request, EEC contacted the residential programs in our sample to request their lists of employees and interns who were active during the audit period. We joined all the lists received from all programs into a list of 2,852 residential program employees and interns. We then extracted background record check data from LEAD and joined this data with the employee list. For each employee and intern in the joined data set, we examined their hire date and the date of their Criminal Offender Record Information, Sex Offender Registry Information, DCF, and fingerprint checks. Further, we calculated the dates to determine whether each employee or intern received the background record check either before their employment or within three years of their previous background record check.

We determined that, during the audit period, 39 (98%) residential programs out of our sample of 40 had one or more missing components of a background record check (see Finding 4).

FCC Background Record Checks

To determine whether EEC performed required background record checks for FCC program licensees and household members in accordance with 606 CMR 14.05(2), we took the following actions:

  • We extracted a list of 5,799 FCC programs from LEAD.
  • From that list, we selected a random, statistical11 sample of 60 FCC programs that were active during the audit period, using a 95% confidence level,12 a 0% expected error rate,13 and a 5% tolerable error rate.14
  • For each of the sampled FCC programs, we obtained, from LEAD, a list of the FCC programs’ licensees, household members, and people who were regularly on the premises.
  • We determined whether EEC conducted a background record check for each FCC program’s licensee, household member, and individual who was regularly on the premises by notating the dates for each component of the background record check.

We determined that, during the audit period, 14 FCC licensees out of our sample of 60 (23%) had one or more missing components of a background record check. For more information, see Finding 5.

Language Access Plans

To determine whether EEC had a language access plan as required by the Office of Access and Opportunity’s Language Access Policy Implementation Guidelines, we obtained and reviewed a copy of EEC’s most recent language access plan, which did not contain a publication date. However, we identified the EEC commissioner’s signature on the language access plan and noted that this EEC commissioner resigned in 2013.

We determined that EEC did not update its language access plan every two years (see Finding 6).

Trainings to Recognize Trafficking

To determine to what extent EEC ensured that its licensees trained their staff members to recognize the signs of and prevent abduction, sexual exploitation, or human trafficking, we took the following actions:

  • We interviewed EEC management, who informed us that they currently do not have specific training requirements related to recognizing the signs of and preventing abduction, sexual exploitation, or human trafficking.
  • We conducted additional analyses of recommendations published by the Massachusetts Interagency Human Trafficking Policy Task Force in 2013 and DCF’s Policy 2016–002 to determine what elements should be provided as training to meet the minimum requirements of the task force’s recommendations and DCF’s policy. One such training topic that we identified was screening runaway children, who returned to residential programs, for signs of human trafficking or sexual exploitation.
  • We reviewed 606 CMR 3 to determine what training requirements EEC had in place that closely related to the topics of sexual exploitation and human trafficking. Specifically, we identified the Runaway Policy as one such policy that residential programs are required to use to conduct orientation as well as submit this policy to EEC for review. A Runaway Policy outlines procedures for responding to and preventing incidents where children run away from the program.
  • We selected a random, nonstatistical sample of 40 residential programs from a population of 364 residential programs with a license that was active during the audit period.
  • For each of the 40 sampled residential programs, we inspected the program’s document library within LEAD to determine whether the program submitted a Runaway Policy. We then reviewed those submitted Runaway Policies for documentation outlining the screening of runaway children who returned to residential programs for signs of human trafficking and sexual exploitation.

We determined that, during the audit period, 30 (75%) residential programs out of our sample of 40 did not have a policy in LEAD containing language related to the screening of children for signs of abduction, sexual exploitation, or trafficking (see Finding 7).

LGBTQ Youth Recommendations

To determine whether EEC ensured that its licensees supported LGBTQ children and teenagers by adopting the recommendations made by the Massachusetts Commission on LGBTQ Youth in its fiscal years 2021 and 2022 Report and Recommendations, we interviewed EEC management regarding EEC’s implementation status for each of the five recommendations. We noted that some of the recommendations made in the fiscal years 2021 and 2022 reports had been recommendations in the annual reports since fiscal year 2018. In addition to the interview, we reviewed written responses from EEC summarizing its implementation status of the recommendations.

We determined that, during the audit period, EEC had not yet implemented the recommendations (see Finding 8).

Noncompliant Funded Programs

To determine whether EEC ensured that noncompliant funded programs addressed deficiencies within 14 days in accordance with the EEC Essentials Training Requirements and Applicability policy, we obtained funded program and EEC Essentials training noncompliance data from LEAD. We identified and tested all 50 noncompliant funded programs that were active during the audit period. We calculated the number of days between the notification of corrective action by EEC and the date the corrective action was implemented. This allowed us to determine whether the funded programs complied with EEC Essentials training and whether any instances of noncompliance were addressed within the required 14-day period.

We determined that, during the audit period, 26 (52%) funded programs out of our population of 50 did not address their EEC Essentials training noncompliance in a timely manner (see Finding 9).

High-Risk Complaints

To determine whether EEC investigated high-risk complaints of unlicensed care in accordance with its Internal Policy Handbook for EEC Childcare Operations Staff, we obtained a list of unlicensed care complaints from LEAD. There were 286 total unlicensed care complaints during the audit period. We found that 94 of these 286 unlicensed care complaints were categorized as high risk in LEAD. We tested all 94 of these high-risk unlicensed care complaints. We determined whether EEC had an EEC investigator or licensor investigate each high-risk unlicensed care complaint by reviewing the investigation files in LEAD.

We determined that, during the audit period, EEC investigators did not investigate 9 high-risk unlicensed care complaints out of our population of 94 (10%), but rather, EEC licensors conducted these investigations (see Finding 3).

We used a combination of statistical and nonstatistical sampling methods for testing. Where we used nonstatistical sampling methods, we did not project the results of our testing to any of the populations.

Data Reliability Assessment

To determine the reliability of the data within LEAD for licensed programs, funded programs, visits, and cases, we interviewed EEC and Executive Office of Education information system and technology employees who were knowledgeable about the data. We reviewed information security policies and procedures from EEC as well as System and Organization Control reports.15 Specifically, we reviewed these documents for security management, configuration management, contingency planning, and segregation of duties. We also tested user access controls to ensure that EEC’s LEAD users had completed cybersecurity awareness training and state background checks, had their accounts removed upon termination, had their access rights updated with position changes, and that EEC conducted annual user reviews. We performed further electronic checks of the data for duplicates, missing information, and data outside of the audit period.

We obtained a list of employees and interns who were active during the audit period from each of the 40 residential programs in our sample directly from these programs. However, we were unable to test the completeness and accuracy of this data without payroll information, which was necessary for tracing and vouching.16 EEC could not provide this data to us because EEC does not gather payroll information from its programs. We used the list of employees and interns that we obtained directly from the programs, as that was the only source available for our audit purposes. For more information, see Finding 4.

We obtained a spreadsheet of 51A Reports from DCF and PDFs of 51A Reports from EEC that it received during the audit period. We used data analytics tools to extract key information—such as intake numbers, dates, and screening statuses—from the EEC PDF 51A Reports and generated a spreadsheet of 51A Reports. We then reconciled the information in this spreadsheet to the spreadsheet of 51A Reports from DCF and filtered the dataset to include only those reports that DCF screened in for testing.

Based on the results of the data reliability assessment procedures described above, we determined that the information we obtained was sufficiently reliable for the purposes of our audit.

9.    Auditors use judgmental (i.e., nonstatistical) sampling to select items for audit testing when a population is very small, the population items are not similar enough, or there are specific items in the population that the auditors want to review. Auditors use their knowledge and judgment to select the most appropriate sample. For example, an auditor might select items from areas of high risk. The results of testing using judgmental sampling cannot be used to make conclusions or projections about entire populations; however, they can be used to identify specific issues, risks, or weaknesses.

10.    Some examples of incidents include allegations that result in a 51A Report, issues with the administration of medication, or hospitalization.

11.    Auditors use statistical sampling to select items for audit testing when a population is large and contains similar items. Auditors generally use a statistical software program to choose a random sample when sampling is used. The results of testing using statistical sampling, unlike those from judgmental sampling, can usually be used to make conclusions or projections about entire populations.

12.    Confidence level is a mathematically based measure of the auditor’s assurance that the sample results (statistic) are representative of the population (parameter), expressed as a percentage.

13.    Expected error rate is the number of errors that are expected in the population, expressed as a percentage. It is based on the auditor’s knowledge of factors such as prior year results, the understanding of controls gained in planning, or a probe sample.

14.    Tolerable error rate is the maximum error in the population that auditors would be willing to accept and still conclude that the result from the sample has achieved the audit objective.

15.    A System and Organization Control report is a report issued by an independent contractor on controls about a service organization’s systems relevant to security, availability, processing integrity, confidentiality, or privacy.

16.    Vouching is the inspection of supporting documentation to corroborate data.

Date published: November 25, 2024

Help Us Improve Mass.gov  with your feedback

Please do not include personal or contact information.
Feedback