• This page, Audit of the Department of Youth Services Objectives, Scope, and Methodology, is   offered by
  • Office of the State Auditor

Audit of the Department of Youth Services Objectives, Scope, and Methodology

An overview of the purpose and process of auditing the Department of Youth Services.

In accordance with Section 12 of Chapter 11 of the Massachusetts General Laws, the Office of the State Auditor has conducted a performance audit of certain activities of the Department of Youth Services (DYS) for the period July 1, 2022 through June 30, 2024.

We conducted this performance audit in accordance with generally accepted government auditing standards. Those standards require that we plan and perform the audit to obtain sufficient, appropriate evidence to provide a reasonable basis for our findings and conclusions based on our audit objectives. We believe that the evidence obtained provides a reasonable basis for our findings and conclusions based on our audit objectives. 

Below is a list of our audit objectives, indicating each question we intended our audit to answer; the conclusion we reached regarding each objective; and, if applicable, where each objective is discussed in the audit findings.

Objective  Conclusion
  1. Did DYS monitor its compliance with Sections 5(a) and 5(b) of Chapter 120 of the General Laws and its own policies (see the Appendix for these DYS policies) regarding the assessments of youths committed to its custody?
No; see Finding 1
  1. Did DYS deliver timely, accurate, and complete educational programs, services, and support to youths committed to its custody as outlined in Sections 2 and 7 of Chapter 18A of the General Laws and Sections I–IV of its Guidelines for Serving Youth with Disabilities Under Section 504?
No; see Finding 2 and Other Matters
  1. Did DYS institute formal disciplinary guidelines for its 51B report resolution process regarding Section 4.35(2) of Title 110 of the Code of Massachusetts Regulations (CMR) to address supported allegations of neglect and/or abuse of youths in its care or custody? In addition, did DYS’s mentor program assist with this process?
No; see Finding 3

To accomplish our audit objectives, we gained an understanding of the DYS internal control environment relevant to our objectives by reviewing applicable DYS policies and procedures and by interviewing DYS officials who are responsible for the oversight of DYS’s programs. In addition, to obtain sufficient, appropriate evidence to address our audit objectives, we performed the procedures described below.

Assessments

To determine whether DYS monitored its compliance with Sections 5(a) and 5(b) of Chapter 120 of the General Laws and its own policies (see the Appendix for these DYS policies) regarding the assessments of youths committed to its custody, we took the actions described below.

We interviewed DYS employees to determine how DYS conducts assessments and reassessments of youths committed to DYS. We conducted walk-throughs, which showed different examples of intake procedures, along with numerous policies and procedures that DYS employees follow. We reviewed these processes and procedures to conduct our testing.

We selected a random, nonstatistical8 sample of 60 youths out of a population of 421 of youths committed or recommitted to DYS during the audit period to ensure that intake procedures, treatment plans, suicide assessments, progress reports, and youth substance use disorder treatments were all completed and followed according to DYS policies. Specifically, we tested the following attributes from documentation provided by DYS:

  • youths received a health and dental screening within seven days of arrival;
  • admission records were completed upon arrival;
  • offense information was recorded upon arrival;
  • property inventory was taken;
  • handbooks were provided to youths;
  • risk/need assessments were completed;
  • treatment plans were conducted;
  • treatment plans were reviewed at least monthly;
  • suicide assessments were completed as soon as possible;
  • youths met with caseworkers at least monthly;
  • substance use risk factors were assessed;
  • clinical assessment recommendations were made based on the youths’ monthly treatment plans; and
  • substance use disorder treatments were provided.

For this objective, we found certain issues during our testing; namely, that DYS did not always follow its own policies related to monitoring assessments of youths in its custody. See Finding 1 for more information.

Review of Education Programs

To determine whether DYS delivered timely, accurate, and complete educational programs, services, and support to youths committed to its custody as outlined in Sections 2 and 7 of Chapter 18A of the General Laws and Sections I–IV of its Guidelines for Serving Youth with Disabilities Under Section 504, we first selected a random, nonstatistical sample of 40 youths out of a population of 421 youths committed and recommitted to DYS during the audit period.

In addition, we selected all 20 youths who had a 504 plan during the audit period. We then took the actions described below.

504 Plans

To determine whether DYS delivers timely, accurate, and complete 504 plans to youths committed or detained (as outlined in Sections 2 and 7 of Chapter 18A of the General Laws, and in accordance with DYS’s own internal guidelines), we took the actions described below.

We inspected the list DYS provided to us of all 20 youths who were committed or detained during the audit period and had a 504 plan. The list of 20 youths was derived from DYS’s online document management system. We reviewed each youth’s placement and status history at DYS, the 504 plan obtained by DYS from the school districts, the education profile in DYS’s online document management system, and consent forms to determine whether DYS provided services and support to youths in accordance with the 504 plans. We then inspected the list to determine whether DYS followed its own guidelines, which include the following:

  • identifying whether each youth previously had a 504 plan either at DYS or in their independent school district;
  • completing an evaluation for a 504 plan;
  • determining the eligibility for each youth to obtain a 504 plan; and
  • developing a 504 plan, if needed.

For this aspect of our objective, we found certain issues during our testing; namely, that DYS did not always deliver (1) timely, accurate, and complete educational services, specifically related to 504 plans, and (2) support to youths in its custody. See Finding 2 for more information. 

General Education

To determine whether DYS delivered timely, accurate, and complete general educational programs, services, and support to youths committed to DYS as outlined in Sections 2 and 7 of Chapter 18A of the General Laws, we took the actions described below.

Through several discussions with DYS personnel members, including DYS’s general counsel and the director of education initiative, we were informed that DYS is responsible for the education of youths, with the coordination of the Department of Elementary and Secondary Education, public schools, and contracted companies. Additionally, DYS provided us with procedures that are followed by DYS and its contracted employees to ensure that youths committed to the department are provided with quality education.

Using each youth’s Master Identification Number (MID), we requested and inspected each youth’s history of assessments (specifically, to see completion of core classes and the Massachusetts Comprehensive Assessment System), the education journal from DYS’s online document management system, the youth’s enrollment status and academic outcome, and the youth’s transcripts (showing completion of classes and attendance) for the 40 youths in our sample.

For this aspect of our objective, we concluded that, based on our testing, DYS met the relevant criteria. However, we found certain issues during our testing regarding DYS’s lack of written policies and procedures regarding general education. See Other Matters for more information. 

51B Reports

To determine whether DYS instituted formal disciplinary guidelines for its 51B report resolution process regarding 110 CMR 4.35(2) to address supported allegations of neglect and/or abuse of youths in its care or custody, and to determine whether DYS’s mentor program assisted with this process, we took the actions described below.

We interviewed DYS employees to determine how DYS communicates with the Department of Children and Families (DCF), conducts its own internal investigation into misconduct, and takes disciplinary action against employees involved in supported 51B reports. DCF provided us with 30 supported 51B reports against DYS employees from the audit period. DYS then provided us with its internal investigation report for the same 30 supported 51B reports.

To ensure that DYS instituted formal disciplinary action against employees with supported 51B reports, we reviewed the DCF report to identify DCF’s findings, the corresponding internal investigation completed by DYS, the Serious Incident Report (SIR), and proof of any disciplinary action taken against the employee(s). In reviewing the disciplinary action taken, we inspected the evidence to determine whether DYS took further action, aside from termination, such as offering trainings or involvement in the mentorship program. Additionally, we reviewed the hire date of the employee involved in the incident to determine whether the employee involved in the 51B report was a part of the mentorship program, and if so, whether the mentorship program provided assistance or training to the employee.

For this objective, we found certain issues during our testing; namely, that DYS did not institute sufficient disciplinary guidelines for its 51B report resolution process. See Finding 3 for more information. 

We used nonstatistical sampling methods for testing and therefore did not project the results of our testing to the corresponding populations.

Data Reliability Assessment

Population of Committed and Recommitted Youth

DYS provided us with a list of all youths committed and recommitted to DYS during the audit period, which consisted of 421 youths. To determine the reliability of the list of all committed and recommitted youths in DYS custody, we interviewed DYS employees who were knowledgeable about DYS’s online document management system, including DYS’s senior director of systems operations. We performed validity and integrity tests on the list to ensure that it did not contain certain dataset issues (i.e., blank fields, duplicate records, missing values in the MIDs, and data corresponding to dates outside the audit period). Further, to determine the accuracy of the list, we randomly selected a sample of 20 youths from the list and traced the information to the Medicaid Management Information System (MMIS) to determine whether the information matched. To determine the completeness of the list, we randomly selected a sample of 20 youths from the MMIS and traced the information to the list of committed and recommitted youths.

To determine the reliability of the data from MMIS, we relied on the work performed by the Office of the State Auditor in a separate project completed in 2023 that tested certain information system controls in MMIS. As part of this work, the Office of the State Auditor reviewed existing information, tested selected system controls, and interviewed agency officials knowledgeable about the data. Additionally, we performed validity and integrity tests on all data relevant to that audit to ensure that it did not contain certain dataset issues (i.e., blank fields, duplicate records, and data corresponding to dates outside the audit period).

Population of 51A Reports 

To determine the reliability of the SIRs, which include 51A reports, maintained by DYS, we interviewed DYS employees who were knowledgeable about the data. DYS provided us with a list of all serious incidents, from DYS’s online document management system, that occurred during the audit period, which consisted of 3,201 serious incidents, including 269 51A reports filed. We performed validity and integrity tests on all serious incidents that occurred during the audit period to ensure that the data did not contain certain dataset issues (i.e., duplicate identification numbers, gaps in the identification numbers, and data corresponding to dates outside the audit period). 

Additionally, we obtained a list of all 51A reports from DCF during the audit period. To determine the accuracy of the list of 269 51A reports filed, we selected a judgmental sample9 of 20. For each 51A report in our sample, using the submission date and case number, we ensured that each 51A report was also listed in the DCF list. We determined whether the date and description of the incident in DYS’s online document management system matched the data in DCF’s list.

To determine the completeness of the 269 51A reports filed, we selected a separate judgmental sample of 20 from the DCF list. For each 51A report in our sample, using the submission date and case number, we ensured that the 51A report was also listed in DYS’s online document management system. We determined whether the date and description of the incident in DCF’s list matched the data in DYS’s online document management system.

Based on the results of the data reliability assessment procedures described above, we determined that the information we obtained during the course of our audit was sufficiently reliable for the purposes of our audit.

8.    Auditors use nonstatistical sampling to select items for audit testing when a population is very small, the population items are not similar enough, or there are specific items in the population that the auditors believe require review.

9.    Auditors use judgmental sampling to select items for audit testing when a population is very small, the population items are not similar enough, or there are specific items in the population that the auditors determine are appropriate to review. Auditors use their knowledge and judgment to select the most appropriate sample. For example, an auditor might select items from areas of high risk. The results of testing using judgmental sampling cannot be used to make conclusions or projections about entire populations; however, they can be used to identify specific issues, risks, or weaknesses.

Date published: April 24, 2026

Help Us Improve Mass.gov  with your feedback

Please do not include personal or contact information.
Feedback