• This page, Audit of the Massachusetts Department of Correction Objectives, Scope, and Methodology, is   offered by
  • Office of the State Auditor

Audit of the Massachusetts Department of Correction Objectives, Scope, and Methodology

An overview of the purpose and process of auditing the Massachusetts Department of Correction.

Overview

In accordance with Section 12 of Chapter 11 of the Massachusetts General Laws, the Office of the State Auditor has conducted a performance audit of certain activities of the Massachusetts Department of Correction (DOC) for the period July 1, 2022 through June 30, 2024.

We conducted this performance audit in accordance with generally accepted government auditing standards. Those standards require that we plan and perform the audit to obtain sufficient, appropriate evidence to provide a reasonable basis for our findings and conclusions based on our audit objectives. We believe that the evidence obtained provides a reasonable basis for our findings and conclusions based on our audit objectives.

Below is a list of our audit objectives, indicating each question we intended our audit to answer; the conclusion we reached regarding each objective; and, if applicable, where each objective is discussed in the audit findings.

ObjectiveConclusion
  1. Does DOC provide verbal and written instructions to incarcerated individuals (I/Is), upon admission, that explain the procedures for gaining access to healthcare services within a timely manner, including I/Is with learning barriers and/or limited English proficiency (LEP), to ensure that they understand their rights before the initiation of any healthcare services in accordance with Sections 630.02(3), 630.08, and 630.09(1) and (2) of DOC’s Medical Services Policy and Section 650.03 of DOC’s Mental Health Services Policy regarding informed consent?
Not always; see Findings 1 and 2
  1. Does DOC provide educational counseling and placement for criminally sentenced I/Is with Americans with Disabilities Act (ADA) accommodations and/or LEP, as required by Section 48 of Chapter 127 of the General Laws and Sections 441.04, 441.05, and 441.10 of DOC’s Inmate Training and Education Policy?
Not always; see Findings 3 and 4

To accomplish our audit objectives, we gained an understanding of the DOC internal control environment relevant to our objectives by reviewing applicable policies and procedures and DOC’s internal control plan and by conducting interviews and observations of the intake healthcare and educational counseling and placement procedures. In addition, to obtain sufficient, appropriate evidence to address our audit objectives, we performed the procedures described below.

To determine whether DOC provided verbal and written instructions upon admission that explain the procedures for gaining access to healthcare services within a timely manner to I/Is, including those with learning barriers and/or LEP, and to ensure that they understand their rights before the initiation of any healthcare services in accordance with Sections 630.02(3), 630.08, and 630.09(1) and (2) of DOC’s Medical Services Policy and Section 650.03 of DOC’s Mental Health Services Policy regarding informed consent, we took the actions described below.

From a total population of 4,740 commitment identification numbers for pre-trial, civilly committed, and criminally sentenced I/Is who were admitted to DOC during the audit period, we divided the population into two categories. For the first category, we identified a total of 695 commitment identification numbers for I/Is who had a learning barrier and/or LEP. We selected a nonstatistical,4 judgmental5 sample of 60 out of these 695 commitment identification numbers. For the second category, we identified a total of 4,045 commitment identification numbers for I/Is who did not have a learning barrier and/or LEP. We selected a statistical6 sample of 60 commitment identification numbers, using a 95% confidence level,7 a 0% expected error rate,8 and a 5% tolerable rate.9

For each of the 120 commitment identification numbers in our samples, we inspected the I/I’s Offender Face Sheet10 to identify their primary and secondary languages and to determine whether they could comprehend English. We inspected the general and mental health orientation forms that were signed and dated by the I/I and/or a DOC staff witness. We also inspected the signed and dated general health consent forms and any additional consent forms related to psychotropic medications, if applicable, to determine whether the I/I understood the risks and benefits of treatments offered to them before the initiation of any healthcare services. For any I/I who refused to sign any orientation or consent forms, we inspected these forms for a signature and date from a DOC staff witness.

For the 60 commitment identification numbers of I/Is who had a learning barrier and/or LEP, we reviewed notes in the healthcare electronic files, ADA hardcopy files, and the I/I’s electronic records to determine whether the I/I was provided with any auxiliary aids, translation services, and/or alternative forms in their preferred language to understand how they could gain access to healthcare services and/or give their informed consent before the initiation of any healthcare services.

For this aspect of our objective, we found certain issues during our testing. See Findings 1 and 2 for more information.

Educational Counseling and Placement

To determine whether DOC provides educational counseling and placement for criminally sentenced I/Is with ADA accommodations and/or LEP, as required by Section 48 of Chapter 127 of the General Laws and Sections 441.04, 441.05, and 441.10 of DOC’s Inmate Training and Education Policy, we took the actions described below.

From a total population of 2,445 commitment identification numbers of criminally sentenced individuals with ADA accommodations and/or LEP who were in DOC’s custody during the audit period, we sampled a total of 104 commitment identification numbers by breaking down the population into two categories:

  • For the first category, from a total population of 2,172 commitment identification numbers for I/Is with an auditory impairment and I/Is who had no ADA accommodations, but had LEP, we targeted the 4 numbers associated with the only I/Is who had an active individualized education program (IEP) and who were under 22 years of age during the audit period. Further, we selected a statistical sample of 60 commitment identification numbers, using a 95% confidence level, a 0% expected error rate, and a 5% tolerable rate.
  • For the second category, we selected a nonstatistical, judgmental sample of 40 out of 273 commitment identification numbers for the remaining ADA accommodation population types.

For each of the 104 commitment identification numbers in our samples, we inspected the I/I’s Offender Face Sheet to identify the I/I’s primary and secondary languages and to determine whether they could comprehend English. We also inspected the I/I’s and correctional program officer’s (CPO’s) signatures and dates on the I/I’s annual personalized program plans (PPP), which were filed in the I/I’s six-part folder.11 For any I/I who refused to sign a PPP, we inspected the PPP for a witness signature and date from the CPO. For I/Is with an ADA accommodation and/or LEP, we reviewed notes in the I/Is’ healthcare records, ADA hardcopy files, and I/Is’ electronic records in DOC’s prison population database (PPD) to determine whether the I/Is were provided with any auxiliary aids, translation services, and/or alternative forms in their preferred language to understand their opportunities during the educational counseling and placement meetings.

For the 4 commitment identification numbers for I/Is with active IEPs who were under 22 years old during the audit period, we reviewed the Consent Form for Adult Inmates that notified them of their eligibility for special education services and recorded their consent.

For this aspect of our objective, we found certain issues during our testing. See Findings 3 and 4 for more information. During our testing, we also noted that special education services and evaluations are not being provided by local education agencies for I/Is who are potentially eligible for special education services. See Other Matters.

We used a combination of statistical and nonstatistical sampling methods for testing and did not project the results of our testing to any population.

Data Reliability Assessment

PPD I/I List

We obtained from DOC’s director of data analytics a list, from DOC’s PPD, of all I/Is (pre-trial, civilly committed, and criminally sentenced) who were in DOC’s custody at any point during the audit period. We also interviewed DOC officials who were knowledgeable about the data and tested certain information system general controls, including security management, access controls, configuration management, segregation of duties, and contingency planning. We reviewed the structured query language that was generated by DOC’s director of data analytics to verify that the information that the query returned was from the audit period and that the correct fields were included. We tested the data to ensure that it did not contain any spreadsheet issues (e.g., hidden rows, columns, or workbooks; duplicate records; and data corresponding to dates outside the audit period). We selected a random sample of 20 out of 4,740 commitment identification numbers of I/Is (pre-trial, civilly committed, and criminally sentenced) who were admitted to DOC’s custody during the audit period. Additionally, we selected a random sample of 20 out of 8,279 commitment identification numbers of criminally sentenced I/Is who were in DOC’s custody during the audit period. For both samples, we traced the I/Is’ information (e.g., commitment identification numbers; commitment dates; names; dates of birth; active statuses; release dates, if applicable; primary languages; ADA fields; and intake high school diploma/equivalency) to their hardcopy six-part folders maintained at each DOC facility in which the I/Is in our samples were housed.

PPD User List and DOC Employee List

As part of our review of information system general controls, we obtained, from the Executive Office of Public Safety and Security’s director of application development and support, a PPD user list from the audit period, which included vendor employees. We also obtained a list of DOC employees from DOC’s Human Resources Department. We tested the data in these lists to ensure that they did not contain any spreadsheet issues (e.g., hidden rows, columns, or workbooks; duplicate records; and data corresponding to dates outside the audit period).

We selected a random sample of 20 out of 4,668 employees from a list of DOC employees from the Commonwealth’s official payroll system and an additional random sample of 20 out of 618 terminated employees from the list of employees that DOC provided to us and verified their employment status with DOC by tracing employee information (i.e., employee identification number, name, employee status, job title, hire date, and termination date, if applicable) to employee information in the hardcopy personnel files maintained by DOC’s Human Resources Department.

We selected a random sample of 20 out of 6,723 PPD users from the PPD user list and traced user information (i.e., username, name, and job title) to the vendor and employee look-up on DOC’s intranet to determine whether they were DOC employees or vendor employees. We also compared the employee list to the PPD user list to determine when user access to PPD was granted to and terminated for employees, when applicable, and to ensure that user access was granted after employees’ official hire dates.

Based on the data reliability assessment procedures described above, we determined that the data we obtained was sufficiently reliable for the purposes of our audit.

4.    Auditors use nonstatistical sampling to select items for audit testing when a population is very small, the population items are not similar enough, or there are specific items in the population that the auditors believe they need to review.

5.    Auditors use judgmental sampling to select items for audit testing when a population is very small, the population items are not similar enough, or there are specific items in the population that the auditors believe they need to review. Auditors use their knowledge and judgment to select the most appropriate sample. For example, an auditor might select items from areas of high risk. The results of testing using judgmental sampling cannot be used to make conclusions or projections about entire populations; however, they can be used to identify specific issues, risks, or weaknesses.

6.    Auditors use statistical sampling to select items for audit testing when a population is large (usually over 1,000) and contains similar items. Auditors generally use a statistical software program to choose a random sample when sampling is used. The results of testing using statistical sampling, unlike those from judgmental sampling, can usually be used to make conclusions or projections about entire populations.

7.    Confidence level is a mathematically based measure of the auditor’s assurance that the sample results (statistic) are representative of the population (parameter), expressed as a percentage.

8.    Expected error rate is the number of errors that are expected in the population, expressed as a percentage. It is based on the auditor’s knowledge of factors such as prior year results, the understanding of controls gained in planning, or a probe sample.

9.    The tolerable error rate (which is expressed as a percentage) is the maximum error in the population that is acceptable while still using the sample to conclude that the results from the sample have achieved the objective.

10.    An Offender Face Sheet is a hardcopy document used for DOC’s administrative purposes and includes, but is not limited to, an I/I’s personally identifiable information such as name, commitment number, commitment date, last known address, date of birth, and a picture of the I/I.

11.    According to Section 155.05 of Title 103 of the Code of Massachusetts Regulations, a six-part folder is defined as “an inmate record divided into six categories and the computerized inmate record used by the Department of Correction for the recording and/or filing of documents.”

Date published: December 19, 2025

Help Us Improve Mass.gov  with your feedback

Please do not include personal or contact information.
Feedback