Overview
In accordance with Section 12 of Chapter 11 of the Massachusetts General Laws, the Office of the State Auditor has conducted a performance audit of certain activities of the Disabled Persons Protection Commission (DPPC) for the period July 1, 2021 through June 30, 2023.
We conducted this performance audit in accordance with generally accepted government auditing standards. Those standards require that we plan and perform the audit to obtain sufficient, appropriate evidence to provide a reasonable basis for our findings and conclusions based on our audit objectives. We believe that the evidence obtained provides a reasonable basis for our findings and conclusions based on our audit objectives.
Below is a list of our audit objectives, indicating each question we intended our audit to answer; the conclusion we reached regarding each objective; and, if applicable, where each objective is discussed in the audit findings.
Objective | Conclusion |
---|---|
| Yes; see Other Matters |
| To a sufficient extent |
| To a sufficient extent |
To accomplish our audit objectives, we gained an understanding of DPPC’s internal control environment relevant to our objectives by reviewing applicable policies and procedures and by interviewing DPPC management. In addition, to obtain sufficient, appropriate evidence to address our audit objectives, we performed the procedures described below.
IRs, Investigation Reports, and RI Reports
Regarding IRs, Investigation Reports, and RI Reports, to determine whether DPPC either (1) ensured that these documents were filed by an APS investigator in a timely manner or (2) recorded the reasons as to why these documents were not filed in a timely manner, in accordance with 118 CMR 5.02(4)(a)(1)(a), 5.02(4)(a)(2)(a), and 5.03(2); Section II(E) of DPPC’s Retaliation Investigations Procedure; and Section III(E) of DPPC’s Investigation Assignment, Monitoring and Timeframes Procedure, we took the actions described below.
IRs
We selected a random, statistical sample4 of 60 IRs from the population of 4,721 IRs that external protective service agencies filed with DPPC during the audit period, using a 95% confidence level,5 a 0% expected error rate,6 and a 5% tolerable error rate.7 By comparing each IR’s screening date to the date that the APS investigator filed it with DPPC, we determined whether each IR was filed with DPPC within 10 calendar days of the IR’s screening date.8
We met with DPPC officials and observed them retrieving IRs from the case management system (CMS) that we initially calculated as being late. We took screenshots9 of each potentially late IR, along with any documentation that supported the delayed submission. For each IR confirmed as being late, we determined whether it was accurately listed on DPPC’s monthly monitoring reports and whether notices were sent to the protective services agency, notifying it of the late IR.
Investigation Reports
We selected a random, statistical sample of 60 Investigation Reports from the population of 2,645 Investigation Reports that APS investigators filed with DPPC during the audit period, using a 95% confidence level, a 0% expected error rate, and a 5% tolerable error rate. By comparing each Investigation Report’s screening date to the date that the APS investigator filed the Investigation Report with DPPC, we determined whether each Investigation Report was filed with DPPC within 45 calendar days of the Investigation Report’s screening date.
We met with DPPC officials and observed them retrieving Investigation Reports that we initially calculated as being late from the CMS. We took screenshots10 of each potentially late Investigation Report, along with any documentation that supported the delayed submission. For each Investigation Report confirmed as being late, we determined whether it was accurately listed on DPPC’s monthly monitoring reports and whether notices were sent to the protective services agency, notifying it of the late Investigation Report.
We also determined whether any alleged abusers reported in each Investigation Report were accurately listed in DPPC’s Registry of Abusive Care Providers.
RI Reports
We tested the entire population of 10 RI Reports that DPPC APS investigators filed with DPPC’s director of investigations during the audit period. Using the full population, we determined whether each RI Report was filed by the APS investigator with the director of investigations within 60 business days of the RI Report’s screening date.
We met with DPPC officials and observed them retrieving RI Reports that we initially calculated as being late from the CMS. We took screenshots11 of each RI Report, along with any documentation that supported the delayed submission. For the one RI Report confirmed as being late, we reviewed DPPC’s Investigation Planning Form and Investigation Case Activity Log.12 We reviewed these documents to determine whether the DPPC APS investigator documented each step of the investigation by updating the Investigation Case Activity Log and by providing investigation information to their investigation manager during the investigation process.
See Other Matters for additional information.
PSPs
To determine to what extent DPPC monitored the delivery of PSPs, as required by 118 CMR 7.03(3), 7.05(1), and 7.05(2), we took the following actions. First, we selected a random, nonstatistical sample13 of 10 months out of the population of the 24 months in the audit period. We obtained the monthly monitoring report corresponding to each month in our sample. We then reviewed each monthly monitoring report to determine whether DPPC provided those monthly monitoring reports to the appropriate protective services agency, notifying them of any PSPs that were late.
In addition, we selected a random, nonstatistical sample of 40 Investigation Reports from the population of 341 Investigation Reports that had substantiated cases of abuse for which a PSP should have been recommended by an APS investigator. We inspected each Investigation Report in the CMS and noted any documentation that showed the dates that the PSP recommendations were made. We compared the dates that an APS investigator filed each Investigation Report with DPPC to the dates that the PSPs were recommended (as recorded in DPPC’s CMS). Using this sample, we determined whether each PSP was recommended within 30 calendar days of the date of DPPC’s acceptance of the Investigation Report. We also determined whether alleged abusers listed in our sample were accurately placed on DPPC’s Registry of Abusive Care Providers.
For this objective, we found no significant issues during our testing. Therefore, we concluded that, based on our testing, DPPC met the relevant criteria regarding monitoring the delivery of PSPs.
Registry of Abusive Care Providers
To determine to what extent DPPC maintained the Registry of Abusive Care Providers to ensure that entries were accurate, current, and in compliance with Sections 15(b) and 15(f) of Chapter 19C of the General Laws, we took the following actions. First, we obtained all two audit summary reports for the audit period. We inspected each audit summary report to determine whether it was submitted no later than October 31 of the corresponding year to the clerks of the House of Representatives and the Senate, the House and Senate Committees on Ways and Means, and the Joint Committee on Children, Families and Persons with Disabilities. We reviewed each audit summary report and determined whether each report contained the required information listed in Section 15(f) of Chapter 19C of the General Laws, which is as follows:
- the number of substantiated findings of abuse found or not found to have been registrable;
- the number of people on the registry;
- the number of people who were added to the registry in the last fiscal year;
- the number of substantiated findings of registrable abuse that were appealed in the last fiscal year;
- the number of substantiated findings of registrable abuse that were overturned on appeal in the last fiscal year;
- the number of requests made by employers for information from the registry and the number of such requests that were granted in the last fiscal year;
- the total number of instances in the last fiscal year in which the commission failed to notify the department or the last known employer of a care provider who was placed on the registry and the reasons for such failures; and
- the number of employers found to have failed to meet the requirements of subsection (d) [of Section 15 of Chapter 19C of the General Laws] in the last fiscal year.
We then reconciled the number of people added to the Registry of Abusive Care Providers for each fiscal year during the audit period to the number of people reported as newly added in the corresponding audit summary report to determine whether the information in the report was accurate.
Additionally, we selected a random, nonstatistical sample of 20 names out of the population of 69 names that were on the Registry of Abusive Care Providers list. We traced each of the 20 names in our sample to their corresponding Investigation Reports in DPPC’s CMS and determined whether those names were substantiated abusers, which warranted their names being listed on the Registry of Abusive Care Providers. Further, we determined whether the Registry of Abusive Care Providers was current by comparing it to the samples we selected during our testing of Objectives 1 and 2, which were indicative of substantiated abuse cases.
For this objective, we found no significant issues during our testing. Therefore, we concluded that, based on our testing, DPPC met the relevant criteria regarding maintaining the Registry of Abusive Care Providers.
We used a combination of statistical and nonstatistical sampling methods for testing, and we did not project the results of our testing to the populations corresponding to IRs, Investigation Reports, RI Reports, PSPs, or the Registry of Abusive Care Providers.
Data Reliability Assessment
To determine the reliability of the data from DPPC’s computer network system, we reviewed a System and Organization Controls 2 Type 2 Report,14 and mapped it to the respective areas such as: DPPC’s security management, configuration management, segregation of duties, contingency planning, and access controls. We tested all 68 employees hired during the audit period to determine whether they had completed initial cybersecurity training. We also selected a nonstatistical, random sample of 8 employees out of 142 who had computer network access during the audit period and tested whether these 8 employees completed annual cybersecurity training, whether background checks were performed at their time of hire, and whether their computer user access rights matched their titles and positions. We also determined whether employees who were terminated during the audit period had their computer network access removed. We also interviewed DPPC management who were knowledgeable about the data. We observed DPPC officials query and extract the following information from the CMS:
- a total of 4,805 IRs (of which 4,721 were from within the audit period and thus constituted our testing population;15
- a total of 3,012 Investigation Reports (of which 2,645 were from within the audit period and thus constituted our testing population;16
- a total of 10 RI Reports;
- a total of 341 investigations with substantiated cases of abuse committed during the audit period (and which constituted our testing population);
- a total of 35217 CMS users (of whom 142 were specifically DPPC employees active during the audit period and thus constituted our testing population); and
- a total of 69 registered abusers who were added to the Registry of Abusive Care Providers during the audit period.
The chief of quality assurance and audit officer then provided these records to us in Microsoft Excel spreadsheets. We ensured that the number of CMS records we observed for each document type listed above matched the corresponding number of records in the Excel spreadsheets. We performed a date range test to ensure that our testing only involved records from within the audit period. We tested the data to ensure that it did not contain any spreadsheet issues (e.g., hidden objects such as names, rows, columns, or workbooks; duplicate records; or missing values in necessary data fields).
Specifically for testing the CMS users list, we also traced all 142 DPPC CMS users from our population to the Human Resources Compensation Management System, which is the Commonwealth’s official payroll system, to ensure that each CMS user in our population was an active DPPC employee during the audit period.
Based on the results of the data reliability assessment procedures described above, we determined that the information we obtained was sufficiently reliable for the purposes of our audit.
Date published: | June 30, 2025 |
---|