Overview
In accordance with Section 12 of Chapter 11 of the Massachusetts General Laws, the Office of the State Auditor has conducted a performance audit of certain activities of the Department of Developmental Services (DDS) for the period July 1, 2022 through June 30, 2024.
We conducted this performance audit in accordance with generally accepted government auditing standards. Those standards require that we plan and perform the audit to obtain sufficient, appropriate evidence to provide a reasonable basis for our findings and conclusions based on our audit objectives. We believe that the evidence obtained provides a reasonable basis for our findings and conclusions based on our audit objectives.
Below is a list of our audit objectives, indicating each question we intended our audit to answer; the conclusion we reached regarding each objective; and, if applicable, where each objective is discussed in the audit findings.
| Objective | Conclusion |
|---|---|
| No; see Finding 1 |
| No; see Finding 2 |
| No; see Finding 3 |
| No; see Finding 4 |
| No; see Finding 5 |
To accomplish our audit objectives, we gained an understanding of the internal control environment relevant to our objectives by reviewing applicable policies and procedures and DDS’s internal control plan, as well as by interviewing DDS personnel members. We evaluated the design of controls over DDS’s investigations, administrative reviews, MORs, and incident reports as part of our substantiative testing.
In addition, to obtain sufficient, appropriate evidence to address our audit objectives, we performed the procedures described below.
Investigations
To determine whether DDS completed investigations, including their corresponding action plans and decision letters, within the timeframes required by 115 CMR 9.10(5), 9.13(1)(d), and 9.14(3), we took the following actions. We selected a statistical6 sample of 73 investigations from the population of 2,523 investigations that were initiated during the audit period. To select the sample, we used a 90% confidence level,7 a 50% expected error rate,8 and a 20% desired precision range.9 We first determined whether any of these 73 investigations had a documented extension and, if not, whether any investigations were completed after more than 45 business days. We did this by calculating the number of business days between the case date, which represents when DDS received the complaint, and the decision letter date.
Next, we analyzed whether action plans were created within 30 business days after they were assigned to complaint resolution team (CRT) coordinators in accordance with 115 CMR 9. We reviewed the date when the decision letter was issued and the date when the action plan was created. We then calculated the number of business days between these two dates to identify whether any action plans were created after more than 30 business days.
To determine whether the 73 investigations in our sample were conducted in accordance with 115 CMR 9.10, we determined whether investigators took the following actions:
- conducted face-to-face interviews with alleged victims and alleged abusers;
- conducted interviews with people whom the investigator considered appropriate to interview (e.g., complainants or eyewitnesses);
- documented any refusals to be interviewed;
- reviewed pertinent documents (e.g., incident reports or medical records); and
- conducted site visits, if deemed necessary by DDS.
In addition, to determine whether DDS implemented our recommendations from our previous audit (Audit No. 2020-0234-3S), we requested and reviewed any routine audits performed during the audit period, evidence of caseloads being monitored biweekly by senior investigators/the deputy director and any action taken as a result. We also requested and reviewed the updated formal written procedures.
Lastly, we requested and reviewed the custom reports created to help monitor the timeliness of actions plans.
For this objective, we found certain issues during our testing; namely, that that DDS did not always issue decision letters or develop action plans for its investigations within the required timeframes. See Finding 1 for more information.
Administrative Reviews
To determine whether DDS conducted administrative reviews in accordance with the procedures and timeframes established in 115 CMR 9.11(1), 9.11(2), and 9.14(3)(b), we took the following actions. We generated a statistical sample of 74 administrative reviews from the population of 3,515 administrative reviews that were initiated during the audit period. To select the sample, we used a 90% confidence level, a 50% expected error rate, and a 20% desired precision range.
We reviewed the date when each administrative review was finalized and the date when the resolution letter was created. We then calculated the number of business days between these two dates to determine whether any resolution letters were created after more than 15 business days.
Next, we determined whether administrative reviews were conducted in accordance with the procedures in 115 CMR 9.11. We reviewed each administrative review report from our sample. Information in each report included, but was not limited to, an allegation summary, immediate actions taken, and principal findings of the review. We then reviewed the information and determined whether each administrative review report was complete, whether each report was reviewed by the CRT, and whether each resolution letter was generated upon review by the CRT based on the administrative review report. To determine whether resolution letters were developed by CRTs within 30 business days in accordance with CMR 9.14(3), we obtained and reviewed the date when each resolution letter was created and the date when it was issued. We then calculated the number of business days between these dates to determine whether any resolution letters were issued after more than 30 business days.
In addition, to determine whether DDS implemented our recommendations from our previous audit (Audit No. 2020-0234-3S), we requested and reviewed any documentation related to the development of best practices, minimum standards and training tips for performing administrative reviews. Furthermore, we determine whether all principal findings were completed as evidence that DDS resolved the issue in its web-based system.
For this objective, we found certain issues during our testing, namely that DDS did not manage all administrative reviews properly. See Finding 2 for more information.
MORs
To determine whether DDS implemented policies and procedures to ensure that MORs were processed based on the recommendations from our prior audit (Audit No. 2020-0234-3S), we took the following actions. We requested and reviewed updated guidance and training materials and evidence of DDS working with the Medication Administration Program (MAP) coordinators to resolve any issues with MORs.
Furthermore, we generated a statistical sample of 75 non-hotline MORs from the population of 12,077 non-hotline MORs. To select the sample, we used a 90% confidence level, a 50% expected error rate, and a 20% desired precision range. We also generated a nonstatistical sample10 of 35 hotline MORs from the population of 213 hotline MORs.
DDS MAP coordinators are required to review non-hotline MORs within 14 calendar days after providers finalize them. We reviewed the date when each of the 75 non-hotline MORs was finalized by the provider and the date when it was reviewed by a MAP coordinator in DDS’s web-based system. We calculated the number of calendar days between these two dates to determine whether any MORs took more than 14 calendar days to be reviewed after they were finalized.
DDS MAP coordinators are required to review hotline MORs within 14 calendar days after the providers finalized them. We reviewed the date when each of the 35 hotline MORs was finalized by the provider, as well as the date when it was reviewed by a MAP coordinator. We calculated the number of calendar days between these two dates to determine whether any MOR reviews by MAP coordinators exceeded 14 calendar days.
To determine whether providers met submission timeframes for non-hotline MORs, we reviewed the date when each MOR event11 occurred and the date when the MOR was created in DDS’s web-based system. We calculated the number of calendar days between these two dates to determine whether any MORs took more than the allowed seven calendar days to be created. To determine whether providers met finalization timeframes for non-hotline MORs, we reviewed the date when each non-hotline MOR was created and the date when it was finalized by the provider. We then calculated the number of calendar days between these two dates to determine whether any non-hotline MORs took more than seven calendar days to be finalized after their creation.
To determine whether providers met submission timeframes for hotline MORs, we reviewed the date when each MOR event was discovered and the date when the MOR was finalized. We calculated the number of calendar days between these two dates to determine whether any hotline MORs were finalized after more than the allowed 24 hours.
For this objective, we found certain issues during our testing; namely, that DDS did not ensure that MORs were created, finalized, and reviewed within the required timeframes. See Finding 3 for more information.
Incident Reports
To determine whether DDS implemented policies and procedures to monitor the accuracy and completeness of incident reports based on the recommendations from our prior audit (Audit No. 2020-0234-3S), we took the following actions. We requested and reviewed new custom reports that DDS sends to providers and evidence of senior investigators using the reports to identify and address concerns directly with providers.
Furthermore, we generated a statistical sample of 75 from the population of 7,852 major-level incidents. To select the sample, we used a 90% confidence level, a 50% anticipated rate of occurrence, and a 20% desired precision range.
For our sample of major-level incidents, we conducted testing based on the timeframes for providers to submit and finalize major-level incident reports. We first reviewed the date when each incident was discovered and the date when the incident report was submitted in DDS’s web-based system. We calculated the number of calendar days between these two dates and determined whether any incident reports were submitted after more than the allowed 24 hours. We also reviewed the date when each incident report was submitted and the date when it was finalized. We then calculated the number of calendar days between these two dates to determine whether providers finalized their reports within the allowed seven calendar days after incident reports were submitted.
To determine whether providers submitted minor-level incident reports within the required timeframes, we generated a statistical sample of 75 from the population of 49,714 minor-level incidents. To select the sample, we used a 90% confidence level, a 50% expected error rate, and a 20% desired precision range.
For our sample of minor-level incidents, we conducted testing to determine whether providers met DDS’s timeframes for providers to submit and finalize minor-level incident reports. We first reviewed the date when each incident was discovered and the date when the corresponding incident report was submitted. We calculated the number of business days between these two dates and determined whether any incident reports were submitted after more than three business days. We also reviewed the date when each incident report was initially submitted and the date when it was finalized. We calculated the number of calendar days between these two dates to determine whether there were any incident reports that were finalized after more than the allowed seven calendar days.
Within our samples of major-level and minor-level incidents, we identified incident reports that were submitted to the Disabled Persons Protection Commission (DPPC) and incident reports that staff members noted were going to be reported to DPPC. We followed up with DDS about these incidents and determined whether all incidents that should have been reported to DPPC had been reported.
For this objective, we found certain issues during our testing; namely, that DDS did not ensure that its providers submitted and finalized incident reports within the required timeframes. See Finding 4 for more information.
ISPs
To determine whether DDS developed ISPs on behalf of recipients of DDS services to accommodate those who elected the self-determination option in accordance with 115 CMR 6.21 and 6.23(5) and Sections 19(b), (f), and (g) of Chapter 255 of the General Laws, we took the following actions. We generated a statistical sample of 75 recipients of DDS services with ISPs from the population of 22,593 recipients of DDS services with ISPs. To select the sample, we used a 90% confidence level, a 50% expected error rate, and a 20% desired precision range.
For our sample of recipients of DDS services with ISPs, we reviewed dates to determine whether the service coordinator collaborated with the aforementioned parties before the initial ISP meeting as well as whether providers satisfied their responsibilities 15 days before the initial ISP meeting for each of the 75 recipients of DDS services. We then calculated the number of calendar days between these dates to determine whether service coordinators and providers met timeframes before the first ISP meeting.
In addition, to determine whether ISPs were reviewed by the area director or facility director within 45 days after the initial ISP meeting in accordance with 115 CMR 6.23(5), we obtained from DDS’s web-based system, and reviewed, the date when the ISP meeting was held and the date when the area director or facility director approved or disapproved the ISP for the recipient of DDS services.
Lastly, to determine whether the self-determination option was offered to each recipient of DDS services, we determined whether the self-determination option was present within the web-based system. In addition, we also reviewed the samples within the web-based system to see whether the provider documented the DDS service recipients’ interest in the self-determination option.
For this objective, we found certain issues during our testing, namely that DDS did not offer the self-determination option to all recipients of DDS services enrolled in ISPs. See Finding 5 for more information.
We used a combination of statistical and nonstatistical sampling methods for testing, and we did not project the results of our testing to any corresponding populations.
Data Reliability Assessment
To determine the reliability of the data within DDS’s web-based system, we gained an understanding of the system and its controls, and we reviewed information security policies and procedures from DDS and the Executive Office of Health and Human Services. We also tested certain information system general controls, including security management, access controls, configuration management, contingency planning, and segregation of duties for DDS’s web-based system.
DPPC provided us with us a list of intakes that it sent to DDS through the web-based system for the audit period. From this list, we selected a random sample of 25 intakes from the data provided by DPPC and then traced the DPPC intake number, screen date, and investigating/referral agency fields to the system to determine whether they matched.
Additionally, for investigations, administrative reviews, incident reports, medication MORs, and ISPs, we tested these items to ensure that they did not contain certain record issues (i.e., duplicate records, missing values in necessary data fields, dates that did not flow logically in the data).
Based on the results of the reliability assessment procedures described above, we determined that the data was sufficiently reliable for the purposes of our audit.
| Date published: | April 15, 2026 |
|---|