• This page, Audit of the Department of Developmental Services Objectives, Scope, and Methodology, is   offered by
  • Office of the State Auditor

Audit of the Department of Developmental Services Objectives, Scope, and Methodology

An overview of the purpose and process of auditing the Department of Developmental Services.

Table of Contents

Overview

In accordance with Section 12 of Chapter 11 of the Massachusetts General Laws, the Office of the State Auditor has conducted a performance audit of certain activities of the Department of Developmental Services (DDS) for the period July 1, 2017 through June 30, 2019.

We conducted this performance audit in accordance with generally accepted government auditing standards. Those standards require that we plan and perform the audit to obtain sufficient, appropriate evidence to provide a reasonable basis for our findings and conclusions based on our audit objectives. We believe that the evidence obtained provides a reasonable basis for our findings and conclusions based on our audit objectives.

Below is a list of our audit objectives, indicating each question we intended our audit to answer, the conclusion we reached regarding each objective, and where each objective is discussed in the audit findings.

Objective

Conclusion

  1. Did DDS conduct investigations within 45 business days after investigators were assigned and in accordance with the procedures established in Sections 9.10(5), 9.13(1)(d), 9.13(2)(a), and 9.14(3) of Title 115 of the Code of Massachusetts Regulations (CMR) for complaints associated with clients residing at privately operated group homes?

No; see Findings 1 and 2

  1. Did DDS conduct administrative reviews in accordance with the procedures and timeframes in 115 CMR 9.11(1), 9.11(2), and 9.14(3)(b) for complaints associated with clients residing at privately operated group homes?

No; see Finding 3

  1. Did DDS process incident reports associated with clients residing at provider-operated group homes in accordance with its Incident Management Guidelines?

No; see Findings 4 and 6

  1. Were medication occurrence reports (MORs) that were associated with clients living at privately operated group homes processed in accordance with DDS’s MOR Overview Guide?

No; see Finding 5

 

To achieve our audit objectives, we gained an understanding of the internal control environment we determined to be relevant to the objectives by reviewing DDS’s internal control plan and applicable agency policies and procedures. We also conducted interviews with DDS managers and other staff members. We reviewed, and tested the operating effectiveness of, controls related to incident reports.

We performed the following procedures to obtain sufficient, appropriate audit evidence to address the audit objectives.

Investigations

To determine whether investigations were completed within 45 business days, we selected a random, nonstatistical sample of 50 investigations from a population of 501 investigations initiated during the audit period. We first determined whether any of these 50 investigations had a documented extension and, if not, whether the investigations were completed in more than 45 business days. We did this by calculating the number of business days between the case date, which represents when DDS received the complaint and the 45-business-day timeframe began, and the decision letter date.

To determine whether action plans were created within 30 business days after they were assigned to complaint resolution team (CRT) coordinators and investigations were conducted in accordance with the procedures in 115 CMR 9, we generated a random, nonstatistical sample of 60 investigations from our population of 501 initiated during the audit period. For these 60 investigations, we reviewed and documented from the Home and Community Services Information System (HCSIS) the date the decision letter was issued and the date the action plan was created. We then calculated the number of business days between the two dates to identify whether any action plans were created after more than 30 business days.

To determine whether these 60 investigations were conducted in accordance with 115 CMR 9.10, we determined whether investigators took the following actions:

  • conducting face-to-face interviews with alleged victims and alleged abusers
  • conducting interviews with people the investigator considered it appropriate to interview (e.g., complainant, eyewitness)
  • documenting any refusals to be interviewed
  • reviewing pertinent documents (e.g., incident reports, medical records)
  • conducting site visits if deemed necessary by DDS.

Administrative Reviews

To determine whether administrative reviews were submitted to the assigned CRT coordinators within 15 business days, we generated a random, nonstatistical sample of 60 from a population of 1,297 administrative reviews initiated during the audit period. We reviewed, and documented from HCSIS, the date each administrative review was finalized and the date the resolution letter was created. We then calculated the number of business days between the two dates to identify any resolution letters created after more than 15 business days.

Using the same sample of 60 of 1,297 administrative reviews, we determined whether administrative reviews were conducted in accordance with the procedures in 115 CMR 9.11. We reviewed each administrative review report from our sample in HCSIS. Information in each report in HCSIS included, but was not limited to, an allegation summary, immediate actions taken, and principal findings of the review. We then reviewed the information and determined whether each administrative review report was complete, each report was reviewed by the CRT, and each resolution letter was generated upon review by the CRT based on the administrative review report. To determine whether resolution letters were developed by CRTs within 30 business days in accordance with CMR 9.14(3), we obtained from HCSIS, and reviewed, the date each resolution letter was created and the date it was issued. We then calculated the number of business days between the dates to identify resolution letters that were issued after more than 30 business days.

Incident Reports

To determine whether providers submitted major-level incident reports within their required timeframes, we generated a random, nonstatistical sample of 60 from a population of 2,273 major-level incidents.

For major-level incidents, we conducted two tests based on the timeframes for providers to submit and finalize major-level incident reports. We first reviewed the date each incident was discovered and the date the incident report was submitted in HCSIS. We calculated the number of calendar days between the two dates and identified the incident reports that were submitted after more than the allowed 24 hours. We also reviewed the date each incident report was submitted and the date it was finalized in HCSIS. We then calculated the number of calendar days between the two dates to determine whether providers finalized their reports within the required seven calendar days after incident reports were submitted.

To determine whether providers submitted minor-level incident reports within the required timeframes, we generated a random, nonstatistical sample of 60 from a population of 20,355 minor-level incidents.

We conducted two tests to determine whether providers met DDS’s timeframes for minor-level incidents. We first reviewed the date each incident was discovered and the date the corresponding incident report was submitted in HCSIS. We calculated the number of business days between the two dates and identified which incident reports were submitted after more than three business days. We also reviewed the date each incident report was initially submitted and the date it was finalized in HCSIS. We calculated the number of calendar days between the two dates to identify those that were finalized after more than the seven calendar days allowed.

Within our samples of 60 major-level and 60 minor-level incidents, we identified incident reports that were submitted to the Disabled Persons Protection Commission (DPPC) and incident reports that staff members noted were going to be reported to DPPC. We followed up with DDS about these incidents and verified that all incidents that should have been reported to DPPC had been reported.

MORs

To determine whether DDS reviewed, and providers submitted, MORs in accordance with DDS policies and procedures, we generated a random, nonstatistical sample of 60 regular MORs from a population of 8,407; we also reviewed all 136 hotline MORs submitted during the audit period.

DDS Medication Administration Program (MAP) coordinators were required to review hotline MORs within 14 calendar days after the providers finalized them. We reviewed the date each of the 136 hotline MORs was finalized by the provider, as well as the date it was reviewed by a MAP coordinator in HCSIS. We calculated the number of calendar days between the two dates to identify MORs that took more than 14 calendar days.

DDS MAP coordinators were also required to review regular MORs within 14 calendar days after providers finalized them. We reviewed the date each regular MOR in our sample was finalized by the provider and the date it was reviewed by a MAP coordinator in HCSIS. We calculated the number of calendar days between the two dates to identify MORs that took more than 14 calendar days to be reviewed after they were finalized.

To determine whether providers met submission timeframes for hotline MORs, we reviewed the date each MOR event (i.e., medication omission or violation of one of the “five Rs”) was discovered and the date the MOR was finalized in HCSIS. We calculated the number of calendar days between the two dates to determine which hotline MORs were finalized after more than the 24 hours allowed.

To determine whether providers met submission timeframes for regular MORs, we reviewed the date each MOR event occurred and the date the MOR was created in HCSIS. We calculated the number of calendar days between the two dates to determine which MORs took more than the seven calendar days allowed. To determine whether providers met finalization timeframes for regular MORs, we reviewed the date each regular MOR was created and the date it was finalized by the provider in HCSIS. We then calculated the number of calendar days between the two dates to determine which regular MORs were finalized after more than the seven calendar days allowed.

Because of the unique circumstances of each investigation, administrative review, incident report, and MOR (i.e., there were many different CRTs, area offices, and types of complaints investigated), we used nonstatistical sampling and therefore did not project our results to the entire population.

Data Reliability

To determine the reliability of the data in HCSIS, we gained an understanding of HCSIS and its controls. We interviewed information technology personnel and reviewed information security policies and procedures from DDS and the Executive Office of Health and Human Services. Specifically, we reviewed policies and procedures for security management, access controls, configuration management, contingency planning, and segregation of duties. We interviewed HCSIS administrators to review password management. We also tested whether HCSIS users at DDS had been granted appropriate access to the system, completed security awareness training, been screened before access, and been removed as users when applicable. We identified issues with security awareness training for 2019 (see Finding 6).

Further, for investigations and administrative reviews, we traced 9,184 DPPC case numbers from DPPC’s case management system, FileMaker Pro, and confirmed that they had all been uploaded to HCSIS. In response to an inquiry from us, DDS confirmed that intake information could not be altered once it was uploaded because of system controls. We also conducted electronic testing to identify any duplicates, blank fields, or key dates that did not flow logically in the data. We also discussed with DDS, and verified, the fact that any blank fields were by design and not due to error.

Additionally, for incident reports, we conducted electronic testing to identify any duplicates, blank fields, or key dates that did not flow logically in the data in HCSIS. We determined whether incident reports were reviewed by one or two levels of DDS personnel depending on the incident level.

We also conducted electronic testing of MOR data in HCSIS to identify any duplicates, blank fields, or key dates that did not flow logically in the data.

Based on the reliability procedures performed above, we determined that the HCSIS data were sufficiently reliable for the purposes of our audit.

Date published: June 29, 2021

Help Us Improve Mass.gov  with your feedback

Please do not include personal or contact information.
Feedback