• This page, Audit of the Department of Public Health Objectives, Scope, and Methodology, is offered by
  • Office of the State Auditor

Audit of the Department of Public Health Objectives, Scope, and Methodology

An overview of the purpose and process of auditing the Department of Public Health

Table of Contents

Overview

In accordance with Section 12 of Chapter 11 of the Massachusetts General Laws, the Office of the State Auditor has conducted a performance audit of certain activities of the Department of Public Health (DPH) for the period July 1, 2016 through June 30, 2018.

We conducted this performance audit in accordance with generally accepted government auditing standards. Those standards require that we plan and perform the audit to obtain sufficient, appropriate evidence to provide a reasonable basis for our findings and conclusions based on our audit objectives. We believe that the evidence obtained provides a reasonable basis for our findings and conclusions based on our audit objectives.

Below is our audit objective, indicating the question we intended our audit to answer, the conclusion we reached regarding the objective, and where the objective is discussed in the audit findings.

Objective

Conclusion

  1. Does DPH appropriately review and respond to reported alleged incidents, including abuse, neglect, mistreatment, and misappropriation, that negatively affect residents at licensed nursing homes?

Not always; see Findings 1, 2, 3,
and 4

 

We gained an understanding of the internal controls we deemed significant to our audit objectives and evaluated the design and effectiveness of those controls for management oversight of the intake and investigation process. We completed two internal control tests. The first control test determined whether each intake was reviewed by two unique members of the intake staff, as outlined in the Complaint Unit Intake Process Manual. Our second control test determined whether all intakes designated for on-site investigation surveys had management review documented on the Complaint Processing Summary form generated to track the administrative life of an investigation.

We interviewed management and staff members of the Division of Health Care Facility Licensure and Certification’s (DHCFLC’s) Complaint Unit to understand their processes, from when an intake is first reported to when an on-site investigation survey is completed. We identified and reviewed key laws, regulations, and policies that dictate how the Complaint Unit’s intake and investigation program operates when the Complaint Unit responds to alleged incidents at nursing homes. We observed the processing of complaints by information systems key to the intake and investigation program. We identified data in the Health Care Facility Reporting System (HCFRS) and the Automated Survey Process Environment (ASPEN) that could support our substantive testing.

HCFRS and ASPEN Data

To facilitate our planned substantive tests, we worked with DPH’s information technology (IT) personnel and extracted a dataset from HCFRS that included all intakes from our audit period. The data included intake submission dates, facility identifiers, incident types, harm types, and disposition types selected by Complaint Unit personnel.

We also received a dataset from the Centers for Medicare & Medicaid Services (CMS) that included all intakes that occurred during our audit period and were entered in ASPEN. The data included some overlapping fields with HCFRS intake data, as well as unique fields specific to the prioritization and survey processes.

We interviewed DPH IT personnel and DHCFLC personnel to better understand the fields and data elements in both datasets. We filtered the HCFRS dataset that we extracted to include only HCFRS intakes linked to licensed nursing homes designated for on-site investigation surveys, which totaled 1,781 records. We filtered the ASPEN dataset to include only intakes with nursing home facility identifiers. We matched the rows of the two datasets and then joined the two, forming one dataset that captured intakes from initial reporting to investigation survey completion. We reviewed the joined dataset, removed any duplicates and extraneous ASPEN rows that did not join to HCFRS intakes, and separated the remaining 1,781 rows of data into categories based on key data elements.

Investigation Completion Test Methodology

Since our Complaint Processing Summary control test concluded that there was high risk that intakes reported to DPH and designated for on-site investigation surveys might not be investigated, we tested all 1,781 HCFRS intakes designated for on-site investigation surveys at nursing homes to verify that investigations were completed. After our data join, we observed 170 HCFRS intakes that required additional information to show that DPH had completed an investigation survey. We asked DPH to provide us with evidence that it had responded to these 170 intakes. We reviewed all responses provided and determined whether they showed that (1) an investigation survey did occur or (2) another response type was more appropriate for the case and DPH could show its rationale. Intakes that did not have an on-site investigation survey when one was required were noted as exceptions. We reviewed these exceptions with DPH program management to determine whether they agreed that surveys should have occurred in those instances.

Timeliness of Intake Prioritization Test Methodology

CMS’s State Operations Manual requires that all Non–Immediate Jeopardy (Non-IJ) intakes be prioritized within two working days. DPH’s staff told us that intakes were backlogged during our audit period. Therefore, we determined that there was a high risk that intakes were not prioritized in a timely manner. Using the joined dataset of 1,781 HCFRS intakes, we identified a population of 1,567 nursing home intakes that were designated for on-site investigation at priority Non-IJ High by DPH’s intake staff and were linked with investigation survey identifiers in ASPEN. We selected a random statistical sample of 200 intakes from the population of 1,567, using a confidence level of 95% with a sampling error range of 13.3% and an assumed noncompliance rate of 50%. For each intake in our test sample, we found the date the intake was submitted to DPH, the date DPH completed its prioritization of the intake, and the number of working days between those two dates. Intakes that took more than two working days to prioritize were noted as exceptions.

On-Site Investigation Timeliness Test Methodology

During the interview process, Complaint Unit staff members informed us that Non-IJ High on-site investigations were not always started within the required timeframes. To test DPH’s compliance with on-site investigation timeliness requirements, we used the same population of 1,567 nursing home intakes designated for on-site investigation at priority Non-IJ High that we had used for our intake prioritization test. We also used the same random statistical sample of 200 intakes for our on-site investigation timeliness test that we had used for our intake prioritization timeliness test. The projection has a confidence level of 95% with a sampling error range of 13.3% and an assumed noncompliance rate of 50%. For each intake in our test sample, we identified the date DPH completed its final prioritization, the date the investigation staff arrived at the facility to begin the investigation survey, and the number of working days between those two dates to determine whether the investigation began within the required timeframe.

Attorney General’s Office Notification Test Methodology

During program inquiries with the Complaint Unit manager, intake supervisor, and intake staff, we determined that DPH keeps a paper record of the fax cover letters for all intakes referred to the state Attorney General’s Office (AGO). These cover letters are addressed to the AGO and contain an ASPEN intake identifier. We determined this fax cover letter to be sufficient to demonstrate that DPH referred a given intake to AGO and sufficient for us to assess compliance with the requirement of referring abuse to AGO in accordance with Section 72H of Chapter 111 of the General Laws.

To assess whether DPH notifies AGO when it receives intakes alleging abuse, neglect, mistreatment, or misappropriation, we identified a population of all HCFRS intakes that were designated for on-site investigation with an incident type of abuse, neglect, mistreatment, or misappropriation, totaling 430 intakes. We then selected a nonstatistical random sample of 40. For each case in our sample, we verified whether DPH had a record of a fax cover letter addressed to AGO with the specific intake identifier included. The results of the testing cannot be projected to this population.

Refer to Other Agency Test Methodology

When DPH determines that a case falls under another agency’s jurisdiction, it refers the case to that agency. We confirmed with Complaint Unit personnel that all available documentation associated with an intake’s referral to another agency is documented with the intake in HCFRS in the form of timestamped comments and scanned attachments. To assess whether DPH follows its Complaint Unit Intake Process Manual, we identified the population of all HCFRS intakes designated as “Refer to Other Agency” in our audit period, totaling 697 intakes. We selected a nonstatistical random sample of 50. We then verified that each intake had evidence that DPH faxed the complaint to the agency, received a follow-up from the agency, and reviewed the agency follow-up. The results of the testing cannot be projected to this population.

Recertification Survey Cited Deficiencies Follow-Up Test Methodology

After reviewing the recertification survey scores for 393 licensed nursing homes, we identified the “Nursing” category as the section of the survey most pertinent to our audit objective. We selected a judgmental sample of the 19 lowest-scoring nursing homes in the “Nursing” category (those with a score below 25) and requested the Statement of Deficiencies (SOD), which notes any instances of facility noncompliance found by DPH during the recertification survey, for each survey in our sample. We also requested and reviewed all follow-up SODs and each nursing home’s plan to correct the cited deficiencies to verify that DPH followed up with each facility until the cited deficiencies were corrected.

Data Reliability Assessment

We assessed the reliability of the data from HCFRS and ASPEN that we used to support our data matching and joining procedures and ultimately our substantive testing. In both data reliability assessments, we concluded that the data used were sufficiently reliable for the matching and joining procedures and for our substantive testing. We performed the following data reliability assessment tests:

  • a “trace to and from” procedure using two samples of 25 sources (written intakes) and HCFRS data
  • an intake report date validity check to confirm that all intakes were within our audit period
  • a duplicate check to confirm that no duplicate HCFRS identifiers or duplicate rows were present in our data
  • a comparative check after the joining procedure to confirm for each row that all facility identifiers in ASPEN data matched exactly with all facility identifiers in HCFRS data
Date published: September 11, 2019
Feedback