• This page, Audit of the Massachusetts Commission for the Blind Objectives, Scope, and Methodology, is   offered by
  • Office of the State Auditor

Audit of the Massachusetts Commission for the Blind Objectives, Scope, and Methodology

An overview of the purpose and process of auditing the Massachusetts Commission for the Blind.

Table of Contents

Overview

In accordance with Section 12 of Chapter 11 of the Massachusetts General Laws, the Office of the State Auditor has conducted a performance audit of the Massachusetts Commission for the Blind (MCB) for the period July 1, 2016 through December 31, 2017.

We conducted this performance audit in accordance with generally accepted government auditing standards. Those standards require that we plan and perform the audit to obtain sufficient, appropriate evidence to provide a reasonable basis for our findings and conclusions based on our audit objectives. We believe that the evidence obtained provides a reasonable basis for our findings and conclusions based on our audit objectives.

Below is our audit objective, indicating the question we intended our audit to answer, the conclusion we reached regarding the objective, and where it is discussed in the audit findings.

Objective

Conclusion

  1. Does MCB ensure that vocational rehabilitation (VR) program information in its client case management system, Accessible Web-Based Activity and Reporting Environment (AWARE), is complete and accurate?

Partially; see Findings 1 and 2

 

To achieve our audit objective, we gained an understanding of the internal control environment by reviewing MCB’s internal control plan and applicable laws, regulations, and agency policies and procedures, as well as conducting interviews with MCB staff members and managers. We tested the design and effectiveness of controls we determined to be relevant to our audit objective over the annual review of the standards, performance indicators, and edit check of the Rehabilitation Services Administration (RSA). Additionally, we performed the following procedures.

To obtain an understanding of MCB’s process for tracking consumer satisfaction with its VR program, we interviewed MCB management to determine how MCB tracks consumer complaints. We also sent surveys to the six MCB regional directors to determine how VR consumers’ complaints about VR service delivery are recorded in each case-service record.

RSA requires that VR eligibility determinations that take more than 60 days from the time of application be explained in the case files and include evidence that consumers have agreed to the delay. To determine whether MCB complied with this requirement, we selected a judgmental sample of 7 of 13 consumers whose eligibility determinations took more than 60 days and reviewed their case files to determine whether the causes for, and consumers’ agreement to, the delays were appropriately documented.

We determined the reliability of controls in AWARE by performing interviews with MCB’s AWARE database administrator and policy consultant and testing certain general information technology controls over account management, security training and identification, and authentication policies and procedures. Further, we verified the functionality of various application controls in AWARE, including signature requirements and completion of required fields.

We extracted and reviewed consumer demographic data from AWARE. In reviewing the data, we noted that the information on how citizens were referred to the VR program indicated that most were “self-referrals” rather than the expected “physician referrals.” We brought this matter to the attention of MCB management, who told us that when information on MCB cases was imported from MCB’s legacy system (System 7) to AWARE, AWARE defaulted to indicating that all consumers were self-referred even when the referrals had come from their physicians. Additionally, in reviewing the data, we noted many instances where a required data field indicating the consumer’s race was missing (null values4), which raised concerns about the integrity of the information in AWARE.

To determine the number of null value errors that may have occurred when MCB transferred its case information from System 7 to AWARE, we obtained from MCB a copy of a Case Service Report dated December 1, 2017 (before the data transfer on December 4, 2017), compared the information on this report to a Case Service Report generated by MCB on December 31, 2017 (after the transfer), and identified any changes in null values.

To further assess the integrity of the data, we selected a nonstatistical random sample of 40 of the 416 VR cases in AWARE with application dates that were within our audit period and before the data import from System 7 on December 4, 2017. We then identified five RSA-required fields within our sample that we expected would be the same both before and after AWARE was implemented (application date, birth date, gender, race, and referral source) and compared the information in these data fields in AWARE to the corresponding information in System 7.

We also traced federal fiscal year 2017 applicant data from AWARE to the federal fiscal year 2017 applicant data reported by MCB (as part of its mandated reporting to RSA) and to the federal fiscal year 2017 applicant data in System 7.

Whenever sampling was used, we applied a nonstatistical sampling approach, and as a result, we could not project our results to the entire population.

4.    Null values are data fields that are blank, are not completed, or do not have information available.

Date published: March 28, 2019

Help Us Improve Mass.gov  with your feedback

Please do not include personal or contact information.
Feedback