This page, Audit of the Department of Revenue—Information Security Objectives, Scope, and Methodology, is offered by

Audit of the Department of Revenue—Information Security Objectives, Scope, and Methodology

An overview of the purpose and process of auditing the Department of Revenue—Information Security

Table of Contents

Overview

In accordance with Section 12 of Chapter 11 of the Massachusetts General Laws, the Office of the State Auditor (OSA) has conducted a performance audit of certain activities of the Department of Revenue (DOR) for the period July 1, 2016 through December 31, 2018. For our review of DOR’s training programs, we used attendance records from April 19, 2018 through June 4, 2019.

We conducted this performance audit in accordance with generally accepted government auditing standards. Those standards require that we plan and perform the audit to obtain sufficient, appropriate evidence to provide a reasonable basis for our findings and conclusions based on our audit objectives. We believe that the evidence obtained provides a reasonable basis for our findings and conclusions based on our audit objectives.

Below is a list of our audit objectives, indicating each question we intended our audit to answer; the conclusion we reached regarding each objective; and, if applicable, where each objective is discussed in the audit findings.

Objective

Conclusion

  1. Does DOR have a governing committee tasked with identifying, classifying, and mitigating information security risks?

No; see Finding 1

  1. Has DOR designed and implemented user training programs and acknowledgement forms to protect personally identifiable information (PII)?

Yes

  1. Have policies supporting PII protection been defined and documented?

Yes

  1. Does DOR have documented and tested procedures to handle information security incidents?

No; see Finding 2

  1. Does DOR manage risks with third-party vendors to meet Executive Office of Technology Services and Security (EOTSS) standards and National Institute of Standards and Technology (NIST) standard 800-53r4 related to the protection of PII?

No; see Findings 3 and 4

 

We conducted this performance audit by using criteria from policies, procedures, and standards issued by DOR, as well as policies and standards issued by the Massachusetts Office of Information Technology (MassIT) before October 15, 2018. MassIT is the predecessor agency to EOTSS. In addition, we referred to enterprise security policies and standards issued by EOTSS, which had an effective date of October 15, 2018.

We also referred to NIST’s Special Publication 800-53, Revision 4, titled Security and Privacy Controls for Federal Information Systems and Organizations, and Special Publication 800-122, titled Guide to Protecting the Confidentiality of Personally Identifiable Information, as well as the Information Systems Audit and Control Association’s (ISACA’s) Control Objectives for Information and Related Technology 4.1. Although DOR is not required to follow these industry standards, OSA believes they represent best practices for information security.

We gained an understanding of the internal controls we deemed significant to our audit objectives through interviews and observations. To achieve our audit objectives, we conducted the following activities.

  • To assess whether DOR had a governing committee tasked with identifying, classifying, and mitigating information security risks, we performed the following procedures:
  • We interviewed employees to determine whether an information technology (IT) strategy committee was in place and active at DOR.
  • We reviewed both sets of meeting minutes by DOR’s security review board from the audit period to determine the content of the meetings and whether they constituted an IT strategy committee.
  • To assess the design and implementation of DOR’s user training programs and acknowledgement forms aimed at protecting PII, we performed the following procedures:
  • We reviewed both DOR’s and the Executive Office for Administration and Finance’s confidentiality policies to determine whether they complied with ISACA best practices.
  • We reviewed DOR’s “Safeguarding DOR’s Critical Assets: Information and Public Trust” training program to determine whether it addressed the 2019 “Confidentiality Policy” and “Acceptable Use Policy.”
  • We obtained DOR’s training attendance records from its Learning Management System (LMS) between April 19, 2018 and June 4, 2019 and reviewed records for all 1,950 users assigned to the 2018 “Safeguarding DOR’s Critical Assets: Information and Public Trust” program, and all 1,939 users required to sign the 2019 “Confidentiality Policy,” to determine whether the users completed them.
  • To assess DOR’s IT security policies and determine whether DOR had policies and procedures to cover all 16 of EOTSS’s information security policies and standards, we cross-referenced DOR’s policies and procedures with EOTSS’s.
  • To assess DOR’s security incident response procedures and determine whether they constituted a security incident response plan, we obtained and reviewed DOR’s “Security Incident Response Policy.”
  • To assess DOR’s risk management with its third-party contractors, we performed the following procedures:
  • We reviewed a judgmental sample of 17 out of 22 DOR contracts with vendors identified by DOR officials as having received or accessed PII during our audit period to determine whether the contracts contained applicable information security and confidentiality provisions.
  • We asked whether DOR had documented risk assessments for its third-party vendors.
  • We reviewed DOR’s interdepartmental service agreement with EOTSS to determine whether it clearly spelled out each entity’s roles and responsibilities.

To assess the completeness and accuracy of training records from LMS, we interviewed the system administrator at DOR. In addition, we observed an LMS administrator obtaining the training records from LMS. We performed electronic tests to check for duplicate records and other abnormalities and assessed the query that was used to extract the data to determine its sufficiency for our testing purposes. To assess the accuracy of DOR’s list of vendors that received PII, we vouched1 vendors from this list to vendors in the Commonwealth Information Warehouse. However, because there was no other way for us to determine which vendors had access to PII, we relied on the list provided by DOR. Based on the results of these data reliability assessment procedures, we determined that the information obtained for our audit was sufficiently reliable for the purpose of our audit.

1.    Vouching is the act of comparing auditee-provided documentation to source documentation to determine the accuracy of data.

Date published: December 13, 2019
Feedback