• This page, Audit of the Office of Consumer Affairs and Business Regulation Objectives, Scope, and Methodology, is   offered by
  • Office of the State Auditor

Audit of the Office of Consumer Affairs and Business Regulation Objectives, Scope, and Methodology

An overview of the purpose and process of auditing the Office of Consumer Affairs and Business Regulation.

Table of Contents

Overview

In accordance with Section 12 of Chapter 11 of the Massachusetts General Laws, the Office of the State Auditor has conducted a performance audit of certain activities of the Office of Consumer Affairs and Business Regulation (OCABR) for the period July 1, 2022 through June 30, 2023.

We conducted this performance audit in accordance with generally accepted government auditing standards. Those standards require that we plan and perform the audit to obtain sufficient, appropriate evidence to provide a reasonable basis for our findings and conclusions based on our audit objectives. We believe that the evidence obtained provides a reasonable basis for our findings and conclusions based on our audit objectives.

Below is a list of our audit objectives, indicating each question we intended our audit to answer; the conclusion we reached regarding each objective; and, if applicable, where each objective is discussed in the audit findings.

Objective  Conclusion
  1. Was OCABR’s website in compliance with the Executive Office of Technology Services and Security’s (EOTSS’s) Enterprise Information Technology Accessibility Policy and the World Wide Web Consortium’s (W3C’s) Web Content Accessibility Guidelines (WCAG) 2.1 for user accessibility, keyboard accessibility, navigation accessibility, language, error identification, and color accessibility?
No; see Finding 1
  1. Did OCABR do the following to implement certain information technology (IT) governance policies:
    1. establish classification or sensitivity levels of all information of which it had custody in accordance with Section 6.2. of EOTSS’s Asset Management Standard IS.004;
    2. identify and securely delete stored information that exceeded defined retention periods on a quarterly basis in accordance with Section 6.4.2.4. of EOTSS’s Asset Management Standard IS.004; and
    3. conduct a business impact analysis or risk assessment to determine the classification level of information systems in accordance with Section 6.6.2. of EOTSS’s Asset Management Standard IS.004?
No; see Findings 2, 3, and 4
  1. Did OCABR restrict access to personally identifiable information (PII) to a narrow subset of personnel members who had a business need to access the information in accordance with Section 6.2.1. of EOTSS’s Asset Management Standard IS.004?
No; see Finding 5

To accomplish our audit objectives, we gained an understanding of the aspects of OCABR’s internal control environment relevant to our objectives by reviewing applicable policies and procedures and by interviewing OCABR staff members and management. In addition, to obtain sufficient, appropriate evidence to address our audit objectives, we performed the procedures described below.

Website Accessibility Testing

To determine whether OCABR’s website was in compliance with EOTSS’s Enterprise Information Technology Accessibility Policy and W3C’s WCAG 2.1 for user accessibility, keyboard accessibility, navigation accessibility, language, error identification, and color accessibility, we took the actions described below.

We selected a random, nonstatistical sample of 60 OCABR webpages out of a population of 783 OCABR webpages. We performed the procedures described below on the sampled webpages.

User Accessibility

  • We determined whether content on the website was able to be viewed in both portrait and landscape modes.
  • We determined whether content on the webpage was undamaged and remained readable when zoomed to 200% and 400%.

Keyboard Accessibility

  • We determined whether all elements10 of the webpage could be navigated using only a keyboard.
  • We determined whether any elements on the webpage prevented a user from moving to a different element when using only a keyboard to navigate the webpage.
  • We determined whether the first focusable control11 is a hyperlink that redirects to the main content of the website. The first focusable control is known as either a bypass block or a skip link.
  • We determined whether the website contained a title that was relevant to website content.
  • We determined whether there was a search function present to help users locate content.
  • We determined whether related hyperlinks allowed navigation to the intended webpage.
  • We determined whether headings within websites related to the content of the header’s section.

Language

  • We determined whether video content found within the website had all important sounds and dialogue captioned.
  • We determined whether the language of the webpage was tagged with the correct language attribute.12
  • We determined whether words that appeared on the webpage matched the language to which the webpage was set.

Error Identification

  • We determined whether mandatory form fields alerted users if the field was left blank.
  • We determined whether there was a label for elements that required user input.
  • We determined whether the label was programmed correctly.
  • We determined whether there were examples given to assist the user in correcting mistakes (for example, a warning when entering a letter in a field meant for numbers).

Color Accessibility

  • We determined whether there was at least a 3:1 contrast in color and additional visual cues to distinguish hyperlinks, which WCAG recommends for users with colorblindness or other visual impairments.

Out of the 60 webpages we selected in our sample, 18 had been removed from OCABR’s website by the time we began our testing. OCABR management informed us these 18 webpages were for events that had already occurred, which is why they removed the webpages.

See Finding 1 regarding hyperlinks on OCABR’s website.

IT Governance Testing

We took the following actions to determine whether OCABR established IT governance policies and procedures over the areas listed below.

Information Classification Policy

To determine whether OCABR’s information classification policy met the requirements of Section 6.2 of EOTSS’s Asset Management Standard IS.004, we interviewed knowledgeable OCABR staff members and requested OCABR’s information classification policy. We learned that OCABR did not have an information classification policy in place during the audit period.

See Finding 2 regarding OCABR’s information classification policy.

Information Disposal Plan and Procedures

To determine whether OCABR’s information disposal procedures met the requirements of Section 6.4.2.4 of EOTSS’s Asset Management Standard IS.004, we interviewed knowledgeable OCABR staff members and requested OCABR’s information disposal plan and procedures. We were informed that OCABR had not established a procedure for disposing of stored information that exceeds defined retention periods during the audit period.

See Finding 3 regarding OCABR’s information disposal procedures.

Business Impact Analysis or Risk Assessment to Determine Information System Classification

To determine whether OCABR conducted a business impact analysis or risk assessment in accordance with Section 6.6.2 of EOTSS’s Asset Management Standard IS.004, we interviewed knowledgeable OCABR staff members and requested OCABR’s business impact analysis or risk assessment used to determine the classification levels of its information systems. We were informed that OCABR did not conduct a business impact analysis or risk assessment to determine the classification levels of its information systems.

See Finding 4 regarding OCABR’s business impact analysis and/or risk assessment.

Restricted Access to PII

To determine whether OCABR restricted access to PII to the narrow subset of personnel members who had a business need to access the information in accordance with Section 6.2.1. of EOTSS’s Asset Management Standard IS.004, we took the actions described below. We requested that knowledgeable OCABR staff members identify personnel members on the OCABR employee list who had access to PII. Then, we selected a random, nonstatistical sample of 20 employees out of a population of 52 employees who had access to PII. For each employee in our sample, we then inspected IT tickets and emails to determine whether these personnel members were granted the approvals needed before gaining access to PII.

See Finding 5 regarding OCABR’s authorization process for access to PII.

Due to the small testing populations, we used nonstatistical sampling methods for testing and therefore did not project the results of our testing to any population.

Data Reliability Assessment

Web Accessibility

To determine the reliability of the site map spreadsheet we received from OCABR management, we interviewed knowledgeable OCABR staff members and checked that variable formats (e.g., dates, unique identifiers, or abbreviations) were accurate. Additionally, we ensured that there was no abbreviation of data fields, no missing data (e.g., hidden rows or columns, blank cells, or absent records), no duplicate records, and that all values in the dataset corresponded with expected values.

We selected a random sample of 20 uniform resource locators (URLs)13 that could be accessed independently from the OCABR site map and traced each to the corresponding webpage, checking that each URL and webpage title matched the information on the OCABR website. We also selected a random sample of 20 URLs from OCABR’s website and traced each URL and webpage title to the site map to ensure that there was a complete and accurate population of URLs on the site map.

IT Governance

To determine the reliability of the employee list we received from OCABR management, we interviewed OCABR management and knowledgeable OCABR staff members and checked that variable formats (e.g., dates, unique identifiers, or abbreviations) were accurate. Additionally, we ensured that there was no abbreviation of data fields, no missing data (e.g., hidden rows or columns, blank cells, or absent records), no duplicate records, and that all values in the dataset corresponded with expected values.

We selected a random sample of 10 employees from the employee list and traced their names to CTHRU, the Commonwealth’s statewide payroll open records system, to verify the list’s accuracy. We also selected a random sample of 10 employees from CTHRU and traced their names back to the employee list provided by OCABR to ensure that we received a complete and accurate employee list.

Based on the results of the data reliability assessment procedures described above, we determined that the site map and the employee list we obtained during the course of our audit were sufficiently reliable for the purposes of our audit.

10.    An element is a part of a webpage that contains data, text, or an image.

11.    The first focusable control is the first element a user will be brought to on a webpage when navigating with a keyboard.

12.    A language tag identifies the native language of the content on the webpage or PDF (e.g., a webpage in English should have an EN language tag). The language tag is listed in the webpage’s or PDF’s properties. This, among other things, is used to help screen readers use the correct pronunciation for words.

13.    A URL uniquely identifies an internet resource, such as a website.

Date published: May 5, 2025

Help Us Improve Mass.gov  with your feedback

Please do not include personal or contact information.
Feedback