• This page, Audit of the Operational Services Division Objectives, Scope, and Methodology, is   offered by
  • Office of the State Auditor

Audit of the Operational Services Division Objectives, Scope, and Methodology

An overview of the purpose and process of auditing the Operational Services Division.

Table of Contents

Overview

In accordance with Section 12 of Chapter 11 of the Massachusetts General Laws, the Office of the State Auditor has conducted a performance audit of certain activities of the Operational Services Division (OSD) for the period July 1, 2021 through December 31, 2022.

We conducted this performance audit in accordance with generally accepted government auditing standards. Those standards require that we plan and perform the audit to obtain sufficient, appropriate evidence to provide a reasonable basis for our findings and conclusions based on our audit objectives. We believe that the evidence obtained provides a reasonable basis for our findings and conclusions based on our audit objectives.

Below is a list of our audit objectives, indicating each question we intended our audit to answer; the conclusion we reached regarding each objective; and, if applicable, where each objective is discussed in the audit findings.

Objective

Conclusion

  1. Did OSD’s Mass.gov website comply with the Executive Office of Technology Services and Security’s (EOTSS’s) Enterprise Information Technology Accessibility Policy and the Web Content Accessibility Guidelines (WCAG) 2.1 for user accessibility, keyboard accessibility, navigation accessibility, language, error identification, and color accessibility?

No; see Finding 1

  1. Did OSD ensure that all contracts posted to its COMMBUYS website complied with EOTSS’s Enterprise Information Technology Accessibility Policy and WCAG 2.1 for user accessibility, keyboard accessibility, navigation accessibility, language, error identification, and color accessibility?

No; see Findings 2, 3, and 4

  1. Did OSD establish information technology (IT) governance policies and procedures over the following areas:
  1. business continuity and disaster recovery plans that met the requirements of Sections 6.1.1.4 and 6.2.1 of EOTSS’s Business Continuity and Disaster Recovery Standard IS.005;
  2. information security incident response plan and procedures that met the requirements of Sections 6.5.1 and 6.5.2 of EOTSS’s Information Security Incident Management Standard IS.009; and
  3. cybersecurity awareness training that met the requirements of Sections 6.2.3 and 6.2.4 of EOTSS’s Information Security Risk Management Standard IS.010?

No; see Findings 5 and 6

 

To accomplish our audit objectives, we gained an understanding of the aspects of OSD’s internal control environment relevant to our objectives by reviewing applicable policies and procedures and by interviewing OSD staff members and management.

To obtain sufficient, appropriate evidence to address our audit objectives, we performed the following procedures.

Web Accessibility

To determine whether OSD’s Mass.gov website met EOTSS’s Enterprise Information Technology Accessibility Policy and WCAG 2.1 for user accessibility keyboard accessibility, navigation accessibility, language, error identification, and color accessibility, we tested a random, nonstatistical sample of 35 out of a total of 189 OSD Mass.gov webpages in the audit population. We performed the following procedures on the sampled webpages.

User Accessibility

  • We determined whether the webpage could be viewed in both portrait and landscape modes.
  • We determined whether, when zoomed in to 200%, content on the webpage was undamaged and remained readable.
  • We determined whether, when zoomed in to 400%, content on the webpage was undamaged and in a single column.

Keyboard Accessibility

  • We determined whether all elements7 of the webpage could be navigated using only a keyboard.
  • We determined whether any elements on the webpage prevented a user from moving to a different element when using only a keyboard to navigate the webpage.

Navigation Accessibility

  • We determined whether there was a search function present to help users locate content.
  • We determined whether related hyperlinks allowed navigation to the intended webpage.

Language

  • We determined whether words that appeared on the webpage matched the language tag8 to which the webpage was set by examining its properties.
  • We determined whether proper names were identified in PDF files included on the webpage to avoid improper translation or pronunciation errors from screen readers.

Error Identification

  • We determined whether there was text explaining why an error occurred when a user input information into an entry field.
  • We determined whether there were examples given to assist the user in correcting mistakes (for example, a warning when entering a letter in a field meant for numbers).

Color Accessibility

  • We determined whether there was at least a 3:1 contrast in color and additional visual cues to distinguish hyperlinks, which WCAG recommends for users with colorblindness or other visual impairments.

See Finding 1 for an issue we identified regarding hyperlinks on OSD’s Mass.gov website.

Contract Accessibility

To determine whether contracts posted to OSD’s COMMBUYS website met EOTSS’s Enterprise Information Technology Accessibility Policy and WCAG 2.1 for user accessibility, keyboard accessibility, navigation accessibility, language, error identification, and color accessibility, we inspected a random, nonstatistical sample of 40 unique purchase orders (POs) out of a population of 364.9

We also inspected statistical samples of 60 bid solicitations out of a population of 13,897 and 60 master blanket purchase orders (MBPOs) out of a population of 3,413, using a 95% confidence level, a 0% expected error rate, and a 5% tolerable error rate.

  • The 60 MBPOs in our sample contained references to 25 CUGs. Of these 25 CUGs, 10 were no longer in use at the time of our audit and 4 appeared twice. This gave us a total population of 11 unique CUGs that we tested for accessibility, as described below.
  • Of the 60 MBPOs in our sample, 40 MBPOs contained attached contract forms. Of these 40 MBPOs, 24 contained one attached contract form and 16 contained multiple contract forms. This gave us a total of 71 unique attached contract forms that we tested for accessibility, as described below.

Additionally, to determine whether the processes for bid solicitation creation, quote creation, and PO creation related to Statewide Contracts (SWCs) met EOTSS’s Enterprise Information Technology Accessibility Policy and WCAG 2.1 for user accessibility, keyboard accessibility, navigation accessibility, language, error identification, and color accessibility, we inspected all 30 webpages available on COMMBUYS (12 related to the bid solicitation creation process, 8 related to the quotes creation process, and 10 related to creating POs from SWCs). We performed the following procedures on the preceding samples.

User Accessibility

  • We determined whether the webpage could be viewed in both portrait and landscape modes.
  • We determined whether, when zoomed in to 200%, content on the webpage was undamaged and remained readable.
  • We determined whether, when zoomed in to 400%, content on the webpage was undamaged and in a single column.

Keyboard Accessibility

  • We determined whether all elements of the webpage could be navigated using only a keyboard.
  • We determined whether any elements on the webpage prevented a user from moving to a different element when using only a keyboard to navigate the webpage.

Navigation Accessibility

  • We determined whether there was a search function present to help users locate content.
  • We determined whether related hyperlinks allowed navigation to the intended webpage.

Language

  • We determined whether words that appeared on the webpage or PDF matched the language tag to which the webpage or PDF was set by examining its properties.
  • We determined whether proper names were identified in PDF files included on the webpage to avoid improper translation or pronunciation errors from screen readers.

Error Identification

  • We determined whether there was text explaining why an error occurred when a user input information into an entry field.
  • We determined whether there were examples given to assist the user in correcting mistakes (for example, a warning when entering a letter in a field meant for numbers).

Color Accessibility

  • We determined whether there was at least a 3:1 contrast in color and additional visual cues to distinguish hyperlinks.

See Findings 2, 3, and 4 for issues we identified regarding CUGs, attached contract forms, and the COMMBUYS bid solicitation and purchase creation processes.

IT Governance

To determine whether OSD established effective IT governance policies and procedures, we performed the following procedures.

Information Security Incident Response Plan and Procedures

To determine whether OSD’s information security incident response plan and procedures complied with Sections 6.5.1 and 6.5.2 of EOTSS’s Information Security Incident Management Standard IS.009, we interviewed knowledgeable OSD staff members and requested OSD’s information security incident response plan and procedures. We learned that OSD relies on its secretariat agency, the Executive Office for Administration and Finance, for an information security incident response plan and procedures, so we inspected the Executive Office for Administration and Finance’s information security incident response plan and procedures to determine whether they complied with the aforementioned EOTSS policy.

See Finding 5 for an issue we identified regarding OSD’s information security incident response plan and procedures.

Business Continuity and Disaster Recovery

To determine whether OSD’s business continuity plan complied with Section 6.1.1.4 of EOTSS’s Business Continuity and Disaster Recovery Standard IS.005, we interviewed knowledgeable OSD staff members and inspected OSD’s business continuity plan to ensure that it addressed the following: critical business processes, manual and automated processes used by the agency, minimum operating requirements to resume critical functions, the designation of a business continuity lead, clearly defined and communicated roles and responsibilities, assigned points of contact, and annual updates.

To determine whether OSD’s disaster recovery plan complied with Section 6.2.1 of EOTSS’s Business Continuity and Disaster Recovery Standard IS.005, we interviewed knowledgeable OSD staff members and inspected OSD’s disaster recovery plan to ensure that it addressed the following:

  • developing and maintaining processes for disaster recovery;
  • identifying relevant stakeholders;
  • conducting damage assessments of impacted IT infrastructure and applications;
  • establishing procedures that allow facility access to support the restoration of data in an emergency;
  • recovering critical agency services;
  • implementing interim means for performing critical business processes at or above minimum service levels; and
  • restoring service at the original site of impact without interruption.

See Finding 6 for an issue we identified regarding OSD’s business continuity plan.

Cybersecurity Awareness Training

To determine whether OSD’s cybersecurity awareness training met the requirements of Sections 6.2.3 and 6.2.4 of EOTSS’s Information Security Risk Management Standard IS.010, we performed the following procedures.

  • We inspected the cybersecurity awareness training certificates of completion for a random sample of 15 out of a total population of 48 newly hired employees to determine whether they completed the new hire cybersecurity awareness training within 30 days of orientation.
  • We inspected the cybersecurity awareness training certificates of completion for a random sample of 20 out of a total population of 60 employees to determine whether they completed the annual refresher cybersecurity awareness training.

We noted no exceptions in our testing; therefore, we conclude that, during the audit period, OSD cybersecurity awareness training met the requirements of Sections 6.2.3 and 6.2.4 of EOTSS’s Information Security Risk Management Standard IS.010.

We used a combination of statistical and nonstatistical sampling methods for testing and did not project the results of our testing to any population.

Data Reliability Assessment

Web Accessibility Testing

To determine the reliability of the site map spreadsheet we received from OSD management, we interviewed knowledgeable OSD staff members and checked that variable formats (e.g., dates, unique identifiers, and abbreviations) were accurate. Additionally, we ensured that there was no abbreviation of data fields, no missing data (e.g., hidden rows or columns, blank cells, and incomplete records), and no duplicate records and that all values in the data set corresponded with expected values.

We selected a random sample of 20 uniform resource locators (URLs)10 that could be accessed independently from the OSD site map and traced them to the corresponding webpage, checking that each URL and page title matched the information on OSD’s Mass.gov website. We also selected a random sample of 20 URLs from OSD’s Mass.gov website and traced each URL and page title to the site map to ensure that there was a complete and accurate population of URLs on the site map.

COMMBUYS Data

To assess the reliability of the COMMBUYS lists of MBPOs, POs, and bid solicitations that we received from OSD, we checked that variable formats (e.g., dates, unique identifiers, and abbreviations) were accurate. Additionally, we ensured that there was no abbreviation of data fields, no hidden rows or columns, and that all MBPOs, POs, and bid solicitations in the data set were active during the audit period. We then selected random samples of 20 each of MBPOs, POs, and bid solicitations from the appropriate lists and traced them to the COMMBUYS website. Additionally, we selected random samples of 20 each of MBPOs, POs, and bid solicitations from COMMBUYS and traced them to our lists of MBPOs, POs, and bid solicitations.

IT Governance Testing

To determine the reliability of the employee list we received from OSD management, we checked that variable formats (e.g., dates, unique identifiers, and abbreviations) were accurate. Additionally, we ensured that there was no abbreviation of data fields, no missing data (e.g., hidden rows or columns, blank cells, and incomplete records), and no duplicate records and that all values in the data set corresponded with expected values.

We selected a random sample of 10 employees from the employee list and traced their names to CTHRU, the Commonwealth’s statewide payroll open records system, to verify the list’s accuracy. We also selected a random sample of 10 employees from CTHRU and traced their names back to the employee list provided by OSD to ensure that we received a complete and accurate employee list.

Based on the results of the data reliability assessment procedures described above, we determined that the site map, the COMMBUYS data lists, and the employee list were sufficiently reliable for the purposes of our audit.

7.    An element is a part of a webpage that contains data, text, or an image.

8.    A language tag identifies the native language of the content on the webpage or PDF (e.g., a webpage in English should have an EN language tag). The language tag is listed in the webpage’s or PDF’s properties. This, among other things, is used to help screen readers use the correct pronunciation for words.

9.    We arrived at this population by examining unique PO numbers to avoid testing a sample that included duplicate PO numbers (i.e., that included a PO with multiple requisitions). This resulted in a smaller, more diverse population of contracts that did not skew toward contracts that appeared multiple times.

10.    A URL uniquely identifies an internet resource, such as a website.

Date published: April 25, 2024

Help Us Improve Mass.gov  with your feedback

Please do not include personal or contact information.
Feedback