• This page, Audit of the Massachusetts Cultural Council Objectives, Scope, and Methodology, is   offered by
  • Office of the State Auditor

Audit of the Massachusetts Cultural Council Objectives, Scope, and Methodology

An overview of the purpose and process of auditing the Massachusetts Cultural Council.

Overview

In accordance with Section 12 of Chapter 11 of the Massachusetts General Laws, the Office of the State Auditor has conducted a performance audit of certain activities of the Massachusetts Cultural Council (MCC) for the period July 1, 2022 through June 30, 2024.

We conducted this performance audit in accordance with generally accepted government auditing standards. Those standards require that we plan and perform the audit to obtain sufficient, appropriate evidence to provide a reasonable basis for our findings and conclusions based on our audit objectives. We believe that the evidence obtained provides a reasonable basis for our findings and conclusions based on our audit objectives.

Below is a list of our audit objectives, indicating each question we intended our audit to answer; the conclusion we reached regarding each objective; and, if applicable, where each objective is discussed in the audit findings.

ObjectiveConclusion
  1. Did MCC establish and administer grant program funding in accordance with its grant guidelines?
Yes
  1. Did MCC ensure that local cultural councils (LCCs) complied with the annual reporting requirements outlined in MCC’s “Local Cultural Council Program Guidelines”?
No; see Finding 1
  1. Did MCC ensure that corrective actions for noncompliance with program guidelines that it identified during its review of LCCs were implemented in accordance with MCC’s “Local Cultural Council Program Guidelines”?
Yes

To accomplish our audit objectives, we gained an understanding of the MCC internal control environment relevant to our objectives by reviewing applicable policies and procedures, as well as by conducting interviews and walkthroughs with MCC management. We evaluated the design and implementation of the internal controls related to our objectives. We also tested the operating effectiveness of controls related to the review and approval of LCC operations before the release of funding and the committee review and approval of the grantee funding recommendation list. In addition, to obtain sufficient, appropriate evidence to address our audit objectives, we performed the procedures described below.

MCC Program Grants

To determine whether MCC established and administered grant program funding in accordance with its grant guidelines, we took the following actions. First, we divided our testing into two groups: (1) grant applicants who applied for the Cultural Sector Recovery for Individuals and Organizations Grants and (2) grant applicants who applied to all other MCC grant programs during the audit period.

For the grant applicants who applied for the Cultural Sector Recovery for Individuals and Organizations Grants, we selected a statistical3 sample of 60 out of a total population of 9,102 grant applicants, using a 95% confidence level,4 a 0% expected error rate,5 and a 5% tolerable error rate.6

For the grant applicants who applied for all other MCC program grants, we selected a statistical sample of 47 out of a total population of 8,564 grant applicants, using a 90% confidence level, a 0% expected error rate, and a 5% tolerable error rate.

For each of the applicants in our samples, we reviewed the grant guidelines for each of the grants they applied to, reviewed the approved funding list for each of the grant programs, and confirmed that the grant applicant met eligibility requirements by examining their Forms W-9 and grant applications. Furthermore, we determined whether the approved amounts on the funding lists matched the payment amounts recorded in the Massachusetts Management Accounting and Reporting System (MMARS).

For this objective, we found no significant issues during our testing. Therefore, we concluded that, based on our testing, MCC established and administered grant program funding in accordance with its grant guidelines. 

Compliance with MCC’s “Local Cultural Council Program Guidelines”

To determine whether MCC ensured that LCCs complied with the annual reporting requirements outlined in MCC’s “Local Cultural Council Program Guidelines,” we took the following actions. First, we selected a random, nonstatistical7 sample of 35 out of 329 LCCs for each fiscal year during the audit period.

For each of the 35 LCCs in our sample, we confirmed, using MCC’s grant management system, that the LCC submitted its council priorities and local guidelines by the required deadline of August 31. We also confirmed that the LCC’s grant financial report was submitted by October 17. Furthermore, we confirmed that the annual grant decision report was submitted by its due date of January 17. We followed up with MCC for any inconsistencies when LCCs did not submit annual reports by the required dates indicated in MCC’s “Local Cultural Council Program Guidelines.”

For this objective, we found certain issues during our testing; namely, MCC did not always ensure that LCCs complied with the annual reporting requirements outlined in its “Local Cultural Council Program Guidelines.” See Finding 1 for more information. 

Follow-Up of LCC’s Corrective Actions for Noncompliance

To determine whether MCC ensured that corrective actions for noncompliance with program guidelines that it identified during its review of LCCs were implemented in accordance with MCC’s “Local Cultural Council Program Guidelines,” we took the following actions. First, we selected a random, nonstatistical sample of 20 out of 93 LCCs that were late for over 15 days in submitting their annual reports during the audit period. For each of the LCCs in our sample, we examined emails from MCC to LCCs and comments in the grant management system to ensure that LCCs had addressed the corrective actions provided by MCC.

For this objective, we found no significant issues during our testing. Therefore, we concluded that, based on our testing, MCC ensured that corrective actions for noncompliance with program guidelines that it identified during its review of LCCs were implemented in accordance with MCC’s “Local Cultural Council Program Guidelines.”

We used a combination of statistical and nonstatistical sampling methods for testing. Where we used nonstatistical sampling methods, we did not project the results of our testing to any populations. 

Data Reliability Assessment

To determine the reliability of the list of program grant applicants and the list of LCCs during the audit period that we obtained from MCC’s grant management system, we interviewed MCC employees who were knowledgeable about the data to gain an understanding of the system’s controls and data management process. We reviewed System and Organization Control 2 Reports8 covering the audit period, including security management, access controls, configuration management, segregation of duties, and contingency planning. We also tested whether personnel screenings and cybersecurity awareness trainings were conducted. For this aspect of our data reliability assessment, we found certain issues during our testing; namely, MCC did not ensure that all grant management system contractors completed the cybersecurity awareness training. See Finding 2 for more information.

For the list of program grant applicants, we tested the data to ensure that it did not contain certain dataset issues (i.e., duplicate records, hidden cells and rows, and blank fields). Further, we compared this list to the list of program grant payments that we obtained from MMARS to ensure accuracy. Additionally, we selected a random sample of 20 program applicants and verified the applicants’ names against their Forms W-9 to ensure accuracy. Further, we compared the total number of records in the dataset we received to the number that we observed during the data extraction process to confirm completeness.

For the list of LCCs, we tested the data to ensure that it did not contain certain dataset issues (i.e., duplicate records, hidden cells and rows, and blank fields). Furthermore, we compared the list with the list of cities and towns that we obtained from the Citizen Information Service of the Secretary of the Commonwealth of Massachusetts to ensure the completeness and accuracy of the list of LCCs.

Based on the results of the data reliability assessment procedures described above, we determined that the information we obtained during the course of our audit was sufficiently reliable for the purposes of our audit.

3.    Auditors use statistical sampling to select items for audit testing when a population is large (usually over 1,000) and contains similar items. Auditors generally use a statistics software program to choose a random sample when statistical sampling is used. The results of testing using statistical sampling, unlike those from judgmental sampling, can usually be used to make conclusions or projections about entire populations.

4.    Confidence level is a mathematically based measure of the auditor’s assurance that the sample results (statistic) are representative of the population (parameter), expressed as a percentage. A 95% confidence level means that 95 out of 100 times, the statistics accurately represent the larger population.

5.    Expected error rate is the number of errors that are expected in the population, expressed as a percentage. It is based on the auditor’s knowledge of factors such as prior audit results, the understanding of controls gained in planning, or a probe sample. In this case, we are assuming there are no errors in the data provided to us by the auditee.

6.    The tolerable error rate (which is expressed as a percentage) is the maximum error in the population that is acceptable while still using the sample to conclude that the results from the sample have achieved the objective.

7.    Auditors use nonstatistical sampling to select items for audit testing when a population is very small, the population items are not similar enough, or there are specific items in the population that the auditors believe require review.

8.    A System and Organization Control report is a report, issued by an independent contractor, on controls about a service organization’s systems relevant to security, availability, processing integrity, confidentiality, or privacy.

Date published: November 26, 2025

Help Us Improve Mass.gov  with your feedback

Please do not include personal or contact information.
Feedback