• This page, Audit of the Massachusetts Environmental Police Objectives, Scope, and Methodology, is   offered by
  • Office of the State Auditor

Audit of the Massachusetts Environmental Police Objectives, Scope, and Methodology

An overview of the purpose and process of auditing the Massachusetts Environmental Police

Table of Contents

Overview

In accordance with Section 12 of Chapter 11 of the Massachusetts General Laws, the Office of the State Auditor (OSA) has conducted a performance audit of the Massachusetts Environmental Police (MEP) for the period July 1, 2016 through June 30, 2018.

We conducted this performance audit in accordance with generally accepted government auditing standards. Those standards require that we plan and perform the audit to obtain sufficient, appropriate evidence to provide a reasonable basis for our findings and conclusions based on our audit objectives. We believe that the evidence obtained provides a reasonable basis for our findings and conclusions based on our audit objectives.

Below is a list of our audit objectives, indicating each question we intended our audit to answer, the conclusion we reached regarding each objective, and where each objective is discussed in the audit findings.

Objective

Conclusion

  1. Does MEP ensure that environmental police officer (EPO) overtime is authorized and provided in accordance with its “MEP Overtime” policy; Section 30C of Chapter 149 of the General Laws; and Sections 7.1(B) and 7.3 of its unit 5 collective bargaining agreement (CBA)?

No; see Findings 1, 2, 3, and 4 and Other Matters

  1. Does MEP ensure that it distributes overtime equitably in accordance with Section 7.2(I) of its unit 5 CBA?

No; see Finding 5

 

To achieve our objectives, we gained an understanding of the internal controls we deemed significant to our audit objectives by reviewing the unit 5 CBA and agency policies and procedures, as well as conducting inquiries with MEP management. We evaluated the design of controls over MEP’s authorization and distribution of overtime. We also evaluated the effectiveness of controls over authorization of joint enforcement agreement (JEA) overtime as well as approval of hours worked for all overtime. We assessed whether these controls operated as intended during the audit period.

We performed the following procedures to obtain sufficient, appropriate audit evidence to address the audit objective.

Data From the Information Management Corporation Computer-Aided Dispatch System and MMARS

To facilitate our planned substantive tests, we worked with MEP’s information technology (IT) personnel and extracted a dataset from the Information Management Corporation Computer-Aided Dispatch (IMC CAD) system that included 38,317 shifts worked during our audit period. The data included all shift logins, shift logouts, narratives created by officers, and types of patrol activity officers performed during their shifts.

We also queried a dataset from the Massachusetts Management Accounting and Reporting System (MMARS) that included all unit 5 officers who were paid premium overtime rates. The data included all overtime transactions’ dates worked, numbers of hours, time reporting codes8 used, overtime descriptions, employee identification numbers, job titles, event types, pay cycle start dates, pay cycle end dates, and dollar amounts. We filtered MMARS data by overtime event type to eliminate any non-associated overtime payments. We interviewed MEP IT personnel to better understand the fields and data elements in both datasets.

We divided the 22,512 hours worked (totaling 5,500 instances and $1,294,858) into two strata: stratum 1 is JEA overtime (totaling 4,229 hours, 1,485 instances, and $238,211), and stratum 2 is all other overtime (totaling 18,283 hours, 4,015 instances, and $1,056,647).

After the end of our audit fieldwork, MEP provided us with additional documents that were requested but not provided to us during our audit testing. We reviewed all of this documentation (IMC Activity Reports, timesheets, Department of Conservation and Recreation [DCR] Directed Patrol Narrative sheets, operational plans, emails, and state of emergency [SOE] directives from the Governor) and considered it in drafting this report.

JEA Overtime Test Methodology

For JEA testing, we took a statistical random sample of 24 overtime transactions out of a population of 1,485, using a confidence level of 90% with a 10% tolerable error rate and 0% expected error rate.1 To determine whether overtime was authorized and whether there was documentation to substantiate work performed, we reviewed JEA Report Forms and inspected timesheets to ensure that officers charged overtime to the correct time reporting code.

All Other Overtime Test Methodology

For the remaining population of 4,015 overtime transactions, we initially tested the 12 transactions where an officer worked more than 16.5 hours in a single day. Further, we took a statistical random sample of 74 transactions from the remaining 4,003 overtime transactions. Using a confidence level of 90%, with a 20% tolerable error rate and a 50% expected error rate, we tested to determine whether overtime was authorized and had documentation to substantiate work performed. Our sample consisted of the following types of overtime.

  • We requested documentation for 29 divisional overtime transactions to ensure that overtime was authorized and to substantiate work performed. We received and examined documentation (DCR Directed Patrol Narratives, IMC Activity Reports, operational plans, emails, and SOE directives from the Governor) for 28 of the 29 divisional overtime transactions.
  • We examined operational plans for 18 off-highway vehicle overtime transactions to ensure that overtime was authorized and to substantiate work performed.
  • We examined the DCR Directed Patrol Narratives for 13 DCR directed patrol overtime transactions to determine whether officers’ supervisors approved overtime.
  • We requested operational plans for 12 recreational boating safety overtime transactions to ensure that overtime was authorized and to substantiate work performed. We received and examined operational plans for 10 of the 12 transactions.
  • We requested, but did not receive, two emails for two boat instructor training overtime transactions to ensure that overtime was offered to all eligible officers.
  • We received and examined documentation (DCR Directed Patrol Narratives, IMC Activity Reports, operational plans, emails, and SOE directives from the Governor) to determine whether the 12 transactions where an officer worked more than 16.5 hours in a single day were authorized.

Forty-Hour Test Methodology

Three sets of rules (Section 30C of Chapter 149 of the General Laws, MEP’s overtime policy, and the unit 5 CBA for the period July 1, 2015 through June 30, 2018) governed EPO overtime during the audit period. These three sets of rules significantly complicate the situations where officers may be eligible for overtime. The methodology laid out below reflects our best effort at taking a fair, good-faith approach to interpreting and applying the rules that were relevant to MEP’s Human Resources Compensation Management System (HR/CMS) data during the analysis.

We queried the Commonwealth Information Warehouse10 for the audit period and conducted an analysis of those data to identify instances of premium overtime paid to officers where the officers did not meet the elements of the OSA-constructed criteria to be paid at the premium overtime rate. To be included in the analysis, an instance of overtime had to meet the following three criteria:

  1. occurring on a day when the officer worked less than 8.5 hours (in accordance with Section 30C of Chapter 149 of the General Laws and MEP’s overtime policy);
  2. occurring in a calendar week when the officer worked less than 40 hours (in accordance with Section 30C of Chapter 149 of the General Laws and MEP’s overtime policy); and
  3. occurring on a day during an eight-week period in which the average number of hours worked per week was below 40. For the purpose of this sample, we considered the eight-week period to begin seven weeks before the first day of the calendar week containing the date reported and end on the last day of the calendar week containing the date reported (in accordance with Section 30C of Chapter 149 of the General Laws).

We counted all paid leave toward hours worked, except sick time (per the unit 5 CBA); the only instances where we counted sick time as time worked were instances of federally funded boating and hunter safety overtime. We did not treat any cases of reported overtime as callbacks and court time (which are types of overtime named in the unit 5 CBA) because of limitations of the system of record for approvals, IMC CAD. We conducted a manual reconciliation between HR/CMS data and three officers’ timesheets by collecting from MEP screenshots of the three officers’ reported time for a calendar week containing an instance of overtime that met the OSA-constructed criteria and cross-referencing them to the HR/CMS data used for this manual reconciliation to verify that the data were accurate.

Split-Shift Test Methodology

We queried MMARS for the audit period to identify days when officers worked split shifts. Using data analytics, we selected a statistical random sample of 60 of 1,834 instances when officers split shifts, with a 95% confidence level, a 5% tolerable error rate, and a 0% expected error rate. To determine whether officers received approval from MEP’s Director (Colonel) to work split shifts, we requested documentation of approvals. We also reviewed IMC Activity Reports to determine which officers worked before and after the overtime shifts.

IMC CAD Test Methodology

For IMC CAD testing, we analyzed IMC CAD by examining all shifts worked during our audit period to determine whether officers logged in or out for their work shifts. We asked management about the procedure of logging in to and out of IMC CAD. Additionally, we calculated the longest, shortest, and average shift lengths during our audit period.

Equitable Distribution Test Methodology

We queried from MMARS all overtime transactions and filtered by location and job title to determine how overtime was distributed among MEP officers. We held discussions with MEP management and asked for evidence of the equitable distribution of overtime, including overtime rosters.

Data Reliability Assessment

We determined the reliability of the IMC CAD data by performing interviews and testing IT controls over user identification and authentication policies and procedures, account management, and security training and monitoring configuration. We ensured the completeness and accuracy of the data from HR/CMS by judgmentally selecting 10 officers and verifying that they worked on the selected dates using IMC Activity Reports. Also, we judgmentally selected 10 officers from IMC CAD to trace to HR/CMS data to determine whether the officers worked on selected dates.

In 2018, OSA performed a data reliability assessment of MMARS that focused on testing selected system controls (access controls, application controls, configuration management, contingency planning, and segregation of duties) for the period April 1, 2017 through March 31, 2018. As part of the current audit, we tested security management controls at MEP during the audit period to assess security awareness training and personnel screening. Further, we judgmentally selected 10 officers for each of three pay periods from MMARS and determined whether the information in MMARS matched the unit 5 CBA salary grade chart.

Based on the results of our data reliability assessments, we determined that the information obtained for our audit period was sufficiently reliable for the purpose of our audit objectives.

8.     According to the Massachusetts Human Resources Division website, “A time reporting code is a unique identifier to specify the type of time used (e.g., sick, vacation, overtime, etc.).”

9.     The confidence level is the measure of how confident we can be that our results reflect what we would have obtained if the entire population had been tested. The tolerable error rate is the maximum error in the population we would be willing to accept and still conclude that the result from the sample had achieved the audit objective. The expected error rate is the anticipated rate of occurrence of the overtime not being authorized.

10.     According to the website of the Executive Office for Administration and Finance, the Commonwealth Information Warehouse is an "integrated repository" of "financial, budgetary, human resource, payroll, and time reporting information.”

Date published: April 21, 2020

Help Us Improve Mass.gov  with your feedback

Please do not include personal or contact information.
Feedback