• This page, The Executive Office of Education Did Not Always Establish Performance Metrics or Effectively Measure the Performance of Its Information Technology Vendors., is   offered by
  • Office of the State Auditor

The Executive Office of Education Did Not Always Establish Performance Metrics or Effectively Measure the Performance of Its Information Technology Vendors.

EOE did not consistently establish performance metrics or effectively measure the performance of technology vendors.

Table of Contents

Overview

The Executive Office of Education (EOE) did not ensure that performance metrics (i.e., key performance indicators) were measured for its information technology (IT) vendors. Specifically, for five of the six IT contracts in our sample, EOE did not provide performance measurement reports. For the other contract, although EOE did establish performance measures and said that it had evaluated the vendor’s performance, it could not show objective metrics to support this assertion. Without establishing measures and objectively monitoring vendor performance, EOE cannot hold its IT vendors accountable for contractual noncompliance and/or poor performance, which could affect its operations.

Authoritative Guidance

Section DS2 (“Manage Third-Party Services”) of the Information Systems Audit and Control Association’s (ISACA’s) document Control Objectives for Information and Related Technology 4.1 states,

The need to assure that services provided by third parties (suppliers, vendors and partners) meet business requirements requires an effective third-party management process. This process is accomplished by clearly defining the roles, responsibilities and expectations in third-party agreements as well as reviewing and monitoring such agreements for effectiveness and compliance. Effective management of third-party services minimizes the business risk associated with non-performing suppliers.

Control Objectives. . . .

Establish a process to monitor service delivery to ensure that the supplier is meeting current business requirements and continuing to adhere to the contract agreements and [service-level agreements].

Although EOE is not specifically required to follow ISACA requirements, we believe they represent best practices that should be followed.

Reasons for Noncompliance

EOE officials stated that they believe that the agency’s contract documents identify expected outcomes and deliverables and that EOE effectively measures performance over the course of a contract. According to EOE management, the agency has been in the process of transitioning the administration of its IT services to the Executive Office of Technology Services and Security (EOTSS). EOTSS is responsible for IT contract management, but EOE and EOTSS did not have clearly defined roles, responsibilities, and expectations or a process for EOE’s IT contract performance monitoring.

Recommendations

  1. EOE should establish key performance indicators for future IT contracts.
  2. EOE should develop and implement a process to measure and monitor IT vendors’ performance.
  3. EOE should develop and implement metrics to ensure that IT vendors’ performance requirements, such as project milestones and time and expense budgets, are met.

Auditee’s Response

EOE prides itself on managing vendor relationships with processes and procedures that ensure it receives the services and products for which it contracts. Executed contract documents identify expected outcomes and deliverables, and EOE effectively measures performance over the course of a contract. . . .

EOE measures performance throughout a contract cycle. Contract documentation, such as the Scope of Work, includes language identifying the procured deliverables. During a contract period, IT staff oversee processes that include daily coordination meetings, weekly reports, required signoffs on deliverables, and overarching project plans and timelines. EOE does not submit payment to vendors until user acceptance testing occurs, if warranted, or the deliverable meets the expectations identified at the outset of the contract—e.g., in the Scope of Work. For IT staff augmentation contracts, job descriptions serve to establish performance expectations, and individuals are terminated if they do not meet these expectations. Additionally, two of the contracts reviewed by the [audit team]—one for phone services and one lease agreement for desktop computers—function essentially as a purchase; the “performance metric” for these contracts simply is whether EOE received the services/goods.

The following table identifies the methods by which EOE established contract expectations and assesses vendor performance for the six contracts reviewed by the [audit team]:

Table 2.

Contract

Contract Expectations

Assessment of Vendor Performance

ITT46—Network Statewide Services

Telecommunications/phone contract

Monthly review of performance/invoices

ITC49—IT Asset Lease Services

Purchase of Network equipment

Printer Lease

Desktop Lease

Sign off on shipment/receipt of equipment

ITS55IBM Software, Appliances, Maintenance, and Technical Support Statewide Contract

Hardware, software & support for DESE & DHE data systems

Sign off on shipment/receipt of equipment

Daily performance metrics using Tivoli monitoring software

ITS53— IT Staff Augmentation Full Service and Niche Statewide Contract

Staff Augmentation Services

Scope of Work / Job Descriptions

Daily time reporting

Weekly time sign off using [Human Resource Compensation Management System].

Weekly status reports of project plan

18ITSMS1—SIF Maintenance and Support

Contract for maintaining the data collection system for DESE—3 times a year data collection for all school districts.

Application Up-time

System Performance stats nightly

Weekly status metrics

18ESEEK1—Adult Ed Data System

Purchase and licensing for Adult Ed software system

Scope of Work

Required functionality

Assigned Project Manager to manage the implementation.

User acceptance testing and sign off on deliverables.

 

. . .

EOE constantly strives to improve practice and identify opportunities for standardization, where appropriate and will continue to work with both EOTSS and other secretariat partners to adopt best practices.

Auditor’s Reply

We acknowledge that EOE does perform some monitoring of its contractors’ activities. However, the assessment activities detailed in the table above appear to focus primarily on assessing contract compliance and the provision of deliverables and do not address other important aspects of performance, such as the quality of work provided. For example, in the table, EOE indicates that it uses Tivoli software to monitor the performance of the contractor it hired to provide hardware, software, and support for the DESE and DHE data systems. Although this software may allow EOE to monitor the contractor’s activities, EOE does not use the data to assess the vendor’s performance. In another example, in the table, EOE indicates that it evaluates the performance of its staff augmentation service contractor by simply monitoring the progress of the project and various attendance metrics rather than the quality of services provided by the contractor (e.g., software developers are not evaluated on coding error rates and timeliness of error resolution). Moreover, it is important to note that although in some instances EOE indicates that it assesses the performance of these contractors, EOE never gave the Office of the State Auditor (OSA) any documentation to substantiate either that it had established metrics it could use to assess their performance or that it had performed any formal assessments. 

In its response, EOE asserts that two of the contracts in our sample (one for phone services and one lease agreement for desktop computers) are essentially purchases and that therefore the performance metric for these contracts is simply whether EOE received the services/goods in question. However, in OSA’s opinion, performance metrics and measures can and should be established for every contract. For example, on these two contracts, EOE could have established performance metrics such as the timeliness of responses to any service requests or the quality of the products received. In support of this, the “State Finance Law and General Contract Requirements” policy issued jointly by the Comptroller of the Commonwealth and the Operational Services Division (OSD) states,

The Commonwealth has a responsibility to conduct monitoring and evaluation of the commodities and services it purchases. These activities can assist in identifying and reducing fiscal and programmatic risk as early as possible, thus protecting both public funds and clients being served.

As an example of a state agency that meets this requirement, the Commonwealth’s Division of Capital Asset Management and Maintenance requires government awarding authorities to complete a standard Contractor Evaluation Form for Building Projects where they rate contractors on such criteria as quality of work, timeliness, and quality of customer service. This type of evaluation provides valuable information for determining whom to hire on future contracts.

In OSA’s opinion, to properly assess the quality and effectiveness of the services provided by its contractors, EOE should establish quantifiable performance metrics and should regularly assess vendors’ compliance with those metrics.

Based on its response, EOE will take measures to address our concerns in this area.

Date published: October 11, 2019

Help Us Improve Mass.gov  with your feedback

Please do not include personal or contact information.
Feedback