During the audit period, DESE did not consistently follow its risk-based approach for scheduling AE provider monitoring visits, which is based on cohort assignments. In our review of the cohort assignments for the 77 AE providers that were active during our audit period, we found that, based on the providers’ request for proposals (RFP) application scores, 13 were incorrectly assigned to lower-risk cohorts than they should have been, and 12 were incorrectly assigned to higher-risk cohorts. (The other 52 were assigned to the correct cohorts.) Monitoring visits were not conducted in the appropriate order because of the incorrect cohort assignments.
If providers are not assigned to the appropriate cohorts, higher-risk providers may not receive the assistance they need to ensure that they meet program requirements.
DESE’s “Program Quality Review and Site Visit Protocol” states,
Programs will be reviewed on a cohort-based model during the 2019–2022 funding cycle:
- approximately a third of programs will be reviewed in Year 1 (FY2019);
- approximately a third of the funded programs will be reviewed in Year 2 (FY2020);
- approximately a third of the funded programs will be reviewed in Year 3 (FY2021).
No visits will be conducted in Year 4 (FY2022) due to the next open and competitive rebid.
DESE management told us the aforementioned cohort-based model involved scoring the applications received in response to an RFP and using those scores to assess risk levels. The lowest-scoring third of AE providers are assigned to cohort 1 (high risk), the second-lowest-scoring third of AE providers are assigned to cohort 2 (medium risk), and the highest-scoring third of AE providers are assigned to cohort 3 (low risk). Cohorts are used to schedule program quality reviews (PQRs) and site visits (SVs).
Reasons for Issues
DESE lacked controls to ensure that AE provider monitoring visits were scheduled in accordance with its risk-based cohort assignment policy.
DESE should implement controls to ensure that AE providers are placed in the appropriate cohorts and monitoring visits are scheduled in accordance with its risk-based cohort assignment policy.
We do not concur with this finding. ACLS is required to monitor our sub recipient grantees, the monitoring of which can take many forms as defined by the federal Uniform Grant Guidance [issued by the Office of Management and Budget], including budget reviews, communication with sub recipients, technical training and on-site visits. In FY19, ACLS changed its monitoring protocols, adding ambitious new criteria, including the monitoring, or program quality of review of every sub recipient program within a four-year funding cycle. It should also be noted that ACLS was not required to follow a strict protocol beyond the requirements previously mentioned, and all the changes made to its protocol were approved by management and had control activities put in place.
The text in the “Program Quality Review and Site Visit Protocol” that follows the portion quoted above states:
To accelerate outcomes, PQR schedules factor in the performance of programs. Performance in this case refers to a program’s open and competitive proposal score and any funding conditions. Other factors may include previous fiscal year Measurable Skill Gains (MSG) target completion, Educational Functioning Level (EFL) completion rates when full MSG data is not available, and/or recommendations from program specialists.
ACLS makes clear that multiple factors are considered when considering how to schedule monitoring visits, including “recommendations from program specialists.” Because ACLS has chosen to meld a census approach to our risk-based approach for scheduling monitoring visits, i.e. all programs will be comprehensively monitored during the contract term, the scheduling of visits has been altered, which incorporates other criteria to ensure that all subsets of ACLS’s programs for monitoring are considered.
ACLS acknowledges that some but not all of these other factors (e.g. geographic/regional considerations) are outlined in the protocol and recognizes the need to better document any changes to the holistic program quality review protocol. However, it is important to note that ACLS did complete full program quality reviews of approximately 1/3 of the subgrantees in FY19, which aligns with our stated commitment and represents implementation of our new census-based monitoring approach which will cover all programs over a 3-year period.
The Office of the State Auditor (OSA) acknowledges that DESE’s oversight of its AE providers involves a number of activities. OSA’s audit work focused on evaluating one of these activities, the process DESE used to schedule monitoring visits, to determine whether they were properly prioritized and scheduled in accordance with DESE’s “Program Quality Review and Site Visit Protocol.”
The protocol dictates that monitoring visits are to be scheduled using a risk-based cohort model. Although the protocol does not delineate the actual risk assessment process to be followed, DESE management told us during our audit that the agency’s risk-based cohort model involved scoring the RFP applications received and using the scores to assess risk levels: the lowest-scoring third of AE providers are assigned to cohort 1 (high risk), the second-lowest-scoring third of AE providers to cohort 2 (medium risk), and the highest-scoring third of AE providers to cohort 3 (low risk). In its response, DESE’s ACLS Unit now asserts that it uses multiple factors when considering how to schedule monitoring visits. However, this was not mentioned to us during our audit work; the protocol does not mention these factors; and there was nothing in the information we reviewed that stated to what extent, if any, the other factors in the ACLS Unit’s response may have been used in the risk assessment / cohort determination process. Further, the text that the ACLS Unit mentions in its response as part of the protocol was not included in the documentation provided to us during our audit. If the ACLS Unit does use other factors to establish its risk-based cohorts, these factors should be identified in its protocol, along with how each factor should be weighted in the risk assessment process, to ensure that they are properly documented and the process performed consistently.
Finally, although we acknowledge that DESE’s completion of full PQRs of one-third of its AE providers during fiscal year 2019 was a significant achievement, not ensuring that providers are placed in the proper risk cohorts means that some of DESE’s high-risk providers may not have been reviewed during that year. Therefore, such providers may not receive the assistance they need to meet program requirements. We again urge DESE to implement our recommendations.
|Date published:||June 2, 2020|