Chapter II Review Preparation: A. Review Information in ED Databases
Chapter II Review Preparation: A. Review Information in ED Databases
In most cases, the decision to conduct a program review was based on the
results of case managing the school. Therefore, there should be notes for the
applicable school in CMIS that document the results of the research performed
by various team members (e.g., audit, eligibility, program reviews, financial
analysis, etc), and the reasons for recommending that a program review be
Eligibility: This section contains information about a school that may be helpful in
planning the program review, such as additional locations that have been
reported to the Department or whether the school participates in the Quality
Assurance Program or Experimental Sites Initiative. Other types of useful
information include:
If not already discussed in case managing the school, reviewers can also identify
conditions of any provisional recertification as a possible area of concentration
during the review. Additionally, reviewers could perform searches based on
ownership listings, to identify if there are other schools with common ownership.
Note that much of this information is summarized in the Detailed School Report
available from PEPS.
Program Review: Although CMIS should contain information about prior program
reviews, more detailed information about the types of findings and associated
liabilities can be found in the program review section of PEPS. Generally,
information is maintained for reviews closed within the last five years.
Audit: As with program reviews, CMIS should contain information about recent
Title IV audits conducted at the institution. However, the audit section of PEPS
will provide a more detailed record of the school’s audit history. Reviewers may
check PEPS or consult their team’s audit resolution specialists for additional
information on prior non-federal audits and audits by OIG staff. Serious audit
findings, especially recurring violations in program review focus items, should be
noted, added to the reviewer's on-site checklist, and reviewed for corrective
action.
If the team's audit research suggests required audits have not been submitted,
the Department’s Document Receipt and Control Center (DRCC) or the Federal
Direct Loan: This section will identify the school’s origination option, which
determines such things as whether the school controls the draw down of Direct
Loan funds, and when a school can disburse funds to students.
Keep in mind that PEPS is most valuable as a starting point for researching
institutional information. If necessary, staff may also check the source
documents.
NSLDS is the single most complete picture of aid awarded and disbursed at a
school that can be obtained within the Department. NSLDS can provide
summary and detail funding reports for the FFEL and Direct Loan Programs, the
Pell Grant Program, and the Perkins Loan Program. The detail funding report
includes student names and funds disbursed to each student, which can be
useful in preparing a statistical sample. Schools and approximately 20 servicers
(including the National Student Clearinghouse) send Student Status Confirmation
Reports (SSCRs) to NSLDS. Thus, complete information on student attendance
can be retrieved from NSLDS. 1-800-999-8219 or www.nsldsfap.ed.gov
When using NSLDS data, reviewers should remember that while NSLDS is the
best single source of data on a particular school, it might not be the best source
for each data element. Because NSLDS is a repository that collects data from
many other systems, it provides a one-stop source for data that may not be
easily obtainable otherwise. Reviewers should consider it a starting point for
research, but should exercise caution when drawing conclusions about data
contained in NSLDS and should verify the data with an independent source
before using it as the basis for a program review finding. Exceptions to this
would be SSCR data, Perkins Loan data, and Pell and FSEOG overpayment
data for which schools are responsible for reporting to NSLDS.
Guaranty Agencies For schools that participate in the FFEL program, guaranty
agencies can often provide information very similar to the information that can be
obtained from the Direct Loan system. Query NSLDS to find which guaranty
agencies have loans for the school and contact them to request the information.
• Activity Report – shows the amount of cash drawn down for each “award,” or
document number (including Direct Loans, if the school is a Direct Loan
participant), along with individual cash request dates and amounts;
• Refunds Report – identifies the amount of funds the school has returned to
the Federal programs through GAPS (but does not necessarily reflect all
actual Title IV refunds); and
• Award Balances Report – summarizes the authorization, net draws, and
available balance for each award.
The FISAP reports are now available in electronic format for schools participating
in the Campus-Based Programs. This report contains important information
about the school’s management of Campus-Based Programs. In addition to
information about the allocation of funds, and status of the school’s Perkins Loan
portfolio for prior years, Part II, Section E also provides some student enrollment
numbers.
The need to access specific student data from the system should be rare. Pell
Operations Office can generate reports from the RFMS database – on a limited
basis as a special request. If necessary, you can request them through
[email protected]. There are several initiatives under way that will hopefully
provide staff with easier access to RFMS data.
If the school being reviewed is a Direct Loan participant, reviewers may find that
the Direct Loan System can provide very valuable data. The Direct Loan System
data is similar to the data in the NSLDS in many ways; however it has the
advantages of being more accurate, more current, and more complete. Unlike
Furthermore, the Direct Loan School Relations staff maintains a limited database
for schools that participate in the program. Input from the Direct Loan staff must
be obtained during the case management of the school. Also, reviewers should
consider requesting from the Direct Loan staff any pertinent information that
might be available in their database.
There are other sources of information available to reviewers other than those
contained on the Department’s databases. Some examples are noted below.
1. Prior Reviews
PEPS research will identify the type of findings, and associated liabilities.
However a review of the actual program review records, especially the
correspondence, will provide more in-depth insight into past problems at the
school. Even if there were no resultant liabilities, past findings may have
indicated serious problems that reviewers may want to look at more closely. For
example, the only finding in a prior review may have identified problems that
resulted in students being under-awarded, with the resulting resolution that the
institution agreed to retrain staff to prevent a reoccurrence. That review, as
reflected on PEPS (one finding, no liability) may appear to have insignificant
issues, but it is important to verify that the school is not still under-awarding
students. Reviewers who conducted any recent reviews may provide valuable
information about the school’s organization and procedures.
3. Complaint Profiles
Each CMT may have different methods of tracking the receipt and resolution of
complaints and referrals. Ideally, there should be a method for reviewers to
identify and research all recent complaints and referrals. Even if complaints have
been successfully resolved, they may provide some insight into operations at the
school.
Each CMT may maintain various records that may provide additional useful
information. CMT files may also contain information from accrediting and
licensing bodies. For example, the CMT may keep an “institutional file” in which
it files all miscellaneous information that comes to the CMT about each school
(e.g., Campus-Based Allocation Letters that identify the institution’s
authorizations, and Perkins Loan authorizations).
5. AAAD Liaison
Reviewers should check with their respective AAAD liaison to see if there are any
AAAD records of past administrative actions, appeals of FPRDs/FADs, and/or
debarment or suspension actions against school officials. AAAD maintains a
database of all referrals and resolutions of administrative actions, appeals,
debarments and suspensions, as well as other miscellaneous information on
schools that may not be recorded elsewhere. In addition, AAAD maintains
copies of all settlement agreements. The AAAD liaison may also be aware of
any information from OGC regarding prior actions.
AAAD staff also have access to the Lexis-Nexis system, which contains
information on case law, news articles and a variety of public records, including
bankruptcy petitions, corporation registrations, judgments, tax liens, uniform
commercial codes (contains information on parties that have security interests,
such as liens, against an individual or corporation), verdicts and settlements.
7. The Internet
This discussion must begin with the caveat that reviewers should be very
circumspect about information gathered from the Internet and use only reliable
sources. That being said, there may be some useful information on the Internet
about an institution; for example, discovering through newspaper links that a
school scheduled for a program review was in the process of being sold.
If the school maintains a website, the reviewer should look it over. This may
provide additional information about the school, – or may reveal potential
conflicts with information that the school has reported to ED. Also, a website is
information that is being provided to the consumer, and should be reviewed in the
same manner as printed consumer information to ensure that there is no
misrepresentation.
C. Announced/Unannounced Reviews
In general, all program reviews will be announced, although a CMT may depart
from this policy after the ACD consults with the Division Director. If a review will
be announced, the institution should be better prepared to have staff and records
available at the agreed-upon dates of the on-site review. In addition, information
on institutional administration of the Title IV programs may be requested in
advance (typically 2 - 4 week notice). Information requested should include a
Generally, the CMT will have already solicited information during the case
management process from adjunct team members from offices such as AAAD,
Direct Loan, and Quality Assurance. However, once a final decision is made to
conduct the program review, it is advisable to contact those offices again before
scheduling the review. For example, the Direct Loan Client Account Managers
(CAMS) often visit participating schools, so it is advisable to check with that office
to avoid simultaneous visits.
E. Sample Selection
Most program reviews will entail reviewing student files to evaluate the school’s
procedures for awarding and disbursing Title IV funds. The basic guideline is
that reviews should cover the two most recent closed award years. In addition,
some files from the current award year should be examined. However, the CMT
may decide that a shorter timeframe is appropriate.
Except for schools with very small Title IV populations (under 100 per award
year), reviewers should prepare a statistical sample list in advance of the review.
To identify a sample of student files to review on-site, reviewers first select from
the population of Title IV recipients under review a valid statistical sample list.
From the statistical sample, the reviewer then selects a smaller, random sample
list. The file review portion of the review begins with this random sample.
The best source of information about the Title IV recipients is that institution’s
records. Many schools maintain databases that identify Title IV recipients for
each award year. To maximize the accuracy of the sample selection, ask the
institution to submit a complete, unduplicated, reconciled list (in an
electronic database format, if possible) of all Title IV recipients, by award
year. Ideally, the list should be sorted alphabetically or by social security
number, and should also identify the amount of Title IV funds received in each
program by each student in the applicable award year. Using a complete,
unduplicated list is important because the results of the review will be more
accurate, and liability extrapolations more comprehensive, if based on the entire
universe of Title IV recipients.
If the reviewer is able to obtain a complete list of Title IV recipients from the
school, he or she will select a statistical sample, using the CMO Statistical
Sampling Template (an Excel © Spreadsheet).
From the universe of all Title IV recipients for an award year, the Statistical
Sampling Spreadsheet will provide a report that identifies the random statistical
sample of students from that universe. The software then identifies a further
random sub-sample of 15 student records from the statistical sample group.
These records are identified with an asterisk on the report generated by the
software. These student records will be the initial focus of work for the review.
Reviewers may find that they need to expand the statistical sample to increase
the number of files to be examined.
The Notice of Visit Letter constitutes the official written request for access to
records, required under 668.23(g), to initiate the program review process. A
standard format for the Notice of Visit letter for both announced and
unannounced reviews is provided in Appendix B. Reviewers must adjust the text
in advance, adding information relevant to the particular school. Information to
be added must include the name of the chief administrative officer, OPEID
numbers, review team member(s), and award years to be reviewed.
After the Notice of Visit letter has been sent, reviewers should contact the
institution to arrange an entrance conference to initiate the program review. It is
preferable that managers of the primary offices involved in the review be present
at the entrance conference. This is usually beneficial, in that it allows the
reviewers to become acquainted with school officials they may be interacting with
over the course of the program review, provide an overview of the review
process, and discuss logistical issues (e.g., copier and computer access, record
availability, etc.).
Reviewers may find the SFA Assessment useful as a tool in the case
management process, and as an information resource. Therefore, reviewers
should become familiar with the SFA Assessment to help determine its
applicability to the circumstances of any school that is being evaluated. Then, at
the reviewers’ discretion, and based on the particulars of the review, decide
when/if the SFA Assessment can help at some point during the review process.
The SFA Assessment is the starting point for any institution's quality improvement
initiatives. The SFA Assessment is designed around the concept of self-
assessment. An institution evaluates and analyzes its aid delivery system (the
existing policies, procedures, and practices) to determine strengths and
weaknesses. The benefit from this process is that the institution assesses its
own systems and identifies areas that need improvement. The SFA Assessment
is designed to strengthen a trusting relationship with our partners as we strive
toward better service, and to improve overall performance in delivering aid and
serving students.
There are different ways that the SFA Assessment can be used throughout the
case management/ program review process.
Using the SFA Assessment while on-site The SFA Assessment can be an
effective tool to use while on-site. If reviewers have time to spare during the
review, a portion of the SFA Assessment can be used at the beginning of the
review to help the reviewers and the institution determine areas that might need
improvement. Technical assistance and recommendations can be provided prior
Using the SFA Assessment after the review The SFA Assessment can be an
effective tool to use after the review is conducted. Reviewers could have an
institution complete a section of the SFA Assessment in response to the exit
interview to strengthen areas of non-compliance and to determine if the findings
can be resolved prior to the report being written. This would be a proactive step
toward achieving a mutual respect and trusted partnership.
Further, the SFA Assessment can be an effective tool to use after the review is
closed. Often times, program reviews are conducted, the findings are resolved,
the reviews are closed, and the institution begins making similar mistakes in the
same areas that resulted in the findings of the review. Reviewers can encourage
the institution to use the SFA Assessment after the review is closed to
continuously evaluate its procedures to ensure that the findings do not reoccur. If
an institution uses the SFA Assessment to continuously evaluate its Title IV
processes and to make improvements based on the results of the SFA
Assessment activity, then the likelihood of future findings and liabilities should be
reduced.